Menu Mobile
Get the latest software delivery industry news, reporting and resources. Follow us on Linkedin
Last updated on
Plutora Blog - Test Case Management, Test Environment Management, Test management

The Evolution of Test and the Enterprise Data Mart

Reading time 7 minutes

Software testing has seen significant evolutionary changes that have driven greater efficiency and improved test coverage. However, this has not removed the need for software testing to be tracked, managed and communicated to others.

Without this information, stakeholders and decision makers can’t determine if a release has been sufficiently tested. There are certain levels of risk that are acceptable and unavoidable for business operations – but this lack of transparency makes effectively managing the risk, that is inherent in every software release, impossible. Considering the dire consequences of a glitch or bug making it into production, decision makers simply cannot let risk stay unquantified.

The challenge facing enterprise development teams is that for testing progress to be recorded, testing efforts had to be paused. Imagine a race car driver having to stop after each lap to record their completed lap, lap time and any issues they experienced – highly inefficient.

enterprise data mart

As software development continues to evolve, so do the activities associated with testing. While it seems that automation is being pressed at every possible stage of the delivery chain, complete automation of every test isn’t the answer. Neither is solely relying on exploration testing. There needs to be a healthy mix of both types.

Every organization will have a different ideal formula of automation vs exploration that works best for their product, environment, culture and delivery needs. The challenge is to find the right mix to maximize release quality while also meeting the needs of the timeline. But how can you determine the ideal proportions of this mixture when you lack the necessary testing data?

The Evolving Role of Testers

Traditionally, the tester’s role has been to sit at a desk and pound out test cases. While this is still a large part of a tester’s typical responsibilities, their role is also evolving. The smart tester will always work with a key purpose in mind –  ensuring that business-critical testing and testing records are correctly managed.

Correct management can include collating the data needed for decision makers to act decisively. It can also involve identifying which test results require other exploratory tests to hunt down potential issues.

The role of the tester has further evolved to include managing the product quality. They do this by driving quality discussions and other activities to enhance product quality. On many teams, testers become gatekeepers – they are the ones who determine if a release will progress or if it needs to be sent back to the developer.

The most valuable testers know why, when and how the solution was tested. They know the quality of each release and of the overall product. A good tester will find ways to raise the bar on quality for the entire product team. Their provision of crucial data makes the decision maker’s job easier and more quality centric.

The role of tester can include activities such as:

  • Unit Testing
  • Test Environment Management
  • Continuous Integration Testing
  • Automated Testing
  • Exploratory Testing
  • Requirements Testing
  • End User Testing
  • Continuous Deployment Testing
  • Continuous Delivery Testing
  • Production side Testing
  • Test reporting and analytics
  • Requirements coverage analysis
  • And much more.

enterprise data martWhile figuring out the ideal mix of testing activities is essential, the less talked about, but equally important struggle is how to accurately capture the resulting test data.

Managing and Recording Testing Activities

This might seem simple at a glance, but upon implementation the complexity escalates quickly. It’s one thing to have a tester take an hour out of their work day to record their tests, results etc. It’s quite another to implement this strategy for every tester across a multi-national enterprise and have all of those reports roll up to a manageable and meaningful dataset.

This becomes exponentially more challenging when hundreds of projects, releases, requirements, release gates, compliance criteria, audits and thousands of test environments are in play. The complexity and scale involved only compound the question faced by decision makers – what is the risk of releasing this build to production?

Of course, this question is really made up of multiple sub-questions. Have all the requirements been adequately tested? Have any other potential issues been introduced? Has each release been tested by the delivery team, performance team, end users, etc?

Without current and complete information, there is really no way to effectively manage this type of risk. It’s this type of scenario that drives the crucial need for a testing data mart.

Establishing a Testing Data Mart

How can a data mart be established that tracks and associates user requirements with test environments, edge testing, test data, load tests, unit tests, regression tests, user testing, test cases, test plans, etc.? How can all of these things be tracked and communicated to decision makers? And finally, can the decision makers use this information to confidently release the product to production?

There are a variety of test tools available that can help with nearly every conceivable testing scenario. While these test tools have a variety of bells and whistles that make them stand apart from their competition, the smart tester knows that tools that provide clear analytics and reporting are vital. A data mart that can function as a central source of truth rather than working as an isolated tool. One that automatically collects data and distributes analytics to decision makers and stakeholders regardless of their organizational or geographic location.

For environments where efficiency, process and reporting are paramount, Plutora provides a solution that other competitors just can’t match. In addition to streamlined test management and improved workflow efficiency, their comprehensive data mart tracks everything. User information, versions, builds, test environments, test cases, requirements coverage, change management, defect management, automation, audit trails and even results and events from your favorite integrated tools – everything is captured.

Additionally, Plutora integrates seamlessly with your test team’s favorite test tools including Selenium, MS Test, REST, and hundreds more. Plutora Test allows testers to focus on testing. With Plutora, you can choose to focus on testing alone or create a more comprehensive end-to-end solution to also coordinate test environment management, and release management. With the ability to completely configure your workflow and gate management for each project, Plutora gives you the flexibility to create the ideal solution to meet your organizational needs.

enterprise data martThe Plutora data mart automatically captures every detail and result from every test throughout the enterprise. Powerful reporting and analytics tools then organize and feed real-time data to reports and dashboards allowing testers, stakeholders and decision makers to know progress and results at any given moment.

Take the Plutora requirements traceability matrix: this matches all test activity with the requirements to ensure adequate test coverage and risk management. Additionally, Plutora has partnered with Tableau to give customers the most powerful analytics solution in the industry. This provides the ability to drill down into the data to further identify progress, coverage, areas of risk, KPIs, schedules, bottlenecks, and much more.

With Plutora, this data is always at your fingertips whether you’re in a conference room or hiking a mountain trail. You can access the analytics or manage and initiate tests remotely via your laptop, mobile phone, tablet, or even an Apple watch. And because it’s SaaS based, it’s further set apart from old legacy tools by the ability to run it from any web browser. With Plutora, testing is no longer limited to the confines of a cubicle.

Experience the freedom and flexibility of Plutora now with a free Demo.

Dan Packer Dan Packer

Dan is an Industry Specialist at Plutora. Dan got his first taste of programming in high school, coding games in Basic. Since then, he has been directly involved with nearly every aspect of the Development and Release lifecycle — coding, testing, project management, team management, architecture, database, web & graphics designer, and much more. He has implemented development lifecycle methodologies for companies like Sears Financial, Novell, Sprint, Daimler-Benz Financial, Sabre, Centex and T-Mobile to name a few. In addition to his enterprise work, he has founded multiple companies, and continues to work as a business and technology advisor on various domestic and international projects. In total Dan has managed and orchestrated literally hundreds of deployments, development initiatives and thousands of iterative code enhancements.

Readers also check out