Effectively Utilizing Production Testing Data



The term “testing in production” refers to a process that is critical to thoroughly testing an application. It’s the process of continuing to test and monitor code post-release for both disaster recovery preparation and user experience analysis.

The process of setting up production testing is easy enough—you simply extend your testing tools and workflows to cover production environments. But making use of the data you collect during production testing can be more complicated.

With that challenge in mind, this article discusses several ways in which you can put production testing data to good use, such as diagnosing critical errors before they become catastrophic to the application, or even leveraging this test data in future feature planning. In addition, we will discuss the process for efficiently sharing this production testing data with the rest of the DevOps team in an effort to provide full visibility for the organization.


Let’s begin by explaining how you can get started with production testing and collecting production test data.

Testing in production can be achieved through the implementation of several simple processes. These processes include production testing techniques such as A/B testing for evaluating the user response to new or modified features, as well as performance monitoring for collecting data within the application that will allow for the detection of performance issues that may otherwise be difficult to identify.

  • A/B testing – A/B testing, by definition, can only be done effectively in a production environment. A/B testing, popular within web applications, refers to the testing strategy of releasing two different versions of the same feature and collecting data to evaluate the user response to either version. Two separate versions (version A and version B) of a feature are released in production (for the sake of argument), and each user that utilizes that feature within the application would be presented with one of them. Some metrics tracking would be implemented to gather information regarding the user response to the differing versions.
  • Canary release – In essence, a canary release is the same as dipping your toes in the water of a full release. A canary release strategy refers to releasing the code changes to a server in the production environment, but one where only a small subset of the traffic will visit. This allows for those releasing the code to ensure that the version works as desired with the infrastructure prior to performing a full release across all production servers.
  • Performance monitoring – Application performance monitoring can be implemented using one of the many tools that exist today. The idea is to implement continuous testing for your web application in a production environment. The tool you choose should collect data on such things as page load time, in addition to tracking error codes that may be returned due to specific requests.

The question posed at the beginning of this piece was: How can we utilize production testing data effectively? Assuming the aforementioned strategies for testing in production have been implemented properly, it is now time to take that data and leverage it appropriately to improve the application through the use of this data. So how do we do that?


Setting goals is the first step to utilizing production testing data to the best of your ability. These goals will likely change depending upon the type of data you are collecting. In the scenario where you have implemented an A/B test, good goals to set would be to select the better performing version (this will take some data analysis), and to obtain some ideas for feature planning in the future. Take the example below:

You have implemented two versions of a new feature on your website. The feature relates to getting visitors to the website to sign up for news-related emails from it. Two versions (version A and version B) have slight differences in presentation. Through the addition of some metrics-tracking code, you are able to track and record the views for each version of the features as well as certain actions taken with each version, including if the sign-up is completed. This helps you to determine which feature is more popular and effective with the users.

Given the above scenario, your DevOps team may not only be able to use the data to determine which version of the feature should be installed fully in the web application, but they may also be able to extrapolate this data to serve a bigger purpose. It’s possible that elements from the successful version for the A/B test may be useful (meaning that there could be data to pull from this test that is useful when planning future features for the application).

The same strategies for utilizing production testing data holds true when analyzing data from canary releases, or performance monitoring software. Taking note of common issues within your application and using what has been learned in future iterations of the application allows the application to improve inherently as new features or modifications to existing features are implemented down the line. For instance, if certain calls to the database are causing performance issues for certain pages, then taking the time to analyze what is at the core of the database call may also serve to help the development team learn as well. They can design more efficient queries for retrieving resources from the database in the future, thus avoiding issues for future pages added to the application.


Sharing production testing data with the rest of the organization is a huge part of what helps to make this data as useful as it can be. The first step is to leverage a cloud testing platform that provides options for sharing data with ease. For instance, Sauce Labs allows for integration with Slack, the popular collaboration tool. Taking advantage of platforms like these can assist in the dissemination of data ensuring that all necessary people within the organization have access to it and can utilize it accordingly.


Implementing a production testing strategy is just one step in the effort to build quality software. The data gathered from this testing is pointless if it is not used appropriately. Ensuring that the team is educated on the potential uses of production testing data and prepared to share and analyze conclusions drawn from the data can help to make production testing more valuable than ever before.

Scott Fitzpatrick has over 5 years of experience as a software developer. He has worked with many languages, including Java, ColdFusion, HTML/CSS, JavaScript and SQL. Scott is a regular contributor at Fixate IO.


Click on a tab to select how you'd like to leave your comment

Leave a Comment

Your email address will not be published. Required fields are marked *

Skip to toolbar