Best Practice for Handling Production Data in Local/Dev/Test Environments by Virto DevLabs

Virto Commerce
4 min readJan 31, 2022

--

Best Practice for Handling Production Data in Local/Dev/Test Environments by Virto DevLabs

Many companies will tell you about test cases when a production database was copied into a test environment to develop new features, then this data was compromised or incorrectly processed, which damaged the company’s reputation with customers.

Some may argue that access to product data (or a copy of it) will allow you to develop a feature or fix a bug faster, but vast experience suggests this is not the case — it either takes the same amount of time or, more often, several times longer.

One of Virto’s partners shared a similar story, which happened at an undisclosed company, when a copy of the production database at the developer’s hands was somehow indexed by Google and one of the customers even placed an order on a product from this test database. That time everything ended well, but the company had to apologize to the customer and manually process that order, taking a significant amount of time from the staff.

3 Rules for Handling Production Data in Local/Dev/Test Environments

What is the best practice for handling production data when the developers are working on new features? In Virto Commerce, we do not recommend using production data in local/dev/test environments at all.

Never copy production database for dev/test purposes
Never copy production database for dev/test purposes

Any Chief Technology Officer or Information Officer should immediately reject any ideas to make full copies of production data for developers or testers. Why is this so critical?

  • First of all, many examples of using production data in dev/test have resulted in reputational damage. For example, an incorrect email notification or payment request may get sent from a test system to an entire client population.
  • Secondly, the level of security, specifically around the protection of private data, tends to be less strict for test systems. There is little point in having elaborate controls around access to production data if that data is copied to a test database, which can then be accessed by every developer and QA. Although you can obfuscate the data, this tends to be applied only to specific fields, for example, credit card numbers.
  • Lastly, copying production data to test systems can break privacy laws, for example, where test systems are hosted or accessed by dev teams from a different country or region. This last scenario is especially problematic for complex cloud deployments.
Do not touch production data, use dummy data instead
Do not touch production data, use dummy data instead

Top 5 Recommendations from the Virto Team

When developing, testing, or demonstrating something, it is often important to use test material that is not real data. This could be because you don’t have the actual data yet, or because you don’t want to use data from real users, which might contain sensitive information. From product data, you can take symptoms and metrics, thereby identifying bottlenecks.

  1. Create fake data for dev/test as a safer approach, and employ simple tools that exist to help in its creation. An example of one such tool is Faker.Net, which can help create test datasets by making it easy to get lots of random names, addresses, emails, etc. You can use Faker.Net to generate random plausible data for some of the most common data types. This can make a big difference when you are testing or demoing a project, both visually and functionally.
  2. Use Azure Application Insights reports to isolate the problem and get more insights without access to production data. Use it to monitor your live applications. Application Insights, a feature of Azure Monitor, is an extensible Application Performance Management (APM) service for developers and DevOps people. It will automatically detect performance anomalies, and includes analytics tools to help you diagnose issues and understand what users actually do with your app. It can monitor and analyze telemetry from mobile apps by integrating with Visual Studio App Center.
  3. Use a clean error message. Every error should provide the reason for it.
    Do not rely on system error codes, as it takes a lot of time for testers and developers to decipher them. Although in the past there was a practice on mainframes to assign numerical codes for various errors, now it looks outdated.
  4. Use unit and integration tests to find and fix the problem. Unit testing applies to individual application modules in isolation (without any interaction with dependencies). This allows you to verify that the code is working correctly. Integration testing is the next step to ensure that the different modules work correctly when they are combined together. For example, BenchmarkDotNet protects you from popular benchmarking mistakes and warns you if something is wrong with your benchmark design or obtained measurements.
  5. Split solution architecture into several subsystems or cells, for example, according to the method employed by Virto Atomic Architecture™ | Virto Commerce. You will need only a subset of the demo data in local/dev/test environments.

We do recognize there are reasons for specific elements of production data to be copied, for example, in the reproduction of complex bugs or for the training of specific ML models. In these instances, however, our advice is to proceed with extreme caution.

--

--

Virto Commerce
Virto Commerce

Written by Virto Commerce

Digital commerce software | the most scalable & customizable B2B open source .NET ecommerce platform