Case Studies

Improve and optimize the testing process

Building Test Architecture and Test Strategy in CRM project

schedule a call

The challenge

  1. It is not possible to determine test coverage due to an inconsistent approach to writing and storing test cases. Manual testers use checklists, test automation developers use cucumber feature files.
  2. Automated testing on a test environment is not possible due to the incompatibility of test data after several updates.
  3. Chatbots testing is not automated, except for a couple of basic scenarios for mobile applications.
  4. There are no performance and security testing in the project.
  5. Testers who validate releases on the production environment have access to users' personal data.

The solution

  1. Improved transparency of the testing process in the project.
  2. Improved measurability and manageability of the testing process.
  3. Built stable test environments due to more accurate test data.
  4. Achieved independence of development and testing in teams.
  5. Achieved a significant reduction in test time; not only due to the test execution time but also due to the redistribution of testing levels.
  6. Non-functional testing became mandatory for every major release.

Project background

This is a general-purpose CRM project. There are a user portal and two mobile applications: for internal and external CRM end-users with Live Chat and several chatbots to solve delivery problems, place orders, and contact technical support.

Part of the CRM functionality is outsourced to other companies.

The Initial Testing Assessment

01.

From a business perspective

From the customer’s business perspective, the most important CRM features are mobile applications’ functionality, chatbots, and dashboards on the user portal.

What is less important are the development and improvements in the external API, reporting system, and the company’s internal portal.

02.

In terms of data

In a test environment, all data is created by the tests themselves.

If problems arise in the production environment, defect root cause analysis is performed there.

Unit tests use a predefined set of test data.

Third-party companies, which outsource part of the functionality, provide accounts for testing with a set of test data.

03.

From the application point of view

There are web applications, mobile applications, APIs, and external services that provide interfaces for:

  • mass e-mailing
  • creation of electronic documents using templates
  • IP telephony service.
04.

In terms of technology

Each module is covered by unit tests in the main programming language of the module.

Integration and main business scenarios are tested by end-to-end tests for web applications and for mobile applications.

Gap analysis

After studying the current state of the project, the following gaps were identified:

  1. It is not possible to determine test coverage due to an inconsistent approach to writing and storing test cases. Manual testers use checklists, test automation developers use cucumber feature files.
  2. Automated testing on a test environment is not possible due to the incompatibility of test data after several updates.
  3. Chatbots testing is not automated, except for a couple of basic scenarios for mobile applications.
  4. There are no performance and security testing in the project.
  5. Testers who validate releases on the production environment have access to users’ personal data.

Opportunities and Solutions

After careful analysis of existing testing processes, the following improvement steps have been proposed:

  1. Introduce a unified approach to writing and storing test cases. Establish a transparent connection between requirements and test cases. For automated tests, implement automated execution reporting.
  2. Break test data into Master data (the initial data set in the database needed to start testing) and Test data for testing mobile applications and chatbots.
  3. Deploy test environments in the “Infrastructure as code” concept.
  4. Implement code analysis static tools into CI/CD pipelines to achieve compliance with security and performance requirements.
  5. Shift the chatbot testing “to the left”: before starting the chatbot tests, prepare the necessary test data using automated scripts. Use the API to send chatbot commands and check expected results. Redistribute the scope of testing: check the entire functionality of the chatbots via the API, check the integration, and the presence of graphic elements and controls on mobile devices.
  6. Implement performance testing of chatbots. To eliminate the influence of network connections of mobile devices, performance testing should be carried out at the API level.
  7. Based on the data in the Production, generate de-personalized test data. Shift testing to the right. Use “production-like data” on test environments. Check the correctness of data updates after the release of new functionality.
  8. Generate a large amount of test data and start the performance testing of web applications.

Implementation Plan

In brief, this is a high-level plan. I want to show how, after developing an updated testing strategy, specific tasks for its implementation are formed.

  1. Develop a test data generator based on production data.
  2. Develop test data requirements for testing chatbots.
  3. Based on the requirements from task #2 write scripts to generate test data for testing chatbots.
  4. Conduct a workshop with testers and developers on testing chatbots and using scripts to create test data.
  5. Configure autotest suites for updated CI/CD pipelines.
  6. Expand the performance testing environment.
  7. Develop scenarios for performance testing. Write automatic tests.
  8. Develop scenarios for security testing.
  9. Perform security testing of the latest major release on the production environment.
  10. Write/implement libraries for testing the chatbots via API in the test framework.
  11. Export existing test cases to Test Management Tool.
  12. Implement automated reporting in all automated test frameworks.
  13. In the Project Management Tool, configure a dashboard with information about test coverage of each module, test results, and registered defects.

Your Cyber Resiliency is Our Passion

schedule a call

Related Posts: