Automated testing for a social survey service
The client company develops software to collect, process and analyze the survey data. The product was in trend during the election campaigns in the USA not only for the sentiment evaluation, but also to create the campaign statistics.
The product was in trend during the election campaigns in the USA.
Got a project in mind?
There is no better place for a QA solution than Performance Lab.Drop us a line to find out what our team can do for you.
As the client didn’t have a testing department, they decided to employ Performance Lab. A third party was involved as a development contractor. The Performance Lab testing team was employed to create an automated testing tool to cover 100% of the main system functionality; the tool had to run tests regularly.
Additionally we needed to implement an option to run selected tests. The development contractor was to support and maintain the tool afterwards.Our team was going to interact with the following service components:
- The web app that is responsible for the survey structure and creation.
- Mobile apps (Android, iOS), which collects the respondent data.
Technical problems with deploying the testing process.
Performance Lab team used a combined approach to adapt the testing workflow to the client needs. We simultaneously developed the automated testing framework and performed manual testing, which helped us to discover most issues.
Automated testing was performed on two identical stagings: one was located in the Performance Lab office, another one by the customer. First we debugged the tests on our side and then sent them to the client staging. On the client side the following infrastructure was configured: every two weeks we sent the fully configured test suit for a certain functionality to the client. Each test in the test suit was independent, so that the developers could evaluate the needed functionality segment.
Automated testing helped to discover issues, such as:
- interruptions when four administrators were active;
- no entry permission to the created campaigns on the login;
- time-consuming event and survey creation.
We used the following technology while testing:
- Java programming language (to write the automated tests);
- Jenkins (to facilitate continuous integration);
- Selenium ChromeDriver (as a tool to automate the web-app testing);
- Appium AndroidDriver (as a tool to automate Android testing);
- Appium iOSDriver (as a tool to automate iOS testing).
To fulfill the task it was enough to develop 60 automated tests. In the end of the project thecustomer received 135 tests: 93 tests for the web application and 42 mobile tests, adapted for two platforms (iOS, Android).
The tests cover 54% of the system functionality and 100% of the main functionality. A consecutive run of 79 tests (13 web app tests, 33 Android tests and 33 iOS tests) took around 20 hours. The Performance Lab team proposed a solution: to optimize the test run and to run the tests on each platform in parallel.
The testing time was reduced to 8 hours. Automated tests were set to run every night, so that the client could see the results in the morning. On the beginning of each test run we’ve configured a device check, which sent e-mail notification when staging was not functioning correctly. This function helped to prevent test run interruptions. All automated tests suited the client requirements and were delivered according to the time plan, which was an unquestionable achievement in the project.