From manual testing to automated testing
Our client is one of the largest commercial banks and one of the leading domestic banks in terms of various key performance indicators such as: equity, assets, profit, and the number of credit cards in circulation.
The customer uses it’s core banking system for operation, accounting, and management systems automation. This system is one of the most commonly used and effective automated banking systems.
In the summer of 2017, the Bank contacted Performance Lab with a request to automate testing of the core banking system.
There is no better place for a QA solution than Performance Lab.Drop us a line to find out what our team can do for you.
The banking system includes Platform 1, which was the object tested in the project.
is a technology platform for Retail Module, Management Accounting, Front Office, Budget Planning for corporate customers, and Data Warehouse, as well as other systems based on this platform. It can handle high transactional loads, supporting thousands of users and millions of documents.
Application model data warehouse: Oracle Server tables and views;
Business logic: Oracle Server stored procedures;
The user workspace is called Navigator: a universal client that implements the logic to display the application model to the end user.
One of the main advantages of the core banking system is the powerful tools for creating bank statements. Financial regulators’ requirements for them change periodically, so updating system components means not only improving performance and expanding functionality, but also updating bank statements to comply with the law. These revisions are applied locally and as part of quarterly software updates.
Accordingly, the updates must be deployed as quickly as possible. However, they affect the basic functionality of the core banking system. They cannot be deployed without full testing, and the regression test model is constantly growing in size. The nature and depth of the changes require a high level of competency to verify them, so the burden falls largely on the shoulders of specialized analysts. Consequently, the costs of regression testing increase with each new update.
Automation tools don’t recognize the application’s main menu, which is the entry point and an integral part of all business operations.
The contents of the tables that comprise most of the core banking system’s functionality are not read beyond the visible area. This means the time spent obtaining table entries increases to an unknown value;
The bank’s regression model is only partially documented. Only specialized analysts knew the peculiarities of the system’s behavior when performing certain business operations.
The main test automation tool was Winium, a free test automation desktop application based on Selenium, a tool for automating web applications.
To work around the inability to access the application’s main menu and to work with UI objects that are not easily automated, we used C++ to develop our own solution, called the System for Automated Testing of Bank Information Systems (SATBIS). This solution is a library that proxies ActiveX requests in order to obtain
information about menu objects and provide access to the toolbar.
The JDBC driver was used for working with the database. JDBC and Winium support the Java language, which was chosen as the main development language, and Java Native Interface (JNI) was used to integrate SATBIS into the developed framework. The framework itself was developed based on JUnit.
We used Jenkins to support the continuous integration process, while the Yandex.Allure framework, which we tailored to the customer’s needs, was responsiblefor generating test automation reports.
The overall testing scope was divided into UI tests and API tests.
UI tests involve automating the test by simulating the actions of a real system user, e.g. filling out forms and fields and clicking on interface elements.
API tests call stored procedures, sending certain inputs and analyzing the values received in the response.
Moreover, some tests were written at both the UI and API levels in order to optimize and reduce the execution time. This approach let us accelerate the automation of business processes and achieve a significant increase in productivity, since auxiliary operations unrelated to the tested business processes were moved from the slow UI level to the high-speed API level that works directly with the database.
To address the lack of a regression test model, ensure high test coverage, and create flexible test scenarios, our top specialists were involved in the functional testing. They handled all communications with the customer’s specialized analysts and, thanks to the close cooperation of both parties, the Bank’s analysts’ knowledge of the tested business processes was successfully transferred to the test model. Among other things, the specialists audited the bank’s existing test model and extended and adapted as needed for automation.
The functional scenarios for which automated tests were written were distinguished by the fact that they specified all the parameters to be used in the automated tests as well as the method of obtaining the parameter values (SQL query, data file, generation based on a template, and hard-coded values). Using this approach let us create flexible, parameterizable automated tests that are not tied to a specific banking product or product appearance, while remaining useable even when the Bank’s product line is updated.
For example, we wrote a test for opening a term deposit, in which the deposit type is selected randomly from a list of deposit types, and other parameters, such as the minimum and maximum deposit amount, deposit term, account number format, and acceptable currencies, are determined based on the deposit type by extracting the relevant parameters from the database.
This approach produced an increase in test coverage without additional costs to the customer, because the same automated test can be run repeatedly to check the business process on several product types without having to first perform a complicated configuration.
We successfully completed the tasks assigned under the project: Performance Lab helped the Bank reduce the time required to perform regression testing and made it possible to independently expand and automate the test model without slipping schedules or reducing the quality of the product, allowing the customer to keep time frames and costs under control.
We also provided full assistance as the product was deployed on the Bank’s infrastructure and provided technical support for the developed solution.
In summary, we can highlight the following key results of this project:
We created the technical framework for developing an in-house testing
process at the Bank;
A test automation center of excellence has been created at the Bank;
Regression testing costs have been reduced;
Regression testing coverage has increased.
With results like this, we can confidently say that the groundwork has been laid for further fruitful cooperation and that Performance Lab is ready to continue to partnering with the Bank, including in the automation of the rest of the regression model, and with other organizations that use the core banking system in their business operations.