The Client
Operating since the late 90s, this global company founded and based in London, England has achieved undisputed market leadership in the business analytical solutions sectors and has established a physical presence in over 40 countries. Apart from analytical solutions and information services, the client also specializes in credit services and marketing.
The Situation
In a move to diversify their product range, the client develops a system that allows citizens in England to create a unique digital identity for the purposes of accessing online services. The product in question undergoes rapid development and growth cycles with upcoming releases expected in shortening timeframes. This necessitates additional resources to perform overall quality assurance.
The Challenge
Quality House undertakes a significant portion of the testing, which include Manual Testing, Regression Testing, Migration Testing, Integration Testing, Cross-Browser Testing, Mobile Testing, Performance Testing and Sanity Testing. All this work complete with automation of all test cases had to follow a strict and punishing schedule. Factors such as lack of sufficient documentation and geographical fragmentation of the team assigned on the project strained the work process, complicated the planning stages and caused difficulties in communication.
The Objective
Apart from the research and organization of the work assigned to the various types of testing, Quality House operatives set up test environments for all testing and worked to automation of test cases. This included test cases preparation, execution, result verification and maintenance. Automated tests had to achieve a pass rate of over 97%, while manual testing had to improve the product’s quality and afterwards, maintain its level.
The Solution
The first step towards Manual Testing involved thorough requirement documentation review, which served as the basis for the test case preparation and execution. This also extended to the work done in the areas of cross-browser testing, integration testing, migration testing and bug reporting. Quality House successfully met the quality criteria set by the client. We worked in Scrum to establish an easy work pace and also utilized Jira и ТestLink heavily.
Automation started with the creation and automation of test cases, result analysis, verification and maintenance. The tools we relied on to complete automation included Ruby, Selenium-Webdriver, STAF, Cucumber for the Behavior Driven Development and Regex as a support tool for Cucumber. Quality House employed a Page-Object pattern and used MongoDB as our base. All server machines are virtual using vCloud with application servers operating on Windows Server 2008.
Operatives assigned here also worked on the automation of integration, cross-browser and migration testing. After each successful test, the team created detailed reports for bug reporting and tracking. The client also desired for new features to be automated.
Despite all the work done, the team missed delivery dates as deadlines continued to shrink at a progressive rate with each new assignment. To mediate this issue, Quality House introduced planning meetings in earlier project stages and a proposal for improving the development team’s awareness during the early stages of developing a new feature. As a result, the workflow improved significantly.
Automation started with the creation and automation of test cases, result analysis, verification and maintenance. The tools we relied on to complete automation included Ruby, Selenium-Webdriver, STAF, Cucumber for the Behavior Driven Development and Regex as a support tool for Cucumber. Quality House employed a Page-Object pattern and used MongoDB as our base. All server machines are virtual using vCloud with application servers operating on Windows Server 2008.
Operatives assigned here also worked on the automation of integration, cross-browser and migration testing. After each successful test, the team created detailed reports for bug reporting and tracking. The client also desired for new features to be automated.
Despite all the work done, the team missed delivery dates as deadlines continued to shrink at a progressive rate with each new assignment. To mediate this issue, Quality House introduced planning meetings in earlier project stages and a proposal for improving the development team’s awareness during the early stages of developing a new feature. As a result, the workflow improved significantly.
The Conclusion
In the end, Quality House concluded the project on time and delivered a fully functioning software, which met every client requirement and passed the assigned success rate on both manual and automated testing. The many different features operate as specified on mobile devices as well as on desktop, providing an intuitive user experience.