How we manually test our applications to create high-quality software

How we manually test our applications to create high-quality software

We pride ourselves on delivering software solutions that are robust, polished, and user-friendly. Our manual testing process is one of the ways we achieve this. By being attentive to testing of this kind as early as possible in the implementation process, a solution is a lot more likely to be secure, reliable, efficient, maintainable and provide a high standard of usability.

What is manual testing?

Manual testing is the practice of testing software by manually using the system, without any automated tools. This testing is completed once a piece of functionality has been created or altered, but before it is delivered to end users.

The main goal of this testing is to confirm that the system meets the requirements defined in the functional specification. In addition to this, manual testing improves the quality of our applications since its always performed by a developer who did not develop the work that they are testing: a fresh pair of eyes. This means that every change made during a project build phase is scrutinised and critiqued by at least two people.

Isn't automated testing all the rage these days?

When appropriate, automated tests are also used to check that the system conforms to the specified requirements.

Once created, these tests can be ran on demand with minimal developer input. It is highly unlikely however that a series of automated tests will be able to check that the system meets every single requirement. Furthermore, an automated test cannot investigate or provide feedback on a user's experience when using an application.

Manual testing overcomes these limitations and therefore, forms a crucial part of the development process.

Manual testing process

Manual testing process

We employ a structured approach to manual testing that consists of the following steps:


  1. Plan test cases
  3. Perform tests
  5. Log any bugs or issues
  7. Write a test report

Prior to carrying out these steps, some preliminary testing may be performed. This testing puts the tester in the shoes of a normal user, who may have never used the system before, and therefore does not have an understanding of how it works. This step can provide valuable feedback on the intuitiveness and usability of an application.

Step 1: Planning

Initially, but following any preliminary testing, the tester will make sure that they fully understand the requirements for the piece of functionality that they are testing so that they can create a thorough test plan. This plan details a number of test cases, with each one representing a single executable action that can be repeated. The test plan is designed so that executing each one of the test cases verifies that all of the specified requirements have been fulfilled.

To fully test a requirement is not always as simple as just executing a number of test cases, and sometimes the configuration of the testing environment must be adjusted. For example, different browsers may need to be used, or the data volumes in the system increased, or we might need to test as a user with different permissions. These configuration details are all identified and outlined in the test plan to increase the thoroughness and repeatability of our test cases.

Step 2: Execution

Each test case detailed in the plan is executed and its result recorded. At Enable, we are not just satisfied in ensuring that the application meets its specified requirements, but also in making sure that it is intuitive, attractive, easy to use, and of the quality that our clients expect.

A test case is only marked as passing if the actual outcome matches the expected one and if the quality of the implementation meets our high standards. If this standard is not met, then the case will be flagged for further review and action will be taken to improve the implementation. Finally, a case is marked as failing if the system does not meet the requirement.

We carry out all of our manual testing in an environment that mirrors the production environment as closely as possible. This gives us confidence in the validity and accuracy of our manual testing.

A member of the Enable development team, in Stratford-upon-Avon

Step 3: Logging

Any issues that are identified while executing our manual tests are logged as tasks within a project management system that allows us to track and manage work across the team.

Each separate issue that a tester identifies will be logged as a separate task, with information detailing how to reproduce the issue. Each task is then assigned to the developer who worked on the implementation of the feature so that they can fix it. Once they have resolved the problem, it is assigned back to the tester so that they can repeat the testing for any associated test cases. This iterative process of executing test cases, logging issues, resolving issues, and then re-executing test cases is repeated until we are satisfied that every test case is passing.

By logging and tackling issues in this way, we ensure that every one of the issues that we discover is tracked and cannot be forgotten about or lost.

Step 4: Documentation

Once we have completed our testing, we create a test report that documents all of the test cases that have been completed. Documenting each test case in this way allows any developer to repeat the tests at a later date to ensure that the software has not regressed.


Strong manual testing results in high-quality software. The thorough manual testing process that we have installed at Enable limits the number of issues in the solutions we build, and ensures that all of the work that we deliver meets our high standards. By getting the requirements for the solution right first time, Enable's clients are free to focus on what matters: profitable growth, business transformation, and achieving their company's vision.

More about testing and quality assurance


David Hunt

Lorem ipsum dolor sit amet.

You might also enjoy

Subscribe to the Enable blog to get the
latest rebate news and updates straight to your inbox

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
By using this website, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Notice for more information.
Back to top