Structure and Approach of UAT
Despite the growing awareness of automated testing, survey respondents still use manual testing when evaluating their software’s performance against functional tests.
In fact, 96 percent of respondents use manual testing to run functional tests. Just under half (49 percent) use automated testing to aid their functional tests.
For non-functional testing, 45 percent of respondents manually run scripts for usability, load, and security tests, whereas 28 percent run these tests with automated testing tools.
The Frequency of UAT
Interestingly, there was wide variability in the frequency with which respondents perform UAT during their projects or cycles.
UAT is not just one single thing. It is a set of tools that encompass several steps and benchmarks.
Notably, of those surveyed, 66 percent perform UAT at least once a month.
The Tools That Enable UAT
Respondents also demonstrated that there is still wide variability in how organizations use software to enable UAT.
In particular, 43 percent of respondents use specialized test management tools, such as testing and automation applications, whereas 23 percent lean solely on other generic tools, such as word processing and spreadsheet software. About 30 percent of respondents noted that their team uses both types of software to facilitate UAT.
The Most Time-Consuming Aspect of UAT
When it comes to the mechanics of running UAT, respondents showed that each phase of the testing process comes with challenges.
In particular, 36 percent of respondents stated that running tests is the most time-consuming aspect of their quality assurance process, followed by designing test cases and planning out test runs, respectively.
About 13 percent of respondents noted that analyzing test results and performing reporting activities are their most time-consuming activities.