2023 User Acceptance Testing (UAT) Survey Report

Download the results from TestMonitor’s comprehensive 2022 UAT Survey.

Arrow down


The last 12 months have proven very hectic in almost every industry. Many organizations have found it difficult to plan their next steps, adjust processes for new customer expectations, and maintain the pace that today’s customers expect in a highly competitive software marketplace.

It is more important than ever to have the latest insights and trends at your fingertips, especially in the software testing and quality assurance (QA) world.

To give your organization the data it needs, TestMonitor conducted our annual poll of the software development community to learn more about how they approach UAT testing, better understand common hurdles and outcomes, and collect the latest tips and best practices you can use to refine your QA processes to have a strong 2023 and beyond.

The following is a summary of those results as well as the key takeaways and lessons learned.

Download the survey results by filling out the form below.

2023 UAT Survey Report Cover-1

Section 1

Overview of Survey Respondents

Throughout the fall of 2022, TestMonitor polled professionals from around the world and across the software development industry, including project managers, QA testers, engineers, developers, and more.

There were respondents from a range of industries, including IT services, government, and healthcare as well as entertainment, financial services, and telecommunications. Overall, the top three industry groups represented include:

  • IT services organizations (30 percent)
  • Government and public sector (14 percent)
  • Healthcare (14 percent)

Survey respondents represented organizations from around the world with a diverse portfolio of testing responsibilities, including:

  • 51 percent of respondents testing mobile applications
  • 64 percent of respondents testing enterprise applications
  • 74 percent of respondents testing web-based applications

When asked about the ultimate aim of pursuing UAT testing at their organization, 60 percent said that testing was used to build the best product for their audience, 22 percent use UAT testing to collect end-user feedback, and 18 percent use UAT to help to ensure compliance.

2023 UAT Survey Report Ultimate Goal of UAT Testing Graphic


Section 2

The Structure and
Approach of UAT Testing

Screen Shot 2023-04-17 at 11.36.54 AM  Functional Testing

Even with so much buzz about automation and artificial intelligence, automated testing is used less than half (34 percent) of the time for functional testing when compared to manual testing (86 percent). In fact, the number of respondents that used automated testing for functional testing dropped from 49 percent to 34 percent. What is clear, however, is that both forms of testing—manual and automated—are still common practices across the industry.

Screen Shot 2023-04-17 at 11.38.20 AM Non-Functional Testing

When it comes to non-functional testing, 36 percent of respondents manually run scripts for usability, load, and security tests, while 24 percent run these tests with automated testing tools.

Screen Shot 2023-04-17 at 11.38.59 AM UAT Testing Cycle Frequency

Compared to the survey results from 2021, where 66 percent of UAT tests were performed at least once a month compared to weekly or yearly cycles, testing frequency was more evenly spread among respondents to the 2022 survey:

  • 34 percent performed UAT testing one or more times per week
  • 32 percent performed UAT testing one or more times per month
  • 20 percent performed UAT testing one or more times per year

Screen Shot 2023-04-17 at 11.39.43 AM Tools That Enable UAT

Similar to the results from the 2021 survey, respondents in 2022 confirmed a wide variability in how organizations use software to enable UAT.

In particular, 44 percent of respondents use specialized test management tools, such as testing and automation applications, while 20 percent lean solely on other generic tools, such as word processing and spreadsheet software. About 32 percent of respondents noted that their team uses both types of software to facilitate UAT.

Screen Shot 2023-04-17 at 11.16.54 AM

Screen Shot 2023-04-17 at 11.40.39 AM Most Time-Consuming Aspect of UAT

Responses to the question about which aspect of running UAT was the most time-consuming showed that the preparation phase of “designing tests” took the longest (46 percent). “Planning tests” and “analyzing test results” were selected by 10 percent and 14 percent, respectively. The remaining 30 percent of respondents stated that “running tests” is the most time-consuming aspect of their quality assurance process.

Screen Shot 2023-04-17 at 11.17.57 AM

Section 3

The UAT Testing Team Composition

Not every software testing effort has the ability to include end users in their UAT testing, and this shows in the responses. When asked if end users are involved in UAT efforts, 40 percent said “yes, always,” while 26 percent said “sometimes,” and 10 percent said “no.”

When asked how many end users are involved in their UAT efforts, the responses were:

  • 36 percent said 1-5 end users
  • 23 percent said 6-10 end users
  • 10 percent said 11-25 end users


Section 4

UAT Testing Outcomes

The Most Challenging Aspect of UAT Testing

Nearly one-third (32 percent) of respondents shared that the most challenging part of UAT
testing was “designing tests,” while 26 percent selected “planning tests.” “Running tests”
was the biggest challenge for 22 percent of respondents, and “analyzing results” was the
biggest hurdle for 20 percent.

The Quality of UAT Testing

While all testing teams identified challenges to their UAT testing efforts, more than half (52
percent) of respondents believe that the overall quality of their approach is effective, rating
it at least a 7 out of 10.

Forty-four percent of the respondents rated their UAT testing as a 6 or worse in terms of
effectiveness, and 10 percent selected 4 or worse. 

Section 5

Testing Team Working Model

Hybrid vs. In-Person Testing 

The 2022 survey expanded the number of questions focused on where organizations were performing their UAT testing, how this made up compared to the past, and how they expected it to change in the future.

Before the pandemic, 70 percent of UAT testing teams worked “mostly or primarily” in the office while 24 percent said that they had a hybrid model of in-office and remote work. Only 6 percent worked mostly remote before the pandemic.

In the last year, as expected, the results flipped dramatically. In 2022, the number of teams stating that they worked a hybrid model jumped to 70 percent while 14 percent worked mostly remote. Only 16 percent of teams worked mostly or primarily in the office.

The Impact of the Economy

Like many aspects of operations, UAT testing also wasn’t immune to the effects of a shifting economic outlook. Forty-six percent of respondents stated that there were “minimal to no impacts” on their team, and 28 percent noted a reduction in their testing budgets. Only 18 percent noted an increase in access to tools, resources, and budgets.



Section 6

Best Practices Shared by Survey Respondents

In addition to sharing their perspectives on the logistics, quality, and structure of their UAT testing, respondents were also asked to share best practices.

The feedback focused on four main themes:

  • Structure bug tracking: As noted above, handling results can be a challenge for some UAT teams. Many respondents recommended creating a consistent standard for communicating identified bugs, prioritizing them, and tracking their resolution.

  • Plan tests with requirements gathering: While collecting software requirements, use that time to preplan how and what kind of tests can be used to evaluate if the requirements are satisfied or not.

  • Test design engagement: Respondents recommended involving testers as early as possible in test design to address their requirements, work styles, and communication preferences.

  • Test data preparation: Respondents also recommended preparing test data for use during UAT testing before reaching the testing phase. This can save a significant amount of time and uncertainty for testers as they begin to prepare for test execution.


Section 7

Bringing It All Together

The underlying industry trends from this year’s UAT Survey were quite clear: an increase in hybrid and remote work, a focus on tester engagement, and teams putting more focus on designing and planning tests. Although these may be common challenges, the right test management tool helps organizations overcome all of these hurdles with an intuitive, cloud-based platform.

TestMonitor is specifically designed to make manual, time-consuming processes more simple and smooth. This includes facilitating test case design, storage, and test-run sequencing, as well as enabling collaboration down the hall or around the world. All of these benefits enable your team to focus more time and energy on building a better product for your customers.

Ultimately, we hope you use these survey results to better understand quality assurance trends across the larger community and refine your own internal testing approach.



close chapters modal

Download a PDF version of this report to save for later or share with a colleague. 

Simply fill out this form and get immediate access to receive your own copy of our UAT Survey Results.

2023 UAT Survey Report Cover-1