No more typing reviews! Try our Samantha, our new voice AI agent.

CrossBrowserTesting vs OpenText Functional Testing comparison

 

Comparison Buyer's Guide

Executive SummaryUpdated on Mar 29, 2026

Review summaries and opinions

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

Categories and Ranking

CrossBrowserTesting
Ranking in Functional Testing Tools
27th
Average Rating
9.0
Reviews Sentiment
7.6
Number of Reviews
19
Ranking in other categories
No ranking in other categories
OpenText Functional Testing
Ranking in Functional Testing Tools
3rd
Average Rating
8.0
Reviews Sentiment
6.6
Number of Reviews
98
Ranking in other categories
Mobile App Testing Tools (2nd), Regression Testing Tools (3rd), API Testing Tools (5th), Test Automation Tools (4th)
 

Mindshare comparison

As of May 2026, in the Functional Testing Tools category, the mindshare of CrossBrowserTesting is 1.5%, up from 0.8% compared to the previous year. The mindshare of OpenText Functional Testing is 6.8%, down from 9.9% compared to the previous year. It is calculated based on PeerSpot user engagement data.
Functional Testing Tools Mindshare Distribution
ProductMindshare (%)
OpenText Functional Testing6.8%
CrossBrowserTesting1.5%
Other91.7%
Functional Testing Tools
 

Featured Reviews

CN
Senior DevOps Engineer at a financial services firm with 10,001+ employees
Knowledgeable support, scalable, and stable
We use CrossBrowserTesting for testing our web-based applications We had some issues with the onboarding process and the cloud conductivity could improve. I have used CrossBrowserTesting within the past 12 months. CrossBrowserTesting is stable. I have found CrossBrowserTesting to be scalable.…
Kevin Copple - PeerSpot reviewer
Sr. Quality Assurance Project Manager at a tech services company with 501-1,000 employees
Has supported faster test execution and increased flexibility while offering room to improve support responsiveness
Reducing the levels of support is something they could continue to improve. They tend to have an entry-level person that may not be as familiar with the product that fields the calls, which creates another day of delay to get to the level that's needed. This is a common practice across most companies where you call, you get the entry-level person, and then they work their way up to help screen calls so that they are more focused.

Quotes from Members

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

Pros

"The CrossBrowserTesting Selenium API and live test features have greatly improved our team's ability to quickly and effectively perform QA."
"Aside from speeding up our processes, it also allowed us to tie in our automated test scenarios and integrate our reporting tools to make the entire process efficient and hassle free."
"With screenshots, I can quickly verify a page looks good universally in minutes."
"The ability to choose from many devices is the best feature."
"The support team is top-notch. I have a great relationship with them. They are extremely honest and responsive."
"We no longer need to have a full QA team, testing is more quickly and reliably reproduced, and scheduled daily tests assist in catching any bugs which fall through the cracks and make it to the production environment."
"I can run a page through the screenshot tool, then send a URL with the results to my team."
"The ability to replay sessions is valuable for tracking down issues."
"The most valuable feature is that it is fast during test execution, unlike LoadRunner."
"The most valuable feature for me is that it works on multiple platforms and technologies."
"It's easy to use for beginners and non-technical people."
"I like the fact that you can record and play the record of your step scripts, and UFT One creates the steps for you in the code base. After that, you can alter the code, and it's more of a natural language code."
"OpenText Functional Testing has an impressive ability to connect to mobile devices and its ability to test so many different types of software, whether it be mainframe, APIs, mobile, web, or desktop."
"When compared with UFT and manual execution, we have definitely saved a lot of effort, somewhere in the range of 60 to 70 percent when compared with our efforts to manually test."
"Its ease-of-use; it doesn’t take long to train staff on it, and our third-party script developers find it easy to up-skill staff to use UFT."
"UFT is easy to use for functional testing, so for me it’s very important that it can travel across a large range of technologies."
 

Cons

"The "Getting Started" documentation for Selenium testing could be improved."
"The screenshot tool defaults to a screen layout instead of a full page test. I find it a bit cumbersome that I can't have it run a full screenshot as my default."
"Being able to test on real devices via the virtual connection is wonderful, but it can cause some lag and load time issues while testing."
"Being able to test on real devices via the virtual connection is wonderful, but it can cause some lag and load time issues while testing."
"We had some issues with the onboarding process and the cloud conductivity could improve."
"There should be more detailed training on CrossBrowserTesting."
"I have had quite a few issues trying to use a virtual machine to test our application on."
"The five minute timeouts can cause irritation if you have just popped away to consult some supporting documentation."
"I'd like to see UFT integrated more with some of the open source tools like Selenium, where web is involved."
"It should consume less CPU, and the licensing cost could be lower."
"It often crashes."
"Object identification has room for improvement, to make it more efficient."
"Perhaps more coverage as far as different languages go. I'm talking more about object identification."
"The application can be buggy at times and takes up a lot of memory on your PC."
"It looks like User Acceptance Testing of the product is getting bypassed entirely because this design has precedence in UFT."
"Cost is the biggest issue with UFT. It is not cheap."
 

Pricing and Cost Advice

"CrossBrowserTesting offered the best value for its price."
"SmartBear offers bundles of products that work together."
"It is worth the pricing as the product is supported on multiple platforms and browsers."
"The lowest price point is very reasonable. It is also useful if only one person in the company needs to check on the browser display."
"A few intermediary pricing options for small QA teams would be nice, e.g., unlimited screenshots, "as you need it" parallel tests, etc."
"The way the pricing model works is that you pay a whole boatload year one. Then, every year after, it is around half or less. Because instead of paying for the new product, you are just paying for the support and maintenance of it. That is probably one of the biggest things that I hear from most people, even at conferences, "Yeah, I would love to use UFT One, but we don't have a budget for it.""
"The price is only $3,000. I don't know how many QA analysts you would have in any given company. Probably no more than five or 10. So if it's a large corporation, it can easily afford $15,000 to $25,000. I don't see that being an issue."
"Its price is reasonable compared to other vendors."
"HPE recently extended the demo license period from 30 days to 60 days which was a very wise and popular decision to give potential customers more time to install it and try it for free. Even if your company has a salesperson come in and demo UFT, I would highly encourage at least one of your developers or automation engineers to download and install it to explore for themselves the functionality and features included during the demo trial period."
"The solution is priced reasonably for what features it is providing. However, it might be expensive for some."
"For the price of five automation licenses, you simply would not be able to hire five manual testers for two years worth of 24/7 manual testing work on demand."
"It's a yearly subscription. There are no additional costs to the standard subscription."
"The pricing of the product is an issue."
report
Use our free recommendation engine to learn which Functional Testing Tools solutions are best for your needs.
893,244 professionals have used our research since 2012.
 

Top Industries

By visitors reading reviews
Comms Service Provider
10%
Computer Software Company
10%
Construction Company
9%
Transportation Company
7%
Manufacturing Company
21%
Financial Services Firm
15%
Computer Software Company
7%
Retailer
5%
 

Company Size

By reviewers
Large Enterprise
Midsize Enterprise
Small Business
By reviewers
Company SizeCount
Small Business9
Midsize Enterprise5
Large Enterprise10
By reviewers
Company SizeCount
Small Business20
Midsize Enterprise13
Large Enterprise73
 

Questions from the Community

Ask a question
Earn 20 points
How does Micro Focus UFT One compare to Tricentis Tosca?
We reviewed MicroFocus UFT One but ultimately chose to use Tricentis Tosca because we needed API testing. MicroFocus UFT is a performance and functional testing tool. We tested it, and it was well...
What needs improvement with Micro Focus UFT One?
Reducing the levels of support is something they could continue to improve. They tend to have an entry-level person that may not be as familiar with the product that fields the calls, which creates...
What is your primary use case for Micro Focus UFT One?
I'm more familiar with Functional Testing. OpenText Functional Testing for Developers is a different product set that functions as an IDE for writing custom code. We don't leverage that product bec...
 

Also Known As

No data available
Micro Focus UFT One, Micro Focus UFT (QTP), QTP, Quick Test Pro
 

Overview

 

Sample Customers

St. Jude Children's Research Hospital, Accenture, Sony, Los Angeles Times, ADP, Verizon, T-Mobile, Wistia
Sage, JetBlue, Haufe.Group, Independent Health, Molina Healthcare, Cox Automotive, andTMNA Services
Find out what your peers are saying about CrossBrowserTesting vs. OpenText Functional Testing and other solutions. Updated: April 2026.
893,244 professionals have used our research since 2012.