Try our new research platform with insights from 80,000+ expert users

BlazeMeter vs OpenText Silk Performer comparison

 

Comparison Buyer's Guide

Executive Summary
 

Categories and Ranking

BlazeMeter
Ranking in Load Testing Tools
4th
Average Rating
8.2
Reviews Sentiment
7.5
Number of Reviews
47
Ranking in other categories
Performance Testing Tools (4th), Functional Testing Tools (6th), Test Automation Tools (5th)
OpenText Silk Performer
Ranking in Load Testing Tools
16th
Average Rating
8.0
Number of Reviews
1
Ranking in other categories
No ranking in other categories
 

Mindshare comparison

As of November 2024, in the Load Testing Tools category, the mindshare of BlazeMeter is 16.3%, up from 13.1% compared to the previous year. The mindshare of OpenText Silk Performer is 1.1%, down from 2.0% compared to the previous year. It is calculated based on PeerSpot user engagement data.
Load Testing Tools
 

Featured Reviews

Bala Maddu - PeerSpot reviewer
Reduced our test operating costs, provides quick feedback, and helps us understand how to build better test cases
Overall, it's helped our ability to address test data challenges. The test data features on their own are very good, but version control test data isn't included yet. I think that's an area for improvement. We can update the test data on the cloud. That's a good feature. There's also test data management, which is good. [Runscope] doesn't have the test data management yet. Mock services do, and performance testing has it. We can do the same test through JMeter, validating the same criteria, but the feedback from [Runscope] is quite visible. We can see the request and the response, what data comes back, and add the validation criteria. We can manage the test environments and test data, but running the same API request for multiple test data is missing. We cloned the test cases multiple times to run it. They need to work on that. Version controlling of the test cases and the information, the ability to compare the current version and the previous version within [Runscope] would be really nice. The history shows who made the changes, but it doesn't compare the changes. In the future, I would like to see integrations with GitLab and external Git reports so we could have some sort of version control outside as well. There is no current mechanism for that. The ability to have direct imports of spoken API specifications instead of converting them to JSON would be nice. There are some features they could work on.
SR
Scripting and basic test executions are good features; configuring the workload for tests is easy
In terms of areas of improvement, I would say the Silk Performance Explorer tool, which is used for monitoring and analysis, can be improved because that's where we spend most of our time when we're analyzing the test data. Any enhancements that can be provided in the monitoring sphere would be useful. When you have a large amount of data the tool struggles with it and will sometimes crash, or there may be issues with too many metrics being collected when running a test. The interface for the scripting could be more feature-rich. Integration with tools like Prometheus or Grafana where we can visualize the data would be great. As things stand, we have to use one monitoring tool to visualize data and another for visualizing the test metrics. Integration would enable us to see the metrics from Silk and correlate that with the metrics from other servers or other processes we're monitoring. It would save having to look at Silk data and server metrics separately. It's the way things are going with newer tools. I think the solution is being phased out by Micro Focus and their emphasis is focused more on LoadRunner. We haven't seen much development in the last few years.

Quotes from Members

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

Pros

"I really like the recording because when I use the JMeter the scripting a lot of recording it takes me a lot of time to get used to. The BlazeMeter the recording is quick."
"Its most valuable features are its strong community support, user-friendly interface, and flexible capacity options."
"One thing that we are doing a lot with the solution, and it's very good, is orchestrating a lot of JMeter agents. This feature has helped us a lot because we can reuse other vendors' performance scripts that they have used with JMeter before."
"The solution offers flexibility with its configurations."
"The on-the-fly test data improved our testing productivity a lot. The new test data features changed how we test the applications because there are different things we can do. We can use mock data or real data. We can also build data based on different formats."
"It has helped us simulate heavy load situations so we can fix performance issues ahead of time."
"BlazeMeter's most valuable feature is its cloud-based platform for performance testing."
"The stability is good."
"A good monitoring tool, simple to script and easy to configure."
 

Cons

"If the solution had better support and the documentation was efficient it would do better in the market."
"One problem, while we are executing a test, is that it will take some time to download data. Let's say I'm performance testing with a high-end load configuration. It takes a minimum of three minutes or so to start the test itself. That's the bad part of the performance testing... every time I rerun the same test, it is downloaded again... That means I have to wait for three to four minutes again."
"I believe that data management and test server virtualization are things that Perforce is working on, or should be working on."
"The tool fails to offer better parameterization to allow it to run the same script across different environments, making it a feature that needs a little improvement."
"A possible improvement could be the integration with APM tools."
"Potential areas for improvement could include pricing, configuration, setup, and addressing certain limitations."
"The scalability features still need improvement."
"The reporting capabilities could be improved."
"If you have a large amount of data, the solution can struggle."
 

Pricing and Cost Advice

"The licensing fees are billed on a monthly basis and they cost approximately $100 for the basic plan."
"The solution is free and open source."
"When compared with the cost of the licenses of other tools, BlazeMeter's license price is good."
"The product pricing is reasonable."
"It is an averagely priced product."
"I rate the product's price two on a scale of one to ten, where one is very cheap, and ten is very expensive. The solution is not expensive."
"It's consumption-based pricing but with a ceiling. They're called CVUs, or consumption variable units. We can use API testing, GUI testing, and test data, but everything gets converted into CVUs, so we are free to use the platform in its entirety without getting bogged down by a license for certain testing areas. We know for sure how much we are going to spend."
"We pay a yearly licensing fee for the solution."
Information not available
report
Use our free recommendation engine to learn which Load Testing Tools solutions are best for your needs.
816,406 professionals have used our research since 2012.
 

Top Industries

By visitors reading reviews
Financial Services Firm
21%
Computer Software Company
17%
Manufacturing Company
8%
Retailer
7%
Financial Services Firm
26%
Computer Software Company
17%
Government
8%
Insurance Company
6%
 

Company Size

By reviewers
Large Enterprise
Midsize Enterprise
Small Business
No data available
 

Questions from the Community

How does BlazeMeter compare with Apache JMeter?
Blazemeter is a continuous testing platform that provides scriptless test automation. It unifies functional and performance testing, enabling users to monitor and test public and private APIs. We ...
What do you like most about BlazeMeter?
It has a unique programming dashboard that is very user-friendly.
What is your experience regarding pricing and costs for BlazeMeter?
BlazeMeter's pricing is competitive but can be negotiable.
Ask a question
Earn 20 points
 

Also Known As

JMeter Cloud
Micro Focus Silk Performer, Silk Performer
 

Learn More

 

Overview

 

Sample Customers

DIRECTV, GAP, MIT, NBCUniversal, Pfizer, StubHub
University of Colorado, Medidata, Monash University
Find out what your peers are saying about Apache, Tricentis, OpenText and others in Load Testing Tools. Updated: November 2024.
816,406 professionals have used our research since 2012.