Our primary use case for Performance Center is testing the performance of all of our applications.
Senior Consultant at a computer software company with 5,001-10,000 employees
Tests the performance of our applications and has the ability to share the screen while you are running a test
Pros and Cons
- "This product is better oriented to large, enterprise-oriented organizations."
- "While the stability is generally good, there are a few strange issues that crop up unexpectedly which affect consistent use of the product."
What is our primary use case?
What needs improvement?
One thing that always fails at our company is that after you have checked in an application then it usually crashes in some way. You get some strange error message. We found out you can open the test you have set up and usually, it works without the error the second time. So you just close the application test and open it again, and then it is okay. So that is quite confusing if you are new to the product, but you do not care about the inconvenience or even notice it after using the tool for a while. It does not seem very professional and it is really a buggy behavior that should be fixed.
One feature I would like to see included in the next release of Performance Center would be to be able to run more fluidly with True Client so you could put more virtual users in Performance Center. That would help. I'm not sure how easy it is to compile something like that, but it would be valuable.
For how long have I used the solution?
We've been using Performance Center for about a year.
What do I think about the stability of the solution?
We have had some problems with instability. At one point Performance Center suddenly went down for two days, but usually, it works. It works okay now and has not been a problem, but it was worse in the beginning. They have changed something, so it is better now than it was, I think.
Buyer's Guide
OpenText Enterprise Performance Engineering (LoadRunner Enterprise)
March 2025

Learn what your peers think about OpenText Enterprise Performance Engineering (LoadRunner Enterprise). Get advice and tips from experienced pros sharing their opinions. Updated: March 2025.
842,690 professionals have used our research since 2012.
What do I think about the scalability of the solution?
The scalability is good enough. Sometimes we get a message from the generators that they are at 80% or more capacity. That is an error we get quite commonly. We only have eight gigabytes on the generators and it is recommended to use 16 gigabytes. I guess that is likely the reason why we have this problem. This happens a lot more often when we are running TruClient. The 80% capacity error comes up very fast in that case. We can not run many users with TruClient at all.
How are customer service and support?
It is not usually me who calls tech support but I got the impression that the team is quite pleased with it. Usually, it is good. On the other hand, we have had some problems now that are not resolved. For example, one of my applications is not running at all because we are running on version 12.53. There was some problem with the REST (Representational State Transfer) services and the coding part of our REST services. We were using a very old encoding version that we are not using anymore. We stopped using it a long time ago. But it was still supposed to be compatible in 12.53, and that is what we are using. I know the problem was fixed from version 12.56 and up, but we have not been able to complete the upgrade.
I'm able to run the tests on the application locally, but not in Performance Center. So we are waiting for this upgrade at the moment to resolve these issues.
Which solution did I use previously and why did I switch?
We are currently using 12.53 and we are trying to upgrade it to 12.63 but it looks like there's a problem with the upgrade. We would like to switch to take better advantage of some features that are currently difficult to work with. We used LoadRunner concurrently for a while, and while it was a good product there were things about Performance Center that we prefer.
How was the initial setup?
I was not included in the process when they installed the solution, but it took quite a lot more time than I would have expected. I guess, based partly on the length of time it took, that it was not very straightforward to set up and must have been a bit difficult. The other reason it does not seem easy is that the team has tried to upgrade now two times now and both times they had to roll back to the previous version. We'll see when a fix is issued and they try to upgrade again if the issue is solved. It looks like there are problems with connecting properly. The team has a ticket in with Micro Focus about the problem, but we are not sure what the problem stems from and a resolution has not been provided.
What's my experience with pricing, setup cost, and licensing?
I'm not quite sure about the exact pricing because I do not handle that part of the business, but I think the Performance Center is quite expensive. It is more expensive than LoadRunner, although I am not sure how many controllers you can run for the same price. They said Performance Center was costing us around 40 million Krones and that is about 4 million dollars. But I think that was with ALM (Application Lifecycle Management) as well and not only for Performance Center.
Which other solutions did I evaluate?
Before we used Performance Center at all, we used LoadRunner (Corporate version, 50 licenses). But now we changed over almost entirely to Performance Center and we are phasing LoadRunner out. For a while, we were running both at the same time to compare them. The nice thing is that we do not need to have many controllers connected with Performance Center. The bad thing is that more than one person may want to use the same generator. So sometimes we have problems. I guess we had the same problem before when we used LoadRunner because everyone can't run a test at the same time.
There are some good things and some bad things about Performance Center in comparison to LoadRunner. The good thing is that you are able to share the screen while you are running a test. On the other hand, you do not get all the same information you get with LoadRunner when you run the tests. After you have done the tests, you can just copy the completed file and you get the same test results as if you had run on LoadRunner. So that is not really a problem. But when first running the Performance Center application for testing, I missed some of the information I got from LoadRunner. It is just a different presentation.
What other advice do I have?
The advice I would give to someone considering this product is that they should try LoadRunner first before they start using Performance Center — especially if it is a small company. They need to know and be able to compare LoadRunner to Performance Center in the right way. After you have used LoadRunner then compare Performance Center. If they are part of a small company and they expect to expand they will know the difference. If they are already a very big company, they can save some money by using Performance Center directly. We are quite a big company, so Performance Center makes sense for us.
On a scale from one to ten where one is the worst and ten is the best, I would rate Performance Center as an eight. It is only this low because we have had so many problems here installing it and upgrading it. Sometimes it runs very slow just to set up tests, or it just crashes. Like when setting up a spike test, you start using the spike test process and it suddenly crashes after you have almost finished everything. Executing the tests were a lot easier and more stable in LoadRunner.
You can manage to make Performance Center work, but you have to be patient.
Which deployment model are you using for this solution?
On-premises
Disclosure: My company has a business relationship with this vendor other than being a customer: Partner.

Sr.Engineer csit Quality Assurance at Verizon
User-friendly, with up to date features, offers good visibility over changes in the scripting, and has a very responsive technical support team
Pros and Cons
- "What I like most in Micro Focus LoadRunner Enterprise is the comparison between two different exhibitions which gives value to my company. I also like that the solution is user-friendly, especially in terms of making specific changes. For example, in the past, you can't see the changes when you upload scripts into the Performance Center, but now, it has that visibility, so whenever you want, you can change the script in the Performance Center. I also like that Micro Focus LoadRunner Enterprise is the only tool you can utilize for all your needs, even for different protocols and scripting. The solution also has the latest features, for example, networkability, where it can, within the UI, follow the waterfall model. You can use the insights in the Performance Center of Micro Focus LoadRunner Enterprise to address or test URLs that usually take up much time."
- "A room for improvement in Micro Focus LoadRunner Enterprise is that it should take multiple exhibitions for a particular scenario and have automatic trending for that. This will be a very useful feature that lets users look into how many exhibitions happened for the scenario and their performance, and you should be able to see the data within the Performance Center dashboard. For example, there's one scenario I'm focusing on multiple times in a month, and if I check five times, there's no way for me to see the trend and find out how it went with those five exhibitions. It would be great if the Performance Center has a view of all five exhibitions, particularly transaction by transaction, and how they happened. If Micro Focus LoadRunner Enterprise shows you the time trends, information about one exhibition to another, and how each performed, it'll be an immense feature, and that should be visible to every user. Reporting should be simpler in Micro Focus LoadRunner Enterprise. If I did a scenario with one exhibition now, and I did that scenario again, then I should be able to schedule that scenario for the exhibition, and if that scenario is executed multiple times, there should be the option to turn it into a single view that shows you all the transactions, how the performance was, what the trend graph is for a particular time, etc."
What is our primary use case?
We use Micro Focus LoadRunner Enterprise for load testing and stress testing. We also use it for running performance schedulers during specific times. We also use the solution to determine testing trends. We're using Micro Focus LoadRunner Enterprise less for TruClient because the TruClient protocol takes up a lot of memory.
What is most valuable?
What I like most in Micro Focus LoadRunner Enterprise is the comparison between two different exhibitions which gives value to my company. I also like that the solution is user-friendly, especially in terms of making specific changes. For example, in the past, you can't see the changes when you upload scripts into the Performance Center, but now, it has that visibility, so whenever you want, you can change the script in the Performance Center.
I also like that Micro Focus LoadRunner Enterprise is the only tool you can utilize for all your needs, even for different protocols and scripting. The solution also has the latest features, for example, networkability, where it can, within the UI, follow the waterfall model. You can use the insights in the Performance Center of Micro Focus LoadRunner Enterprise to address or test URLs that usually take up much time.
What needs improvement?
A room for improvement in Micro Focus LoadRunner Enterprise is that it should take multiple exhibitions for a particular scenario and have automatic trending for that. This will be a very useful feature that lets users look into how many exhibitions happened for the scenario and their performance, and you should be able to see the data within the Performance Center dashboard.
For example, there's one scenario I'm focusing on multiple times in a month, and if I check five times, there's no way for me to see the trend and find out how it went with those five exhibitions. It would be great if the Performance Center has a view of all five exhibitions, particularly transaction by transaction, and how they happened. If Micro Focus LoadRunner Enterprise shows you the time trends, information about one exhibition to another, and how each performed, it'll be an immense feature, and that should be visible to every user.
Reporting should be simpler in Micro Focus LoadRunner Enterprise. If I did a scenario with one exhibition now, and I did that scenario again, then I should be able to schedule that scenario for the exhibition, and if that scenario is executed multiple times, there should be the option to turn it into a single view that shows you all the transactions, how the performance was, what the trend graph is for a particular time, etc.
The report from Micro Focus LoadRunner Enterprise would show you the difference between two exhibitions, if I did one today at 12:00 PM and another at 12:00 PM tomorrow, but if you want to see the difference between three or more exhibitions, the solution doesn't have that option. To see the difference, you'll need to do more work in terms of uploading files and doing the comparisons manually, and this should be improved.
An added feature I'd like to see in Micro Focus LoadRunner Enterprise is a converter. I'd also like a performance file extraction feature in the scripting alone. For example, if I'm using the JMX file, I should be able to convert it within the solution, same with other files such as the HAR and PCAP files. Whatever performance file is there, if I can extract it and make a script, that would be a very valuable addition to Micro Focus LoadRunner Enterprise. Another example is if you're not able to record the script in the solution, if there is that option with a PCAP or HAR file, for example, a converter, that will add some value. There's a conversion for the HAR file, but with the PCAP file, I'm not so sure.
For how long have I used the solution?
I've been using Micro Focus LoadRunner Enterprise for three years now.
What do I think about the stability of the solution?
Micro Focus LoadRunner Enterprise is a very stable solution. My company only had to contact the Micro Focus team twice when there was an issue related to tailor-made requirements within my organization, but it wasn't because of a Micro Focus LoadRunner Enterprise feature, and other than that, I didn't see any issues regarding its stability.
What do I think about the scalability of the solution?
Micro Focus LoadRunner Enterprise is a scalable solution.
How are customer service and support?
The technical support team for Micro Focus LoadRunner Enterprise is very responsive. My company contacted support about an issue that was related to requirements tailored to my organization and the team helped in resolving the issue and making the solution stable.
Micro Focus LoadRunner Enterprise has a very responsive technical support team that was upfront in informing my team when it's feasible to set up a meeting, and when the issue needs to be redirected to another person who's knowledgeable about it.
On a scale of one to five, I would rate the Micro Focus LoadRunner Enterprise technical support team four out of five.
Which solution did I use previously and why did I switch?
Before Micro Focus LoadRunner Enterprise, we used Apache JMeter which was the only other option because it's an open-source tool. It was deployed on-premises and not on the cloud.
We also used the normal version of the Performance Center before Micro Focus LoadRunner Enterprise which was two years prior and we had to install the Performance Center on-premises and set up load generators and load controllers. The Enterprise version we've been using for the past three years.
The main differences between Apache JMeter, the Performance Center, and Micro Focus LoadRunner Enterprise are the usability and insights given by the last two solutions. Both the Performance Center and Micro Focus LoadRunner Enterprise give more insights, and they also offer more automation versus Apache JMeter.
With Micro Focus LoadRunner Enterprise and the Performance Center, users can leverage a quick exhibition after setting up the scenario, doing quick checks, and creating reports, but in Apache JMeter, users have to manually set up, observe, and do the reports.
There are more features in Micro Focus LoadRunner Enterprise as well as it's cloud-based versus Apache JMeter which is only a plug-in, so we have to do everything manually in Apache JMeter.
How was the initial setup?
In terms of how easy or complex setting up Micro Focus LoadRunner Enterprise is, I'm not the right person to ask because a different team handles the setup in my company. The solution is set up on the cloud, on-demand, and requires load generators or controllers, but I didn't take part in setting it up. I'm just an end-user that utilizes Micro Focus LoadRunner Enterprise.
What's my experience with pricing, setup cost, and licensing?
As I'm an end-user of Micro Focus LoadRunner Enterprise and not involved in its licensing, I don't have information on how much it costs.
Which other solutions did I evaluate?
I evaluated Apache JMeter before using Micro Focus LoadRunner Enterprise.
What other advice do I have?
In terms of the number of resources using Micro Focus LoadRunner Enterprise, I'm in a large organization, and in the beginning, there were ten resources. Nowadays, with the solution being tailor-made for my company, twenty-five to thirty resources belonging to different teams use Micro Focus LoadRunner Enterprise.
The solution is used every day as my company can't live without performance testing.
I'm rating Micro Focus LoadRunner Enterprise nine out of ten because it still has some room for improvement.
My company is a customer of Micro Focus LoadRunner Enterprise.
Which deployment model are you using for this solution?
Private Cloud
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Buyer's Guide
OpenText Enterprise Performance Engineering (LoadRunner Enterprise)
March 2025

Learn what your peers think about OpenText Enterprise Performance Engineering (LoadRunner Enterprise). Get advice and tips from experienced pros sharing their opinions. Updated: March 2025.
842,690 professionals have used our research since 2012.
Performance Test Lead at a financial services firm with 10,001+ employees
Full geographical coverage, integrates well with monitoring tools, granular project inspection capabilities
Pros and Cons
- "One of the most valuable features of this solution is recording and replaying, and the fact that there are multiple options available to do this."
- "OpenText needs to improve in terms of support. With the same support plan but when the product was owned by HP, support was more responsive and better coordinated."
What is our primary use case?
We use this solution for performance and load test different types of web-based applications and APIs. We want to make sure that before any application or any upgrade to an existing application is made available to an actual user, it is sufficiently tested within the organization.
We want to ensure that if there is a high volume of users, they have a seamless experience. We don't want them to experience slowness or an interruption in service, as a result of an increase in the number of users on the web service or website. Essentially, we test to guarantee that all of our users have a good experience.
How has it helped my organization?
When it comes to delivering enterprise-level testing capabilities, this solution is really good.
Using this tool, we are able to test an application end-to-end from any area. Specifically, we are able to test our applications that are used across geographies. This includes worldwide locations starting from one end of Asia to the other end of the Americas. Geographically, we have full testing coverage for virtually all of our enterprise applications.
In terms of application coverage, there have been very few or no applications at the enterprise level that we have not been able to test using this tool. I think there is only one, but that was a unique case. Apart from that, at an enterprise level, in terms of coverage and geographically as well as technically, we have been able to test everything using this solution.
OpenText has a platform where I can share what is good and what further improvements I can make. There is also a community where we can leave feedback.
As an admin, I have the ability to copy all of the details from one project to another. However, I don't recall functionality for cross-project reporting. If there are two projects available then I cannot run a load test or report metrics from the other project.
LoadRunner Enterprise offers multiple features to perform a deep dive into a project. For example, we can see how many load tests of a particular application were run over a certain period of time. We can also see what scripts and tests were built over a time period. There is lots of information that it provides.
It is very important that we are able to drill down into an individual project because we sometimes have to look into what set of tests was executed for a particular project, as well as how frequently the tests were run. This helps us to determine whether the results were similar across different executions, or not. For us, this is an important aspect of the functionality that this tool provides.
One of the major benefits, which is something that we have gained a lot of experience with, is the internal analytics capability. It has multiple graphical and analytical representations that we can use, and it has helped us a lot of times in pinpointing issues that could have caused SEV1 or SEV2 defects in production.
We found that when we ran the load test, those issues were identified by using the analytic graphs that LoadRunner provides. Based on this knowledge, we have been able to make the required corrections to our applications. After retesting them, we were able to release them to production. This process is something that we find very useful.
In terms of time, I find it pretty reasonable for test management. There are not too many things that we have to do before starting a load test. Once one becomes good at scripting, it does not take long. Of course, the length of time to run depends on how big and how complex the script is. Some load tests have five scripts, whereas some have between 25 and 30 scripts. On average, for a test with 10 scripts, the upper limit to set it up and run is a couple of hours.
Overall, we don't spend too much time setting up our tests.
What is most valuable?
One of the most valuable features of this solution is recording and replaying, and the fact that there are multiple options available to do this. For example, a normal web application can be recorded and replayed again on many platforms. Moreover, it can be recorded in different ways.
An application can be recorded based on your user experience, or just the backend code experience, or whether you want to record using a different technology, like a Java-specific recording, or a Siebel-specific recording. All of these different options and recording modes are available.
The scheduling feature is very helpful because it shows me time slots in calendar format where I can view all of the tests that are currently scheduled. It also displays what infrastructure is available to me to schedule a load test if I need to.
What needs improvement?
Something that is missing is a platform where I can share practices with my team. I would like to be able to inform my team members of specific best practices, but at this point, I can only share scripts and stuff like that with them. Having a private community for my own team, where I can share information about best practices and skills, would be helpful.
OpenText needs to improve in terms of support. With the same support plan but when the product was owned by HP, support was more responsive and better coordinated.
The monitoring and related analytical capabilities for load tests should be brought up to industry standards. This product integrates well with tools like Dynatrace and AppDynamics but having the built-in functionality improved would be a nice thing to have.
For how long have I used the solution?
I have been using OpenText LoadRunner Enterprise for approximately 15 years. It was previously known as Performance Center and before that, it was simply LoadRunner. In terms of continuous, uninterrupted usage, it has been for approximately nine years.
I am a long-time user of OpenText products and have worked on them across multiple organizations.
What do I think about the stability of the solution?
Our tool is hosted on-premises and we have not faced stability issues as such. One of the problems that we sometimes experience is that suddenly, multiple machines become unresponsive and cannot be contacted. We call these the load generators in LoadRunner nomenclature. When this happens, we have to restart the central server machine and then, everything goes back to normal. That sort of issue happens approximately once in six months.
Apart from that, we have not observed any stability issues. There are some defects within the tool which from time to time, we have raised with OpenText. If they have a fix available, they do provide it. Importantly, it does not make the product unusable until that is fixed.
What do I think about the scalability of the solution?
This product is easy to scale and as a user, we have not encountered any such issues. Over time, if I have to add more machines to monitor, or if I have to add more machines to use during a load test, it's pretty straightforward.
If I compare it with other tools, I would say that it does not scale as well. However, as a user, it is okay and I've never faced any issues with adding more machines.
How are customer service and technical support?
Whenever we have any support required from OpenText, the process begins with us submitting a ticket and they normally try to solve it by email. But if required, they are okay with having a video conference or an audio conference. They use Cisco technology for conferencing and they are responsive to collaboration.
Unfortunately, technical support is not as good as it used to be. From an end-user perspective, coming from both me and several of my team members, we have found that over the last year and a half, the quality of support has gone down a couple of notches. It has been since the transition from HP to OpenText, where the support is simply no longer at the same level.
The level of support changes based on the plan that you have but our plan has not changed, whereas the responsiveness and coordination have. Generally speaking, interacting with HP was better than it is with OpenText, which is something that should be improved.
Which solution did I use previously and why did I switch?
I have not used other similar tools.
How was the initial setup?
I have not set up other tools, so I don't have a basis for comparison. That said, I find that setting up LoadRunner Enterprise is not very straightforward.
Whether it's an initial setup or an upgrade to our existing setup, it's very time-consuming. There are lots of things that we have to look into and understand throughout the process. It takes a lot of time and resources and that is one of the reasons we are considering moving to the cloud version. Ideally, our effort in upgrading to the newer versions is reduced by making the transition. The last couple of upgrades have been very consuming in terms of time and effort, which could have been spent on more productive work.
To be clear, I was not involved in setting it up initially. Each time we deploy this product, we set it up as a new one but use our older version as a base. Prior to the configuration, we have to update it. However, it is older and it does not upgrade, so we have to install it as a new version. I do not see a significant difference in time between installing afresh and upgrading an existing installation.
If I am able to identify the needs and what is required, from that point, it takes almost the same amount of time whether it is a clean install or an upgrade. The biggest challenge with LoadRunner Enterprise is to identify the database that we're using and then upgrade it. As soon as the database is upgraded successfully, 70% to 75% of the work is complete. It is the biggest component, takes the longest, and is the most effort-consuming as well.
What about the implementation team?
I am involved in the installation and maintenance, including upgrades.
What's my experience with pricing, setup cost, and licensing?
I have not been directly involved in price negotiations but my understanding is that while the cost is a little bit high, it provides good value for the money.
Which other solutions did I evaluate?
I did not evaluate other tools before implementing this one.
What other advice do I have?
At this time, we do not make use of LoadRunner Developer Integration. We are thinking of migrating to the latest version of LoadRunner, which probably has the LoadRunner Developer functionality. Once we upgrade to the new version, we plan to use it.
We are not currently using any of the cloud functionality offered by OpenText. In our organization, we do have multiple applications that are hosted on the cloud, and we do test them using LoadRunner Enterprise, but we do not use any component of LoadRunner Enterprise that is hosted on the cloud.
I am an active member in several online communities, including LinkedIn, that are specific to performance testing. As such, I have seen different experts using different tools, and the overall impression that I get from LoadRunning Enterprise is that it offers good value for the price. The level of coverage in terms of scripting and analysis had helped to solidify their position as a market leader, at least a decade ago.
Nowadays, while others have closed the gap, it is still far ahead of other tools in the space. My advice is that if LoadRunner Enterprise can be made to fit within the budget, it is the best tool for performance testing and load testing.
I would rate this solution an eight out of ten.
Which deployment model are you using for this solution?
On-premises
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
Domain Engineer at a energy/utilities company with 5,001-10,000 employees
Helpful for documenting everything and doing various types of testing
Pros and Cons
- "With LoadRunner Enterprise, doing various types of performance testing, load testing, and automation testing has been very helpful for some of the teams."
- "I have seen some users report some issues, but I have personally not had any issues."
What is our primary use case?
We use it for doing automation testing, and there are various teams that use it for performance testing, load testing, etc. For ALM, we use it to create requirements, test labs, and test plans and get our test results in.
How has it helped my organization?
LoadRunner Enterprise has helped our organization in being able to document everything and maintain the history. This way, in the future, if the product gets enhanced or new features get added, you know what was tested previously, and then you also have the new stuff. You have the history of those changes and all the testing that has been done.
We use the TruClient feature for browser-based testing, but I am not the end user for it. My guess is that it is helpful for reducing scripting work.
LoadRunner Enterprise has affected our users. Testing is obviously a must before you release a software product. The fact that you can automate some of the testing, and you can do stress testing, performance testing, and simulate a load as if users are actually using it helps a lot. You better understand how the application will operate when it gets into production, and if there are issues, you can spot those early on.
LoadRunner Enterprise has helped streamline our testing processes. We have a process in place for users who need access to it. There is a governance team for the rules and how the product should be used and not. There is a little bit more structure. Based on what I have heard in the organization, it does appear to help them. LoadRunner Enterprise makes it easier for them to be able to do their tasks. They can automate things without them having to manually do it.
LoadRunner Enterprise has helped save us time, but I do not have the metrics.
LoadRunner Enterprise has helped improve our product quality. We can spot issues early on. That helps a lot. This way, we are aware of the issue before we go live in production. We can remediate it before the product goes live.
What is most valuable?
With ALM, being able to write our test scripts and being able to document our test results are valuable. That comes in handy. This way, we have a record of the things that have been tested. There are also workflows that we can create, which are very helpful.
With LoadRunner Enterprise, doing various types of performance testing, load testing, and automation testing has been very helpful for some of the teams.
What needs improvement?
I work more on the administration side. I am not a daily user, but especially during this conference, we are seeing that everything is going to the cloud. We are trying to see what are some of the benefits of going to the cloud.
I have seen some users report some issues, but I have personally not had any issues.
For how long have I used the solution?
My organization was already using it when I joined, but I have personally been using it for over three years.
What do I think about the stability of the solution?
I have not seen any stability issues. I have seen some users report some issues, but we will usually open a case if we cannot figure it out.
What do I think about the scalability of the solution?
With LoadRunner, you get load generators. You can add more as needed. Its scalability is fine.
How are customer service and support?
We get our support through another vendor.
Which solution did I use previously and why did I switch?
When I joined the team, they were already using it, so I do not know what they were using before.
How was the initial setup?
I have been involved in the upgrades. If it is a patch or something small, it is less time-consuming, but if it is a major upgrade, it takes more time and more planning. We need to assess how long it will take, what resources are needed, and do we need to request new servers.
What was our ROI?
My guess would be that we have seen an ROI.
What other advice do I have?
Overall, I would rate LoadRunner Enterprise a nine out of ten.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Test Automation Manager at Petronas
Has a good concept, but the installation process needs improvement
Pros and Cons
- "The product is good, and the concept is good as well."
- "The installation has not been straightforward, and we have had so many problems. We have had to re-install, try to install on a different machine, etc. We have not been able to launch the LRE server itself yet."
What is most valuable?
The product is good, and the concept is good as well.
What needs improvement?
Right now, we are in research mode, and we are yet to adopt the solution. The installation has not been straightforward, and we have had so many problems. We have had to re-install, try to install on a different machine, etc. We have not been able to launch the LRE server itself yet.
It's not a consistent solution. Sometimes, it's executes well, and at other times, graphics will not show up, or we'll need to restart the services, for example.
If I change my host controller, then my graphical report goes missing. I'd like to see this improved so that the graphical report is brought to the analysis. Only the LRE server is codable to give the HTML report.
Also, instead of uploading the script, it would be good to have a check-in/checkout option. At present, because the script is uploaded, the version control is missing. Version control sessions would be nice to have as well.
For how long have I used the solution?
We have been using OpenText LoadRunner Enterprise for almost four to five months.
How are customer service and support?
We need some guidance from OpenText but are not able to directly contact them because we purchased the license via SAP. However, I think that the technical support team should be very proactive.
Which solution did I use previously and why did I switch?
We have been using LoadRunner Professional for a long time. We are looking into switching to LRE because it's centralized and has so many good features.
How was the initial setup?
The installation is very problematic and not that straightforward. We have had so many problems.
What's my experience with pricing, setup cost, and licensing?
We purchased the license via SAP.
What other advice do I have?
On a scale from one to ten, I would rate OpenText LoadRunner Enterprise at six.
Which deployment model are you using for this solution?
Private Cloud
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Test Lead Architect at a tech services company with 10,001+ employees
Straightforward to set up, good or modifying script and offers support
Pros and Cons
- "I like how you can make modifications to the script on LoadRunner Enterprise. You don't have to go into the IDE itself."
- "The solution is expensive."
What is our primary use case?
Mostly it's to test APIs. That's been the main use case.
What is most valuable?
I like how you can make modifications to the script on LoadRunner Enterprise. You don't have to go into the IDE itself. You can make a quick change.
The setup is straightforward.
It's a stable solution.
Support is helpful.
What needs improvement?
Honestly, there really isn't any area for improvement. I think it's a great product.
Maybe the scroll bars could be a little bit bigger.
The solution is expensive.
If they had an easy integration with, let's say, New Relic or Dynatrace, that would be something interesting. If we can see server monitoring data in the LoadRunner report, that would be ideal.
For how long have I used the solution?
I've been using the solution for one year.
What do I think about the stability of the solution?
The product is reliable. It's stable. There are no bugs or glitches. It doesn't crash or freeze.
What do I think about the scalability of the solution?
We don't have too many people on the solution right now. We have ten to 15 people using it. We're using it almost daily. It's used 60% to 70% of the time.
How are customer service and support?
Support has been okay. Yeah, they've been pretty knowledgeable about everything. Responses are generally on time. I typically get a response within a day or so.
How would you rate customer service and support?
Positive
Which solution did I use previously and why did I switch?
We previously used WebLOAD. In WebLOAD, to set it up, if your script needs to use a data file, it's a wizard. It takes six or seven steps to set it up. In LoadRunner, it's a lot easier. It doesn't take that long. It's a very straightforward process.
The other thing is, RadView uses JavaScript language for the script, whereas LoadRunner uses C. LoadRunner recently has given the option for testers to use JavaScript as well. You can add more users on a LoadRunner test. Their load gens are more scalable. They allow more users with load gen than with RadView.
Right now, we tend to prefer LoadRunner.
How was the initial setup?
The implementation was very straightforward.
I'd rate the process four out of five in terms of ease of implementation.
What about the implementation team?
The entire implementation process was handled in-house. We did not use any consultants or integrators.
What's my experience with pricing, setup cost, and licensing?
I don't know this as a fact. However, I've heard that LoadRunner is pricey.
I have heard from different customers that although LoadRunner's a great product, sometimes they are looking for alternatives, since the pricing model for LoadRunner's very expensive. Sometimes customers will look at other options for testing tools due to the cost.
What other advice do I have?
I'm an end-user.
I'd recommend the solution. For API testing, LoadRunner, getting the script developed in LoadRunner is very straightforward. It's not super difficult. You can get a REST API script in LoadRunner done within an hour if you have all the information and if you know the HTTP headers and stuff like that. You can get it up and running in an hour.
I'd rate the solution a nine out of ten. I would give them a perfect score if the pricing was better.
Which deployment model are you using for this solution?
Hybrid Cloud
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Managed Services Architect at a computer software company with 201-500 employees
A stable solution for enterprise-wide testing and collaboration
Pros and Cons
- "Micro Focus LoadRunner Enterprise Is very user-friendly."
- "The reporting has room for improvement."
What is our primary use case?
I am a managed service provider, a reseller, and a consultant. In other words, I am a total geek.
I added a whole bunch of features and changes three or four years, but I don't know if they followed my recommendations; however, they did implement some changes that I suggested.
There's an onsite version and there's a cloud version. We typically don't want an enterprise type version because the clients that we work with are fairly large. The last place we used this solution employed 150,000 people.
We have clients that have as few as 10 employees, and other clients that have thousands of employees. I would say the mid-sized businesses that we work with are between 250 and 700 people.
It's all Citrix. We do load balance. We do load testing for Citrix deployments to determine whether or not we're going to get what we expected.
The ability to run long packages for extended periods of time, and actually mimic end users. That's really what we use it for.
We use it for validation. When you put together a system that has two to three thousand people on it, you need to be able to test it. To do that, you need a product that allows you to cast two to three thousand users on a system.
What is most valuable?
OpenText LoadRunner Enterprise Is very user-friendly.
What needs improvement?
The reporting has room for improvement.
For how long have I used the solution?
I have been using this solution, on and off, for roughly six to seven years.
In the last 12 months, I don't think I've actually loaded it up, but I have had my PS team load it up several times.
What do I think about the stability of the solution?
It's a stable solution. I'd give OpenText LoadRunner Enterprise a 4.5 out of 5 on stability.
We never experienced any bugs or glitches; those are typically in the actual loads that you're running, but that's not their fault, that's your fault.
What do I think about the scalability of the solution?
Scalability-wise, I have not had any problems. It's gone as high as I needed it to go. There are issues when supporting two to three thousand users. I don't ever go any higher than that.
A typical test is between roughly 150 and 250 users, and the most I've ever gotten is 3000. The scalability has been there for what I needed it to do. I really can't speak outside of that realm.
How are customer service and support?
I have never called their technical support, but their online documentation is pretty good.
Which solution did I use previously and why did I switch?
We deployed three different solutions. One of them was free from VMware and the other one was Login PSI. We didn't really switch, it's just different feature sets we're looking for or methodology we want to use; whether or not the client wants to spend a hundred grand upfront.
How was the initial setup?
For me, the initial setup is straightforward — I've done it a few times now.
What's my experience with pricing, setup cost, and licensing?
The price is okay. You're able to buy it, as opposed to paying for a full year. You can just on-demand purchase it for your users for a day or two, which is nice in an MSP business like mine. If I need to use it for separate clients, I don't have to have a huge layout of capital upfront.
What other advice do I have?
Make sure you know what your use case is before you buy it.
On a scale from one to ten, I would give this solution a rating of nine. It's very good at doing what it needs to do. I think that the reporting needs a little bit of work, but that's pretty much it. I think every reporting system needs a little bit of work, so take that with a grain of salt.
Which deployment model are you using for this solution?
Public Cloud
Disclosure: My company has a business relationship with this vendor other than being a customer: Reseller
Senior manager at a transportation company with 10,001+ employees
The number of protocols it supports is a key asset for us
What is most valuable?
One of the things in the airline business, we see that our number of users varies on a day to day basis, from season to season. So, from an airline business standpoint, we are looking at scalability as one the major things and how can we adapt the solution in an agile fashion. If we want to ramp up the "Our Views" account, from, let's say 10,000 to 50,000, how can we do that? That kind of scalability is the main key thing we are looking at.
How has it helped my organization?
One of the key things we use is simulating the actual user experience on the log. We have a huge set of applications from front end to the back end systems. How do we integrate all these systems and how do we simulate the real time user behavior? That's where we see a key value.
What needs improvement?
One of the things we were looking for is more of a DevOps support, like BlazeMeter has. It would be an ideal scenario to incorporate those kinds of features. I know there are some open software products which have that but it would be ideal to see those features in the product.
What do I think about the stability of the solution?
As of now, it's working great for us, it's excellent. We don't have any issues. That's one of the reasons we are pushing forward to version 12, to incorporate the other protocols, which version 12 comes up with.
What do I think about the scalability of the solution?
We are at version 11.5 and we are in the process of upgrading it to version 12. We are pretty happy with the solution we have.
How are customer service and technical support?
We do have a dedicated team. They work with our tech support and with their tech support in terms of the installations and stability of the product and usability. All those issues, they take it up with tech support.
Tech support is pretty excellent. We are getting pretty good responses back from tech support and, as of now, we are happy. We do have a contact too, from the United Airlines side too so I'm pretty happy with that.
Which solution did I use previously and why did I switch?
Ours is more of a historical basis. We were on version 9, we moved to version 11, and we are right now at version 12. It's more for historical reasons rather that an impulse buy.
How was the initial setup?
I didn't work on the installation of 11.5, but right now I'm working on version 12.
Internally, we have a lot of planning to do on our side, like a database upgrade, LGs, all that stuff but we are coordinating that with HPE and Micro Focus and making sure that our timelines and their timelines match. And we do have upgrade licenses, which should be pretty good to go.
I would say the relations between us and Micro Focus is straightforward because all we are looking at is basically license upgrades. On our side, it's more complex because we have to internally work with various teams to coordinate all this activity.
What other advice do I have?
The most important criteria when selecting a vendor to work with are the product, how easy it is to use the product and, again, how scalable the product is and how it suits the needs of United Airlines. And, of course, the customer support, and how technical support deals.
Regarding advice to a colleague, it depends on the industry and what kind of problem they are trying to solve. If it is in the airline industry, I would definitely suggest to them, "Okay, this is a perfect product because of the number of protocols it supports," because we looked at other open source software and we couldn't find a product which matches Performance Center, which supports so many protocols. So, especially in the airline industry, we are using multiple protocols and we need that support. I would definitely recommend that.
Disclosure: I am a real user, and this review is based on my own experience and opinions.

Buyer's Guide
Download our free OpenText Enterprise Performance Engineering (LoadRunner Enterprise) Report and get advice and tips from experienced pros
sharing their opinions.
Updated: March 2025
Popular Comparisons
OpenText LoadRunner Professional
OpenText LoadRunner Cloud
Oracle Application Testing Suite
IBM Rational Test Workbench
Buyer's Guide
Download our free OpenText Enterprise Performance Engineering (LoadRunner Enterprise) Report and get advice and tips from experienced pros
sharing their opinions.
Quick Links
Learn More: Questions:
- When evaluating Load Testing Tools, what aspect do you think is the most important to look for?
- SOAtest vs. SoapUI NG Pro?
- Does Compuware have a manual testing solution? Which manual testing solutions should we be considering?
- What are the top performance tools available to load test web applications?
- What is the best tool for mobile native performance testing on real devices?
- When evaluating Performance Testing Tools, what aspect do you think is the most important to look for?
- Cost of TOSCA Testsuite?
- Do you have an RFP template for Testing Tools which you can share?
- Specflow vs Selenium
- What are the best performance testing tools?