Try our new research platform with insights from 80,000+ expert users
Team Lead at American Electric Power
Real User
Excellent reporting and good support for protocols
Pros and Cons
  • "LoadRunner Enterprise's best feature is the detailed reporting structure."
  • "Micro Focus's technical support could be more responsive."

What is most valuable?

LoadRunner Enterprise's best feature is the detailed reporting structure.

What needs improvement?

Micro Focus's technical support could be more responsive.

For how long have I used the solution?

I've been working with LoadRunner Enterprise for six years.

What do I think about the stability of the solution?

LoadRunner Enterprise is stable.

Buyer's Guide
OpenText Enterprise Performance Engineering (LoadRunner Enterprise)
February 2025
Learn what your peers think about OpenText Enterprise Performance Engineering (LoadRunner Enterprise). Get advice and tips from experienced pros sharing their opinions. Updated: February 2025.
838,713 professionals have used our research since 2012.

What do I think about the scalability of the solution?

LoadRunner Enterprise is scalable.

How are customer service and support?

Micro Focus's technical support could be more responsive.

How was the initial setup?

The initial setup is straightforward and takes around fifteen minutes.

What's my experience with pricing, setup cost, and licensing?

LoadRunner Enterprise's price is high, but it gives more value for money than some cheaper alternatives.

Which other solutions did I evaluate?

I've also evaluated JMeter, but it doesn't support as many protocols as LoadRunner Enterprise.

What other advice do I have?

I would rate LoadRunner Enterprise ten out of ten.

Which deployment model are you using for this solution?

On-premises
Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
it_user671391 - PeerSpot reviewer
IT Manager at a comms service provider with 10,001+ employees
Real User
It allows you to share resources, which wasn't happening with Load Runner.
Pros and Cons
  • "It allows you to work out how well you are doing project-wise because you see the number of scripts done, the number of tests run, and whether you have mapped all your requirements to it."
  • "The worst thing about it is it did not have zero footprint on your PC."

What is most valuable?

ALM centralizes everything. It allows you to work out how well you are doing project-wise because you see the number of scripts done, the number of tests run, and whether you have mapped all your requirements to it. You can produce metrics there fairly easily for your line management and higher. So, overall, it is better than people using Excel spreadsheets.

Performance Center is good because it allows you to share resources, which wasn't happening with Load Runner. With Load Runner, everyone was very specific. I've just got these controllers and their mine and I might only be using them five percent of the time but I need them tomorrow. And I can’t allow anyone else to use them because it will disrupt my schedule.

With Performance Center, you start to get into position where people can say, "I need to run a test, how many assets are available? When can I plan to do it?"
It also provides discipline because you stop getting people saying, "We're ready to do performance testing," because they've got to schedule the test. They've got to use that period when they've scheduled it. If they don't we pull it back and somebody else can use it. You get a lot of people screaming they've lost their slot but what you've proven to them is that they're not ready for performance testing.

It's very good from that point of view. It focuses people's minds on actually using their time effectively.

For how long have I used the solution?

I have been using ALM for eleven years. I used it when it was version 9.2 and continued with a lot of versions, all the way through.

We picked up Performance Center when we started introducing Load Runner. We kept that together until we realized we were had too many instances and it would be better strategically to go with Performance Center. I have been using it for ten years.

What do I think about the stability of the solution?

HPE Quality Center ALM is stable. It obviously has not got the attractiveness of Octane. As going forward, Octane probably does now take it to the next step.

The one thing I always said about ALM, and I'll say this to everybody. The worst thing about it is it did not have zero footprint on your PC. The amount of effort and the cost to upgrade to the next version, the amount of problems that it gave us in terms of trying to put a patch on, because it was particularly essential, was really bad for the business.

We had many different PC models out there on people's desks, so it wasn't just a case of patching or building a new MSI package for one PC. You had to do it for a whole range and then you had to deploy them at exactly the same time or somebody would find that they couldn't use Quality Center.

Octane, now being zero footprint, is probably going to be one of the biggest cost savings I see.

Performance Center seems to be stable. It's probably being utilized far more readily than, say, even Unified Functional Testing.
There are issues with it that mostly seem to be environmental. You'd be surprised how many people think they know about how to do performance testing and then they start using a server that's in one area of the UK to try and run a performance test on servers in another country.

I’m thinking, “why are you running such a transaction load across our network.” Whereas, they should really be in the local area. So, with Performance Center, most of the issues are more user-based. Technically, it seems to meet the task that you need it to do.

What do I think about the scalability of the solution?

Without a doubt, both Performance Center and ALM are very scalable.

How are customer service and technical support?

Sometimes support is good. Sometimes it's not so good. Sometimes you hit an issue and trying to get across the message of what the issue is, and then trying to get an answer back, can be a bit of a challenge sometimes. You hit an issue that everybody else has hit and it has a solution, then you get the response back. But in the majority of cases, the people that are on the case for you tend to do their best to try and answer what you've given them.

Which solution did I use previously and why did I switch?

Adaptability is what I look for in a vendor. It tends to pull the others in. A good contact, ready to listen, to really know how to deliver what you want. Someone who can listen to what your problem is or what your challenge is that you need the tool to resolve. And if you're willing to adapt to that, then the tool might not be 100%, but it might make it's way there. If you're fixed in your ways, and say, "this is what our tool does, this is all that it's going to do," then to be honest, why continue?

How was the initial setup?

The biggest issue is that ALM is a thick client and you can't patch it, because you've got hundreds and hundreds of PCs. Several different standards are on people's PCs. You can’t do it. You leave it until there's a big release and then you take a massive program to deliver it. Get rid of that thick client bit and you could patch on the server and it could be up and running the next day. Which is the neat bit about Octane.

The setup of Performance Center seems fairly reasonable. No real shakes about it. Obviously, you've got to have VuGen on the PC. It tends to have to be a meaty PC, but then you are running performance tests. My biggest challenge with Performance Center is having people who claim to do performance testing or know how do to performance testing and they're still wet behind the ears.
A good performance tester needs to have a good 18 months experience with them. They need to have done things with Performance Center. Delivered projects. They need to use SiteScope. They need to use analysis tools on that network. They need to know how to get the best value out of the tool. Somebody who's just come for the first time has probably done a week or two-week training course and says, "I know how to performance test."
They get results back and say, "We ran it for a 100 users and it failed." Well, okay, where did it fail? Where's the analysis that helps us fix the problem? And we didn't get that, which they would have done if they'd known to implement the additional bits like SiteScope against it.

So, with Performance Center, it's a skill issue for the people that are using it. Again, one of my guys says, “I’d like to see people be able to grade themselves in Performance Center or even in performance testing, "I'm at a Bronze level. I'm at a Silver level. I'm at a Gold level." Then you know how effective that person is going to be.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
Buyer's Guide
OpenText Enterprise Performance Engineering (LoadRunner Enterprise)
February 2025
Learn what your peers think about OpenText Enterprise Performance Engineering (LoadRunner Enterprise). Get advice and tips from experienced pros sharing their opinions. Updated: February 2025.
838,713 professionals have used our research since 2012.
it_user671403 - PeerSpot reviewer
Team Manager at a financial services firm with 10,001+ employees
Real User
It is used for applications where we have many users.
Pros and Cons
  • "With Performance Center, the version upgrade is easy. You just have to roll out the new patch or the new version."
  • "For such an experienced team as mine, who have been with the product for over ten years, sometimes working with technical support is not that easy."

What is most valuable?

Performance Center, in our company, is used for important applications where we have a lot of users, or special needs for performance that are important.

We have a central team that implements the scripts and executes the tests. It depends on the years of experience of the users. The investment goes down, then we have more issues. Then money is spent and then investment goes up. So it is a curve. Everything is going up, as it is in ALM. ALM is still a growing market.

What needs improvement?

With Performance Center, the version upgrade is easy. You just have to roll out the new patch or the new version. It is much easier. I'm not really the right person to say, because I run the environment. We have a specialized team that does development.

For how long have I used the solution?

I’ve been using Performance Center since 2007.

What do I think about the stability of the solution?

Performance Center is more stable than ALM. We roll out a version, and I think it fits for our clients. If it is a very early version, then we have to implement a patch. Afterwards, it is quiet, hopefully, for at least one or two years.

What do I think about the scalability of the solution?

For Performance Center, you have to add additional load generators, and then you can do more. I think it is a matter of the price, in terms of how many machines you can buy.

How are customer service and technical support?

For such an experienced team as mine, who have been with the product for over ten years, sometimes working with technical support is not that easy. Support does not have our knowledge. It takes a while to train them in what our issues are and we have to connect to second or third level support.

Which solution did I use previously and why did I switch?

The collaboration between us and HPE, especially over the past ten years, has been very good. This is the most important thing when looking at a vendor. For that reason, I try to bring in more HPE products, if needed.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
it_user331326 - PeerSpot reviewer
Portfolio Testing Infrastructure Manager at a financial services firm with 10,001+ employees
Vendor
Helps us to uncover critical performance-related defects. Almost all the areas improve with each version, however the correlation of scripts, analysis, and reporting can be further improved.

Valuable Features

  • Integration with the majority of enterprise tools
  • Scripting
  • Reporting
  • Admin Console
  • Reporting & Analysis tool

Improvements to My Organization

This helps us to uncover some very high and critical performance-related defects, and keeping almost zero production issues related to the performance of applications since then.

Room for Improvement

Almost all the areas improve drastically with each version; however, the correlation of scripts, analysis and reporting can be further improved. Their technical support could also be improved. Recording of the latest applications is an area for continuous improvement.

Use of Solution

I've used it for the last 15 years, and for the last nine years at enterprise level.

Deployment Issues

No issues encountered.

Stability Issues

No issues encountered.

Scalability Issues

No issues encountered.

Customer Service and Technical Support

Mercury support was very good compared to HP, however they are getting better day by day.

Initial Setup

It was straightforward initially with v8.1 and various FPs for 8.1. However it was very complex with v10. It was due to the way our security suites were designed that made it very complex. The design stage took one month, and implementation was two months, and we had one month dedicated support from offshore.

Implementation Team

It was with a mix of an in-house team and vendor support. The vendor team is necessary for the initial setup, and upgrades can be done in-house, but major upgrades need vendor support.

Other Solutions Considered

We carried out various PoCs for different market leading tool sets, and chose HP Performance Center because it offers better test suites for our enterprise tools, ease of integration, and it had more collaboration with our existing tool sets. Also, the technology, current & future demands for various applications, was better than the other opetions and they offered better support arrangements.

Other Advice

It's generally for enterprise level, however they now offer a SaaS version for smaller companies or clients.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
it_user331326 - PeerSpot reviewer
it_user331326Portfolio Testing Infrastructure Manager at a financial services firm with 10,001+ employees
Vendor

Hi Diego,
Many thanks for the comments.

See all 2 comments
Associate at Tech Mahindra Limited
Real User
User friendly with good reporting and many useful features
Pros and Cons
  • "The solution is a very user-friendly tool, especially when you compare it to a competitor like BlazeMeter."
  • "The solution is a very expensive tool when compared with other tools."

What is most valuable?

The solution is a very user-friendly tool, especially when you compare it to a competitor like BlazeMeter.

The custom meter is nice. It has a lot of features.

When compared with BlazeMeter, I use the plain data. In the cloud after one year it has been very good.

With reporting, we will see the door unlock on the main portal very quickly, because LoadRunner has very good analysis tools. You can analyze data and get the error data as well. You can merge them together and dig down into specific points of time. It's great for correlating drafts within the number of users, between accounts, and with support. These functionalities are not there in BlazeMeter.

What needs improvement?

The solution is a very expensive tool when compared with other tools.

The stability in some of the latest versions has not been ideal. They need to work to fix it so that it becomes reliably stable again.

The cloud solution of LoadRunner is not user-friendly when compared to BlazeMeter. They need to improve their cloud offering in order to compete. It also shouldn't be a standalone tool.

For how long have I used the solution?

I've been using the solution for about one year now.

What do I think about the stability of the solution?

In terms of stability, it depends on what you are using. Sometimes version 5.3 and the newer versions are not stable. The latest versions we are finding are not so stable when compared with the previous versions we've used, so some glitches are there. They need to rectify that. It was stable for two years, and now it's not.

What do I think about the scalability of the solution?

The solution is scalable, and it's based on the number of licenses you have. In comparison, with BlazeMeter, I ran thousands of users, because it's very cheap and we could scale up the number of users easily with very little overhead.

In my experience, I've used BlazeMeter to scale up to 5,000 users. With OpenText, there are not more than 2,000 users.

How are customer service and technical support?

In OpenText, I have worked with various types of clients. Some clients have platinum customer status, and some have gold. For those levels, the support will be there. At platinum, technical support is very helpful at updating their support and the support is good.

Which solution did I use previously and why did I switch?

I also use BlazeMeter.

With LoadRunner, I use it with a paid tool, and since I am following the protocol, I need it to be easy to use. Whereas with BlazeMeter, we use it with JMeter. We need to use it sometimes if we want support. We need to configure some properties or some customers' ratings before we can use it.

What's my experience with pricing, setup cost, and licensing?

The solution needs to reduce licensing costs. Its main competition, for example, is free to use, so I'm sure it's rather difficult to compete with it on a cost level.

What other advice do I have?

We're partners with OpenText.

I haven't found many products in this particular niche that have compared to JMeter and BlazeMeter tools.

I'd rate the solution eight out of ten.

I suggest other potential users review OpenText. If the client has the budget for the solution, I'd recommend it. If they don't have a budget, I'd suggest they instead opt to look a freeware solution, and I'd suggest they evaluate JMeter or BlazeMeter.

Which deployment model are you using for this solution?

On-premises
Disclosure: My company has a business relationship with this vendor other than being a customer: partner
PeerSpot user
cyrusm - PeerSpot reviewer
cyrusmProduct Manager - LoadRunner Professional and Enterprise at a tech vendor with 1,001-5,000 employees
Vendor

Hello and thanks for the review. One of our goals has been to simplify the entire performance testing process from script creation, to execution and analysis. Our mission is to be open. We hope that you get a chance to review our newer releases.

Test Management Architect at a insurance company with 1,001-5,000 employees
Real User
Provides testing at the integration or system level and the data to make testing decisions

What is most valuable?

It provides a different platform for testing in an organized fashion. One of the big things is data warehousing, data analytics, you want to get from being reactive to proactive to predictive. Those are the progressions that we want to make. It's going to be extremely difficult when you start to incorporate testing platforms, testing techniques, to tooling, into pipeline, into any of these DevOps pipelines. If we can't collect the data, if we don't really know what's going on, then it becomes very hard to make testing decisions from tooling to technique to platforms. 

Performance Center innately provides you the ability to manage those assets. And it's also a different type of testing, independent of something that might be more unit based. We want to be able to test at the integration or the system level, which is a completely different approach to testing compared to a developer who may be doing something very, very low-level. Instead of changing the class.

We want to make sure that all these areas of testing are not just being done, but they're also able to be audited. Because, without access to the data, it makes it very difficult to implement solutions going forward. Whether they're new or they may be something that's up for modernization to keep up with DevOps and pipelining.

What needs improvement?

It has to be fully integrated into pipelines, it needs to be DevOps friendly. It needs to be easily digestible by management, and certainly developers. It's a developers' world, as it should be. They're the ones who create the applications and solve the problems in those applications. So it has to be positioned to be something that allows a team to make better decisions, to move through that progress I mentioned before, from reactive to proactive to predictive. Once you get the predictive you can make better decisions on how you should be teasing things, and Performance Center will have to follow the same trajectory. It has value, but the value needs to evolve and mature along with other aspects of application development.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
it_user739554 - PeerSpot reviewer
Senior Presales Engineer at a tech company with 51-200 employees
Vendor
Enables testing a huge variety of applications, not just web-based systems but SAP, Oracle, etc.
Pros and Cons
  • "You can test a huge variety of applications, not just web-based systems, but SAP, Oracle, web services, pretty much anything out in the market place, but it's mobile-based testing."
  • "Canned reports are always a challenge and a question with customers because customers want to see sexy reports."

What is most valuable?

High scalability. Web-based testing. The interface. If you're familiar with the days of using LoadRunner, when you had to have the 32-bit client, using a web-based client is fantastic. You can spin it up relatively quickly despite the fact that it's enterprise software. You can test a huge variety of applications, not just web-based systems, but SAP, Oracle, web services, pretty much anything out in the market place, but it's mobile-based testing.

How has it helped my organization?

In my current organization, I honestly don't know so much. But in my previous organization, when I was doing consulting, we helped huge amounts of customers prepare not to fail under scale. So whether you have a large amount of base driven things like Super Bowl, or a major sale, release of a new product like Samsung S8, iPhone 7, etc. Basically when you get a huge push.

What needs improvement?

Canned reports are always a challenge and a question with customers because customers want to see sexy reports. They want to be able to show something to the CIO. So I think the dashboards are one of the features I'd like to see most.

I think it's more of getting into the world where you've got tableau and dashboarding. I think that reporting needs to be a little bit more fancy, as people expect the sexier reporting. They don't expect just to have, "I ran a test. The test ran for this long." I think the consumer's expectations for what reporting looks like have changed a lot. You do an Excel report or a Word report versus, "No, it needs to be a very pretty dashboard."

The product itself, I think it's pretty good. I can't think of anything off the top of my head.

What do I think about the stability of the solution?

It's great. I don't have a problem with stability at all, as long as you have it scaled properly and you have sufficient hardware in place. If you're running it all on a VM, you're going to have a problem, but if you run it with the proper infrastructure, it's a very solid product.

What do I think about the scalability of the solution?

The nature of Performance Center is scalable, so you have the application server and then, when you need to have more generators to generate more load, you spin those up pretty quickly. You can use cloud-based generators as well, so that's a huge plus.

How are customer service and technical support?

It's been a long time since I needed to use tech support. Normally, as a consultant, I am the tech support, so I don't typically have to use tech support. But when I have, I normally am able to get quickly to either R&D-level or a level-two support because it's a real problem with the product, not necessarily just, "I can't figure this out."

Which solution did I use previously and why did I switch?

I help customers with this process all the time. I'm usually advising them on what, why, when, what the feature benefits are.

Unfortunately, as is human nature, customers decide that they need Performance Center because they've had a disaster. Hopefully not a horrible disaster, but they've had some kind of case where they released a product and it didn't scale. They didn't plan for their own success. A classic example is HealthCare.gov. Politics aside, when you've got the entire American population ready to enroll for healthcare and it tanks, it's a very bad experience for everyone. And that's not an uncommon occurrence across the board.

So then they realize, "Oh, well, we better do performance testing," and then they realize they didn't plan for that in the project lifecycle, so now they need to come and talk to Micro Focus about standing that up, or to talk to a partner at Micro Focus about how to do that for them.

There was a reason, for the longest time, that it had one of the largest market shares of any type of solution in the world, and now that Micro Focus has Silk and the LoadRunner/Performance Center product, they've got that market cornered.

How was the initial setup?

I have set up many, many instances of Performance Center. Recently, it's much more straightforward. A long time ago it was very complex. But it's pretty straightforward. You set up the application center, you set up your generators, you set up your controllers, database.

What other advice do I have?

When selecting a vendor I would judge them on the criteria that I have myself: they've got to have experience, they've got to have done the testing on the solutions that they've worked on. I think seniority is good too, little gray hairs don't hurt anything.

Regarding advice to others, invest in training. Invest in mentoring. Invest in experienced people that have done the job before. Don't go into it thinking that you're going to open the box, get it out, and it's going to be perfect. It's a complicated tool for a reason. You don't want someone operating on you who says, "Well, I read a book on brain surgery." It's complicated for a reason.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
PeerSpot user
Manager Performance Engineering at a retailer with 10,001+ employees
Vendor
We have Performance Center as a platform to share with others that don't do performance testing full-time.
Pros and Cons
  • "We have Performance Center as a platform to share with others that don't do performance testing full-time, so that they in an agile fashion, on demand can go ahead and get real issue-finding testing done."
  • "I know there are integrations with continuous testing. It's got tie-ins to some of the newer tools to allow continuous testing. I'd love to see us not have to customize it, but for it to be out of the box."

What is most valuable?

What I really like is our team's core competence in building good tests that really do find issues, because of our full-time dedication to it. We have Performance Center as a platform to share with others that don't do performance testing full-time, so that they in an agile fashion, on demand can go ahead and get real issue-finding testing done; then to go ahead and have that pulled into trending reports so that even subtle differences or trends over time are found and not just game-changing defects. Again, it's a platform to get expert level things done for the masses.

How has it helped my organization?

It allows us to on the reporting end show how even though we don't have a smoking gun on this release, and it made everything so terrible that we've got real quality issues, we know when it started and that it's only getting worse. When you're tracking many subtle interactions, this is helpful.

What needs improvement?

I know there are integrations with continuous testing. It's got tie-ins to some of the newer tools to allow continuous testing. I'd love to see us not have to customize it, but for it to be out of the box.

I have some concern over its foundation for utilizing cloud testing hosts in the most integrated fashion. For example there is reliance in AWS to utilize default VPC, and also there is not deep knowledge about utilizing *nix hosts though they are supported.

For how long have I used the solution?

I have used this solution at four different places starting 13 years ago.

What do I think about the stability of the solution?

It's good. It's been around a long time and we've been using it a long time. It's stable.

What do I think about the scalability of the solution?

We're up to 60,000 users. It's got a good system for being able to take a vast amount of data that you haven't put into a particular report and chug through it. It could take a while, but it's stable at that.

How is customer service and technical support?

It comes up periodically; typically when we're doing something we haven't done before. We actually have a combination of support through them and one of their value added re-sellers, AVNET. We actually get level one support through them, so it's a partnered supported arrangement.

Typically AVNET can handle anything unless it's truly about requesting a new feature or enhancement. You need to get back to the product management and developers to request such things.

How was the initial setup?

It has many tiers, it's not a single system thing. You definitely have to take the time to architect it correctly, to have a full topology. I've done it a few times.

What other advice do I have?

As professionals, we're supposed to be some what tool agnostic. We'll find a way to get it done. That said, it's a mature player in the space. We do enjoy some long time knowledge about squeezing the good stuff out of it.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
Buyer's Guide
Download our free OpenText Enterprise Performance Engineering (LoadRunner Enterprise) Report and get advice and tips from experienced pros sharing their opinions.
Updated: February 2025
Buyer's Guide
Download our free OpenText Enterprise Performance Engineering (LoadRunner Enterprise) Report and get advice and tips from experienced pros sharing their opinions.