QA Manager at a financial services firm with 501-1,000 employees
Vendor
2014-03-27T10:16:14Z
Mar 27, 2014
Greatest success was with a horizontally scaled Oracle environment. At the time it was cutting edge. 20% test environment, same servers, just 1/5th of Production.
Aim was 25,000 users in Production, we aimed for 5000 in Test. Successfully executed with the required amounts of data for 25,000 users, performed a database restore, cleansed the data and hired out a full-size environment from Sun Microsystems.
First test at 25,000 and we were inside 2% of our estimates. This then fed into HyPerformix for a Capacity Plan and the client has used my model as a working methodology for large-scale Performance Testing ever since, and this was 2005.
70% of that was all in the pre-test planning.
Search for a product comparison in Performance Testing Tools
Manager IT, Scalability & Performance at a retailer with 10,001+ employees
Real User
2014-03-27T05:31:12Z
Mar 27, 2014
My greatest success is optimizing Loadrunner with a risk-based approach while incorporating architecture and performance tuning into a single service. My greatest failure was in misconfiguring the Run-time Settings, thereby arriving at a false sense of performance. I also greatly prefer major commercial testing tools for the immediate analysis and ease of configuration.
Success in using a testing tool is a combination of human involvement not just intervention to define the right test strategy, lot of analysis of the product features, and a product which is at least 60% stable to be tested.
I have experience using QTP and its first cousin UTP in defining a test harness for an ad campaign product using key words (QTP didn't have this feature then as a Mercury product). The testing to deployment was successfully done after a six sigma project on establishing the stability and suitability of a complex product with about 3500 web pages of an ad campaign product from the UK.
IT Administrator at a tech services company with 51-200 employees
Consultant
Top 20
2014-03-26T12:09:20Z
Mar 26, 2014
My greatest success was the use of LoadRunner to load and stress test company critical software, testing against on HTTP(S)/HTML applications with J2EE deep diagnostics monitoring as well as SAP application chains. It really helped ironing out the major performance wrinkles and memory leaks before pushing to production environments. My greatest failure was using open source tools like JMeter and OpenSTA to achieve the same. It required too much time to parametrize user interaction and associate performance metrics. The maintenance and startup effort/load of these open source tools has been lowered over the past couple of years, but I'd still prefer a major commercial testing tool over open source any time.
Learn what your peers think about OpenText LoadRunner Professional. Get advice and tips from experienced pros sharing their opinions. Updated: December 2024.
OpenText LoadRunner Professional is a performance testing tool used for various software applications, including web-related use cases, API testing, and enterprise performance modeling. Its valuable features include quick test case creation and execution, graph monitoring, multiple protocols, scripting and executing tests, scalability, easy setup, auto-correlation, analysis and reporting capabilities, stability, reliability, and compatibility with various programming...
Greatest success was with a horizontally scaled Oracle environment. At the time it was cutting edge. 20% test environment, same servers, just 1/5th of Production.
Aim was 25,000 users in Production, we aimed for 5000 in Test. Successfully executed with the required amounts of data for 25,000 users, performed a database restore, cleansed the data and hired out a full-size environment from Sun Microsystems.
First test at 25,000 and we were inside 2% of our estimates. This then fed into HyPerformix for a Capacity Plan and the client has used my model as a working methodology for large-scale Performance Testing ever since, and this was 2005.
70% of that was all in the pre-test planning.
My greatest success is optimizing Loadrunner with a risk-based approach while incorporating architecture and performance tuning into a single service. My greatest failure was in misconfiguring the Run-time Settings, thereby arriving at a false sense of performance. I also greatly prefer major commercial testing tools for the immediate analysis and ease of configuration.
Success in using a testing tool is a combination of human involvement not just intervention to define the right test strategy, lot of analysis of the product features, and a product which is at least 60% stable to be tested.
I have experience using QTP and its first cousin UTP in defining a test harness for an ad campaign product using key words (QTP didn't have this feature then as a Mercury product). The testing to deployment was successfully done after a six sigma project on establishing the stability and suitability of a complex product with about 3500 web pages of an ad campaign product from the UK.
My greatest success was the use of LoadRunner to load and stress test company critical software, testing against on HTTP(S)/HTML applications with J2EE deep diagnostics monitoring as well as SAP application chains. It really helped ironing out the major performance wrinkles and memory leaks before pushing to production environments. My greatest failure was using open source tools like JMeter and OpenSTA to achieve the same. It required too much time to parametrize user interaction and associate performance metrics. The maintenance and startup effort/load of these open source tools has been lowered over the past couple of years, but I'd still prefer a major commercial testing tool over open source any time.