Try our new research platform with insights from 80,000+ expert users
Anuj-Kataria - PeerSpot reviewer
QA Manager at Next Solutions
MSP
Top 5Leaderboard
A very well-manageable tool that offers users with a great UI and UX
Pros and Cons
  • "UI and UX are pretty easy to understand without much of a problem."
  • "Tricentis qTest's technical support team needs to improve its ability to respond to queries from users."

What is our primary use case?

On a day-to-day basis, my company works on an in-sprint Agile model. In our company, whatever testing needs to be done in sprints is instead done on Tricentis qTest, and then those are linked to a particular story in Jira, meaning integration with Jira also exists.

How has it helped my organization?

Tricentis qTest helps the employees in our company at the workplace since it provides the functionalities of a test case management tool. Initially, in my company, instead of Tricentis qTest, we used to write in Excel. When on Tricentis qTest, my company has seen that it is much more visible to various stakeholders while also observing that the executions we carry out become much more manageable. Whenever in our company, we execute test cases and mark them as pass or fail straight away in Tricentis qTest, which then provides us with an overall report indicating things like the pass percentage, etc. Regarding one-to-one mapping, when in our company, we automate test cases, Tricentis qTest is a very effective tool since we don't need to go to Excel sheets to look for and make markings. One-to-one mapping with an automated test suite is very effective in Tricentis qTest.

What needs improvement?

Our company still hasn't found a way to mark in Tricentis qTest using APIs when our automation strips run, which is a process that, if introduced in the future, can probably improve the tool. The aforementioned process happens in other tools like Azure DevOps, Zephyr, or Xray but not in Tricentis qTest.

In the future, an automated test suite, triggering functionality in the test suite, and execution of automated tests, which could be marked in Tricentis qTest, would be a great addition to the tool.

Tricentis qTest's technical support team needs to improve its ability to respond to queries from users.

For how long have I used the solution?

I have been using Tricentis qTest for six to eight months. I am a user of Tricentis qTest.

Buyer's Guide
Tricentis qTest
October 2024
Learn what your peers think about Tricentis qTest. Get advice and tips from experienced pros sharing their opinions. Updated: October 2024.
816,406 professionals have used our research since 2012.

What do I think about the stability of the solution?

Stability-wise, I rate the solution a seven out of ten.

What do I think about the scalability of the solution?

Scalability-wise, I rate the solution a six or seven out of ten.

More than 15 people in my company use the tool.

My company plans to increase the usage of the tool in the future. My company is in the process of expanding it to other LOBs as well.

How are customer service and support?

There weren't frequent responses from Tricentis qTest's technical support team whenever our company needed some help. When our company was struggling with the setup phase of the solution, during which we wanted to integrate qTest with Jira, we tried to reach out to Tricentis qTest's technical support team, but we didn't receive any substantial response from their end. I rate the technical support a six out of ten.

How would you rate customer service and support?

Neutral

How was the initial setup?

I rate the initial setup phase of the product a five on a scale from one to ten, where one is difficult, and ten is easy, since in our company, we did say some hiccups in the beginning.

Considering that my company wanted Tricentis qTest to be integrated with Jira, the initial setup phase took around a week or two to complete.

The solution is deployed on the cloud.

During the initial setup phase, some users need to be created, and then there are different access levels to be done once you buy the product. There is a two-way linkage in which, in Jira, there is a qTest extension that you need to install and can be done very easily. But the two-way linkage for qTest to Jira was something which, in our company, we had to figure out in the setting, a process that only an admin can handle and no other users can do. The aforementioned steps need to be followed to be able to start using Tricentis qTest.

One person having full admin access should be enough for deploying the tool.

What's my experience with pricing, setup cost, and licensing?

Based on whatever I heard, I can say that Tricentis qTest is a little costlier than other test management tools, like Jira, Zephyr, or Xray.

What other advice do I have?

I would recommend qTest to others since its UI and UX are pretty easy to understand without much of a problem. With Tricentis qTest, Folder structures are easy to do, but the only area of concern comes into the picture when you try for its integration with your other project management tools. Integration is an area you need to look at to ensure that it allows for easy integrations. When trying to integrate Tricentis qTest with Jira, my company faced a lot of problems. If I compare it with other test management tools like Zephyr, which is built into Jira, you just need to enable it, and it is linked. With other project management tools, like Jira or maybe Azure DevOps, you will have to figure out how the integration between the aforementioned tools and Tricentis qTest can be made possible. Linkage with other project management tools, other automated tools, or other automation suites you have developed is an area you need to take care of when it comes to Tricentis qTest.

I rate the overall tool an eight out of ten.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
reviewer1215417 - PeerSpot reviewer
Senior Director of Quality Engineering at a tech vendor with 1,001-5,000 employees
Real User
Gives us more efficiencies and overall improvement in transparency and visibility of the testing progress
Pros and Cons
  • "The main thing that really stuck out when we started using this tool, is the linkability of qTest to JIRA, and the traceability of tying JIRA requirement and defects directly with qTest. So when you're executing test cases, if you go to fail it, it automatically links and opens up a JIRA window. You're able to actually write up a ticket and it automatically ties it to the test case itself."
  • "The Insights reporting engine has a good test-metrics tracking dashboard. The overall intent is good... But the execution is a little bit limited... the results are not consistent. The basic premise and functionality work fine... It is a little clunky with some of the advanced metrics. Some of the colorings are a little unique."

What is our primary use case?

The primary use case is to the overall testing process and management of our test cases, as far as the design, creation, review, and archiving of them goes. We use it to manage their overall status.

We are cloud users, so we've got the latest and greatest version. They transparently push updates to us.

How has it helped my organization?

The solution’s reporting enables test team members to research errors from the run results. We do have some metrics and some dashboards that are set up that which allow the testers themselves to get good visibility into where things are at and which allow others to see "pass," "failed," "blocked."

qTest has been very useful for us. It's helped in productivity. It's helped in automating a lot due to the seamless integration with JIRA. It has taken us to the next level, in a very positive way, in the management of our overall test cases. It has been outstanding.

In comparison to managing test cases in spreadsheets or other tools we've used in the past qTest is saving us a couple of hours a day.

Investing in Insights to have one location for a dashboard of all reports and metrics, it has allowed us to minimize the number of reports or URLs which other stakeholders have had to go to in order to get status on the testing. There has definitely been an improvement there.

Use of the solution also provides our team with clear demarcations for which steps live in JIRA and which steps live in qTest. Test cases and tickets are assigned to test plans, etc. through the tools within qTest and they are all linked back.

What is most valuable?

The main thing that really stuck out when we started using this tool is the linkability of qTest to JIRA, and the traceability of tying JIRA requirement and defects directly with qTest. So when you're executing test cases, if you go to fail it, it automatically links and opens up a JIRA window. You're able to actually write up a ticket and it automatically ties it to the test case itself.

It has seamless integration with other key, fact-tracking or ticket-tracking tools, with overall good API integrations.

What needs improvement?

The Insights reporting engine has a good test-metrics tracking dashboard. The overall intent is good, compared to other test tracking or test management tools. But the execution is a little bit limited. The overall solution is good, but the results are not consistent. The basic premise and functionality work fine. When you try to extend the use of it a little bit more, it struggles. It is a little clunky with some of the advanced metrics. Some of the colorings are a little unique. They are currently working on a new flavor for Insights.

We do have dashboards and links set up that our executive level access. Overall, the numbers are accurate, based on what we're putting into it, but where we lose integrity or where we lose the overall perception of things, is when the colors start changing or when red is used to mean good. That's when executives lose respect for it. We've used it as a dashboard during key deployments. And then, as press is being made and the reports are being updated, colors start to change and that distracts from the overall intent of the reporting progress.

We chose to leverage Insights so that we didn't have to manually create charts via either a Google Sheet or Excel since we don't have the resources, time, or bandwidth to do that. That is what excited about Insights. But then, it just didn't meet our expectations.

We have voiced our concerns to Tricentis and they definitely have empathy. We talk about it and they keep us updated. With an acquisition they're going to leverage their analytics tool. We are excited about that, once it launches. 

We have also discussed with our account manager a couple of possible enhancements here and there, but nothing that's critical or major. One example is when you're trying to link test cases to requirements, a lot of time there is duplication between the two. Sometimes you want to tie in some of the same test cases to the same requirements. An enhancement would be a quick way to copy that over directly without having to manually link every single one again. We have some instances where a large chunk of test cases are tied, re-used, and similar. When you get upwards of 15 or 20, to limit some of the tediousness of doing them all manually, if you could take a copy of the links from one and switch them over to another, that would be helpful. It's not of major concern. It would just be nice as a quick way to do it.

Another example is that with the charts — and again, great intention — you can put in a date range and apply it. Then you get to another screen and come back. After updating several charts, the date range is gone again. You have to go back in and it's sometimes two to three times before that date range is saved. 

For how long have I used the solution?

It's just about a year since we procured licenses. We've been using it for about 11 months.

What do I think about the stability of the solution?

Stability with qTest is not an issue at all. We've had no downtime and no complaints, along those lines, with anything at all. qTest, by all means, is definitely one of the top test management tools out there.

What do I think about the scalability of the solution?

We're not a big shop so for our situation it's fine. We haven't seen any bandwidth issues with running in the cloud. People are accessing this tool across the globe and we've had no complaints or issues.

We don't plan on rolling it out further until we see the analytics portion of it. Our plan is that we will pick back up again at the start of the calendar year, once we see, at the end of this year, what analytics has to offer and once we get that working. Then we'll go back to the drawing board on how we can use it and then we'll roll it out and provide training.

How are customer service and technical support?

They have been doing okay in terms of the suggestions we make. It depends on the level of severity of what had occurred, what changes are needed. But they're responsive. We do get communications from the support team pretty well and our account manager is pretty good on following up on things.

For the most part, first-tier support has to ask some basic questions, but they're pretty good. There is room for improvement on communication response time from first-tier support. What we do is we wind up copying our account manager on tech support requests so she can assist in following up a little bit quicker. Ideally, we shouldn't have to do that, but we have learned to do that and it does make it a lot faster.

Which solution did I use previously and why did I switch?

We worked with a customized plugin within JIRA, not even a basic, off-the-shelf version. It was an in-house created module that was built to integrate. They couldn't afford to buy a plug-in, so they made one. That was why we started looking for a new solution. It was horrible. I would have preferred Excel.

How was the initial setup?

Because we have used tools like this in the past, we knew what we were getting into and we hit the ground running. So the initial setup was pretty straightforward. Compared to vendors we've worked with in the past, they've been extremely responsive, especially on the client success side of things. We've had that type of support and they have made sure that our needs are met. They have set us up with training and the like and that has been a really good experience.

Our deployment of the solution took a couple of months. Our complexity was that the test cases were being managed as tickets within JIRA and not necessarily using a test management plugin. The conversion of the test cases, and ensuring they were being transferred and translated into a single entity of the test case, was quite a big project.

What we were using before was a JIRA plugin. Given the way it was designed, what we had to do was extract everything into Excel and then import things in. That part of the tool works phenomenally. It's just that we had well over 20,000 test cases to deal with. We wanted to make sure we organized them into libraries. So it took a bit of time to get everything instated in proper order; to make sure that we didn't just dump everything in there.

We had one person doing the initial deployment. On Tricentis' side, there were two people involved in training us as well as our client support person. At this point, there are just two of us who are managing the tool. We tag team, but being that I am the senior director of the organization, I've tried to become the subject matter expert. I didn't really have anybody to delegate it to. That's why it's been a challenge that Insights is not behaving for us.

We've got 50-some licenses, but we probably see a peak of concurrent at no more than between 15 and 20. We're a medium-size company with about 1,300 employees. Mostly it's quality engineers who are using it. Developers have access to help with test cases. We're trying to get scrum masters in there to use Insights but with the challenges we've had with it we've backed off the roll-out of that.

qTest, is being used quite extensively. But there are just two of us who mostly use Insights. It's good in its ability to correlate all of the results coming from a double-digit number of scrum teams from across the globe. We can see the status of that testing.

For our team, the adoption of the solution has been fantastic. It has been well-received. You couldn't ask for a more straightforward, user-friendly, easy-to-use tool on the qTest side, from a user perspective.

What was our ROI?

We have absolutely seen ROI. We didn't have good visibility and transparency.

Don't get me wrong about Insights. For basic "not run," "pass/fail"-type metrics it is fine. It gives us much more visibility than we had in the past in terms of the ability to collaborate on the design, review, tracking, and archiving of the test cases, and the basic results of some of the sprints.

What's my experience with pricing, setup cost, and licensing?

We're paying a little over $1,000 for a concurrent license. One of the solutions we looked at was about half of that but that one is very much a bare-bones test management tool.

There are no additional costs. We pay a flat yearly rate for each license.

Which other solutions did I evaluate?

We looked into SmartBear and Zephyr, and not that we would purchase Quality Center, but it was used as a benchmark.

The main reason for going with qTest was not only that their test management application is more feature-rich and a good solution compared to others, but the ability to create a dashboard and report on a ton of metrics. We could have saved a lot of money, but I pushed hard for paying a premium to get the Insights dashboard.

What other advice do I have?

The biggest lesson I've learned from using the solution, because of the Insights challenge, is that I would probably do more of a formal trial. They are aware there are issues with it, and they are going to work on it.

Absolutely use it for its test management capabilities, without a doubt, but have an alternative solution for your reporting metrics.

Your testing using the tool is not going to change the result of the testing. It's just that the means are more efficient. Our testing scope has been the same and our processes have all been the same. But we're implementing a tool that's a little more organized. We're not really going to become better testers just because we're tracking things a little bit differently. It gives us more efficiencies and an overall improvement in the transparency and visibility of testing progress and its status. qTest has been pretty rock-solid.

Which deployment model are you using for this solution?

Private Cloud
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
PeerSpot user
Buyer's Guide
Tricentis qTest
October 2024
Learn what your peers think about Tricentis qTest. Get advice and tips from experienced pros sharing their opinions. Updated: October 2024.
816,406 professionals have used our research since 2012.
Senior Architect at a manufacturing company with 1,001-5,000 employees
Real User
Helps us to quickly come up with a test plan, but overall, it's not as intuitive to use as it could be
Pros and Cons
  • "The most valuable feature is reusing test cases. We can put in a set of test cases for an application and, every time we deploy it, we are able to rerun those tests very easily. It saves us time and improves quality as well."
  • "You can add what I believe are called suites and modules. I opened a ticket on this as to what's the difference. And it seems there's very little difference. In some places, the documentation says there's no difference. You just use them to organize how you want. But they're not quite the same because there are some options you can do under one and not the other. That gets confusing. But since they are very close to the same, people use them differently and that creates a lack of consistency."

What is our primary use case?

We use it for QA software that we build. 

How has it helped my organization?

It boosts productivity because we're able to quickly come up with a test plan, as opposed to doing it from scratch each time or from something homegrown.

What is most valuable?

The most valuable feature is reusing test cases. We can put in a set of test cases for an application and, every time we deploy it, we are able to rerun those tests very easily. It saves us time and improves quality as well.

It also helps us to identify defects before we get them into production. And, overall, it has increased testing efficiency by 30 percent in terms of time.

What needs improvement?

The information that qTest provides to executives could be better. If there are tests that have a lot of steps in them, people will go through and do seven out of eight steps, but it doesn't show the test is complete. So from a metrics perspective, what executives normally see is that it looks like nothing was done, even though they did seven out of the eight steps.

In addition, you can add what I believe are called suites and modules. I opened a ticket on this as to what's the difference. And it seems there's very little difference. In some places, the documentation says there's no difference. You just use them to organize how you want. But they're not quite the same because there are some options you can do under one and not the other. That gets confusing. But since they are very close to the same, people use them differently and that creates a lack of consistency. My preference would be that qTest establish the way they do it and everybody has to do it that way, so everything is done the same way.

In response to my ticket, they said that they are the same and that you can choose whichever one to best organize how you want to organize. But the problem is that everybody in the organization makes a different choice. And they sent me a link to the documentation. Some of the documentation does say that there are some differences. There was one thing, like importing tests or something, that we could do under one but not under the other. That really made it a mess. That's the only really big concern I have had.

For how long have I used the solution?

We've been using qTest for between six months and a year.

What do I think about the stability of the solution?

It seems very stable.

What do I think about the scalability of the solution?

Scalability gets to be a little bit of a mess. I've never seen a performance issue but, as we continue to add projects, especially if somebody has access to a lot of the projects or is an administrator who has all the projects, it feels a little bit unorganized. There's too much stuff. When I create projects, for example, they're in my dropdown forever, as far as I know. That just creates a huge list of products. I would like, when a project is done, to get it out of my face.

How are customer service and technical support?

Tech support did answer promptly. My issue is not the fault of the tech support. The tech support did fine. The issue I described above is the only time I've contacted them.

Which solution did I use previously and why did I switch?

In this organization, Tricentis was the first. In my last job we used Micro Focus Quality Center. Both it and qTest are a pain. They're pretty similar.

How was the initial setup?

The initial setup is a little bit wonky. What you need to do to get the job done is not intuitive. It takes more time to train people than if it were a little bit simpler.

Getting all the products set up and getting all the testers assigned took a while.

The adoption of qTest in our organization has been average. People aren't against it. They comply. But again, because we don't have a formal QA team, it's our biggest option. When we ask people on the business side to use it, they are pretty good about using it, as long as we show them how to.

What was our ROI?

It does what it's supposed to do. I don't know what the organization paid for it, but it is getting the job done that it's supposed to get done.

What other advice do I have?

I would recommend planning how you're going to organize using it and have everybody organized the same way as they use it. A lot of times you see this in software: They build in flexibility thinking they're doing you a favor because they're making it flexible and thinking you can use it the way you want. But if you have ten users, those ten users each use it ten different ways. If there's no flexibility at all, the ten users use it the same way. To me, that's almost better. Even if it's not exactly how we want, at least it's the same. Uniformity, over being able to choose exactly how I use it, would be my preference.

The biggest lesson I've learned from using qTest is that we need dedicated QA people. What will happen is something like the following. I have a developer, Amos, who, thinking he's doing the right thing, goes in and loads up 20 tests and then he gives that to the business to test. And they think, "Hey, the expectation is that I do exactly what this thing says." The problem is we only then test it from the perspective of the developer. We're not actually getting the business to think about what they should look at or, better yet, developing a dedicated QA team which knows to look for defects. It's a myopic perspective on testing. And because of that, we do not find as many defects as we would. That is not a qTest issue, though. If we had a dedicated testing team using qTest, that would be ideal.

We have not seen a decrease in critical defects and releases since we started using it but I wouldn't blame qTest for that. It's more that we do not have a dedicated QA team. My management team seems to think that qTest is a substitute for a dedicated QA team and we have the developers and the business desk use it to test. But developers and business are not as good at finding defects as a dedicated QA team is.

In terms of maintenance and for administration of the solution, we don't have anybody dedicated to those tasks. People do the maintenance needed to get done whatever they need done. It's mostly me who creates projects, adds users, etc.

We have 56 users, who are primarily developers and on the business side.

Overall, it gets the job done, but it's a struggle to do it. It's not as intuitive to use as it could be.

Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
PeerSpot user
reviewer1229907 - PeerSpot reviewer
Division Chief with 10,001+ employees
Real User
Puts my entire team in a central testing system and results are automatically sent to tickets in JIRA
Pros and Cons
  • "The JIRA integration is really important to us because it allows our business analysts to see test results inside the JIRA ticket and that we have met the definition of "done," and have made sure we tested to the requirements of the story."
  • "The installation of the software could be streamlined. We pay for the on-premise support and they help us a lot, but the installation is something which is very command-line oriented."

What is our primary use case?

We use it for centralized management of our test cases and test requirements. 

We get pretty specific with it. We have a lot of scenarios and some pretty complicated business logic we have to test.

How has it helped my organization?

What has improved is that I've got the whole team now actively in one, central system, developing their test cases and recording the results. The results are automatically captured and sent over to the tickets in JIRA to show that the work has been completed and everything has passed.

I wouldn't say it solves issues, but it definitely gives me a quick way to say, "Yeah, we did test that." So when issues are presented, I can quickly go in and look, as the manager of the group, and say, "Yeah we did test that and it passed."

We have also seen a reduction in critical defects, by half now, over the last three months. And overall, the has solution increased testing efficiency by at least 50 percent.

What is most valuable?

The JIRA integration is really important to us because it allows our business analysts to see test results inside the JIRA ticket and that we have met the definition of "done," and have made sure we tested to the requirements of the story.

The integration with JIRA works great. We had to get support involved to help increase the number of connections because we had a lot of tickets, but it's very seamless. Once you have it set up it works really well.

What needs improvement?

The installation of the software could be streamlined. We pay for the on-premise support and they help us a lot, but the installation is something which is very command-line oriented. I don't know if there's a way to get the installation process to be less command-line oriented, but I would like to see that happen.

For how long have I used the solution?

We've been using Tricentis for six months.

What do I think about the stability of the solution?

The stability has been great. It's very stable and we have had no issues.

What do I think about the scalability of the solution?

We're such a small group, we only have 19 user licenses so we don't ever really push the limits on it. The users are all testers and we're using it daily.

I recently learned that my business analysts and some of the program managers need access to it, so I have to look into get more licensing. At the time, we thought it was strictly for testers. We didn't realize everybody could use it and benefit from it.

I'm the configuration manager and the test branch chief and I support my application management division chief who has the developers and the business analysts. They're the ones who put it all in motion. We have about 60 people who do the whole application development and software development lifecycle.

How are customer service and technical support?

Technical support has been awesome. There is no doubt in my mind that it's been very good.

Which solution did I use previously and why did I switch?

We were using Rational ClearCase before and we switched because it's antiquated and outdated.

We did Excel spreadsheets and saved stuff on network drives. We use the unified functional testing as our automated stuff but it was all very manual. This is way better than what we were doing before.

How was the initial setup?

Getting the software installed was pretty complex. It was all command line. They removed the MongoDB, which was good, in 9.6.

But once it was installed, the configuration of getting the users in there was pretty straightforward and the integration for JIRA was straightforward. Setting up the projects was straightforward. I had to fish around a little bit to understand the different ways I could set up project admins and the working groups. And I'm working on who are the people who will have read-only access.

We spent about a month on the deployment.

As part of the implementation strategy, I got my strong, federal staff involved to look at it and understand how we would use the integration with our planning, and how we would expect people to use it. We considered what kind of training we would put together for them. We had the training from Tricentis and we went through that, but then we had our own in-house training, based on our business practices, to try to show how we use the tool.

Adoption of the tool is really easy. The tool is intuitive enough. It was easy to get people into it.

What about the implementation team?

It was all internal. In the government, we don't have a lot of money.

What's my experience with pricing, setup cost, and licensing?

We're paying $19,000 a year right now for qTest, with 19 licenses. All the on-premise support is bundled into that.

Which other solutions did I evaluate?

We were looking at Tricentis Tosca, when qTest was still QASymphony, before it became Tricentis qTest. Those were really the two we were looking at. We weren't looking to replace our UFT solution because, to me, it's something we already purchased. But we were looking to expand, for our testers; looking for easier ways for them to get their work done. Not all of them are coders.

QASymphony met the immediate needs, which were that we needed a central place to manage all our test cases. We needed to get away from the Excel spreadsheets. We needed to get away from storing stuff on network drives. This solved a long-time problem of stuff being scattered everywhere.

The con with qTest is that it has plugins to run UFT automation, or it creates Selenium scripts for us. But Tricentis Tosca is the one that I am looking to download a demo of. I've got some links here to evaluate how it would help my testers, who are really good at manual testing, to start creating automated tests, using that software suite. And how do we integrate it with qTest, ultimately?

What other advice do I have?

qTest is something that the whole software development team can utilize. It's not just for testers. You would probably get your main licenses for the testers, but for the rest of the team, who are in and out of the tool throughout the day, you can get a set of concurrent licenses for them.

The biggest thing I've learned from using this solution is that we should have done it sooner.

I've used Insights a little bit to help me with managing my people and it looks pretty cool, but that's about as far as I've used it. And in terms of the solution's reporting enabling test team members to research errors from the run results, we're haven't got quite that far yet. As we get more information in there, that will be the next step for us to start looking at, so that they can start researching errors. We're really working on the quality to where, hopefully, we're releasing good quality and no production issues.

We've got the UFT automation set up on the server. We just haven't finished putting in the scripts and seeing how to use the test execution part of qTest to run that, and how the results are put into qTest. That's our next step.

Right now, I can say it's an eight out of ten, and that's just because we haven't made it through all the features of the product yet. But we are very happy with what we see so far.

Which deployment model are you using for this solution?

On-premises
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
PeerSpot user
reviewer1219371 - PeerSpot reviewer
Manager, IT Quality Assurance (EDM/ITSRC/Infrastructure) at a financial services firm with 1,001-5,000 employees
Real User
Integration with JIRA makes all test cases available to anybody in the company with JIRA access
Pros and Cons
  • "The solution's real-time integration with JIRA is seamless."
  • "qTest offers a baseline feature where you can only base sort-order for a specific story or requirement on two fields. However, our company has so many criteria and has so many verticals that this baseline feature is not sufficient. We would want another field to be available in the sort order."

What is our primary use case?

qTest is our test case management tool.

How has it helped my organization?

Our company's workflow starts in JIRA. We create epics, stories, bugs, etc. All of those things are integrated within qTest. There was a disconnect before, with the testers working in Quality Center, while developers and business analysts were working in JIRA. qTest has eliminated that piece, because there is a specific JIRA integration. All the test cases are available in the links section within JIRA, so they're visible for anybody in the company who has access to JIRA. They can pick up the item, the cause-of-issue type, and look at a story or bug and see what level of QA testing has been done and whether its status is pass/fail. All of the test statuses are available in the story itself, so there is one place to view things.

We also use that information for release management. Every release will have an associated JIRA tag for release to production. It's easier for the change-management people to look at JIRA itself and see what level of testing has been done, if it's pass/fail, etc.

We use Selenium WebDriver for test automation. We use Python automation scripts which are located in BitBucket, the central location where we keep all our automation scripts. We execute these scripts with Jenkins and then use a qTest plugin to push the results from Jenkins to qTest test results, once the executions are over. We can also run the same automation scripts within the qTest Automation Host feature. Through the Launch feature we can kick off automation scripts, which are available in BitBucket. So we can either use Jenkins or qTest to run the automation scripts. Because of the reporting mechanism we are directly passing test results to the execution tab, so senior staff can see how many scripts we ran, how many passed, how many failed, in a detailed report in Insight. Jenkins has the ability to talk to qTest. Previously, when we used Quality Center, it didn't have any capability to talk with JIRA.

qTest also provides our team with clear demarcations for which steps live in JIRA and which steps live in qTest. It's a positive feature. It improves our understanding of expectations as to what requirements are to be filled in through JIRA, and blind test-case management and controls available in qTest. It also separates roles and responsibilities and allows people to work within their boundaries.

What is most valuable?

The solution's real-time integration with JIRA is seamless. With ALM or QC, we had an additional plugin which was a schedule-based integration between the tool and JIRA. Sometimes things errored out and there were too many integration issues. We didn't have an updated sync-up between JIRA and Quality Center. Quality Center was predominantly used by testers and JIRA was being used by other users in our company, including PMs, DSAs, and devs. qTest solved one of those challenges for us.

The reports qTest provides to executives are pretty good because what we do is mimic what we used to do with Quality Center: defect reports, throughput reports, aging reports, and an execution summary report to show how many test cases there have been. The executives are reviewing them and they do not find any difference between what we had in Quality Center versus what we are doing in qTest.

What needs improvement?

We are starting to use qTest Insights a little bit. Right now, on a scale of one to five, I would say the Insights reporting engine is a three because we are facing some performance issues.

For example, qTest offers a baseline feature where you can only base sort-order for a specific story or requirement on two fields. However, our company has so many criteria and has so many verticals that this baseline feature is not sufficient. We would want another field to be available in the sort order. When tickets come over from JIRA, it would be helpful to be able to sort by sprint, to begin with. And within the sprint there are labels or subcategories. Currently, it allows us to only sort on a sprint and then subcategory. We would like to see things bucketed or placed in a folder with a status within the subcategory. We need three fields instead of two. When we raised this item, Tricentis said that it's a feature request. 

Also, the features that are customizable or specific for a team are still not available in Insights reporting. We have submitted approximately 15 or 20 tickets to Tricentis so far to address those features/enhancements/bugs. That's all in the works.

Another important issue is that we can export from JIRA. When we export, any attachments in JIRA will be part of the export. Although qTest is just plugging into the test cases, it's not letting us export attachments from JIRA. Everybody else in the company who is operating in JIRA would like to see the attachments that come out of the integration links, meaning the test cases. That is actually a feature that has to be set by Tricentis and not JIRA. Again, that required a new feature request to be submitted.

For how long have I used the solution?

We have been using qTest for around six months.

What do I think about the scalability of the solution?

So far the scalability looks pretty good. I cannot say for sure because in a matter of six months we have 26 projects that are live and functional. So far, so good, but I cannot talk about the scalability yet.

From what I have heard from Tricentis, there is no restriction on data storage. In terms of latency, because the application itself is in cloud, and we shouldn't be seeing any performance issues accessing qTest.

We have about 55 people, contract testers, who have access to the edit, add, and execute features. We have three admins. And we have about 100 people who are view-only users of qTest items. We don't require any people to maintain the solution since it's hosted on the cloud.

We definitely anticipate increasing the number of projects in qTest.

How are customer service and technical support?

Technical support is friendly and quick. Most of the time we get a response the same day. They're located in Vietnam. There is a ticketing process. If we have an issue we open a ticket with them. If we need to, they will schedule a meeting with us to complete the request. They respond on time.

Representatives come over or Skype us to tell us about the next version date and the like. We get the communications from Tricentis indicating the dates of rollout of new versions.

Which solution did I use previously and why did I switch?

We used to have Micro Focus ALM Quality Center as our test management tool and we were nearing our licensing limitation at the time. We evaluated a couple of tools in the market and we picked up qTest because it had a better reporting mechanism and dashboard features, along with a clean integration with JIRA.

How was the initial setup?

The initial setup was straightforward. There were clear project templates and clear user templates available. We were able to add and update roles as needed. The user list was already available. All we had to do was checkmark and save. It was really seamless setting up users within the tool. 

Likewise, we could model it as a waterfall or agile template, and it then gave us the workstreams created in the folder structure mechanism within qTest. These are all good features that allowed us to quickly set things up and keep things moving.

It's hard to say how long it takes to set up qTest because it's handled by Tricentis. All they told us was that they had finished their deployment.

We were given a sandbox and some sample projects to be evaluated and tested. We had a month or so during which all our testers were given access to those sample projects. We tested them and we said we were good to go. The production environment was then available for us to roll out our projects.

Our organization’s adoption of the solution has been pretty positive. Users were looking forward to it. They embraced it pretty quickly and pretty well.

What was our ROI?

It's too early to tell about a return on investment.

What's my experience with pricing, setup cost, and licensing?

I believe we have an annual subscription.

Which other solutions did I evaluate?

We evaluated QASymphony and QMetry. To begin with, we had a list of about ten tools that researched on the internet and via some phone calls. We narrowed it down to these two and Tricentis.

The main differentiators were the dashboard and reporting mechanism, the artifact reporting mechanism, and the JIRA integration. Those were the reasons we chose Tricentis.

What other advice do I have?

It's a simple tool. The usability is pretty straightforward. For a QA tester who is an active user, the UI is pretty simple, the linkage of requirements to test case is simple, and there is searchability of test case across the project. Overall, it's better than Quality Center in the ways that I have explained.

My suggestion would be to always put your use cases up-front with vendors whose tools you're looking at. Ask for a demo and make sure that your use cases match up to the demo that's being performed. We had some challenges with JIRA and the Quality Center integration, real-time interfaces, the time lag, and visibility for all people into the tool. These were challenges in Quality Center that qTest has overcome.

At this time the report section in qTest is used by QA managers and above. Our QA testers are not looking directly into the reports to gather stats. It's the QA managers, directors, and VP, as well as people in IT, who are starting to look at the metrics within the reports and who are trying to form a consensus on the reporting bugs. Our testers just log bugs and Insight reports are gathered by a QA lead to create a defect summary. Then the QA lead talks to the PM, and dev, etc. So the reporting itself is not affecting the productivity of the testers. Overall qTest is not affecting our testers' productivity.

In terms of executives or business users reviewing results provided by qTest, we are just starting that process. They are reviewing the results but there is no active, standardized communication, back and forth, on the artifacts that they review.

I can't say we have seen a decrease in critical defects in releases. qTest is not even enabling people to put out the right requirements. The defect reduction is in the upstream work: how they write the requirements, how they code, how we test. qTest is just the tool that allows us to do a pass or fail. The tool itself is not an enabler of defect reduction.

It's a flexible tool but, overall, I cannot say that it has increased testing efficiency.

Which deployment model are you using for this solution?

Private Cloud
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
PeerSpot user
reviewer970854 - PeerSpot reviewer
Sr. Product Manager - Intelligent Automation & RPA at a financial services firm with 5,001-10,000 employees
Real User
Great for test management and automates a lot of the testing functions
Pros and Cons
  • "Works well for test management and is a good testing repository."
  • "Could use additional integration so that there is a testing automation continuum."

What is most valuable?

The solution works very well for test management and it also automates a lot of the testing functions so that you don't have to manage them in Excel spreadsheets. It doesn't go all the way to automated testing, but it becomes a good testing repository. If an organization is not fully ready to take advantage of automated testing, qTest is a good first step. The value with qTest is that it has nice hooks with the GR. When it comes to test management, it has good integration. 

What needs improvement?

I'd like to see better integration in the platform so that there is a testing automation continuum, where customers can easily mature through qTest and Tosca functionalities.

How was the initial setup?

It took a little bit of back and forth to get started but since then, it's working very well for us. We had to work with our support team and it took a little longer than expected. 

What other advice do I have?

I rate this solution eight out of 10. 

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
PeerSpot user
QA Expert with 10,001+ employees
Vendor
Agile Lifecycle Testing, Requirements Management, And Mapping To Test Cases

What is most valuable?

Provides agile lifecycle testing, requirements management, and mapping to test cases. Simple interface and good reporting. Good integration with JIRA. Importing and exporting artifacts is quite easy. Also, some automation integration with popular test frameworks to provide true end-to-end test case execution.

How has it helped my organization?

Organization was managing test cases using Excel, so bringing in any tool was helpful indeed. But in addition, this tool allows greater requirements and coverage tracking. Integration with automation allows testers to initiate test cases from the test case management tool itself with no additional work to mark results explicitly in the management tool.

What needs improvement?

There is some room for improving the documentation for the APIs that they expose. It is limited and requires some trial/error to figure out.

For how long have I used the solution?

15 months.

What do I think about the stability of the solution?

Not so far.

What do I think about the scalability of the solution?

Not so far.

How are customer service and technical support?

Good. Eight out of 10.

Which solution did I use previously and why did I switch?

Used Excel for test case management and evaluated open source TestLink for some time. Being an open source tool it did not have many features. Used Rational Quality Manager in previous organization but it did not have as much third party tool integration options or automation integration

How was the initial setup?

Deployed on Cloud. Using it as a service. So no comments on setup but configuration needed for our organization and projects was fairly simple.

What's my experience with pricing, setup cost, and licensing?

I can't comment, I wasn't involved.

Which other solutions did I evaluate?

I can't comment, I wasn't involved.

What other advice do I have?

It's quite easy to use and integrate with any popular tools you might already be using. Would definitely recommend.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
it_user589632 - PeerSpot reviewer
Sr. Portfolio Manager - Testing at a tech vendor with 201-500 employees
Vendor
Ensures proper test coverage and test status tracking. The graphical reports need to be improved.

What is most valuable?

The test repository, requirement traceability matrix, test execution and reports are the most valuable features.

How has it helped my organization?

It is the single source for repository and traceability. This ensures proper test coverage and test status tracking.

What needs improvement?

The graphical reports, API integration with the customized automation test tools and support for the same need to be improved.

For how long have I used the solution?

I have used this solution for around two months.

What do I think about the stability of the solution?

There were no stability issues.

What do I think about the scalability of the solution?

There were no scalability issues.

How are customer service and technical support?

The technical support is good.

Which solution did I use previously and why did I switch?

We were not using any other solution before.

How was the initial setup?

The setup was straightforward, but we have yet to use the automation integration.

What's my experience with pricing, setup cost, and licensing?

The pricing can be still competitive.

Which other solutions did I evaluate?

We evaluated the Hiptest solution.

What other advice do I have?

This is a good test management tool, but you need to watch out for the price aspect and the test harness automation integration of the tool.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
Buyer's Guide
Download our free Tricentis qTest Report and get advice and tips from experienced pros sharing their opinions.
Updated: October 2024
Product Categories
Test Management Tools
Buyer's Guide
Download our free Tricentis qTest Report and get advice and tips from experienced pros sharing their opinions.