What is our primary use case?
We use it to do end-to-end testing for the business. After development has occurred and once we're into verifying that no regression has been broken, it's at that stage of testing that we deploy it.
How has it helped my organization?
We are a subsidiary of a larger company and we are focused on rolling it out, at the moment, to our larger company. With the tool's simplicity of use, where we are able to have a code review occurring, in that sense it will be useful in being able to roll it out to the greater company. We will be able to give it to the people who are experts in their areas, rather than trying to pass off test cases to one centralized location. It will be centralized automation and we'll just have one central COE.
Automation using the solution has saved testing time. I couldn't give you a number of hours or days because we're still in the beginning stages of trying to roll it out globally. We haven't been able to use the product and reuse automation. The whole point of automation is that the upfront cost to automate something is heavier and then, as you reuse it, it reduces the testing cycle. We're still investing in the earlier stages where perhaps we have spent equal parts right now, but we intend to see a reduction as we capture more and more.
Certify has also enabled us to find more defects. While I'm focused solely on automating and testing, so I don't have access to the defect count number, I know we have found defects, which tells me that we are finding defects that wouldn't have been found otherwise, or defects that wouldn't have been found as quickly.
What is most valuable?
- The dataset.
- The reusability.
It does allow for good reusability. When it's designed properly and utilized properly, we can put things in a way that allows for reusability, meaning a lot of reuse of VA01, if they're very similar flows, to keep it simple. And if we do have problems with a more complex flow, we'll make another version of VA01 that targets edge cases.
In terms of web UI testing, we've done very limited Fiori testing, but we have done Salesforce and a few others. Our experience is that when we get that stuff applied properly and working properly, it works very well. They're usually built well and if we do have problems with them we can get Worksoft to fix them. A lot of the times, if we're running on something that doesn't have an XF definition for it, by understanding how it's building objects, we are able to easily map objects fairly well and quickly.
The solution's ability to automate testing for packaged applications like SAP and Salesforce is related to when they do have that XF definition, but I do think it works very well. That's especially true for the SAP integration. That interface is very solid and objects are just about always discovered properly.
Since they updated the Capture feature to a more "Snagit" look and feel, it has become our primary tool. We've moved off of the old LiveTouch functionality. We will use it occasionally, but with Capture being built-in, it's easier for users to be trained on one tool. That tool has enough capability to be able to do both verifying the properties and recording the playback. It works well for us.
What needs improvement?
Looking at it as a product fully packaged, I would like to see more documentation or ease of use of the documentation. Sometimes documentation does exist but we have to search three different sites to find the proper way to do things or track down the technical document that explains certain fields.
That, in turn, relates to the ease of use and how objects interact with each other. The application could lend itself to be simpler.
Another area that I would like to see improved is how the permissions are applied. If you're applying permissions groups to a user, one of the options is to delete the group entirely and lose the entire permission group, rather than just deleting the permission from the user, which seems a little silly. In my opinion, that whole module of permissions is very confusing and lends itself to common errors. We have to rebuild permissions occasionally.
The functionality is all there. I just think the way it's packaged can be confusing. We are successful and we can get things working the way they're intended to in Worksoft. It's just that sometimes finding how to do that, or where it is described, can be difficult.
For how long have I used the solution?
I've been using Worksoft Certify for about the last year-and-a-half.
What do I think about the stability of the solution?
It's a pretty stable application. When it works, it works well, and it seems to work consistently. And when it doesn't work, it does not work — if that makes sense. When we see it functioning, we've got everything just right, it frequently seems to function solidly. And then, when we seem to have problems, it seems to not function at all, meaning tests will not run, or we cannot get a script to work in this or that particular way at all. But we've been able to work through all of our non-functioning issues through their support.
What do I think about the scalability of the solution?
The solution will enable us to scale up our testing. With our focus being more on regression testing, it increases the testing of existing functionality first, and then we'll bring in that new functionality.
We are planning on rolling this out to more people, multiples of the number we have using it today. We think that it should be scalable but we haven't done it yet on that scale so we don't know for sure. But we do feel it will be scalable and that it will scale well.
Our extent of usage is pretty narrow at the moment. Approximately 10 people are using it right now and they are mainly automation engineers. There are a few directors using it to understand what the product is. People who we would consider to be "automation champions," who will help champion the product at our global headquarters, are being trained on it right now. They're not actually going to use the application, they're just going to understand it so they can help champion it and bring it on, full-scale, with user acceptance.
Our main users in the future will be those information business analysts who know their respective products very well, the ones who are making the changes in targeted areas and who can easily reach out. They will be able to quickly test and record whatever they need to record for testing. We're looking at anywhere between 20 and 50 additional users within the next year, depending on how well user-acceptance goes, and expansion will continue from there.
How are customer service and technical support?
I'll start with our positive experience. We always end up with some kind of resolution whenever we do submit something through support.
There have been times though where their support has been very slow or difficult, where we end up with a level-one support for what feels like much longer than we should have a level-one for the issues we have. These are high-end issues that mean we can't function. That's been a frustration point for us. We've had to meet with Worksoft to talk about the support that we're getting.
As we start to build better in-house knowledge of some of the caveats of Worksoft though, that support has been needed less. That has made things a little better for us and that's why we focus heavily on training and having supporting documents on what we're doing.
Which solution did I use previously and why did I switch?
I have used Winshuttle as well as DataLoad, which is an open-source and much more simplistic. Winshuttle is used more for something like an RPA function.
Certify has a much deeper bench in terms of what it can actually do. Winshuttle is only functional, to my knowledge, with SAP applications because it's built on the scripting portion of SAP. Its focus isn't for testing, so it's not a good tool for testing. But it is more simplistic in the sense that it looks like a spreadsheet and the result is provided in the last column of what the status bar gave you. It is really designed for one Pcode at a time, in my opinion. Whereas with Certify, you can run a larger-scale test or function or even a larger-scale RPA function, compared to what Winshuttle can support. The complexity involved in that is much harder. It's something of Catch-22, but Certify does enable you to do much more.
How was the initial setup?
I can't speak to about the installation process, as we have a different person who manages installation. As far as setting up users goes, it's fairly simple within the application, once it's installed and functioning on the servers.
We started out with one model of being centralized and we're rotating to a decentralized model of sharing this out with more users and increasing usage. It's almost like we're in a second deployment of the product, and using more of the tools.
We're rolling it out to the specialists in each business area, on the information systems side. These are the people who are producing changes and who understand the changes and updates quite well. We'll have them write the scripts themselves, with our support as the center-of-excellence team. The idea is that they will be submitting the scripts that they've written back to us for code approval and then promotion to gold, to be able to be run regularly, as a script that's been validated. It should work well and be successful for them. We'll give them help with training, etc., in the Worksoft product itself. We're trying to focus on somebody becoming an application expert, for each application we're testing, and to be an application expert for the automation product, allowing them to function well enough within the Worksoft application.
The person who is responsible for installation is also responsible for maintenance of the solution. Like me, he is an automation engineer, but we have different focuses.
What other advice do I have?
My advice would be to develop a very good training program to go with it. Also, understand how to build a good structure to allow for success and to limit exposure where people are editing things that they shouldn't be editing. You should also partner or work with other businesses that have used the solution successful. Build up industry contacts who can help you understand where they're going and where they're having problems, as well, with the model they're implementing.
The biggest lesson I have learned from using Certify is that you can design it to be way more complex than you need to, and you need to be very careful, when you're designing the solution, to design it in a very simplistic manner. It's almost like code in that it enables you to do things that are very complex, but you need to be very cognizant that you shouldn't always do the most complex flow, and that you shouldn't overly design logic out of any one script. They should be relatively simple.
Regarding ease of use, once you understand how to use it you can use it very effectively. But at times it's difficult to understand what the application is doing, what you are actually editing, within the application. So at times, when it comes to certain objects, you might not realize you're editing another object, in a way, unless you've used the application and understand how it actually builds together. It is simple once you know what you're doing, once you understand how all the objects work together, but leading up to that it can be more complex. We overcome that with training, reference documents, and a lot of training documents. We did an intro training with our team just yesterday. We're rolling out more globally, so we're training and trying to have a center of excellence team that can help out with these concepts. For example, they can help design better training to understand, "Hey, when you're editing here, you're doing this." We're trying to do more targeted training to the things we do with our standards inside of Worksoft.
As far as the Capture documentation goes, for us, it's almost too detailed. We've actually implemented a custom solution for documenting, because we need something that's simple, almost like what users would experience for test cases for manual testing. We also designed our own solution for that, in part, because we utilize a lot of Selenium-style code and we need to be able to record results that are occurring in that application. We'll call Selenium and Worksoft and we need to have a consolidated results report. We don't utilize, and, just to be clear, we've never purchased, BPP (Business Process Procedure) so I don't know any of that functionality. But with our unique set up, it did not make sense to utilize those reports. The reporting that is built into Worksoft is good for development cycles, developing scripts, but we don't use it for result-reporting, in the sense of whether the test passed or failed. We've narrowed it down into a custom application.
While it does allow for good reusability, even if best practices are followed, at times it's hard to identify if you have the same components or processes being built. That can be hard to recognize. For example, there will be duplicate login scripts. The application doesn't seem to lend itself to being easy to manage for duplication of processes. We are trying to put workflows in place on our team to help identify duplication and to reduce it. We do intend to use Analyze as a way to help catch duplicate workflows.
We are working towards use of the solution for RPA testing, but our primary charter is to industrialize our testing cycle, and then we can move into something like that.
Which deployment model are you using for this solution?
On-premises
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
Hello, happy to hear that you are successful in testing SAP and ERP applications. Would like to show you new cross browser testing for web applications and Certify's ability to test complex web applications, especially complex web UI's like SAP, SuccessFactors, Salesforce etc.