Technical support for the solution should be faster. We have to further analyze what kind of CVEs are in the reported libraries and what part of the code is affected. That analysis can be added to the report that Contrast Security Assess gives. Further analysis should be done of the third-party libraries report that it gives. The solution should provide more details in the section where it shows that third-party libraries have CVEs or some vulnerabilities. The onboarding or the setup of Contrast Security Assess can get a little easier.
Senior Manager of Information Security at Kaizen Gaming
Real User
Top 20
2023-05-02T15:35:00Z
May 2, 2023
The product's retesting part needs improvement. The tool also needs improvement in the suggestions provided for fixing vulnerabilities. It relies more on documentation rather than on quick fixes.
Director of Threat and Vulnerability Management at a consultancy with 10,001+ employees
MSP
2021-06-18T08:38:00Z
Jun 18, 2021
The automation via its instrumentation methodology is very effective when the underlying application technology is supported. To instrument an agent, it has to be running on an application technology that the agent recognizes and understands. It's excellent when it works. If we're developing an application that is using an unsupported technology, then we can't instrument it at all. We use PHP and Contrast presently doesn't support that, although it's on their roadmap. My primary hurdle is that it doesn't support all of the technologies that we use.
Senior Customer Success Manager at a tech company with 201-500 employees
Real User
2021-02-17T23:07:51Z
Feb 17, 2021
Contrast is good at listening to its customers and setting product directions based on their feedback. Contrast continues to improve along multiple axes. One axis is languages and platforms. Support for Python was recently added and Go is in beta. Another axis is the deployment and configuration of agents. Contrast offers a lot of flexibility in agent management but is working on enhancements to improve centralized control.
Technical Information Security Team Lead at Kaizen Gaming
Real User
2020-09-14T06:48:00Z
Sep 14, 2020
During the period that we have been using it, we haven't identified any major issues. Personalization of the board and how to make it appealing to an organization is something that could be done on their end. The reports could be adaptable to the customer's preferences, but this isn't a big issue, as it's something that the customer can do as he builds his experience with the tool. On the initial approaches during the PoC and the preparation of the solution, it would be more efficient if we were presented with a wider variety of scenarios aimed towards our main concern, which is system availability. However, once we fine tuned those by their scenarios that they provided later on in our discussion, we fixed it and went ahead.
Regarding the solution's OSS feature, the one drawback that we do have is that it does not have client-side support. We'll be missing identification of libraries like jQuery or JavaScript, and such, that are client-side. The same thing is true on the custom code side: the client-side technology support. Although client-side technologies are inherently less risky than server-side technologies, which is where Contrast focuses testing, it would definitely help for this tool to identify both the server-side and client-side findings in libraries, as well as custom code. This would help us move away from using multiple tools. For example, if we have Contrast for our server-side testing, we still need to use some sort of static scanning sensor for the client-side. In a perfect world, it would just be Contrast Assess doing both of those.
I would like to see them come up with more scanning rules. I don't know how it was done within the tool, but there is always room for improvement. We recently had a call with the vendor. We were talking about a finding where it combined all of the instances of the finding into one. Whenever a new instance shows up that finding is being reported again. We want it to work so that once we mark it as "not a problem" the new one will be reported as a new finding, rather than an old finding popping up as a new instance.
Senior Security Architect at a tech services company with 5,001-10,000 employees
Real User
2020-06-07T09:09:00Z
Jun 7, 2020
Contrast Security Assess covers a wide range of applications like .NET Framework, Java, PSP, Node.js, etc. But there are some like Ubuntu and the .NET Core which are not covered. They have it in their roadmap to have these agents. If they have that, we will have complete coverage. Let's say you have .NET Core in an Ubuntu setup. You probably don't have an agent that you could install, at all. If Contrast gets those built up, and provides wide coverage, that will make it a masterpiece. So they should explore more of technologies that they don't support. It should also include some of the newer ones and future technologies. For example, Google is coming up with its own OS. If they can support agent-based or sensor-based technology there, that would really help a lot.
Director of Innovation at a tech services company with 1-10 employees
Real User
2020-06-02T08:40:00Z
Jun 2, 2020
The effectiveness of the solution’s automation via its instrumentation methodology is good, although it still has a lot of room for growth. The documentation, for example, is not quite up to snuff. There are still a lot of plugins and integrations that are coming out from Contrast to help it along the way. It's really geared more for smaller companies, whereas I'm contracting for a very large organization. Any application's ability to be turnkey is probably the one thing that will set it apart, and Contrast isn't quite to the point where it's turnkey. Also, Contrast's ability to support upgrades on the actual agents that get deployed is limited. Our environment is pretty much entirely Java. There are no updates associated with that. You have to actually download a new version of the .jar file and push that out to the servers where your app is hosted. That can be quite cumbersome from a change-management perspective.
There is room for improvement in the reporting. We're looking for a dashboard. One of the things that I have to do right now is export to Excel spreadsheets to get the management-level view that I need to present to the leadership team. We can do a report on applications within the tool, but as far as management goes, they want to see a high-level view of all the applications. They want to see the total number of applications, the total number of criticals, the total number of highs, and then they want to break each of those down and be able to manipulate that data in a dashboard. That's something that's missing. Because we have it on-premise, anytime we have to have maintenance, we have to schedule it and tell our developers to get out of it; that we're going to be doing an upgrade. That's our main gripe with our on-premise version. We don't get the updates as soon as a cloud version, so just two weeks ago we started talking about going to the cloud. We had our meeting about what we need to do and we have a followup meeting this Thursday to continue that.
Contrast Security is the world’s leading provider of security technology that enables software applications to protect themselves against cyberattacks, heralding the new era of self-protecting software. Contrast's patented deep security instrumentation is the breakthrough technology that enables highly accurate assessment and always-on protection of an entire application portfolio, without disruptive scanning or expensive security experts. Only Contrast has sensors that work actively inside...
Technical support for the solution should be faster. We have to further analyze what kind of CVEs are in the reported libraries and what part of the code is affected. That analysis can be added to the report that Contrast Security Assess gives. Further analysis should be done of the third-party libraries report that it gives. The solution should provide more details in the section where it shows that third-party libraries have CVEs or some vulnerabilities. The onboarding or the setup of Contrast Security Assess can get a little easier.
The product's retesting part needs improvement. The tool also needs improvement in the suggestions provided for fixing vulnerabilities. It relies more on documentation rather than on quick fixes.
The out-of-the-box reporting could be improved. We need to write our own APIs to make the reporting more robust.
The automation via its instrumentation methodology is very effective when the underlying application technology is supported. To instrument an agent, it has to be running on an application technology that the agent recognizes and understands. It's excellent when it works. If we're developing an application that is using an unsupported technology, then we can't instrument it at all. We use PHP and Contrast presently doesn't support that, although it's on their roadmap. My primary hurdle is that it doesn't support all of the technologies that we use.
Contrast is good at listening to its customers and setting product directions based on their feedback. Contrast continues to improve along multiple axes. One axis is languages and platforms. Support for Python was recently added and Go is in beta. Another axis is the deployment and configuration of agents. Contrast offers a lot of flexibility in agent management but is working on enhancements to improve centralized control.
During the period that we have been using it, we haven't identified any major issues. Personalization of the board and how to make it appealing to an organization is something that could be done on their end. The reports could be adaptable to the customer's preferences, but this isn't a big issue, as it's something that the customer can do as he builds his experience with the tool. On the initial approaches during the PoC and the preparation of the solution, it would be more efficient if we were presented with a wider variety of scenarios aimed towards our main concern, which is system availability. However, once we fine tuned those by their scenarios that they provided later on in our discussion, we fixed it and went ahead.
Regarding the solution's OSS feature, the one drawback that we do have is that it does not have client-side support. We'll be missing identification of libraries like jQuery or JavaScript, and such, that are client-side. The same thing is true on the custom code side: the client-side technology support. Although client-side technologies are inherently less risky than server-side technologies, which is where Contrast focuses testing, it would definitely help for this tool to identify both the server-side and client-side findings in libraries, as well as custom code. This would help us move away from using multiple tools. For example, if we have Contrast for our server-side testing, we still need to use some sort of static scanning sensor for the client-side. In a perfect world, it would just be Contrast Assess doing both of those.
I would like to see them come up with more scanning rules. I don't know how it was done within the tool, but there is always room for improvement. We recently had a call with the vendor. We were talking about a finding where it combined all of the instances of the finding into one. Whenever a new instance shows up that finding is being reported again. We want it to work so that once we mark it as "not a problem" the new one will be reported as a new finding, rather than an old finding popping up as a new instance.
Contrast Security Assess covers a wide range of applications like .NET Framework, Java, PSP, Node.js, etc. But there are some like Ubuntu and the .NET Core which are not covered. They have it in their roadmap to have these agents. If they have that, we will have complete coverage. Let's say you have .NET Core in an Ubuntu setup. You probably don't have an agent that you could install, at all. If Contrast gets those built up, and provides wide coverage, that will make it a masterpiece. So they should explore more of technologies that they don't support. It should also include some of the newer ones and future technologies. For example, Google is coming up with its own OS. If they can support agent-based or sensor-based technology there, that would really help a lot.
The effectiveness of the solution’s automation via its instrumentation methodology is good, although it still has a lot of room for growth. The documentation, for example, is not quite up to snuff. There are still a lot of plugins and integrations that are coming out from Contrast to help it along the way. It's really geared more for smaller companies, whereas I'm contracting for a very large organization. Any application's ability to be turnkey is probably the one thing that will set it apart, and Contrast isn't quite to the point where it's turnkey. Also, Contrast's ability to support upgrades on the actual agents that get deployed is limited. Our environment is pretty much entirely Java. There are no updates associated with that. You have to actually download a new version of the .jar file and push that out to the servers where your app is hosted. That can be quite cumbersome from a change-management perspective.
There is room for improvement in the reporting. We're looking for a dashboard. One of the things that I have to do right now is export to Excel spreadsheets to get the management-level view that I need to present to the leadership team. We can do a report on applications within the tool, but as far as management goes, they want to see a high-level view of all the applications. They want to see the total number of applications, the total number of criticals, the total number of highs, and then they want to break each of those down and be able to manipulate that data in a dashboard. That's something that's missing. Because we have it on-premise, anytime we have to have maintenance, we have to schedule it and tell our developers to get out of it; that we're going to be doing an upgrade. That's our main gripe with our on-premise version. We don't get the updates as soon as a cloud version, so just two weeks ago we started talking about going to the cloud. We had our meeting about what we need to do and we have a followup meeting this Thursday to continue that.