Contrast Security Assess is deployed on-cloud in our organization. I would recommend Contrast Security Assess to other users. It's a really good tool. It provides lots of details on web-based vulnerabilities, source code reviews, and third-party library issues. Overall, I rate Contrast Security Assess an eight out of ten.
Director of Threat and Vulnerability Management at a consultancy with 10,001+ employees
MSP
2021-06-18T08:38:00Z
Jun 18, 2021
Be prepared for the cultural change, more than the technology change. Most of the benefits that I have from the solution are the time savings where we're not scanning things and analyzing things. I now spend a lot of my time explaining to people how Contrast works, explaining to people how it changes our program, and explaining to people how Contrast fits into their development life cycle. If you're approaching it from a purely technical perspective, you're missing a big piece of what you're going to be spending your time on. I don't have any major complaints. Most of our challenges with Contrast have been how it changes our program and how it impacts the internal culture around development. Those are not really issues with the product itself. If we have had any kind of technical hurdle, it would be that a lot of our application owners might not understand the process for deploying the agent, which is when they instrument their environment. So we spend quite a bit of time supporting that part of the process, technically, which is not necessarily a good fit for a security program, having to tell people how to install an agent. It gathers data in real time, it gathers data from agents. It doesn't perform scans, rather, it observes traffic, and that's fundamentally different from the other tools and from how those tools are used in our existing processes. We spend a lot of time on culture and process and explaining how the technology is different. I find it very intuitive, but our users do not. We have developers who have spent the past 20 years thinking of application security in terms of a scan, and they're passive in that activity. The scan is something that's done for them, it's done to their environment, and then they're given data. Contrast is passive, it's an agent that's just gathering information, but it gives it to them directly, and that means they have to participate. They have to ingest that information, they have to be prepared for what they're going to do with it. They're not used to having that role. They're used to being the recipients of information or they're used to other people performing the service of scanning their environment, and Contrast doesn't do that. The biggest lesson I've learned is around how our developers think about security. When they're passive in that process, when somebody else is running scans for them and telling them what to fix, the way that they operate is different than when you give them an agent in their environment and you start giving them data from multiple environments and you start automatically sending that information to a bug tracker that they use. It's the automation and visibility that they've been asking for. But now that they're getting it, they are not exactly sure what to do with it. I was not prepared for having to have conversations about culture and process. Now that I have a better understanding of how our developers operate, what their metrics are, and how they're evaluated, as well as what constitutes success and what constitutes security on their part, it gives me a much better idea of how to interact with them. Before, we would talk about how we're seeing a certain type of security issue in our environment, and then we would try to figure out why our developers were continuing to make that mistake. Now, it's more about how developers utilize security data in their process and how we can improve that. Right now, the visibility the solution gives us is probably a little bit painful, because this is data that the developers didn't have before. We're identifying more vulnerabilities, and that is something they were not expecting. They were used to results that originated from our previous tools and they only had a handful of vulnerabilities to address. Contrast is now finding more issues with their applications as well as finding issues that are associated with libraries. That's a lot more data than they're used to receiving. And potentially, they're surprised by how vulnerable their applications are. The initial impact of having additional vulnerabilities that you were previously unaware of seems like a significant resource impact. A developer who normally only had to deal with a handful of findings may now have 10 or 20 or 100 findings to deal with. That may feel like a resource burden because you now have more things to fix, but ultimately that's going to be less expensive than the cost of a breach or loss of contract or anything else that might affect the business in the larger sense.
Senior Customer Success Manager at a tech company with 201-500 employees
Real User
2021-02-17T23:07:51Z
Feb 17, 2021
Start with a small app team initially, before scheduling a larger rollout. Teams that have been using SAST tools find that using Assess changes how they think about appSec in their development workflow and helps them identify process modifications that maximize the value of the tool. Overall, on a scale from one to ten, I would give this solution a rating of ten. The product is strong and improving, support is responsive and effective, and supported integrations work for many customers.
Technical Information Security Team Lead at Kaizen Gaming
Real User
2020-09-14T06:48:00Z
Sep 14, 2020
I would recommend trying and buying it. This solution is something that everyone should try in order to enhance their security. It's a very easy, fast way to improve your code security and health. We do not use the solution’s OSS feature (through which you can look at third-party open-source software libraries) yet. We have not discussed that with our solutions architect, but it's something that we may use in the future when we have more applications onboard. At this point, we have a very specific path in order to raise the volume of those critical apps, then we will proceed to more features. During the renewal, or maybe even earlier than that, we will go with more apps, not just three. One of the key takeaways is that in order to have a secure application, you cannot rely on just the pentest, vulnerability assessments, and the periodicity of the reviews. You need the real-time feedback on that, and Contrast Assess offers that. We were amazed to see how much easier it is to be PCI-compliant once you have the correct solution applied to it. We were humbled to see that we have vulnerabilities which were so easy to fix, but we wouldn't have noticed them if we didn't have this tool in place. It is a great product. I would rate it a nine out of 10.
Make sure you understand your environment before deploying. Try to get an idea of what technologies are in use by applications so you can group them and group the deployment and the implementation. That way you can focus on automating .NET deployments, for example, first, and then move on to Java, etc. The biggest lesson I have learned from using this solution is that there is a tool out there that is really changing the way that we are running security testing. In the security realm we're used to the static and dynamic testing approaches. Contrast Assess, as well as some other tools out there, has this new feature of interactive application security testing that really is the future for developer-driven security, rather than injecting security auditors as bottlenecks into the software development life cycle. I would rate Contrast Security at eight out of 10, and that is because of that the lack of client-side support and the troubles in automating the deployment holistically across an organization.
It depends on the company, but if you want to manage and maintain and onboard, I would recommend having Contrast as part of your toolkit. It is definitely helpful. My advice would be to install it on the environment in which there are more routes exercised, whether it is the testing environment or Dev, to get most out of the tool. In terms of configuration, we have Contrast on one of the applications in our testing environment and we have the other in the Dev environment. To decide on that took us some time because we didn't have access to all the environments of a single application. Findings-wise, Contrast is pretty good. It's up to the app engineer to identify whether a finding is due to the functionality of the application or it really is a finding. Contrast does report some false positives, but there are some useful findings as well from the tool. It cannot give you only true positives, so it's up to humans to make out which ones are true which ones are false. Applications do behave in different ways, and the tool might not understand that. But there are definitely a few findings which have been helpful. It's a good tool. Every other tool also has false positives and it's better than some other tools. We are not actively using the solution's OSS feature, through which you can look at third-party open source software libraries, because we have other tools internally for third-party library scanning. It's been a good journey so far.
Senior Security Architect at a tech services company with 5,001-10,000 employees
Real User
2020-06-07T09:09:00Z
Jun 7, 2020
If you are thinking about Contrast, you should evaluate it for your specific needs. Companies are different. The way they work is different. I know a bunch of companies that still have the Waterfall model. So evaluate and see how it fits in your mode. It's very easy to go and buy a tool, but if it does not fit very well in your processes and in your software development lifecycle, it will be wasted money. My strongest advice is: See how well it fits in your model and in your environment. For example, are developers using more of pre-production? Are they using a Dev sandbox? How is QA working and where do they work? It should work in your process and it should work in your business model. "Change" is the lesson I have taken away by using Contrast. The security world evolves and hackers get smarter, more sophisticated, and more technology-driven. Back in the day when security was very new, people would say a four-letter or six-letter password was more than enough. But now, there is distributed computing, where they can have a bunch of computers trying to compute permutations and combinations of your passwords. As things change, Contrast has adapted well to all the changes. Even five years ago, people would sit in a war room and deploy on weekends. Now, with the DevOps and Dev-SecOps models, Contrast is set up well for all the changes. And Contrast is pretty good in providing solutions. Contrast is not like other, traditional tools where, as you write the code they immediately tell you there is a security issue. But when you have the plugin and something is deployed and somebody is using the application, that's when it's going to tell you there's an issue. I don't think it has an on-desktop tool where, when the developer writes this code, it's going to tell him about an issue at that time, like a Veracode Greenlight. It is more of an IAST. We don't have specific people for maintenance. We have more of a Dev-SecOps model. Our AppSec team has four people, so we distribute the tasks and share it with the developers. We set up a team's integration with them, or a notification with them. That way, as soon as Contrast finds something, they get notified. We try to integrate teams and integrate notifications. Our concern is more about when a vulnerability is found and how long it takes for the developer to fix it. We have worked all that out with Power BI so it actually shows us, when a vulnerability is found, how long it takes to remediate it. It's more like autopilot. It's not like a maintenance type of thing. I would rate Contrast at nine out of 10. I would never give anything a 10, but Contrast is right up there.
Director of Innovation at a tech services company with 1-10 employees
Real User
2020-06-02T08:40:00Z
Jun 2, 2020
Make sure that you have a very good change-management strategy in place ahead of time. Also, it's not enough to have the solution itself. It still requires proactive management on behalf of your developers to make sure they understand what the product is offering and that they are using the product in a way that will benefit them.
My advice is: Don't think about it. Do it. The benefits that you'll get from implementing it are enormous, especially for your development teams. They'll be able to look at these vulnerabilities and start remediating them in their environments before even passing them along during the SDLC. In terms of the accuracy of Contrast in identifying vulnerabilities, my assessment is that so far we've had no false positives. However, if you speak to our developers, they will say it does have false positives, but that's not true. Let me give an example. We have several high vulnerabilities that Contrast found. It will say: "Application disable secure flag on cookies." In the lower environments, our Dev and QA environments, we want it that way. We want those cookies to be shared within the development, but we want that flag set on in the higher environment. So developers will say, "Well, that's a false positive." My argument to them is that it's not a false positive. We do want to make sure that those cookies are protected in the higher environment, even if the development team is okay with leaving it open in the lower environments. We need to know that that flag gets set in the higher environment. Therefore, as far as AppSec is concerned, this is not a false positive. We can mark it as not an issue. There's that type of "tennis" between AppSec and the development teams. We are using the OSS feature and looking at those libraries. Our leadership team is planning around the vulnerable libraries that we have on our code base, but rather than fixing these issues individually, application-by-application, they're going to take an enterprise look at it. There's an initiative looking at what Contrast provides versus Black Duck and WhiteHat. They're doing that assessment at an enterprise level. What we're looking for is a tool that alerts us, before we pull it down and bring it in-house, about the status of that library. Contrast is further right than what we're looking for. The leadership wants something farther left to look at these libraries even before the developer downloads it. Right now, Contrast doesn't give us that because it's further right than what the leadership is looking for. We are really building around Contrast. When we bring in new tools, one of the questions we ask is how does it integrate with Contrast. We're looking for tools that complement our use of Contrast. When a new tool is coming in, it goes to our board and they look at how well it integrates with Contrast. We're looking for things that complement our growth with Contrast rather than anything that replaces it. I would rate it at about nine out of 10. If they do come out with a management-level dashboard, that that would be the icing on the cake.
Contrast Security is the world’s leading provider of security technology that enables software applications to protect themselves against cyberattacks, heralding the new era of self-protecting software. Contrast's patented deep security instrumentation is the breakthrough technology that enables highly accurate assessment and always-on protection of an entire application portfolio, without disruptive scanning or expensive security experts. Only Contrast has sensors that work actively inside...
Contrast Security Assess is deployed on-cloud in our organization. I would recommend Contrast Security Assess to other users. It's a really good tool. It provides lots of details on web-based vulnerabilities, source code reviews, and third-party library issues. Overall, I rate Contrast Security Assess an eight out of ten.
I would rate the solution a ten out of ten. It is a cost-effective solution that is easy to implement. You need to try the solution over POC.
I rate Contrast Security Assess 8.5 out of 10.
Be prepared for the cultural change, more than the technology change. Most of the benefits that I have from the solution are the time savings where we're not scanning things and analyzing things. I now spend a lot of my time explaining to people how Contrast works, explaining to people how it changes our program, and explaining to people how Contrast fits into their development life cycle. If you're approaching it from a purely technical perspective, you're missing a big piece of what you're going to be spending your time on. I don't have any major complaints. Most of our challenges with Contrast have been how it changes our program and how it impacts the internal culture around development. Those are not really issues with the product itself. If we have had any kind of technical hurdle, it would be that a lot of our application owners might not understand the process for deploying the agent, which is when they instrument their environment. So we spend quite a bit of time supporting that part of the process, technically, which is not necessarily a good fit for a security program, having to tell people how to install an agent. It gathers data in real time, it gathers data from agents. It doesn't perform scans, rather, it observes traffic, and that's fundamentally different from the other tools and from how those tools are used in our existing processes. We spend a lot of time on culture and process and explaining how the technology is different. I find it very intuitive, but our users do not. We have developers who have spent the past 20 years thinking of application security in terms of a scan, and they're passive in that activity. The scan is something that's done for them, it's done to their environment, and then they're given data. Contrast is passive, it's an agent that's just gathering information, but it gives it to them directly, and that means they have to participate. They have to ingest that information, they have to be prepared for what they're going to do with it. They're not used to having that role. They're used to being the recipients of information or they're used to other people performing the service of scanning their environment, and Contrast doesn't do that. The biggest lesson I've learned is around how our developers think about security. When they're passive in that process, when somebody else is running scans for them and telling them what to fix, the way that they operate is different than when you give them an agent in their environment and you start giving them data from multiple environments and you start automatically sending that information to a bug tracker that they use. It's the automation and visibility that they've been asking for. But now that they're getting it, they are not exactly sure what to do with it. I was not prepared for having to have conversations about culture and process. Now that I have a better understanding of how our developers operate, what their metrics are, and how they're evaluated, as well as what constitutes success and what constitutes security on their part, it gives me a much better idea of how to interact with them. Before, we would talk about how we're seeing a certain type of security issue in our environment, and then we would try to figure out why our developers were continuing to make that mistake. Now, it's more about how developers utilize security data in their process and how we can improve that. Right now, the visibility the solution gives us is probably a little bit painful, because this is data that the developers didn't have before. We're identifying more vulnerabilities, and that is something they were not expecting. They were used to results that originated from our previous tools and they only had a handful of vulnerabilities to address. Contrast is now finding more issues with their applications as well as finding issues that are associated with libraries. That's a lot more data than they're used to receiving. And potentially, they're surprised by how vulnerable their applications are. The initial impact of having additional vulnerabilities that you were previously unaware of seems like a significant resource impact. A developer who normally only had to deal with a handful of findings may now have 10 or 20 or 100 findings to deal with. That may feel like a resource burden because you now have more things to fix, but ultimately that's going to be less expensive than the cost of a breach or loss of contract or anything else that might affect the business in the larger sense.
Start with a small app team initially, before scheduling a larger rollout. Teams that have been using SAST tools find that using Assess changes how they think about appSec in their development workflow and helps them identify process modifications that maximize the value of the tool. Overall, on a scale from one to ten, I would give this solution a rating of ten. The product is strong and improving, support is responsive and effective, and supported integrations work for many customers.
I would recommend trying and buying it. This solution is something that everyone should try in order to enhance their security. It's a very easy, fast way to improve your code security and health. We do not use the solution’s OSS feature (through which you can look at third-party open-source software libraries) yet. We have not discussed that with our solutions architect, but it's something that we may use in the future when we have more applications onboard. At this point, we have a very specific path in order to raise the volume of those critical apps, then we will proceed to more features. During the renewal, or maybe even earlier than that, we will go with more apps, not just three. One of the key takeaways is that in order to have a secure application, you cannot rely on just the pentest, vulnerability assessments, and the periodicity of the reviews. You need the real-time feedback on that, and Contrast Assess offers that. We were amazed to see how much easier it is to be PCI-compliant once you have the correct solution applied to it. We were humbled to see that we have vulnerabilities which were so easy to fix, but we wouldn't have noticed them if we didn't have this tool in place. It is a great product. I would rate it a nine out of 10.
Make sure you understand your environment before deploying. Try to get an idea of what technologies are in use by applications so you can group them and group the deployment and the implementation. That way you can focus on automating .NET deployments, for example, first, and then move on to Java, etc. The biggest lesson I have learned from using this solution is that there is a tool out there that is really changing the way that we are running security testing. In the security realm we're used to the static and dynamic testing approaches. Contrast Assess, as well as some other tools out there, has this new feature of interactive application security testing that really is the future for developer-driven security, rather than injecting security auditors as bottlenecks into the software development life cycle. I would rate Contrast Security at eight out of 10, and that is because of that the lack of client-side support and the troubles in automating the deployment holistically across an organization.
It depends on the company, but if you want to manage and maintain and onboard, I would recommend having Contrast as part of your toolkit. It is definitely helpful. My advice would be to install it on the environment in which there are more routes exercised, whether it is the testing environment or Dev, to get most out of the tool. In terms of configuration, we have Contrast on one of the applications in our testing environment and we have the other in the Dev environment. To decide on that took us some time because we didn't have access to all the environments of a single application. Findings-wise, Contrast is pretty good. It's up to the app engineer to identify whether a finding is due to the functionality of the application or it really is a finding. Contrast does report some false positives, but there are some useful findings as well from the tool. It cannot give you only true positives, so it's up to humans to make out which ones are true which ones are false. Applications do behave in different ways, and the tool might not understand that. But there are definitely a few findings which have been helpful. It's a good tool. Every other tool also has false positives and it's better than some other tools. We are not actively using the solution's OSS feature, through which you can look at third-party open source software libraries, because we have other tools internally for third-party library scanning. It's been a good journey so far.
If you are thinking about Contrast, you should evaluate it for your specific needs. Companies are different. The way they work is different. I know a bunch of companies that still have the Waterfall model. So evaluate and see how it fits in your mode. It's very easy to go and buy a tool, but if it does not fit very well in your processes and in your software development lifecycle, it will be wasted money. My strongest advice is: See how well it fits in your model and in your environment. For example, are developers using more of pre-production? Are they using a Dev sandbox? How is QA working and where do they work? It should work in your process and it should work in your business model. "Change" is the lesson I have taken away by using Contrast. The security world evolves and hackers get smarter, more sophisticated, and more technology-driven. Back in the day when security was very new, people would say a four-letter or six-letter password was more than enough. But now, there is distributed computing, where they can have a bunch of computers trying to compute permutations and combinations of your passwords. As things change, Contrast has adapted well to all the changes. Even five years ago, people would sit in a war room and deploy on weekends. Now, with the DevOps and Dev-SecOps models, Contrast is set up well for all the changes. And Contrast is pretty good in providing solutions. Contrast is not like other, traditional tools where, as you write the code they immediately tell you there is a security issue. But when you have the plugin and something is deployed and somebody is using the application, that's when it's going to tell you there's an issue. I don't think it has an on-desktop tool where, when the developer writes this code, it's going to tell him about an issue at that time, like a Veracode Greenlight. It is more of an IAST. We don't have specific people for maintenance. We have more of a Dev-SecOps model. Our AppSec team has four people, so we distribute the tasks and share it with the developers. We set up a team's integration with them, or a notification with them. That way, as soon as Contrast finds something, they get notified. We try to integrate teams and integrate notifications. Our concern is more about when a vulnerability is found and how long it takes for the developer to fix it. We have worked all that out with Power BI so it actually shows us, when a vulnerability is found, how long it takes to remediate it. It's more like autopilot. It's not like a maintenance type of thing. I would rate Contrast at nine out of 10. I would never give anything a 10, but Contrast is right up there.
Make sure that you have a very good change-management strategy in place ahead of time. Also, it's not enough to have the solution itself. It still requires proactive management on behalf of your developers to make sure they understand what the product is offering and that they are using the product in a way that will benefit them.
My advice is: Don't think about it. Do it. The benefits that you'll get from implementing it are enormous, especially for your development teams. They'll be able to look at these vulnerabilities and start remediating them in their environments before even passing them along during the SDLC. In terms of the accuracy of Contrast in identifying vulnerabilities, my assessment is that so far we've had no false positives. However, if you speak to our developers, they will say it does have false positives, but that's not true. Let me give an example. We have several high vulnerabilities that Contrast found. It will say: "Application disable secure flag on cookies." In the lower environments, our Dev and QA environments, we want it that way. We want those cookies to be shared within the development, but we want that flag set on in the higher environment. So developers will say, "Well, that's a false positive." My argument to them is that it's not a false positive. We do want to make sure that those cookies are protected in the higher environment, even if the development team is okay with leaving it open in the lower environments. We need to know that that flag gets set in the higher environment. Therefore, as far as AppSec is concerned, this is not a false positive. We can mark it as not an issue. There's that type of "tennis" between AppSec and the development teams. We are using the OSS feature and looking at those libraries. Our leadership team is planning around the vulnerable libraries that we have on our code base, but rather than fixing these issues individually, application-by-application, they're going to take an enterprise look at it. There's an initiative looking at what Contrast provides versus Black Duck and WhiteHat. They're doing that assessment at an enterprise level. What we're looking for is a tool that alerts us, before we pull it down and bring it in-house, about the status of that library. Contrast is further right than what we're looking for. The leadership wants something farther left to look at these libraries even before the developer downloads it. Right now, Contrast doesn't give us that because it's further right than what the leadership is looking for. We are really building around Contrast. When we bring in new tools, one of the questions we ask is how does it integrate with Contrast. We're looking for tools that complement our use of Contrast. When a new tool is coming in, it goes to our board and they look at how well it integrates with Contrast. We're looking for things that complement our growth with Contrast rather than anything that replaces it. I would rate it at about nine out of 10. If they do come out with a management-level dashboard, that that would be the icing on the cake.