I use SonarQube for Google's web services, from a security perspective, as well as Oracle Forms, HTML Forms, and script.
SonarQube is deployed on-premises.
I use SonarQube for Google's web services, from a security perspective, as well as Oracle Forms, HTML Forms, and script.
SonarQube is deployed on-premises.
Some of the most valuable features have been the latest up-to-date of the OWASP, the monitoring, the reporting, and the ease of use with the IDE plugins, in terms of integration.
SonarQube's detail in the security could be improved. It may be helpful to have additional details, with regards to Oracle PL/SQL. For example, it's neither as built nor as thorough as Java. For now, this is the only additional feature I would like to see.
I have been working with the Community Edition for at least ten years, and I have been working with the Enterprise version for about a year.
So far, we are happy and haven't had any issues with stability.
The only maintenance this product needs, for now, is just updates and patches.
SonarQube is an auditing requirement from our side and for our SDLC, so it is a gate in our SDLC.
SonarQube is easy to scale. As we've opted for the Docker builds, we haven't had issues yet.
At this point, there are at least 300 people in my company who are working with SonarQube.
I have minor experience with Q One. The main difference is in the licensing structure, with regards to lines of code. We have noticed that Q One has a bit more details, but support for various languages is lacking.
The setup process of SonarQube is straightforward. Deployment took about a week, but the integration of the multiple teams—introducing them and getting them on board—took about a month.
We implemented this solution through an in-house team.
Compared to similar solutions, SonarQube was more accessible to us and had more benefits, with regards to size of the code base and supported languages. Apart from the Enterprise licensing fee, there are no additional costs.
I rate SonarQube an eight out of ten.
To anyone who is looking into implementing SonarQube, I would recommend they look at what their requirements are, with regards to languages. If it's just Java, then the Community Edition is fine, but if there are any additional languages, then I would recommend Enterprise.
We use it for the static analysis of the source code to find issues or vulnerabilities.
The static code analysis is very good. In the banking sector, we have found several vulnerabilities and many issues in the source code.
If you don't have any experience with the configuration or how to configure the files, it can be complicated. The installation needs to be more user-friendly, as well as the interface, which could be more user-friendly.
I use the full trial version of SonarQube. I have been using the latest version of SonarQube for six months.
There are issues with stability. It needs improvement.
We have four members in our organization who are using this solution.
I am not able to evaluate the scalability. Once we go with the Enterprise version, we will know after three months, how efficient and scalable it is with large applications.
I have not contacted technical support.
The initial setup is straightforward. This solution is easy to install. It only takes five minutes.
We require a team of five to deploy and maintain it.
I completed the installation myself.
We are also evaluating Acunetix and will know what direction we want to go in the next few weeks.
Based on the testing, Acunetix offers something different. Acunetix has many features that are not found in SonarQube.
The enterprise version comes with many features. I have not been able to test it all because I am using the evaluation version. After three months of using this solution, I will have a better understanding of it.
We plan to continue using SonarQube. Some feel that it is unfair to compare SonarQube with other solutions as it has so many features.
I would rate this solution a seven out of ten.
We use this SonarQube solution for code quality and as a basic security issues solution for our clients.
It has improved our options for offering products to our clients that can better meet their needs, lower costs, and improves code quality and basic security.
Code analyzing is very valuable for detecting vulnerabilities but it has limitations.
With the aesthetic code analyzer or dynamic code analyzer, we would like to see zero vulnerabilities. This is actually currently not available with any available code analyzer so it is not the fault of this one product. We would like to see that the latest CVE (Common Vulnerabilities and Exposures) gets represented. This would be more useful but does not always happen.
If we have more of an idea of the likelihood of zero vulnerabilities then the product is more useful for user communities.
The product is stable.
We use a centralized machine so scalability is not an issue. We have yet to realize a limitation.
We have little or no interaction with technical support.
We service client needs so we consider all solutions we are aware of and weigh the pros and cons for deployment with a specific client.
Implementation is easy and very straightforward. We do a POC with our client and based on that we make a comparison to the client's needs and available solutions. We compare that with any of the open source options and with any of the premium commercial tools. We go with the one that makes sense. But the implementation of this product is not complex especially as we have experience with it.
We do our own implementations for various clients. We do not need the assistance of another team.
Return on investment is enhanced code and security. The actual ROI is difficult to measure except that licensing a commercial product will cost more over the long term if this product is enough to meet the user's immediate needs.
The product is basically free, so implementation is the greater cost. It will cost in man-hours for deployment and resources, or in consultation. The licensing fee is negligible.
We are constantly evaluating other products. So it might be that we will go with Micro Focus, for example, or any other tool in the future. It depends on what is offered by the product and what fits the client needs and budget.
I would rate this product somewhere between six and seven. It works for many clients, but if the user need and application is super critical, people should go with commercial products like Micro Focus. If the deployment is less critical, they can go with that as SonarQube, or another open source software solution.
Our primary use case for this solution is security testing using the FindSecBugs plugin.
This has improved our organization because it has helped to find security vulnerabilities.
The most valuable feature is the FindSecBugs (Find Security Bugs) plugin, which finds security vulnerabilities.
The product's user documentation can be vastly improved.
Its dashboards, quality profile, quality gates and CI integration features (like as build breaker plugin) are the most valuable features for me.
Personally, I have used SonarQube for educational purposes. SonarQube is helpful for giving motivation to a small development team (10 members or a little above) on code quality improvements with small efforts.
My team uses just two features - dashboards and CI-build-breaker - for checking code quality and the stability of our code base. For those purpose, SonarQube has done its work greatly. We have seen a decrease of about 25% of issues from since we first started using it a few months ago, and my team code bases are getting better.
The only thing I don't like is that they removed the design libraries and dependencies-checking features from v5.2. I hope they reintroduce these features in the future.
I've used it for approximately two years, since December 2013.
I have not encountered any issues.
I have not encountered any issues.
I have not encountered any issues.
I've not had to use them. I thinks it's online documentation is up to date, and it is enough to use them to solve problems and to understand features.
Technical Support:I've not had to use them.
My development team adopted SonarQube in January 2015 for code quality improvement, and had not used any code quality checking tool before.
The initial setup is easy. They provide a step-by-step online guideline to follow for installing it.
It has decreased the efforts of my team for finding and fixing potential issues which exist in our code base.
We are only using the free features.
Just keep following their online installation and plugin development guide.
We have literally thousands of rules and they are of medium effectiveness. The problem is that most people bypass the rules or turn them off. But even that is information to us. The fact that they have to turn the rules off is as much value to us as the rules themselves.
Code coverage of tests is their most valuable feature. Code coverage is of no value if it's high, but if it's a low number then that's of great value to me.
I would like to see something around mutation testing included in SonarQube. I'd like to see some mechanism of quality which has real meaning. The problem in metrics is that they're correlated. I'd like to see how they can add a feature to detect genuine quality, instead of numbers that people can game. The number can be manipulated. There are a few ways to do this, and mutation testing is one of them.
I would also be interested in more security scanning.
Stability has never been a problem. It would have to be unstable for me to experience a problem, and we haven't. So it's good.
I don't really know how scalable this solution is, but I know we use it on thousands of projects, so it's probably good.
We have a pipeline. The pipeline currently runs 4000 teams through it, and all of them have SonarQube but usually with default rules. So that's pretty expensive. Now, we can't increase it because everything goes through it. We are evaluating what our best option is as we migrate our pipeline. We're migrating the pipeline and we're wondering what to do. If SonarQube did more security scanning, there's a good chance that we would use it more, in a different role. We're already using SonarQube everywhere, in some aspect.
It was years ago. They probably evaluated other solutions.
We're evaluating the use of different solutions at the moment, but I've just withdrawn from that task.
In all the companies that I've worked with, nobody has ever had a problem with the initial setup. It takes time to set up. It's a big thing and you do it, but it's just a project.
We used people in-house to deploy. We have about 100 people in our pipeline maintenance team. SonarQube has not led to any significant increase in that number. It's just absorbed as a part of the cost. There are no dedicated staff working on it.
My advice is to focus on quality, not on tools. Work on the quality of your code and get a quality culture, but don't require the use of a tool. SonarQube is an okay tool. I'd suggest it as a default tool, but I wouldn't rave about it.
In all of my previous jobs, there has been somebody using SonarQube. They're usually very positive. I don't share that positiveness, but the reasons for that are that I don't believe you can have metrics of code quality based upon code analysis. I don't think it's possible for a computer to do it.
I don't rate any tool higher than a five or six, ever. JUnit is the only tool that gets a rating of ten. On a scale of one to ten, where ten is JUnit, I would rate SonarQube as about a five or a six.
My primary use for this solution is to perform static code analysis.
The most valuable feature is the display of issues, like in Jira. That is very helpful for us to track our coding.
Improvements could be made in terms of security.
I would like to see dynamic code analysis in the next version of the software.
The stability is good.
Scalability is good; we currently have five users but we will definitely be increasing our usage of this solution.
We have not required technical support for this solution.
This solution is not as easy to install as SonarLint.
We are using the free, unlicensed version.
We evaluated other solutions including Cobra Static Code Analyzer, but we were not satisfied with their customer support in the open source community.
We advise all of our developers to have this solution in place. That way, whenever they are developing, the will get live tracking with respect to the quality of their code.
I would rate this solution a seven out of ten.
Better live process: More automated quality control in the lifecycle of development/testing/deployment/production. This includes the prevention of potential bugs due to ineffective code, as well as keeping a more unified style of solutions. This is thanks to standard solutions offered by the issue tips. It raises code maintainability as well as flexibility, to some extent.
Quality Gate: Automated rules for determining if a project is above or below a quality threshold. This is a concise "red"/"green" style, basic quality-control. This is integrated in the development and deployment process.
Issue Explanations: Documentation with detailed samples. Helps in growing technical knowledge and re-writing logic to conforming solutions.
Deep intelligence and smarter code analysis: There are many cases where a bug or critical issue is reported. However, there is very little chance of rewriting the solution in some other way due to several circumstances. The written solution is actually safe.
It requires advanced heuristics to recognize more complex constructs that could be disregarded as issues.
There is a manual false positive feature for that, so it compensates for it. However, time and time again, some issues become annoying, since they are actually not issues. This can be time-tested though and configured/fine-tuned throughout working with the tool.
There were no stability issues. I can't think of any serious issues.
There were no scalability issues, not as far as the development environments are concerned. I guess if there were tens of repos and maybe hundreds of commits per day, the analysis time would probably suffer. I suppose there is a way to cluster the solution somehow. I'm not sure. I never needed anything like it at the current scale that we have operated with it.
I had no direct contact with tech support by myself, but I haven't heard any complaints about it going around either. I guess it is adequate.
Previous to this solution, we used static code analysis using built-in IDE tools and plugins. SonarQube just centralizes the same thing and adds some extra layers to systemize and create a somewhat better pipelining for the quality analysis process.
IDE-related tools and plugins are still in use today, as first-in-line hints and helpers. SonarQube manages the quality threshold and it is part of the larger overall process.
The initial setup was not complex at all. There is default configurations out of the box in many ways. It was rather straightforward.
I have no advice on that part, as I'm not directly related to these aspects of the product myself.
Try it, get used to it, configure, and fine-tune it. Make it part of your everyday quality pipeline as gates necessary to pass before the green light to production deployment.
While annoying occasionally with its issue reports, it is actually an invaluable source of better knowledge and applying it in practice to your solutions.
Saves you bunch of headaches and debugging/fixing sessions at production, which is ten times as costly than using the help of this.