Try our new research platform with insights from 80,000+ expert users

Apache Spark vs IBM Spectrum Computing comparison

 

Comparison Buyer's Guide

Executive Summary

Review summaries and opinions

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

Categories and Ranking

Apache Spark
Ranking in Hadoop
2nd
Average Rating
8.4
Reviews Sentiment
6.9
Number of Reviews
67
Ranking in other categories
Compute Service (4th), Java Frameworks (2nd)
IBM Spectrum Computing
Ranking in Hadoop
6th
Average Rating
8.2
Reviews Sentiment
5.9
Number of Reviews
9
Ranking in other categories
Cloud Management (25th)
 

Mindshare comparison

As of October 2025, in the Hadoop category, the mindshare of Apache Spark is 19.0%, up from 18.7% compared to the previous year. The mindshare of IBM Spectrum Computing is 2.1%, down from 2.2% compared to the previous year. It is calculated based on PeerSpot user engagement data.
Hadoop Market Share Distribution
ProductMarket Share (%)
Apache Spark19.0%
IBM Spectrum Computing2.1%
Other78.9%
Hadoop
 

Featured Reviews

Omar Khaled - PeerSpot reviewer
Empowering data consolidation and fast decision-making with efficient big data processing
I can improve the organization's functions by taking less time to make decisions. To make the right decision, you need the right data, and a solution can provide this by hiring talent and employees who can consolidate data from different sources and organize it. Not all solutions can make this data fast enough to be used, except for solutions such as Apache Spark Structured Streaming. To make the right decision, you should have both accurate and fast data. Apache Spark itself is similar to the Python programming language. Python is a language with many libraries for mathematics and machine learning. Apache Spark is the solution, and within it, you have PySpark, which is the API for Apache Spark to write and run Python code. Within it, there are many APIs, including SQL APIs, allowing you to write SQL code within a Python function in Apache Spark. You can also use Apache Spark Structured Streaming and machine learning APIs.
OmarIsmail1 - PeerSpot reviewer
Senior Technical Specialist appreciates intelligent workload management, strong support, and scalability
The best features of IBM Spectrum Computing are common across many of their storage products. The software is solid, meaning that the code is stable. They take business seriously, which is what IBM stands for - International Business Machines. They always maintain a business-oriented approach in their software development. It's not simply clicking through interfaces; in IBM software, they consider their actions, process flows, and workflows around business processes. It requires understanding IBM and their methodology, as the software operates accordingly. I have utilized IBM Spectrum Computing's intelligent workload management feature. We use Insights, which is connected to the cloud. This provides AI capabilities for analyzing the configuration, offering smart recommendations on new code, warning about bugs in current code, and suggesting configuration improvements through its advisor tool. The predictive analytics feature in IBM Spectrum Computing enables optimal software performance through Insights. However, being a storage administrator requires foundational knowledge and understanding beyond these tools. For troubleshooting, it's efficient in spotting bottlenecks, but understanding the terms and metrics is essential as it provides answers that need interpretation.

Quotes from Members

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

Pros

"I appreciate everything about the solution, not just one or two specific features. The solution is highly stable. I rate it a perfect ten. The solution is highly scalable. I rate it a perfect ten. The initial setup was straightforward. I recommend using the solution. Overall, I rate the solution a perfect ten."
"The product’s most valuable feature is the SQL tool. It enables us to create a database and publish it."
"The fault tolerant feature is provided."
"The product is useful for analytics."
"Spark helps us reduce startup time for our customers and gives a very high ROI in the medium term."
"The product's deployment phase is easy."
"The most valuable feature of this solution is its capacity for processing large amounts of data."
"It's easy to prepare parallelism in Spark, run the solution with specific parameters, and get good performance."
"This solution is working for both VTL and tape."
"The comparison was challenging, but the IBM Spectrum Scale offered a balanced solution. Our engineers rated itsanalytics capabilities equally high as Pure Storage. For workload management, Spectrum Computing provided effective solutions that met our needs. Workload management is part of a complete solution that uses different tools. There were the cloud and HPC parts; within HPC, there were parts like liquid cooling, simple computing, storage, and orchestration. The orchestration team handled the workload management."
"We are satisfied with the technical support, we have no issues."
"The best features of IBM Spectrum Computing are common across many of their storage products."
"The most valuable feature is the backup capability."
"IBM's ability to cluster compute resources is impressive, with built-in support for scenarios like VR and active-active configurations,"
"Easy to operate and use."
"Spectrum Computing's best features are its speed, robustness, and data processing and analysis."
 

Cons

"We are building our own queries on Spark, and it can be improved in terms of query handling."
"Its UI can be better. Maintaining the history server is a little cumbersome, and it should be improved. I had issues while looking at the historical tags, which sometimes created problems. You have to separately create a history server and run it. Such things can be made easier. Instead of separately installing the history server, it can be made a part of the whole setup so that whenever you set it up, it becomes available."
"The initial setup was not easy."
"I know there is always discussion about which language to write applications in and some people do love Scala. However, I don't like it."
"When you are working with large, complex tasks, the garbage collection process is slow and affects performance."
"There were some problems related to the product's compatibility with a few Python libraries."
"From my perspective, the only thing that needs improvement is the interface, as it was not easily understandable."
"Apache Spark's GUI and scalability could be improved."
"The deduplication software isn't quite up to speed with the market. While IBM has excellent compression technology, specifically on their FlashCore modules, they lag behind competitors such as NetApp in deduplication capabilities."
"We'd like to see some AI model training for machine learning."
"IBM's sales and support structure can be challenging."
"Lack of sufficient documentation, particularly in Spanish."
"Spectrum Computing is lagging behind other products, most likely because it hasn't been shifted to the cloud."
"This solution is no longer managing tapes correctly."
"In Pakistan, IBM's disadvantage is the lack of OEM support and presence."
"The deduplication software isn't quite up to speed with the market."
 

Pricing and Cost Advice

"They provide an open-source license for the on-premise version."
"It is quite expensive. In fact, it accounts for almost 50% of the cost of our entire project."
"I did not pay anything when using the tool on cloud services, but I had to pay on the compute side. The tool is not expensive compared with the benefits it offers. I rate the price as an eight out of ten."
"Apache Spark is an open-source solution, and there is no cost involved in deploying the solution on-premises."
"Apache Spark is not too cheap. You have to pay for hardware and Cloudera licenses. Of course, there is a solution with open source without Cloudera."
"The solution is affordable and there are no additional licensing costs."
"On the cloud model can be expensive as it requires substantial resources for implementation, covering on-premises hardware, memory, and licensing."
"The tool is an open-source product. If you're using the open-source Apache Spark, no fees are involved at any time. Charges only come into play when using it with other services like Databricks."
"Spectrum Computing is one of the most expensive products on the market."
"This solution is expensive."
report
Use our free recommendation engine to learn which Hadoop solutions are best for your needs.
869,883 professionals have used our research since 2012.
 

Top Industries

By visitors reading reviews
Financial Services Firm
26%
Computer Software Company
11%
Manufacturing Company
7%
Comms Service Provider
7%
No data available
 

Company Size

By reviewers
Large Enterprise
Midsize Enterprise
Small Business
By reviewers
Company SizeCount
Small Business27
Midsize Enterprise15
Large Enterprise32
By reviewers
Company SizeCount
Small Business3
Midsize Enterprise1
Large Enterprise6
 

Questions from the Community

What do you like most about Apache Spark?
We use Spark to process data from different data sources.
What is your experience regarding pricing and costs for Apache Spark?
Apache Spark is open-source, so it doesn't incur any charges.
What needs improvement with Apache Spark?
Regarding Apache Spark, I have only used Apache Spark Structured Streaming, not the machine learning components. I am uncertain about specific improvements needed today. However, after five years, ...
What is your experience regarding pricing and costs for IBM Spectrum Computing?
IBM Spectrum Computing consistently offers competitive pricing. When solutioning new implementations, IBM always presents the best solution and price. In a recent comparison with Pure Storage and N...
What needs improvement with IBM Spectrum Computing?
IBM Spectrum Computing had limitations with remote copy services between head office and disaster recovery sites. In the last year, IBM has improved the code by re-engineering it to policy-based re...
What is your primary use case for IBM Spectrum Computing?
The typical use case for IBM Spectrum Computing is that it's an all-rounder. It can be used in various scenarios, such as the retailer I work for that has batch processing. It's on-demand when perf...
 

Also Known As

No data available
IBM Platform Computing
 

Overview

 

Sample Customers

NASA JPL, UC Berkeley AMPLab, Amazon, eBay, Yahoo!, UC Santa Cruz, TripAdvisor, Taboola, Agile Lab, Art.com, Baidu, Alibaba Taobao, EURECOM, Hitachi Solutions
London South Bank University, Transvalor, Infiniti Red Bull Racing, Genomic
Find out what your peers are saying about Apache Spark vs. IBM Spectrum Computing and other solutions. Updated: September 2025.
869,883 professionals have used our research since 2012.