Try our new research platform with insights from 80,000+ expert users

Qlik Replicate vs Quest SharePlex comparison

 

Comparison Buyer's Guide

Executive SummaryUpdated on Dec 19, 2024

Review summaries and opinions

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

Categories and Ranking

Qlik Replicate
Ranking in Data Integration
13th
Average Rating
8.2
Reviews Sentiment
6.9
Number of Reviews
17
Ranking in other categories
No ranking in other categories
Quest SharePlex
Ranking in Data Integration
50th
Average Rating
9.0
Reviews Sentiment
7.3
Number of Reviews
5
Ranking in other categories
No ranking in other categories
 

Mindshare comparison

As of March 2025, in the Data Integration category, the mindshare of Qlik Replicate is 3.1%, up from 2.8% compared to the previous year. The mindshare of Quest SharePlex is 0.7%, down from 0.8% compared to the previous year. It is calculated based on PeerSpot user engagement data.
Data Integration
 

Featured Reviews

KrishnaBaddam - PeerSpot reviewer
Lightweight tool, ensures that data is replicated across different systems and simplify complex tasks such as defining relationships
Qlik Compose is something that will automate user's overall data modernization. Here data modernization includes data modeling, ETL jobs, etc. But the advantage is users can automate the overall process of data engineering and data modeling through Qlik Compose. I think that's useful when users are able to manage 60% of the workload automated. That will be very useful. That's fantastic. Replicate does not have a great AI capability. AI capabilities are present in Qlik Sense. Qlik Replicate is a very light tool. It is only meant to capture data from the log files, get the data, and transfer it, read that table structure, create the table structure, and transfer the data whenever there is a change. So, it basically integrates with the kernel of the operating system. The way it works is that these replicate tools will integrate with the kernel of the operating system, and they will access the redo log files of the database. The redo log should have access to all the files of the structure of the schema, too. So, using that technique, they redo all the data structures, create a similar structure, and replicate the structure in the target schema, table, and database. After that is done, it will start tracing the instances that are happening. For example, if data is inserted into the table, then an insert is fired on the statement on the table. So, that particular insert is captured. And based on that insert statement, it will pull the SQL query and say, "Okay, there is an insert. I need to get that data." It will get the data from the redo log itself rather than going to a database. Then, it will just pass that transaction into the target system, where it will just insert the data. And this happens instantaneously, within a microsecond. So, if there is an insert, an update, or a delete, everything is transferred immediately. It is picked from the redo log because it comes to the redo log, and then the redo log sends it to Qlik Replicate and Replicate to the target system on which Replicate is installed.
MJ
Excellent replication with good stability and very helpful support
I don't know how easy it would be to change the architecture in an already implemented replication. For example, if we have a certain way of architecting for a particular database migration and want to change that during a period of time, is that an easy or difficult change? There was a need for us to change the architecture in-between the migration, but we didn't do it. We thought, "This is possibly complicated. Let's not change it in the middle because we were approaching our cutover date." That was one thing that we should have checked with support about for training. Also, maybe if we could have a seperate section of showing out-of-sync tables in Foglight, instead of looking into the "warning" messages.

Quotes from Members

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

Pros

"It enables us to transform data at the latest stage rather than in ETL loads, so it's more ELT which is one of the advantages. It is also in near real-time, which brings significant advantage for our embedded analytics approach."
"The cost is under control with this solution, unlike other services where it's not."
"The most valuable features of Qlik Replicate are its CDC performance and trigger functions. CDC feature is important to the financial industry."
"Qlik Replicate's ability to correctly map and replicate data, especially when converting complex fields like packed decimal fields into integers, stands out."
"Great with replicating and updating records."
"We use Qlik Replicate to change data capture of databases in production environments."
"A valuable feature of Qlik Replicate is that you do not need ETL. It's easy to use—you choose two systems and it automatically replicates them. Even if there is no CDC available, if you insert it and update it, and there is nothing to find out, then you can use Qlik Replicate. It's a good product."
"The CDC and the flexibility to use QVD as a source are the most valuable features of Qlik Replicate."
"I like SharePlex's Compare and Repair tool."
"The core features of the solution we like are the reliability of the data transfer and the accuracy of data read and write. The stability of the solution is also excellent."
"There are some capabilities within SharePlex where you can see how the data is migrating and if it still maintains good data integrity. For example, if there are some tables that get out of sync, there are ways to find them and fix the problem on the spot. Since these are very common issues, we can easily fix these types of problems using utilities, like compare and repair. So, if you find something is out of sync, then you can just repair that table. It basically syncs that table from source to target to see if there are any differences. It will then replicate those differences to the target."
"Because of the volume of the transactions, we heavily use a feature that allows SharePlex to replicate thousands of transactions. It's called PEP, Post Enhancement Performance, and that has helped us scale tremendously."
"The core replication and its performance. Performance is crucial, and SharePlex is by far the fastest. The way it handles replication to multiple targets along with basic filtering, as well as from multiple sources to a single target, is very efficient."
 

Cons

"In the next release, I would like to see closer integration with data catalyst."
"The product should improve its licensing limitation."
"The disadvantage is people are not going for this license because it is not marketed properly."
"Some features can also be overly dependent on each other. So, there is room for improvement."
"In various scenarios, an important consideration is when we encounter issues and Qlik Replicate suggests reloading a specific table. If we face any problems or encounter errors with that table, it becomes necessary to make a change in Qlik Replicate. Performing a full reload every time is not feasible or practical. Instead, we should identify the specific issues and address them without repeating the entire reloading process. Based on this approach, we can investigate and resolve the problem by performing targeted loads from the source itself. This change aligns with my perspective and is something I would like to implement."
"The solution's flexibility to work with APIs should also be improved since it is very weak in working with APIs."
"It's not possible to replicate the QVC files in data analytics."
"This product could be improved by providing more insight regarding errors. One of our customers that uses Qlik Replicate has had an issue. We tried to debug it, but we could not trace the error message. The infrastructure site should give us more insight about errors. Qlik Replicate is not a business solution, it's an IT solution. The reporting tools and bug site should be improved."
"For its function in relation to replication (i.e. filtering), I'd give it a six or seven out of 10. GoldenGate has much more functionality by comparison."
"I would like more ability to automate installation and configuration in line with some of the DevOps processes that are more mature in the market. That would be a considerable improvement."
"I don't know how easy it would be to change the architecture in an already implemented replication. For example, if we have a certain way of architecting for a particular database migration and want to change that during a period of time, is that an easy or difficult change? There was a need for us to change the architecture in-between the migration, but we didn't do it. We thought, "This is possibly complicated. Let's not change it in the middle because we were approaching our cutover date." That was one thing that we should have checked with support about for training."
"The reporting features need improvement. It would be very good for users to have a clear understanding of the status of replication."
"I would like the solution to have some kind of machine learning and AI capabilities. Often, if we want to improve the performance of posting, we have to bump up a parameter. That means we need to stop the process, come up with a figure that we want to bump the parameter up to, and then start SharePlex. Machine learning and AI capabilities for these kinds of improvement would tremendously help boost productivity for us."
 

Pricing and Cost Advice

"Unlike Azure, where you pay based on consumption, Qlik Replicate seems to charge per endpoint."
"Pricing for this solution is very reasonable."
"Overall, Qlik is an expensive solution. You need to pay for all additional features that you would like to use."
"Qlik Replicate is not expensive, compared to GoldenGate."
"Qlik Replicate is mainly suited for large companies. However, it is too costly for small businesses. Its pricing is high."
"On a scale from one to ten, where one is cheap, and ten is expensive, I rate Qlik Replicate's pricing a nine out of ten."
"It is not as expensive as Oracle GoldenGate and has worked really well within our budgets."
"It's really good value for the money. There are some things they could improve on, but in terms of the pricing, features, and support, as a holistic package, we are not thinking of anything else at this point in time."
report
Use our free recommendation engine to learn which Data Integration solutions are best for your needs.
842,296 professionals have used our research since 2012.
 

Top Industries

By visitors reading reviews
Financial Services Firm
19%
Computer Software Company
11%
Insurance Company
10%
Manufacturing Company
10%
Financial Services Firm
19%
Computer Software Company
11%
Real Estate/Law Firm
7%
Manufacturing Company
6%
 

Company Size

By reviewers
Large Enterprise
Midsize Enterprise
Small Business
No data available
 

Questions from the Community

What do you like most about Qlik Replicate?
The main valuable feature is its real-time change data capture (CDC) capabilities, which process data with minimal latency. There is not much delay. It also performs well with batch-wise data appli...
What needs improvement with Qlik Replicate?
There is complexity involved in the licensing part of this system. It is a core-based licensing, which, especially in the banking industry, results in the system capacity being utilized up to a max...
What is your primary use case for Qlik Replicate?
An example involving the banking project I am currently working on is for a public sector bank in India. Primarily, they are using a database, and the ultra architecture is meant to take a backup o...
Ask a question
Earn 20 points
 

Also Known As

Replicate, Qlik Replicate
Dell SharePlex, SharePlex
 

Overview

 

Sample Customers

American Cancer Society, Fanzz, SM Retail, Smart Modular, Tangerine Bank, Wellcare
Bodybuilding.com, Priceline.com, Ameco Beijing, Viasat, SK Broadband
Find out what your peers are saying about Qlik Replicate vs. Quest SharePlex and other solutions. Updated: February 2025.
842,296 professionals have used our research since 2012.