Try our new research platform with insights from 80,000+ expert users

Domo vs Teradata comparison

Sponsored
 

Comparison Buyer's Guide

Executive SummaryUpdated on Oct 6, 2024
 

Categories and Ranking

IBM Cognos
Sponsored
Ranking in BI (Business Intelligence) Tools
7th
Average Rating
8.0
Number of Reviews
134
Ranking in other categories
Reporting (4th)
Domo
Ranking in BI (Business Intelligence) Tools
11th
Average Rating
7.8
Reviews Sentiment
6.9
Number of Reviews
36
Ranking in other categories
Data Integration (30th), Business Performance Management (13th), Reporting (8th), Data Visualization (7th)
Teradata
Ranking in BI (Business Intelligence) Tools
10th
Average Rating
8.2
Reviews Sentiment
7.0
Number of Reviews
76
Ranking in other categories
Customer Experience Management (5th), Backup and Recovery (19th), Data Integration (18th), Relational Databases Tools (7th), Data Warehouse (3rd), Marketing Management (6th), Cloud Data Warehouse (6th)
 

Featured Reviews

Carlos Larrad Salgado - PeerSpot reviewer
Improved the quality of our KPIs, while reducing calls to the IT department
I don't like that when we use Colab packages, we get less functionality. For example, you can make groups of data with Excel or with the data sets from the packages, but when you use the Colab packages directly, you can only group the data when you analyze it with Analysis Studio. I think Cognos needs to improve more on this functionality. The user experience is also very important. Cognos is not very easy to understand sometimes, especially when they change the layout but keep the functionality the same. The help is not very graphic and they have no examples. Cognos has to make a big effort to help with understanding the functionality by improving the documentation. There is a lot of documentation, but the examples are hard to find and they should make their help section easier to understand for non-technical users.
James John Wilson - PeerSpot reviewer
Robust, powerful, and easy to use
There were very few cases on some of the tables, the data tables, where I wish there was an additional feature or two. However, they were particular. What I wanted to see was the ability to collapse when you group a set of rows, let's say when you group them by status or health, so you have your red projects grouped up top. I wanted to compress or collapse that group of red and then open the yellow projects and then the green projects. There were a bit more features in the tables than I wanted to see. They have a widget that you can use either in Microsoft PowerPoint to pull over data into your PowerPoints and refresh graphs or charts or metrics or tables. I would love to see that available in Google Slides. I used it successfully in PowerPoint; however, at one company, they were only using Google products, and so that widget didn't help with reporting in slides. Therefore, we had to do a bit more manual work for our quarterly business reviews or monthly business reviews to produce our executive presentations. Sometimes the fonts were difficult to read if you're trying to put a lot of data in a table and show a lot of rows. Sometimes the fonts got too light, and you had to really play with it to try and figure out how to make it readable. One thing I had to do, and I don't know if it's necessarily a bad thing, was when I was running a meeting, I would have to go turn off the data jobs. If I was running a meeting and a lot of times people were scrambling in the background to do their updates even as the meeting was occurring, it would cause the page to render very slowly. It would sometimes pause or freeze. I found that if I went and turned off the status, the data update jobs that we're pulling data from Smartsheet, then the meetings would work more smoothly, and there were no interruptions or delays.
SurjitChoudhury - PeerSpot reviewer
Offers seamless integration capabilities and performance optimization features, including extensive indexing and advanced tuning capabilities
We created and constructed the warehouse. We used multiple loading processes like MultiLoad, FastLoad, and Teradata Pump. But those are loading processes, and Teradata is a powerful tool because if we consider older technologies, its architecture with nodes, virtual processes, and nodes is a unique concept. Later, other technologies like Informatica also adopted the concept of nodes from Informatica PowerCenter version 7.x. Previously, it was a client-server architecture, but later, it changed to the nodes concept. Like, we can have the database available 24/7, 365 days. If one node fails, other nodes can take care of it. Informatica adopted all those concepts when it changed its architecture. Even Oracle databases have since adapted their architecture to them. However, this particular Teradata company initially started with its own different type of architecture, which major companies later adopted. It has grown now, but initially, whatever query we sent it would be mapped into a particular component. After that, it goes to the virtual processor and down to the disk, where the actual physical data is loaded. So, in between, there's a map, which acts like a data dictionary. It also holds information about each piece of data, where it's loaded, and on which particular virtual processor or node the data resides. Because Teradata comes with a four-node architecture, or however many nodes we choose, the cost is determined by that initially. So, what type of data does each and every node hold? It's a shared-no architecture. So, whatever task is given to a virtual processor it will be processed. If there's a failure, then it will be taken care of by another virtual processor. Moreover, this solution has impacted the query time and data performance. In Teradata, there's a lot of joining, partitioning, and indexing of records. There are primary and secondary indexes, hash indexing, and other indexing processes. To improve query performance, we first analyze the query and tune it. If a join needs a secondary index, which plays a major role in filtering records, we might reconstruct that particular table with the secondary index. This tuning involves partitioning and indexing. We use these tools and technologies to fine-tune performance. When it comes to integration, tools like Informatica seamlessly connect with Teradata. We ensure the Teradata database is configured correctly in Informatica, including the proper hostname and properties for the load process. We didn't find any major complexity or issues with integration. But, these technologies are quite old now. With newer big data technologies, we've worked with a four-layer architecture, pulling data from Hadoop Lake to Teradata. We configure Teradata with the appropriate hostname and credentials, and use BTEQ queries to load data. Previously, we converted the data warehouse to a CLD model as per Teradata's standardized procedures, moving from an ETL to an EMT process. This allowed us to perform gap analysis on missing entities based on the model and retrieve them from the source system again. We found Teradata integration straightforward and compatible with other tools.
report
Use our free recommendation engine to learn which Data Integration solutions are best for your needs.
824,053 professionals have used our research since 2012.
 

Comparison Review

it_user232068 - PeerSpot reviewer
Aug 5, 2015
Netezza vs. Teradata
Original published at https://www.linkedin.com/pulse/should-i-choose-net Two leading Massively Parallel Processing (MPP) architectures for Data Warehousing (DW) are IBM PureData System for Analytics (formerly Netezza) and Teradata. I thought talking about the similarities and differences…
 

Top Industries

By visitors reading reviews
Educational Organization
55%
Financial Services Firm
7%
Computer Software Company
4%
Government
4%
Computer Software Company
12%
University
9%
Manufacturing Company
9%
Retailer
8%
Financial Services Firm
26%
Computer Software Company
11%
Manufacturing Company
8%
Healthcare Company
7%
 

Company Size

By reviewers
Large Enterprise
Midsize Enterprise
Small Business
 

Questions from the Community

Seeking lightweight open source BI software
It depends on the Data architecture and the complexity of your requirement. Some great tools in the market are Qlik ...
What do you like most about IBM Cognos?
The solution's most valuable feature is its ease of use, which makes it easily compatible with other tools.
What needs improvement with IBM Cognos?
I need improvements, particularly with the Framework Manager, which has an outdated user interface from older version...
What do you like most about Domo?
All our client SLAs and daily and weekly dashboards are tracked on Domo.
What needs improvement with Domo?
Dashboard settings in Domo are not much different from other platforms like Power BI. Data integration is okay, but n...
Comparing Teradata and Oracle Database, which product do you think is better and why?
I have spoken to my colleagues about this comparison and in our collective opinion, the reason why some people may d...
Which companies use Teradata and who is it most suitable for?
Before my organization implemented this solution, we researched which big brands were using Teradata, so we knew if ...
Is Teradata a difficult solution to work with?
Teradata is not a difficult product to work with, especially since they offer you technical support at all levels if ...
 

Comparisons

 

Also Known As

Cognos, Cognos Analytics, IBM Cognos Analytics
corda
IntelliFlex, Aster Data Map Reduce, , QueryGrid, Customer Interaction Manager, Digital Marketing Center, Data Mover, Data Stream Architecture
 

Learn More

 

Overview

 

Sample Customers

More than 23,000 leading organizations across multiple industries use Cognos. Some examples of Cognos customers include BMW Financial Services, Quinte Health Care, Troy Corporation, Michigan State University, and GKN Land System.
Capco, SABMiller, Stance, eBay, Sage North America, Goodwill Industries of Central Indiana, Telus, The Cliffs, OGIO International Inc., and many more!
Netflix
Find out what your peers are saying about Domo vs. Teradata and other solutions. Updated: November 2024.
824,053 professionals have used our research since 2012.