Director , Business Intelligence at a healthcare company with 5,001-10,000 employees
Real User
Top 10
2024-08-23T19:17:22Z
Aug 23, 2024
When Salesforce bought the solution, they claimed that Einstein would be part of Tableau, but that indication is not working. Tableau's integration with Einstein Analytics could be better. It's not even up to par with other products. The ad hoc, self-service, and data analysis features are not working.
The product's features for cloud integration need improvement. They should revise licensing and pricing models to cater to smaller enterprises. Users must be able to customize and write their code similarly to one of its competitors. Many companies have in-house data science models for Twitter or Facebook based on predictive analysis. There is a possibility of integrating these models seamlessly into Tableau.
Tableau is excellent at visualizing data, however, I think improving the data preparation features would be a great addition. Navigating through activities like cleansing, reshaping, and wrangling extensive or complicated datasets could prove challenging within the Tableau environment. Also, the settings for working with complex datasets also need to be changed. In the next version, it would be good to add user-friendly resources for beginners, such as interactive tutorials and templates, to make Tableau even more accessible to a wider audience.
The product’s router could be better in terms of readability. When we put more information on a single screen, it gets compressed and superimposed in many places while scrolling. It could be improved.
CEO at a tech services company with 1-10 employees
Real User
Top 20
2023-07-12T10:17:18Z
Jul 12, 2023
A specific thing in Tableau is that I have looked at how to add pictures to the reports. I don't have the ability in Tableau to create a tooltip and see the picture of a piece of jewelry or watch that is a best seller.
Despite the number of available interface languages, I would like to see as many languages as possible represented in Tableau. So many companies really want to use this program and it suits them better than any other similar programs, but due to the lack of understanding of English, it is difficult for them to decide to integrate this particular product. In addition to the existing set of common data types, Tableau Desktop has such a wonderful thing as a union of data on the principle of relationships for Multi-table. This type of union is very easy to work with the data source itself and does not overload the system. At the moment, this type of union only exists for connections from databases. It would be nice if Tableau added this type of merging for data sources that are on the Tableau Server/Tableau Cloud. Every three months, Tableau adds new features. There are always new features coming up.
When you're working on a dashboard, you can't select multiple components at a time and align them, so you have to go one by one. This is very cumbersome if you're floating, and it loses in comparison to Power BI, which does allow multiple selections. In the next release, I would like to see an enhancement of the prescriptive analytics features.
Expert Analyst at a tech services company with 501-1,000 employees
Real User
2022-02-25T19:30:08Z
Feb 25, 2022
When you create new fields in Tableau and you enter the formulas, there is a new small window that is there in the interface. You can enter the calculated fields, it could be more user-friendly. At this time it is limited and hard to understand at the beginning. The fields should be easier to use, such as in Microsoft Excel. You can have a difficult time understanding what to do in the fields, you end up doing trial and error to figure it out.
The development part should be better. We are putting a lot of effort in during development, so if we face any struggles, we have to find workaround solutions on the internet. It would be nice to have new workaround solutions and other options. Every customer has different expectations, so sometimes it's hard to find the right solution.
Most of the problems in Tableau Online that I have noticed have to do with performance or weird, inexplicable bugs that I can't pin down. For example, you might try unloading some data, and you'll be waiting for a long time without anything happening. These bugs always seem to happen when we perform big upgrades or do maintenance work, and we have had to send a lot of tickets for unexplained issues during these times. It doesn't seem to be a problem only for us, but also for customers all over the world, such as in Ireland, Western Europe, Eastern Europe, and the US, too. As for future features, I would like to see major upgrades in Bridge and the Flow Tool, allowing us to do more data engineering work. I think it would give Tableau a big edge in the market to look into how to incorporate more data engineering tools into their product. Besides that, I would also like the charts to be more realistic and easier on the eyes.
There were a lot of dashboards everywhere in the organization, however, when the company wanted to get the operational databases they were not connected. The solution needs to improve its integration capabilities. The performance and security could be better. Many people saw Tableau as a silver bullet and it isn't. It's good for small things, however, not for an institutional way of doing things. I'd like to see better integration with SAP. I'd like an integrated ETL or some sort of data preparation capabilities.
Director , Business Intelligence at a healthcare company with 5,001-10,000 employees
Real User
Top 10
2021-12-02T22:38:48Z
Dec 2, 2021
We need big servers to perform the operations that we are doing. They should probably relook at its architecture. There are limitations to the data source that we are building. We can put only 32 tables in a data source, which means we have to transfer some of the workload to a database.
Senior tech architect at a computer software company with 1,001-5,000 employees
Real User
2021-11-23T16:24:00Z
Nov 23, 2021
One thing I would want to change for Tableau is to have a lower-cost model. It's pretty high for enterprise deployment. In the next release, I would like to have the capability to call machine learning models within Python while I'm building a dashboard. The value calculation should be a machine learning model, which is running somewhere else, on say, Amazon. These tools give good outputs, like calculated fields and all. But today the outputs are not straightforward. In simple terms, I need machine learning on the fly. That is not there.
Business Analyst at a media company with 10,001+ employees
Real User
2021-11-03T20:09:00Z
Nov 3, 2021
The price is definitely a point that can be improved because smaller firms, like my bank firm, don't use Tableau because it's an expensive tool. If there were an option that catered toward smaller firms, that would be great because Tableau does in fact help with a lot of different kinds of data sources. For instance, it lets you upload CSV on Excel. However, other tools that we currently use, such as Looker, do not let you upload Excel files for ad hoc analysis. So, definitely, this is something price-wise that can be catered toward smaller firms. Creating variables, creating new fields in Tableau during analysis, actually adds columns to the data. That's something that could potentially give us an option. Do you want it as a column added to the data set or do you want it ad hoc in the visualization sheet? So if you create a measurement or a dimension, that creates a new column, but if you try to create a new filter directly on the visualization, it doesn't let you rename it. Basically what you see is just the calculation that you put in there. If you wanted to create something without making it an extra column in the data set, you can't just rename it to a more user-friendly short name. An improvement would be adding the ability to rename ad hoc creations if you do create a mark or a filter on the visualization. That doesn't really get added to the actual data fields.
They currently don't have a great Workday connector. Right now, Tableau can connect to more than 80 different types of databases or data sources, but it's challenging to connect with a few types, like Workday. So if they can come up with a better version or a connector for Workday, it will solve a lot of problems.
There is a lot more that can be done with Tableau than what is actually happening within Juniper. The company is not getting the answers to the questions directly from the Tableau database, for example. Of course, Tableau can be extended to answer those questions. What is happening, with so many tools coming up in the market, is that people have to continuously get educated in order to use some of the more advanced features. What's happening with Tableau is that, except for the dashboard view and all the filtering and that's happening from a dashboard perspective, it doesn't seem to be very good in making me understand the trend insights. For example, if I saw that the average sales price for Product A was lower than the average sales price for Product B, I'm not saying that B is inferior to A or anything. I'm just noting what I found and I cannot give more details. It doesn't go deeper into the analysis. I'd like more analysis to better understand what a trend might mean, and not just a report that a trend is happening. Right now, Tableau is not so good at providing that extra bit of insight. What happens is Tableau data is used very often. From the quarterly business reviews, et cetera, the executives have direct access to the Tableau dashboard. More than anything else, they're able to do all this filtering. They could probably improve the user interface response times. When it comes to slicing and dicing of data viewing the results, it needs to be just easier in general as executives are using it and looking at it, and they are not very technical. When executives look at the Tableau dashboard, they want to know why, for example, Product A bringing in less than Product B. Those kinds of key questions, which come from executives for reviewing the Tableau data need to be addressed and in a simple to understand way. I think Tableau has to work a little more in terms of the business insights aspect of it, where it communicates to the user and answers their questions. That intelligence part needs to be developed in Tableau. Something great would be, if, for example, like in Google, if you asked a question, it could feed you back potential information. I don't want to compare everything to Google, however, it's so easy to find the answers you need in the way Google is set up. If Tableau could do something similar to showcase answers to questions, that would be ideal. It needs some sort of smart dashboard.
Manager, BI & Analytics at Perceptive Analytics
Real User
2021-09-10T10:44:24Z
Sep 10, 2021
An advanced type of visualization is a bit tricky to create. It has something called a Calculated field, and that sometimes gets a bit difficult to use when you want to create an advanced type of visualization.
Global Head of Professional Services at Arteria AI
Consultant
2021-08-31T19:09:49Z
Aug 31, 2021
From a downside perspective, some of the more advanced modeling techniques are actually fairly difficult to do. In addition, I just fundamentally disagree with the way you have to implement them because you can get incorrect answers in some cases. One of the key challenges is that you never know whether it is how your developers developed it or whether it was the tool. We did find that once we got into more complex models, the ability to keep objects that should tally the same way but didn't became more and more difficult. That was probably the big thing for me. I don't know enough about how the tool was developed to know whether that was because they didn't follow a recommended practice. That was probably the number one thing that I found frustrating with it. When we started to try and get into some very granular data sets that had some complex relationships in them, the performance on it degraded pretty quickly. It did degrade to such an extent that we couldn't use it. We had to change what we were trying to do and manage its scope so that we could get what we wanted out of it or reduce the scope of what we needed out of it. It doesn't have a database behind it, per se. So, while doing some of the more complicated things that you might otherwise do on a database, we started hitting some pretty significant challenges.
There should be a focus on memory data, which is the concept of Tableau. This is where they squeeze the data into their memory. Because of that, we see performance issues on the dashboards. The architecture should be improved in such a way that the data can be better handled, like we see in the market tools, such as Domo, in which everything is cloud-based. We did a POC in which we compared Tableau with Domo and performance-wise the latter is much better. As such, the architecture should be improved to better handle the data. We are seeing a shift from Tableau to Power BI, towards which most users are gravitating. This owes itself to the ease of use and their mindset of making use of Excel. Power BI offers greater ease of use. For the most part, when comparing all the BI tools, one sees that they work in the same format. But, if a single one must be chosen, one sees that his data can be integrated at a better place. Take real time data, for example. I know that they have the live connection, but, still, they can improve that data modeling space better.
Associate at a financial services firm with 10,001+ employees
Real User
2021-07-11T12:26:00Z
Jul 11, 2021
I have noticed that Tableau is not very compatible with ClickHouse. There's no direct connection to ClickHouse; you have to set up an ODBC connection. Tableau's performance takes a hit if you have huge data. The stability and scalability could be improved.
The process of embedding the dashboards on external portals and websites could be improved. We also experienced challenges with integration with analytics. In an upcoming release, if the capabilities of Tableau Prep are improvised and expanded, that would be an added advantage.
Operations & BI Analyst at American Hospital Dubai
Real User
2021-04-05T12:11:36Z
Apr 5, 2021
There is another ETL tool for Tableau that is new. It takes time to reach some level of experience. IN Power BI, they have Power Query. I find it easier to convert the information in Power Query with a single shortcut key. That's not an option in Tableau. You have to prepare your data. It will take a lot of time to clean the data. There's no mature ETL tool in Tableau, which is quite a negative for them. They need to offer some built-in ETL tool that has a nice and easy drag-and-drop functionality. There needs to be a bit more integration capability.
Its price should be improved. Its price is much higher than Power BI and QlikView. Programming is not easy on Tableau. For programming, you have to have a separate model. They should include programming directly on the web portion of the Tableau desktop so that people can write Python or JavaScript code for customizations instead of using a different model. Currently, Tableau Data Prep is a separate application that you have to purchase. It would be helpful if they can include Tableau Data Prep and programming languages such as R, Python in the next version. Tableau Public, which is a community version, doesn't allow you to save your work on your desktop. They should allow it. Currently, you can only upload it in the community.
The product could be improved with more features in data analytics. Tableau is not currently a good database for handling built-in models for data science in order to test, train and run the models. It's not currently an AI tool or a tool for machine learning. Right now it's more for non-expert users. If they could improve their analytical capabilities for data science tasks, it would be a better product. In order to carry out data science tasks now, we have to use Vertica for big data projects to discover and run machine learning models. It would be very good if they had their own machine learning capabilities built in. I'd like to see more features in data analytics, AI and machine learning capabilities.
Lead Data Scientist at International School of Engineering
Real User
Top 10
2021-03-01T11:40:42Z
Mar 1, 2021
I have used Power BI as well as Tableau. There are a couple of interesting features that I like in Power BI, but they are not present in Tableau. For example, in Power BI, if I am looking at country-wise population, I can type and ask for the country that has the maximum population, and it will automatically give an answer and address that query. This kind of feature is not there in Tableau. Similarly, in Power BI, for integrating with the latest ML algorithms, we have decision trees and primarily multiple machine learning algorithms. The decision tree essentially visualizes the patterns in the data. We don't have such a feature in Tableau. If Tableau can integrate with the machine learning algorithms and help us to do visualizations, it would be a wonderful combination. Most of the people are going for Tableau primarily for visualization purposes. However, in the data science industry, users want to do model building as well as tell a story. As of now, Tableau is fulfilling the requirements for visualization purposes. If they can bring it up to a level where I can use it for machine learning purposes as well as for visualization, it would be very helpful. Many people who want to do data science don't want to write a code. Tableau is anyway a drag and drop tool, and if they can provide those options as well, it will be a powerful combination.
Principal Partner at a tech consulting company with 51-200 employees
Real User
2021-02-23T14:41:10Z
Feb 23, 2021
With Tableau, when you're dealing with very large datasets, it can be slow so the performance is an area that can be improved. The security can be improved.
An area needing improvement involves the complexity of the product should you need to alter a lot of parameters. Definitely speaking, it's straightforward and it's very easy. Implementation problems can be dealt with by the client, in place of the user consultant. Let me give you some examples of things that could take long in a Tableau implementation. Suppose you have five different business areas in your company: marketing, supply chain, finance, HR and procurement. Let us suppose that access to HR salaries is not company-wide but is limited to only a select number of people in HR, such as the manager or the director of the department. Yet, I want people in the supply chain to be able to see and access different data from different areas. While this would not be technically difficult it would be time consuming if the businesses are very particular. There may be many policies involved in access authorization, in data availability and the like. This can involve a very strict security process using an outside identity provider. Instead of just logging in your username and password, you may have different technologies which are more safe and secure that need different providers to interface in Tableau. Depending on the need, this will be time consuming. For instance, while I don't know how this would be in your country, suppose you have an identity provider, in Brazil, marketing in Tableau. If you go to Asia, you may sometimes have a bio-metric identity that your hand or fingers employ which is going to get back at you. In that circumstance, they are going to send you a number or a code in your cellphone, requiring two steps, one to enter the bank and the other to withdraw your money. So, these things we call an outside identity provider, meaning a different vendor or different companies who manage the servers of managing identities. These would entail an integration with Tableau and these outside companies for security purposes. This would involve them sending me files and me sending them back in order to authenticate the user into the Tableau server. This can be time-consuming because they involve or require a different partner. Tableau is made for basic needs, such as requiring a user and a password to log in to the server; an unsophisticated architecture; or use of a single instead of a cluster of servers. If you have non-specific data security needs or you just want to analyze and sell data, that can take less than a day. But if you have technical servers, many interfaces, different providers and more serious processes, that will be time consuming. While Tableau does integrate with Arc server and Python server, the integration process is slow and the information is integrated in a protracted fashion. Sometimes your data will vary. You may have a vector of data. You may have a matrix of data. For some algorithms we do not use regular data, but a different data structure. Tableau does not work with these different data structures. As such, interfacing with Arc server and Python server, which are still languages that are widely used in machine learning, all happen slowly. It does not happen by a matrix of data and data vector.
The price could be better. The overall scalability can also be improved. I would like to see more machine learning components to do predictive analytics. It should be simple for our customers to use. Tableau should include an automated machine learning feature in the next release.
An issue that is common to both Tableau and Power BI is with large data sets. When it comes to large datasets, the data should be extracted faster. Tableau should offer the end-user a desktop version that is free where they can go in and practice. There are other solutions that offer it for free such as Huawei, and the desktop version of Power BI is also free. People tend to know if they want to learn visualization. They don't have a proper tool in place, they don't know how to or where to go to learn. If you give them the tool to learn and let them explore when they want to go into production, people are able to purchase the license. A 14-day trial version would not be enough time.
Research & Development Expert at a energy/utilities company with 11-50 employees
Real User
2021-02-03T10:27:15Z
Feb 3, 2021
The integration with other program languages, like Python, needs to be better. I know the capability is there, however, there needs to be better integration. There needs to be integration for machine learning and AI. That would help data analysts and data scientists quite a bit.
Senior Software Engineer at a tech services company with 1,001-5,000 employees
Real User
2021-02-02T12:46:16Z
Feb 2, 2021
This solution has some features which really needs to be improved. For example, the sorting feature, If we compare it with ClixSense, ClixSense has a direct sorting feature available to users. Wherein Tableau, we have to go and create a parameter, make it dynamic, force users to click somewhere else on the filter, and then maybe you can sort it. Tableau is really new for sorting features. With performance tuning, it generates a pretty complex query when it is not required. We do not actually write 100 lines of code for a single KPI indicator. What we do is run the performance tuning model which will give 100-200 lines of code for a single KPI. That is not exactly an optimized query. While running performance tuning on the query, it should be pretty optimized, but it does not seem to be doing this.
Lead Data Scientist at a financial services firm with 11-50 employees
Real User
2021-01-15T12:22:54Z
Jan 15, 2021
It will be good if the server could be more stable, and I would like to have the technical service to be more reliable. I would like a better response time without having to wait for a week just to get feedback.
General Surgeon at a healthcare company with 51-200 employees
Real User
2021-01-14T07:23:25Z
Jan 14, 2021
Some of the functionality of the dashboard can be difficult to operate and the color pallets are limited. They need to improve the icons and the filters, because they look too old, resembling Excel from 1997. It would be helpful if the solution was less difficult to use.
I think predictive analytics is the main driver of business decisions and hence Tableau should strengthen the ability to make predictions. The forecasting feature in Tableau in my view is too limited because it must have dates but I should be able to predict the outcome of an event without having a date as part of the input. In situations where you are analyzing or using just one measure such as Sales, Tableau does not create the header for you. Furthermore, it is not straightforward as to how to create it. I would like to have the ability to perform multiple pivots and creating different variables. For example, if I have the regional population for six regions and branch offices, together with the number of clients per branch, all as a record or observation, then I should be able to pivot them separately resulting in the Region, Population, Branch, and Clients.
Director of Professional Services, Analytics at a computer software company with 5,001-10,000 employees
Real User
2020-12-25T03:32:51Z
Dec 25, 2020
With Tableau, there is a gap in its ability to handle very large-scale data. I would like it to be similar to the rest of the solutions, which can handle terabytes of data.
I attended a Tableau conference recently, and a quick improvement came to mind. When I am training people how to use it, I've come across situations where I've found it difficult to explain relationships. For example, when you want to blend data or when you want to show relationships, like when linking multiple tables; well, if you're an IT guy, that's easy. But if you are not an IT guy, you don't know anything about entity relationships, and it becomes a bit difficult for others to follow along. It takes me a long time to get people to understand, even up to the point where I feel that this is the lowest level that I can go in terms of explaining it. I realized that many people don't really have any experience or knowledge about relationships between objects, and it makes it hard for me to get my teaching across. So I was suspecting, and I think I made this recommendation, that Tableau could find an easier way to introduce relationships. For now, if you want to build relationships in Tableau, or even in Excel, you have things like Access modules and Sheets. But how do I know that I need to use one object with another for the relationship. And if you then put in a table, what do you do after that? You have to double click, but people don't know that you have to double click. I was hoping that there's a way that they can make that process a bit easier, though I don't know how they will do it. Perhaps when you load Tableau and connect to a data source, there would be a prompt that asks you if you want to link two tables together. So if you want to link two tables together, maybe you do A, B, C, D. That might help with the self-service idea. If you're talking about self-service, then it should be easy for people who do not have the time, or who do not have that IT background, to pick the data and use it correctly. In addition, and more generally, what I would like to see more support for is predictive analytics. When you're doing descriptive analysis, Tableau is excellent, and it's easy to do. But when you are trying to predict something, like in Tableau's forecasting feature, it seems to require date fields, or it won't work. But I can forecast something without relying on date fields; maybe I want to predict that a branch has to close if it doesn't want to make something soon. I don't need dates to do that. For this reason, I'm using Alteryx for predictive modeling instead of Tableau. Overall, the only major frustration that I have had so far is with Tableau Public. I first used Tableau Public when I was building capacity, and when there was a later release to download and you wanted to upgrade, all your work would have to be manually re-entered. I don't know how they can solve that. I was expecting that they might make a release on this upgrade, and then I can hit upgrade and it will install over what ever I have already. Otherwise, for now I think they are doing well and I know they're still adding a lot of features. But it does sometimes make our work difficult, for those of us who are building capacity, and who are regularly changing people around. It means you have to keep learning all the time. Another small detail for improvement is that when you draw bar charts, the default color could be something more neutral like gray. Instead, the default is blue, and I don't exactly get why this is the case.
Manager at a tech services company with 1,001-5,000 employees
Real User
2020-12-04T21:56:58Z
Dec 4, 2020
I'm not sure if the solution needs any improvements. It's the best solution we have here right now. The pricing is high. I'm using a student license, however, I know that even this license is very expensive. I've tried to have this product in our organization, however, it's quite expensive. We don't have the internal budget. If you're looking for other kinds of data, for example, non-structured data, they could make it much easier to use this kind of data. Tableau could create other features just for data visualization and non-structured data. It's a beautiful solution when you've got frames and tables. It's structured. However, if you don't have this kind of information on the data, it's quite difficult to use Tableau. I would say that if you have any feature that opens the opportunity to work with non-structured data, it would be excellent. For example, we do end up creating a lot of word clouds. With unstructured data it just doesn't translate quite right. If you could use non-structured data to count the frequency of important words to find which word is more important, for example, that would be useful. I don't see Tableau doing this - counting the frequency of important words in a specific kind of text. It would also be great if there was statistical modeling for non-structured data.
Service Delivery Manager / Architect at a tech services company with 201-500 employees
Real User
2020-12-03T13:46:42Z
Dec 3, 2020
Scalability for large amounts of data needs improvement, as well as its performance. From a scheduling perspective, if there is a sync up of the desktop dashboard into the server that we can publish as a web version, in an accessible way, that publishing scales and keeps on executing for hours. This can go on for eight to nine hours, but you have no indicator, you don't even see that it is processing. For example, there is no spinning wheel and all I see is a black screen. The interface can be improved, in part because there is no indication that something is running or that it's processing. I would like to have some kind of indication that there is something processing on the interface. Technical support could be faster or if they have any limitations of the product, they should openly communicate it. They could also just tell you that this product is intended for small volumes of data and may even suggest another solution.
Performance and Business Intelligence Specialist at a transportation company with 1,001-5,000 employees
Real User
2020-11-19T14:15:40Z
Nov 19, 2020
All of the BI tools have graphical interfaces but when it comes to the learning environment, not every tool has everything. To be the best in the market, Tableau has to improve its user interface and also look into developing implementing the best machine learning algorithms. Including data storage capabilities would be helpful. During the data crunching phase, it takes time for Tableau to connect, integrate, and download the data. In general, it takes a lot of time for the ETL process. Increasing the trial period to six months would allow people to better learn and assess the tool to determine whether it suits their needs. Given the price of BI tools, Tableau should consider giving a scholarship to people so that they can learn how to work with the tool. It would be helping some of the people who lost their jobs during this pandemic. If the users learn and become certified on Tableau, it would help to get more people interested in the tool.
Technical Architect - Sr. Manager at Axtria - Ingenious Insights
Real User
2020-11-18T13:55:12Z
Nov 18, 2020
The data processing in Tableau is pathetic compared to Qlik. In Qlik, I can replace my ELD layer for an application. This can't be done in Tableau. The initial processing of data in Tableau takes a lot of effort. If there could be a feature that a particular visual can be exported or just the data behind the particular visual can be exported in one single click, just one button on a visual and it exports the relevant data out to Excel or a CSV output, that would be good.
Chief SAP - ICT (Digital & IT) at a energy/utilities company with 1,001-5,000 employees
Real User
2020-10-29T13:17:37Z
Oct 29, 2020
The licensing costs of Tableau are on the higher side and probably if you wanted more adaptability in usage across business divisions you need to have more reasonable pricing of licenses of Tableau. Tableau is a standalone product. That is a disadvantage. Due to the fact that it is a standalone product, it has to extract the data from other ERP systems or other bespoke systems and other data systems, etc. If you have big data systems and you have got other informed decision-making tools and the data is being extracted into Tableau it is dependent on many other platforms. In contrast, if you use SAP vertical data systems and you have SAP's Data Hub, etc., then everything is vertically integrated. The whole data pipeline is vertically integrated and there is a visualization screen right there as well. Therefore, you don't normally have to go for a separate integration process altogether or need a data extraction solution. In the end, Tableau has got two or three disadvantages in the sense that it is not a seamlessly integrated platform, end-to-end platform. It's purely a standalone reporting tool. On top of that, the licensing cost is extremely on the higher side. Thirdly, IT divisions probably are a little bit hesitant to use Tableau due to the fact that separate training is required, and separate skill sets are needed to develop everything. The cost of owning the solutions from Tableau is much higher compared to any other analytical solutions.
Architect at a tech services company with 1,001-5,000 employees
Real User
2020-10-28T14:02:14Z
Oct 28, 2020
I am a BI consultant. I have worked on different reporting tools, such as Power BI and MicroStrategy. As compared to other tools, Tableau lags behind in handling huge enterprise-level data in terms of robust security and the single integrated metadata concept. When we connect to large or very big databases, then performance-wise, I sometimes found Tableau a little bit slow. It can have the single metadata concept like other tools for the reusability of the objects in multiple reports.
Tableau would be really good if we could have predefined templates. I was doing a POC another newer tool, Einstein Analytics. They have predefined templates already set up. These predefined templates do the heavy lifting for the initial dashboards. We don't have to build them from scratch. Our dashboards look really good and 20 to 30% of the look and feel of the dashboard completes with the predefined templates. If Tableau works on the predefined templates, that would be so helpful to a lot of companies. It would save time for the developers. The pricing is a bit higher than the competition. They'll need to lower it to stay competitive. They need to move more into machine learning AI. Right now, in a POC that I'm doing with Einstein Analytics, they are more into machine learning and AI. Tableau is lagging as of now. If they want to have a long run in the market, they need to integrate machine learning and AI. It has to be very robust.
Manager at a financial services firm with 1,001-5,000 employees
Real User
2020-07-15T07:11:33Z
Jul 15, 2020
Data cleansing and data transformation functionality need to be improved. Tableau is not a full-stack BI tool, like Sisense. Including this type of functionality would add flavor to the tool. The main point is that Tableau requires the data to be in a certain format for the end-user, in order for them to create charts. If it's not in a certain format, or in a certain structure, then the user will have to manipulate it. The charts in Tableau are quite limited.
Product Manager at TCG Digital Solutions Private Limited
Real User
2020-07-06T08:10:48Z
Jul 6, 2020
The solution requires a lot of user training before reports can be created. That can make things difficult and require us to have Tableau specialists. It's difficult for a newbie to start developing reports. Tableau queries and analytics, as well as development could be improved. The solution could also include an option to incorporate more open source libraries. I know Tableau has this closed loop so they might not want to provide that but if they did have integration capabilities with open-source libraries, I think that would be great.
The cost of the solution should be improved. Reports should be downloadable as PDF files. Emails containing images of dashboards can be scheduled, but there is still demand for creating printable PDF snapshot views of dashboards. UPDATE - In fairness to Tableau, with the right design, dashboards that are downloadable can be created ad-hoc.
Licensing and pricing options could be made better so that more users would be able to use it. The biggest concern any organization has is its budget when trying to implement a new product. Tableau is an extremely powerful tool and hence expensive, but if there was a way to cut down the cost they would end up attracting more users.
Improvements can be made in template support. The workbook file structure is really hard to version control. If there was some sort of version control support offered particularly for workbooks, that would help big time. Another note is that the interactions within the UI are not fast enough and in certain instances, there have been issues with the intuitiveness of the tool. Such as delays in configuring and achieving some specific effects. I have to say Tableau does have excellent and extensive online support.
Vice President Engineering Intellicloud at a university with 1,001-5,000 employees
Real User
2019-04-02T07:02:00Z
Apr 2, 2019
I have a lot of experience on the desktop version of Tableau. My recommendations for improvement for Tableau would be: * From the developer perspective, the data connection handling the target data set is what most needs to be improved. * Tableau keeps evolving with each version. With Tableau 192019.2, they're coming again with some more features. * Data preparation is where Tableau needs to work a lot on. Every time with Tableau you have to invest a lot of time preparing the data before you start using the visualizations. * Tableau doesn't perform well on big data processes. Suppose I was working with a file of like 1 or 2 gigabytes, then in that case Tableau is really slow. Sometimes I feel that Tableau is too slow when you have a big data file.
Their training. I've been looking for ways in which we can start training more people to it, and it has shown that other platforms have more access to training than Tableau.
Program Manager at a non-profit with 1,001-5,000 employees
Real User
2018-09-25T09:23:00Z
Sep 25, 2018
I would like to be able to set the parameters in a more specific manner. I feel as if it's not a questions of whether the solution is sufficient, it's whether we understand how to use it to the best of its productivity.
I would like them to include the Italian language, even if it's not a problem for me to use English, because the Quantrix modeler is only in English. I can also see there is Portuguese, Japanese, and Chinese, so why not Italian?
The use of this service in the desktop version is annoying due to the constant updates which lead to reinstalling the application. If they could give support with updates on the same downloaded version, it would be great.
Director, Data-Driven Innovation with 201-500 employees
User
2018-05-29T08:45:00Z
May 29, 2018
We would much appreciate an option for copying/moving objects between different pages and a possibility for teamwork when working on the same dashboards.
* Conditional formatting could be an interesting feature to provide to final users. It is a long-term request of our users. * The data preparation/blending options are very basic. They could be improved. * More willing to hear customer/user suggestions.
Manager, BI & Analytics at Perceptive Analytics
Real User
2016-07-05T10:04:00Z
Jul 5, 2016
I would like to see the inclusion of a template to create a speedometer chart. I can understand that Tableau doesn’t have it as one of its default chart types because it’s not a good way to represent the data. Indeed that’s true, but speedometers are quite popular and once we had a client who was insistent on having highly-customizable speedometers and I had to spend a good amount of time to create them via multiple workarounds. In my experience, I've seen many customers who do not want to consider alternatives to speedometers. I’ll address these two points: * Speedometers/dial charts are a not-so-good way to represent data * I had to resort to multiple workarounds to create a speedometer in Tableau First, I’ll give you a few reasons as to why speedometers are not considered to be a good way to visualize data: * Low data-ink ratio: ‘Data’ here refers to the data that you want to show on your chart/graph and ‘ink’ refers to the aesthetic elements of the chart such as lines, colors, indicators or any other designs. A low data-ink ratio implies that the quantity of ‘ink’ on the chart is very high relative to the small quantity of ‘data’ that is present on the chart. What does a speedometer or a dial chart do? It shows you the current state (value) of any system. Therefore, the data shown by the chart is just one number. Let’s come to the ‘ink’ part. Needless to say, there is a lot of ‘ink’ on a speedometer chart – so many numbers all around the dial, the dial itself, a needle that points to the actual number etc. The fundamental principle of data visualization is to communicate information in the simplest way possible, without complicating things. Therefore, best practices in data visualization are aimed at reducing visual clutter because this will ensure that the viewer gets the message – the right message – quickly, without being distracted or confused by unnecessary elements. * Make perception difficult: The human brain compares lines better than it does angles – information in a linear structure is perceived more easily and quickly than that in a radial one.Let's say I’m showing multiple gauges on the same screen. What's the purpose of visualizing data? It's to enable the user to derive insights - insights upon which decisions can be taken. The more accurate the insights, the better the decisions. So, its best that the visualization does everything that helps the user understand it in the easiest possible way. Hence, the recommended alternative to a dial chart is a bullet chart * Occupy more space: Assume that there are 4 key process indicators (KPIs) that I need to show on screen and the user needs to know whether each KPI is above or below a pre-specified target. If I were to use dial charts I’ll be creating 4 dials – one for each KPI. On the other hand, if I were to use bullets, I’ll be creating just one chart where the 4 KPIs will be listed one below the other and each one in addition to showing its actual and target values, will also show by how much the actual exceeds/falls short of the target in a linear fashion. As real estate on user interfaces is at a premium, believe me, this is definitely better. Now, let me come to my situation where my client would not accept anything but a speedometer. As I’ve mentioned in the review, Tableau doesn’t provide a speedometer template by default. So when I was going through forums on the Internet I saw that people usually used an image of a speedometer and put their data on top of that image and thereby creating speedometers in Tableau. This would not have worked in my case because my client wanted to show different bands (red, yellow and green) and the number of bands and bandwidths varied within and between dials. For example, one dial would have 2 red bands (one between 0 and 10 and the other between 90 and 100), 1 yellow band and 1 green band while another would have just one yellow band between 40 and 50 and no red or green bands. Also, these bands and bandwidths would be changed every month and the client needed to be able to do this on their own. Therefore, using a static background image of a dial was out of the question. So, here’s what I did: I created an Excel spreadsheet (let’s call it data 1; used as one of the 2 data sources for the dial) in which the user would be able to define the bands and bandwidths. The spreadsheet had a list of numbers from one to hundred and against each number, the user could specify the band (red/green/yellow) in which it falls. The other data source (data 2) was an Excel sheet containing the numbers to be indicated on the dials. Then, in Tableau, I created a chart which had 2 pies – one on top of the other. Both the pies had numbers from 1 to 100 along the border, providing the skeleton for the dial. The top pie used data 1 and had the red, yellow and green bands spanning the numbers from 1 to 100. I then created a calculated field having an ‘if’ condition: if the number in data 2 matched the number in data 1, the field would have a value ‘yes’. Otherwise, it would have a value ‘no’. This will produce only 1 ‘yes’ and 99 ‘no’s’ because there will be only 1 true match. I put this calculated field onto the ‘Color’ shelf and chose black for ‘yes’ and white for ‘no’ – this formed the content of the bottom pie. So the bottom pie had 99 white colored slices (which looked like one huge slice) and just 1 black slice (which looked like a needle). I made the top pie containing the red, yellow & green bands more transparent and this gave the appearance of a needle pointing to the KPI value, also indicating into which band the number fell, thereby enabling the client to gauge their performance.
Tableau is a tool for data visualization and business intelligence that allows businesses to report insights through easy-to-use, customizable visualizations and dashboards. Tableau makes it exceedingly simple for its customers to organize, manage, visualize, and comprehend data. It enables users to dig deep into the data so that they can see patterns and gain meaningful insights.
Make data-driven decisions with confidence thanks to Tableau’s assistance in providing faster answers to...
When Salesforce bought the solution, they claimed that Einstein would be part of Tableau, but that indication is not working. Tableau's integration with Einstein Analytics could be better. It's not even up to par with other products. The ad hoc, self-service, and data analysis features are not working.
The solution’s pricing could be improved.
The product's features for cloud integration need improvement. They should revise licensing and pricing models to cater to smaller enterprises. Users must be able to customize and write their code similarly to one of its competitors. Many companies have in-house data science models for Twitter or Facebook based on predictive analysis. There is a possibility of integrating these models seamlessly into Tableau.
Tableau's data modeling, mining, and AI library features need improvement.
Tableau is excellent at visualizing data, however, I think improving the data preparation features would be a great addition. Navigating through activities like cleansing, reshaping, and wrangling extensive or complicated datasets could prove challenging within the Tableau environment. Also, the settings for working with complex datasets also need to be changed. In the next version, it would be good to add user-friendly resources for beginners, such as interactive tutorials and templates, to make Tableau even more accessible to a wider audience.
The product’s router could be better in terms of readability. When we put more information on a single screen, it gets compressed and superimposed in many places while scrolling. It could be improved.
A specific thing in Tableau is that I have looked at how to add pictures to the reports. I don't have the ability in Tableau to create a tooltip and see the picture of a piece of jewelry or watch that is a best seller.
Despite the number of available interface languages, I would like to see as many languages as possible represented in Tableau. So many companies really want to use this program and it suits them better than any other similar programs, but due to the lack of understanding of English, it is difficult for them to decide to integrate this particular product. In addition to the existing set of common data types, Tableau Desktop has such a wonderful thing as a union of data on the principle of relationships for Multi-table. This type of union is very easy to work with the data source itself and does not overload the system. At the moment, this type of union only exists for connections from databases. It would be nice if Tableau added this type of merging for data sources that are on the Tableau Server/Tableau Cloud. Every three months, Tableau adds new features. There are always new features coming up.
When you're working on a dashboard, you can't select multiple components at a time and align them, so you have to go one by one. This is very cumbersome if you're floating, and it loses in comparison to Power BI, which does allow multiple selections. In the next release, I would like to see an enhancement of the prescriptive analytics features.
The charts need to be improved. The drawings and the visualization need to be more accurate. I would like to see the visualization improved.
When you create new fields in Tableau and you enter the formulas, there is a new small window that is there in the interface. You can enter the calculated fields, it could be more user-friendly. At this time it is limited and hard to understand at the beginning. The fields should be easier to use, such as in Microsoft Excel. You can have a difficult time understanding what to do in the fields, you end up doing trial and error to figure it out.
The development part should be better. We are putting a lot of effort in during development, so if we face any struggles, we have to find workaround solutions on the internet. It would be nice to have new workaround solutions and other options. Every customer has different expectations, so sometimes it's hard to find the right solution.
Most of the problems in Tableau Online that I have noticed have to do with performance or weird, inexplicable bugs that I can't pin down. For example, you might try unloading some data, and you'll be waiting for a long time without anything happening. These bugs always seem to happen when we perform big upgrades or do maintenance work, and we have had to send a lot of tickets for unexplained issues during these times. It doesn't seem to be a problem only for us, but also for customers all over the world, such as in Ireland, Western Europe, Eastern Europe, and the US, too. As for future features, I would like to see major upgrades in Bridge and the Flow Tool, allowing us to do more data engineering work. I think it would give Tableau a big edge in the market to look into how to incorporate more data engineering tools into their product. Besides that, I would also like the charts to be more realistic and easier on the eyes.
There were a lot of dashboards everywhere in the organization, however, when the company wanted to get the operational databases they were not connected. The solution needs to improve its integration capabilities. The performance and security could be better. Many people saw Tableau as a silver bullet and it isn't. It's good for small things, however, not for an institutional way of doing things. I'd like to see better integration with SAP. I'd like an integrated ETL or some sort of data preparation capabilities.
We need big servers to perform the operations that we are doing. They should probably relook at its architecture. There are limitations to the data source that we are building. We can put only 32 tables in a data source, which means we have to transfer some of the workload to a database.
One thing I would want to change for Tableau is to have a lower-cost model. It's pretty high for enterprise deployment. In the next release, I would like to have the capability to call machine learning models within Python while I'm building a dashboard. The value calculation should be a machine learning model, which is running somewhere else, on say, Amazon. These tools give good outputs, like calculated fields and all. But today the outputs are not straightforward. In simple terms, I need machine learning on the fly. That is not there.
The price is definitely a point that can be improved because smaller firms, like my bank firm, don't use Tableau because it's an expensive tool. If there were an option that catered toward smaller firms, that would be great because Tableau does in fact help with a lot of different kinds of data sources. For instance, it lets you upload CSV on Excel. However, other tools that we currently use, such as Looker, do not let you upload Excel files for ad hoc analysis. So, definitely, this is something price-wise that can be catered toward smaller firms. Creating variables, creating new fields in Tableau during analysis, actually adds columns to the data. That's something that could potentially give us an option. Do you want it as a column added to the data set or do you want it ad hoc in the visualization sheet? So if you create a measurement or a dimension, that creates a new column, but if you try to create a new filter directly on the visualization, it doesn't let you rename it. Basically what you see is just the calculation that you put in there. If you wanted to create something without making it an extra column in the data set, you can't just rename it to a more user-friendly short name. An improvement would be adding the ability to rename ad hoc creations if you do create a mark or a filter on the visualization. That doesn't really get added to the actual data fields.
They currently don't have a great Workday connector. Right now, Tableau can connect to more than 80 different types of databases or data sources, but it's challenging to connect with a few types, like Workday. So if they can come up with a better version or a connector for Workday, it will solve a lot of problems.
There is a lot more that can be done with Tableau than what is actually happening within Juniper. The company is not getting the answers to the questions directly from the Tableau database, for example. Of course, Tableau can be extended to answer those questions. What is happening, with so many tools coming up in the market, is that people have to continuously get educated in order to use some of the more advanced features. What's happening with Tableau is that, except for the dashboard view and all the filtering and that's happening from a dashboard perspective, it doesn't seem to be very good in making me understand the trend insights. For example, if I saw that the average sales price for Product A was lower than the average sales price for Product B, I'm not saying that B is inferior to A or anything. I'm just noting what I found and I cannot give more details. It doesn't go deeper into the analysis. I'd like more analysis to better understand what a trend might mean, and not just a report that a trend is happening. Right now, Tableau is not so good at providing that extra bit of insight. What happens is Tableau data is used very often. From the quarterly business reviews, et cetera, the executives have direct access to the Tableau dashboard. More than anything else, they're able to do all this filtering. They could probably improve the user interface response times. When it comes to slicing and dicing of data viewing the results, it needs to be just easier in general as executives are using it and looking at it, and they are not very technical. When executives look at the Tableau dashboard, they want to know why, for example, Product A bringing in less than Product B. Those kinds of key questions, which come from executives for reviewing the Tableau data need to be addressed and in a simple to understand way. I think Tableau has to work a little more in terms of the business insights aspect of it, where it communicates to the user and answers their questions. That intelligence part needs to be developed in Tableau. Something great would be, if, for example, like in Google, if you asked a question, it could feed you back potential information. I don't want to compare everything to Google, however, it's so easy to find the answers you need in the way Google is set up. If Tableau could do something similar to showcase answers to questions, that would be ideal. It needs some sort of smart dashboard.
An advanced type of visualization is a bit tricky to create. It has something called a Calculated field, and that sometimes gets a bit difficult to use when you want to create an advanced type of visualization.
Tableau would be difficult to implement without training or the in-house technical support we have.
From a downside perspective, some of the more advanced modeling techniques are actually fairly difficult to do. In addition, I just fundamentally disagree with the way you have to implement them because you can get incorrect answers in some cases. One of the key challenges is that you never know whether it is how your developers developed it or whether it was the tool. We did find that once we got into more complex models, the ability to keep objects that should tally the same way but didn't became more and more difficult. That was probably the big thing for me. I don't know enough about how the tool was developed to know whether that was because they didn't follow a recommended practice. That was probably the number one thing that I found frustrating with it. When we started to try and get into some very granular data sets that had some complex relationships in them, the performance on it degraded pretty quickly. It did degrade to such an extent that we couldn't use it. We had to change what we were trying to do and manage its scope so that we could get what we wanted out of it or reduce the scope of what we needed out of it. It doesn't have a database behind it, per se. So, while doing some of the more complicated things that you might otherwise do on a database, we started hitting some pretty significant challenges.
There should be a focus on memory data, which is the concept of Tableau. This is where they squeeze the data into their memory. Because of that, we see performance issues on the dashboards. The architecture should be improved in such a way that the data can be better handled, like we see in the market tools, such as Domo, in which everything is cloud-based. We did a POC in which we compared Tableau with Domo and performance-wise the latter is much better. As such, the architecture should be improved to better handle the data. We are seeing a shift from Tableau to Power BI, towards which most users are gravitating. This owes itself to the ease of use and their mindset of making use of Excel. Power BI offers greater ease of use. For the most part, when comparing all the BI tools, one sees that they work in the same format. But, if a single one must be chosen, one sees that his data can be integrated at a better place. Take real time data, for example. I know that they have the live connection, but, still, they can improve that data modeling space better.
I have noticed that Tableau is not very compatible with ClickHouse. There's no direct connection to ClickHouse; you have to set up an ODBC connection. Tableau's performance takes a hit if you have huge data. The stability and scalability could be improved.
The data preparation could integrate better with Tableau.
The solution is integrated reasonably well but I'd like to see some custom connectors and more integration with different platforms.
The process of embedding the dashboards on external portals and websites could be improved. We also experienced challenges with integration with analytics. In an upcoming release, if the capabilities of Tableau Prep are improvised and expanded, that would be an added advantage.
There is another ETL tool for Tableau that is new. It takes time to reach some level of experience. IN Power BI, they have Power Query. I find it easier to convert the information in Power Query with a single shortcut key. That's not an option in Tableau. You have to prepare your data. It will take a lot of time to clean the data. There's no mature ETL tool in Tableau, which is quite a negative for them. They need to offer some built-in ETL tool that has a nice and easy drag-and-drop functionality. There needs to be a bit more integration capability.
I would like the solution to have certain features allowing the delivery of reports to the email. For example, publishing Pixel Perfect reports.
Its price should be improved. Its price is much higher than Power BI and QlikView. Programming is not easy on Tableau. For programming, you have to have a separate model. They should include programming directly on the web portion of the Tableau desktop so that people can write Python or JavaScript code for customizations instead of using a different model. Currently, Tableau Data Prep is a separate application that you have to purchase. It would be helpful if they can include Tableau Data Prep and programming languages such as R, Python in the next version. Tableau Public, which is a community version, doesn't allow you to save your work on your desktop. They should allow it. Currently, you can only upload it in the community.
The product could be improved with more features in data analytics. Tableau is not currently a good database for handling built-in models for data science in order to test, train and run the models. It's not currently an AI tool or a tool for machine learning. Right now it's more for non-expert users. If they could improve their analytical capabilities for data science tasks, it would be a better product. In order to carry out data science tasks now, we have to use Vertica for big data projects to discover and run machine learning models. It would be very good if they had their own machine learning capabilities built in. I'd like to see more features in data analytics, AI and machine learning capabilities.
I have used Power BI as well as Tableau. There are a couple of interesting features that I like in Power BI, but they are not present in Tableau. For example, in Power BI, if I am looking at country-wise population, I can type and ask for the country that has the maximum population, and it will automatically give an answer and address that query. This kind of feature is not there in Tableau. Similarly, in Power BI, for integrating with the latest ML algorithms, we have decision trees and primarily multiple machine learning algorithms. The decision tree essentially visualizes the patterns in the data. We don't have such a feature in Tableau. If Tableau can integrate with the machine learning algorithms and help us to do visualizations, it would be a wonderful combination. Most of the people are going for Tableau primarily for visualization purposes. However, in the data science industry, users want to do model building as well as tell a story. As of now, Tableau is fulfilling the requirements for visualization purposes. If they can bring it up to a level where I can use it for machine learning purposes as well as for visualization, it would be very helpful. Many people who want to do data science don't want to write a code. Tableau is anyway a drag and drop tool, and if they can provide those options as well, it will be a powerful combination.
With Tableau, when you're dealing with very large datasets, it can be slow so the performance is an area that can be improved. The security can be improved.
It would be nice if we could export more raw data. Currently, there is a limit as to how much data you can export.
An area needing improvement involves the complexity of the product should you need to alter a lot of parameters. Definitely speaking, it's straightforward and it's very easy. Implementation problems can be dealt with by the client, in place of the user consultant. Let me give you some examples of things that could take long in a Tableau implementation. Suppose you have five different business areas in your company: marketing, supply chain, finance, HR and procurement. Let us suppose that access to HR salaries is not company-wide but is limited to only a select number of people in HR, such as the manager or the director of the department. Yet, I want people in the supply chain to be able to see and access different data from different areas. While this would not be technically difficult it would be time consuming if the businesses are very particular. There may be many policies involved in access authorization, in data availability and the like. This can involve a very strict security process using an outside identity provider. Instead of just logging in your username and password, you may have different technologies which are more safe and secure that need different providers to interface in Tableau. Depending on the need, this will be time consuming. For instance, while I don't know how this would be in your country, suppose you have an identity provider, in Brazil, marketing in Tableau. If you go to Asia, you may sometimes have a bio-metric identity that your hand or fingers employ which is going to get back at you. In that circumstance, they are going to send you a number or a code in your cellphone, requiring two steps, one to enter the bank and the other to withdraw your money. So, these things we call an outside identity provider, meaning a different vendor or different companies who manage the servers of managing identities. These would entail an integration with Tableau and these outside companies for security purposes. This would involve them sending me files and me sending them back in order to authenticate the user into the Tableau server. This can be time-consuming because they involve or require a different partner. Tableau is made for basic needs, such as requiring a user and a password to log in to the server; an unsophisticated architecture; or use of a single instead of a cluster of servers. If you have non-specific data security needs or you just want to analyze and sell data, that can take less than a day. But if you have technical servers, many interfaces, different providers and more serious processes, that will be time consuming. While Tableau does integrate with Arc server and Python server, the integration process is slow and the information is integrated in a protracted fashion. Sometimes your data will vary. You may have a vector of data. You may have a matrix of data. For some algorithms we do not use regular data, but a different data structure. Tableau does not work with these different data structures. As such, interfacing with Arc server and Python server, which are still languages that are widely used in machine learning, all happen slowly. It does not happen by a matrix of data and data vector.
The price could be better. The overall scalability can also be improved. I would like to see more machine learning components to do predictive analytics. It should be simple for our customers to use. Tableau should include an automated machine learning feature in the next release.
An issue that is common to both Tableau and Power BI is with large data sets. When it comes to large datasets, the data should be extracted faster. Tableau should offer the end-user a desktop version that is free where they can go in and practice. There are other solutions that offer it for free such as Huawei, and the desktop version of Power BI is also free. People tend to know if they want to learn visualization. They don't have a proper tool in place, they don't know how to or where to go to learn. If you give them the tool to learn and let them explore when they want to go into production, people are able to purchase the license. A 14-day trial version would not be enough time.
The integration with other program languages, like Python, needs to be better. I know the capability is there, however, there needs to be better integration. There needs to be integration for machine learning and AI. That would help data analysts and data scientists quite a bit.
This solution has some features which really needs to be improved. For example, the sorting feature, If we compare it with ClixSense, ClixSense has a direct sorting feature available to users. Wherein Tableau, we have to go and create a parameter, make it dynamic, force users to click somewhere else on the filter, and then maybe you can sort it. Tableau is really new for sorting features. With performance tuning, it generates a pretty complex query when it is not required. We do not actually write 100 lines of code for a single KPI indicator. What we do is run the performance tuning model which will give 100-200 lines of code for a single KPI. That is not exactly an optimized query. While running performance tuning on the query, it should be pretty optimized, but it does not seem to be doing this.
It will be good if the server could be more stable, and I would like to have the technical service to be more reliable. I would like a better response time without having to wait for a week just to get feedback.
It should offer better features for customization. It would be nice to have features such as border design.
Some of the functionality of the dashboard can be difficult to operate and the color pallets are limited. They need to improve the icons and the filters, because they look too old, resembling Excel from 1997. It would be helpful if the solution was less difficult to use.
I think predictive analytics is the main driver of business decisions and hence Tableau should strengthen the ability to make predictions. The forecasting feature in Tableau in my view is too limited because it must have dates but I should be able to predict the outcome of an event without having a date as part of the input. In situations where you are analyzing or using just one measure such as Sales, Tableau does not create the header for you. Furthermore, it is not straightforward as to how to create it. I would like to have the ability to perform multiple pivots and creating different variables. For example, if I have the regional population for six regions and branch offices, together with the number of clients per branch, all as a record or observation, then I should be able to pivot them separately resulting in the Region, Population, Branch, and Clients.
The Hyper Extract functionality is not as strong as that provided by Microsoft SQL. Tableau is not as strong as Oracle OBIEE in some regards.
With Tableau, there is a gap in its ability to handle very large-scale data. I would like it to be similar to the rest of the solutions, which can handle terabytes of data.
I attended a Tableau conference recently, and a quick improvement came to mind. When I am training people how to use it, I've come across situations where I've found it difficult to explain relationships. For example, when you want to blend data or when you want to show relationships, like when linking multiple tables; well, if you're an IT guy, that's easy. But if you are not an IT guy, you don't know anything about entity relationships, and it becomes a bit difficult for others to follow along. It takes me a long time to get people to understand, even up to the point where I feel that this is the lowest level that I can go in terms of explaining it. I realized that many people don't really have any experience or knowledge about relationships between objects, and it makes it hard for me to get my teaching across. So I was suspecting, and I think I made this recommendation, that Tableau could find an easier way to introduce relationships. For now, if you want to build relationships in Tableau, or even in Excel, you have things like Access modules and Sheets. But how do I know that I need to use one object with another for the relationship. And if you then put in a table, what do you do after that? You have to double click, but people don't know that you have to double click. I was hoping that there's a way that they can make that process a bit easier, though I don't know how they will do it. Perhaps when you load Tableau and connect to a data source, there would be a prompt that asks you if you want to link two tables together. So if you want to link two tables together, maybe you do A, B, C, D. That might help with the self-service idea. If you're talking about self-service, then it should be easy for people who do not have the time, or who do not have that IT background, to pick the data and use it correctly. In addition, and more generally, what I would like to see more support for is predictive analytics. When you're doing descriptive analysis, Tableau is excellent, and it's easy to do. But when you are trying to predict something, like in Tableau's forecasting feature, it seems to require date fields, or it won't work. But I can forecast something without relying on date fields; maybe I want to predict that a branch has to close if it doesn't want to make something soon. I don't need dates to do that. For this reason, I'm using Alteryx for predictive modeling instead of Tableau. Overall, the only major frustration that I have had so far is with Tableau Public. I first used Tableau Public when I was building capacity, and when there was a later release to download and you wanted to upgrade, all your work would have to be manually re-entered. I don't know how they can solve that. I was expecting that they might make a release on this upgrade, and then I can hit upgrade and it will install over what ever I have already. Otherwise, for now I think they are doing well and I know they're still adding a lot of features. But it does sometimes make our work difficult, for those of us who are building capacity, and who are regularly changing people around. It means you have to keep learning all the time. Another small detail for improvement is that when you draw bar charts, the default color could be something more neutral like gray. Instead, the default is blue, and I don't exactly get why this is the case.
I'm not sure if the solution needs any improvements. It's the best solution we have here right now. The pricing is high. I'm using a student license, however, I know that even this license is very expensive. I've tried to have this product in our organization, however, it's quite expensive. We don't have the internal budget. If you're looking for other kinds of data, for example, non-structured data, they could make it much easier to use this kind of data. Tableau could create other features just for data visualization and non-structured data. It's a beautiful solution when you've got frames and tables. It's structured. However, if you don't have this kind of information on the data, it's quite difficult to use Tableau. I would say that if you have any feature that opens the opportunity to work with non-structured data, it would be excellent. For example, we do end up creating a lot of word clouds. With unstructured data it just doesn't translate quite right. If you could use non-structured data to count the frequency of important words to find which word is more important, for example, that would be useful. I don't see Tableau doing this - counting the frequency of important words in a specific kind of text. It would also be great if there was statistical modeling for non-structured data.
Scalability for large amounts of data needs improvement, as well as its performance. From a scheduling perspective, if there is a sync up of the desktop dashboard into the server that we can publish as a web version, in an accessible way, that publishing scales and keeps on executing for hours. This can go on for eight to nine hours, but you have no indicator, you don't even see that it is processing. For example, there is no spinning wheel and all I see is a black screen. The interface can be improved, in part because there is no indication that something is running or that it's processing. I would like to have some kind of indication that there is something processing on the interface. Technical support could be faster or if they have any limitations of the product, they should openly communicate it. They could also just tell you that this product is intended for small volumes of data and may even suggest another solution.
All of the BI tools have graphical interfaces but when it comes to the learning environment, not every tool has everything. To be the best in the market, Tableau has to improve its user interface and also look into developing implementing the best machine learning algorithms. Including data storage capabilities would be helpful. During the data crunching phase, it takes time for Tableau to connect, integrate, and download the data. In general, it takes a lot of time for the ETL process. Increasing the trial period to six months would allow people to better learn and assess the tool to determine whether it suits their needs. Given the price of BI tools, Tableau should consider giving a scholarship to people so that they can learn how to work with the tool. It would be helping some of the people who lost their jobs during this pandemic. If the users learn and become certified on Tableau, it would help to get more people interested in the tool.
The data processing in Tableau is pathetic compared to Qlik. In Qlik, I can replace my ELD layer for an application. This can't be done in Tableau. The initial processing of data in Tableau takes a lot of effort. If there could be a feature that a particular visual can be exported or just the data behind the particular visual can be exported in one single click, just one button on a visual and it exports the relevant data out to Excel or a CSV output, that would be good.
The licensing costs of Tableau are on the higher side and probably if you wanted more adaptability in usage across business divisions you need to have more reasonable pricing of licenses of Tableau. Tableau is a standalone product. That is a disadvantage. Due to the fact that it is a standalone product, it has to extract the data from other ERP systems or other bespoke systems and other data systems, etc. If you have big data systems and you have got other informed decision-making tools and the data is being extracted into Tableau it is dependent on many other platforms. In contrast, if you use SAP vertical data systems and you have SAP's Data Hub, etc., then everything is vertically integrated. The whole data pipeline is vertically integrated and there is a visualization screen right there as well. Therefore, you don't normally have to go for a separate integration process altogether or need a data extraction solution. In the end, Tableau has got two or three disadvantages in the sense that it is not a seamlessly integrated platform, end-to-end platform. It's purely a standalone reporting tool. On top of that, the licensing cost is extremely on the higher side. Thirdly, IT divisions probably are a little bit hesitant to use Tableau due to the fact that separate training is required, and separate skill sets are needed to develop everything. The cost of owning the solutions from Tableau is much higher compared to any other analytical solutions.
I am a BI consultant. I have worked on different reporting tools, such as Power BI and MicroStrategy. As compared to other tools, Tableau lags behind in handling huge enterprise-level data in terms of robust security and the single integrated metadata concept. When we connect to large or very big databases, then performance-wise, I sometimes found Tableau a little bit slow. It can have the single metadata concept like other tools for the reusability of the objects in multiple reports.
Tableau would be really good if we could have predefined templates. I was doing a POC another newer tool, Einstein Analytics. They have predefined templates already set up. These predefined templates do the heavy lifting for the initial dashboards. We don't have to build them from scratch. Our dashboards look really good and 20 to 30% of the look and feel of the dashboard completes with the predefined templates. If Tableau works on the predefined templates, that would be so helpful to a lot of companies. It would save time for the developers. The pricing is a bit higher than the competition. They'll need to lower it to stay competitive. They need to move more into machine learning AI. Right now, in a POC that I'm doing with Einstein Analytics, they are more into machine learning and AI. Tableau is lagging as of now. If they want to have a long run in the market, they need to integrate machine learning and AI. It has to be very robust.
Data cleansing and data transformation functionality need to be improved. Tableau is not a full-stack BI tool, like Sisense. Including this type of functionality would add flavor to the tool. The main point is that Tableau requires the data to be in a certain format for the end-user, in order for them to create charts. If it's not in a certain format, or in a certain structure, then the user will have to manipulate it. The charts in Tableau are quite limited.
The solution requires a lot of user training before reports can be created. That can make things difficult and require us to have Tableau specialists. It's difficult for a newbie to start developing reports. Tableau queries and analytics, as well as development could be improved. The solution could also include an option to incorporate more open source libraries. I know Tableau has this closed loop so they might not want to provide that but if they did have integration capabilities with open-source libraries, I think that would be great.
The cost of the solution should be improved. Reports should be downloadable as PDF files. Emails containing images of dashboards can be scheduled, but there is still demand for creating printable PDF snapshot views of dashboards. UPDATE - In fairness to Tableau, with the right design, dashboards that are downloadable can be created ad-hoc.
I would like Tableau to handle geospatial data better in terms of multiple layers and shapefiles.
The SQL programming functionality needs to be improved.
Licensing and pricing options could be made better so that more users would be able to use it. The biggest concern any organization has is its budget when trying to implement a new product. Tableau is an extremely powerful tool and hence expensive, but if there was a way to cut down the cost they would end up attracting more users.
The performance could be better. At times, it can take up to one minute or more to open a workbook, which is very frustrating for the users.
Improvements can be made in template support. The workbook file structure is really hard to version control. If there was some sort of version control support offered particularly for workbooks, that would help big time. Another note is that the interactions within the UI are not fast enough and in certain instances, there have been issues with the intuitiveness of the tool. Such as delays in configuring and achieving some specific effects. I have to say Tableau does have excellent and extensive online support.
I have a lot of experience on the desktop version of Tableau. My recommendations for improvement for Tableau would be: * From the developer perspective, the data connection handling the target data set is what most needs to be improved. * Tableau keeps evolving with each version. With Tableau 192019.2, they're coming again with some more features. * Data preparation is where Tableau needs to work a lot on. Every time with Tableau you have to invest a lot of time preparing the data before you start using the visualizations. * Tableau doesn't perform well on big data processes. Suppose I was working with a file of like 1 or 2 gigabytes, then in that case Tableau is really slow. Sometimes I feel that Tableau is too slow when you have a big data file.
To improve the next version, it is important to highlight the use of the tool in other languages. This includes internal handling and updates.
Their training. I've been looking for ways in which we can start training more people to it, and it has shown that other platforms have more access to training than Tableau.
Sometimes it crashes because of the huge database. This could be fixed so that it works smoothly with large databases.
I would like to be able to set the parameters in a more specific manner. I feel as if it's not a questions of whether the solution is sufficient, it's whether we understand how to use it to the best of its productivity.
* The enterprise features need improvements. * Improvements in schema security and row/column security need to be made.
We would like a report model, because currently there is no schema that we can create in the tool.
I would like them to include the Italian language, even if it's not a problem for me to use English, because the Quantrix modeler is only in English. I can also see there is Portuguese, Japanese, and Chinese, so why not Italian?
The use of this service in the desktop version is annoying due to the constant updates which lead to reinstalling the application. If they could give support with updates on the same downloaded version, it would be great.
We would much appreciate an option for copying/moving objects between different pages and a possibility for teamwork when working on the same dashboards.
* Conditional formatting could be an interesting feature to provide to final users. It is a long-term request of our users. * The data preparation/blending options are very basic. They could be improved. * More willing to hear customer/user suggestions.
They need to improve the bar chart position and width.
I would like to see the inclusion of a template to create a speedometer chart. I can understand that Tableau doesn’t have it as one of its default chart types because it’s not a good way to represent the data. Indeed that’s true, but speedometers are quite popular and once we had a client who was insistent on having highly-customizable speedometers and I had to spend a good amount of time to create them via multiple workarounds. In my experience, I've seen many customers who do not want to consider alternatives to speedometers. I’ll address these two points: * Speedometers/dial charts are a not-so-good way to represent data * I had to resort to multiple workarounds to create a speedometer in Tableau First, I’ll give you a few reasons as to why speedometers are not considered to be a good way to visualize data: * Low data-ink ratio: ‘Data’ here refers to the data that you want to show on your chart/graph and ‘ink’ refers to the aesthetic elements of the chart such as lines, colors, indicators or any other designs. A low data-ink ratio implies that the quantity of ‘ink’ on the chart is very high relative to the small quantity of ‘data’ that is present on the chart. What does a speedometer or a dial chart do? It shows you the current state (value) of any system. Therefore, the data shown by the chart is just one number. Let’s come to the ‘ink’ part. Needless to say, there is a lot of ‘ink’ on a speedometer chart – so many numbers all around the dial, the dial itself, a needle that points to the actual number etc. The fundamental principle of data visualization is to communicate information in the simplest way possible, without complicating things. Therefore, best practices in data visualization are aimed at reducing visual clutter because this will ensure that the viewer gets the message – the right message – quickly, without being distracted or confused by unnecessary elements. * Make perception difficult: The human brain compares lines better than it does angles – information in a linear structure is perceived more easily and quickly than that in a radial one.Let's say I’m showing multiple gauges on the same screen. What's the purpose of visualizing data? It's to enable the user to derive insights - insights upon which decisions can be taken. The more accurate the insights, the better the decisions. So, its best that the visualization does everything that helps the user understand it in the easiest possible way. Hence, the recommended alternative to a dial chart is a bullet chart * Occupy more space: Assume that there are 4 key process indicators (KPIs) that I need to show on screen and the user needs to know whether each KPI is above or below a pre-specified target. If I were to use dial charts I’ll be creating 4 dials – one for each KPI. On the other hand, if I were to use bullets, I’ll be creating just one chart where the 4 KPIs will be listed one below the other and each one in addition to showing its actual and target values, will also show by how much the actual exceeds/falls short of the target in a linear fashion. As real estate on user interfaces is at a premium, believe me, this is definitely better. Now, let me come to my situation where my client would not accept anything but a speedometer. As I’ve mentioned in the review, Tableau doesn’t provide a speedometer template by default. So when I was going through forums on the Internet I saw that people usually used an image of a speedometer and put their data on top of that image and thereby creating speedometers in Tableau. This would not have worked in my case because my client wanted to show different bands (red, yellow and green) and the number of bands and bandwidths varied within and between dials. For example, one dial would have 2 red bands (one between 0 and 10 and the other between 90 and 100), 1 yellow band and 1 green band while another would have just one yellow band between 40 and 50 and no red or green bands. Also, these bands and bandwidths would be changed every month and the client needed to be able to do this on their own. Therefore, using a static background image of a dial was out of the question. So, here’s what I did: I created an Excel spreadsheet (let’s call it data 1; used as one of the 2 data sources for the dial) in which the user would be able to define the bands and bandwidths. The spreadsheet had a list of numbers from one to hundred and against each number, the user could specify the band (red/green/yellow) in which it falls. The other data source (data 2) was an Excel sheet containing the numbers to be indicated on the dials. Then, in Tableau, I created a chart which had 2 pies – one on top of the other. Both the pies had numbers from 1 to 100 along the border, providing the skeleton for the dial. The top pie used data 1 and had the red, yellow and green bands spanning the numbers from 1 to 100. I then created a calculated field having an ‘if’ condition: if the number in data 2 matched the number in data 1, the field would have a value ‘yes’. Otherwise, it would have a value ‘no’. This will produce only 1 ‘yes’ and 99 ‘no’s’ because there will be only 1 true match. I put this calculated field onto the ‘Color’ shelf and chose black for ‘yes’ and white for ‘no’ – this formed the content of the bottom pie. So the bottom pie had 99 white colored slices (which looked like one huge slice) and just 1 black slice (which looked like a needle). I made the top pie containing the red, yellow & green bands more transparent and this gave the appearance of a needle pointing to the KPI value, also indicating into which band the number fell, thereby enabling the client to gauge their performance.