- It provides easy integration with Office tools.
- It is easy to understand BI architecture.
Head of Information Systems Department at a government
It made data correlation between the agencies fast. Our expectation is to get real-time data collection systems in the maritime environment.
Pros and Cons
- "It provides easy integration with Office tools."
- "Our expectation is putting BI to work in real-time data collection systems in the maritime environment."
What is most valuable?
How has it helped my organization?
Microsoft SQL Server BI made data correlation between the agencies so fast, that the simple POCs for the dashboards let the decision-makers to migrate our so-called enterprise architecture. Enterprise architecture is the integration of ETL, CDC, DWH, reporting, and forecasting tools.
What needs improvement?
Our expectation is putting BI to work in real-time data collection systems in the maritime environment.
The Automatic Identification System is a great source of data regarding the ships from around the world. From kinematic to static including some commercial data, it streams to the maritime monitoring stations. So, collecting and processing of this data and also, creating useful information are the key factors for our government entity. However, this data is real-time data which means that the process should be done in seconds for thousands of ships. We are forcing the boundaries of the Microsoft BI product right now and wish to see some stream data processing methodologies in the future.
For how long have I used the solution?
I have used this product for a year and a half.
Buyer's Guide
Microsoft Power BI
November 2024
Learn what your peers think about Microsoft Power BI. Get advice and tips from experienced pros sharing their opinions. Updated: November 2024.
823,875 professionals have used our research since 2012.
What do I think about the stability of the solution?
The product was pretty stable but when it comes to collecting real-time data we encountered some data dropping issues.
What do I think about the scalability of the solution?
Sometimes, the scalability becomes an issue; instead of horizontal scaling, we always need vertical scaling.
How are customer service and support?
The technical support in Turkey is very good.
Which solution did I use previously and why did I switch?
We were using some open-source BI tools. It was very difficult to get support for open-source and that is why we switched to Microsoft.
How was the initial setup?
The setup was okay because Microsoft integrated BI to the SQL product. Instead of using a special product, you get the sense of using native add-on libraries for BI. It is a part of the database process. Training is the key here.
What's my experience with pricing, setup cost, and licensing?
I don't want to speculate but there is room for at least 45% discount as compared to the initial price. So bargain wildly with Microsoft.
Which other solutions did I evaluate?
We evaluated other open-source alternatives.
What other advice do I have?
You will need training personnel and powerful servers.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Consultant at a tech consulting company with 501-1,000 employees
My 30 tips for building a Microsoft BI solution, Part VI: Tips 26-30
This is the last part in my series of things I wished I knew about before starting a Microsoft BI project. I’ll be taking my summer vacation now so the blog will be quiet the next month. After the break I will revise a couple of the tips based on feedback so stay tuned.
#26: Decide how to source your data in Analysis Services and stick with it.
Ideally you will source your data from a correctly modeled star schema. Even then you may need to massage the source data before feeding it into SSAS. There are two ways of accomplishing this: Through views in the database or through data source views (dimensional) or queries (tabular). Unless you are unable to create views in your database (running on a prod system etc) I would strongly suggest using them. This will give you a clean separation of logic and abstraction between the SSAS solution and the data source. This means that clients connecting to the data warehouse directly will see the same data model as the SSAS solution. Also migrating between different front-ends (like dimensional and tabular) will become much simpler. In my solutions I never connect to tables directly I always bind to views for everything and never implement any logic in the DSV or via queries.
#27: Have some way of defining “current” time periods in your SSAS solution
Most SSAS solutions have a time dimension with dates, months, years, etc. In many ways its the most important dimension in your solution as it will be included in most reports / analyses as well as form the basis for a lot of calculations (see previous tips). Having a notion of what is the current period in your time dimension will greatly improve the usability of your solution: Reports will automatically be populated with the latest data without any user interaction. It can also simplify ad-hoc analysis by setting the default members to the most current date / month / year so that when users do not put these on one of the axes it will default to the most recent time period. There are a number of ways of implementing this including calculated members and named sets (for dimensional) and calculations for Tabular and the internet is abundant with sample solutions. Some of them are fully automated (using VBA time functions) and some require someone to manually set the current period. I prefer to use the latter if possible to avoid reports showing incorrect data if something went wrong in the ETL.
#28: Create a testable solution
This is a really big topic so I will emphasize what I have found most important. A BI solution has a lot of moving parts. You have your various source systems, your ETL pipeline, logic in the database, logic in your SSAS solution and finally logic in your reporting solution. Errors happen in all of these layers but your integration services solution is probably the most vulnerable part. Not only do technically errors occur, but far more costly are logic errors where your numbers don’t match what is expected. Luckily there are a lot of things you can do to help identify when these errors occur. As mentioned in tips #6 and #7 you should use a framework. You should also design your solution to be unit testable. This boils down to creating lots of small packages that can be run in isolation rather than large complex ones. Most importantly you should create validation queries that compares the data you load in your ETL with data in the source systems. How these queries are crafted varies from system to system but a good starting point would be comparisons of row counts, sums of measures (facts) and number of unique values. The way I do it is that I create the test before building anything. So if I am to load customers that have changed since X, I first create the test query for the source system (row counts, distinct values etc.) then the query for the data warehouse together with a comparison query and finally I start building the actual integration. Ideally you will package this into a SSIS solution that logs the results into a table. This way you can utilize your validation logic both while developing the solution but also once its deployed. If you are running SQL Server 2012 you might want to look into the data tap features of SSIS that lets you inspect data flowing through your pipeline from the outside.
#29: Avoid the source if you are scaling for a large number of users
Building a BI solution to scale is another very large topic. If you have lots of data you need to scale your ETL, Database and SSAS subsystems. But if you have lots of users (thousands) your bottleneck will probably be SSAS. Concurrently handling tens to hundreds of queries with acceptable performance is just not feasible. The most effective thing is to avoid this as much as possible. I usually take a two pronged approach. Firstly I implement as much as possible as standard (“canned”) reports that can be cached. Reporting Services really shines in these scenarios. It allows for flexible caching schemes that in most circumstances eliminates all trips to the data source. This will usually cover around 70-80% of requirements. Secondly I deploy an ad-hoc cube specifically designed and tuned for exploratory reporting and analysis. I talked about this in tip #17. In addition you need to consider your underlying infrastructure. Both SSRS and SSAS can be scaled up and out. For really large systems you will need to do both, even with the best of caching schemes.
#30: Stick with your naming standards
There are a lot objects that need to be named in a solution. From the more technical objects such as database tables and SSIS packages to objects exposed to users such as SSAS dimensions and measures. The most important thing with naming conventions is not what they are, but that they are implemented. As I talked about in tip #24 changing a name can have far reaching consequences. This is not just a matter of things breaking if you change them but consider all of the support functionality in the platform such as logging that utilize object names. Having meaningful, consistent names will make it a heck of a lot easier to get value out of this. So at the start of the project I would advise to have a “naming meeting” where you agree upon how you will name your objects. Should dimension tables be prefixed with Dim or Dim_? Should Dimension names be plural (CustomerS) or singular (Customer), etc.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Buyer's Guide
Microsoft Power BI
November 2024
Learn what your peers think about Microsoft Power BI. Get advice and tips from experienced pros sharing their opinions. Updated: November 2024.
823,875 professionals have used our research since 2012.
Consultant at a tech consulting company with 501-1,000 employees
My 30 tips for building a Microsoft BI solution, Part III: Tips 11-15
#11: Manage your own surrogate keys.
In SQL Server it is common to use an INT or BIGINT set as IDENTITY to create unique, synthetic keys. The number is a sequence and a new value is generated when we execute an insert. There are some issues with this. Quite often we need this value in our Integration Services solution to do logging and efficient loads of the data warehouse (there will be a separate tip on this). This means that sometimes we need the value before an insert and sometimes after. You can obtain the last value generated by issuing a SCOPE_IDENTITY command but this will require an extra trip to the server per row flowing through your pipeline. Obtaining the value before an insert happens is not possible in a safe way. A better option is to generate the keys yourself through a script component. Google for “ssis surrogate key” and you will find a lot of examples.
#12: Excel should be your default front-end tool.
I know this is a little bit controversial. Some say Excel lacks the power of a “real” BI tool. Others say it writes inefficient queries. But hear me out. Firstly, if you look at where Microsoft is making investments in the BI stack, Excel is right up there at the top. Contrast that to what they are doing with PerformancePoint and Reporting Services and its pretty clear that Excel is the most future proof of the lot. Microsoft have added lot of BI features over the last couple of releases and continue to expand it through new add-ins such as data explorer and geoflow. Additionally, the integration with SharePoint gets tighter and tighter. The Excel web client of SharePoint 2013 is pretty on par with the fat Excel client when it comes to BI functionality. This means that you can push out the new features to users who have not yet upgraded to the newer versions of Excel. When it comes to the efficiency with which Excel queries SSAS a lot has become better. But being a general analysis tool it will never be able to optimize its queries as you would if you wrote them specifically for a report.Please note that I am saying “default” not “best”. Of course there are better, pure bred, Business Intelligence front-ends out there. Some of them even have superior integration with SSAS. But its hard to beat the cost-value ratio of Excel if you are already running a Microsoft shop. If you add in the fact that many managers and knowledge workers already do a lot of work in Excel and know the tool well the equation becomes even more attractive.
#13: Hug an infrastructure expert that knows BI workloads.
Like most IT solutions, Microsoft BI solutions are only as good as the hardware and server configurations they run on. Getting this right is very difficult and requires deep knowledge in operating systems, networks, physical hardware, security and the software that is going to run on these foundations. To make matters worse, BI solutions have workloads that often differ fundamentally from line of business applications in the way they access system resources and services. If you work with a person that knows both of these aspects you should give him or her a hug every day because they are a rare breed. Typically BI consultants know a lot about the characteristics of BI workloads but nothing about how to configure hardware and software to support these. Infrastructure consultants on the other hand know a lot about hardware and software but nothing about the specific ways BI solutions access these. Here are three examples: Integration Services is mainly memory constrained. It is very efficient at processing data as a stream as long as there is enough memory for it. The instant it runs out of memory and starts swapping to disk you will see a dramatic decrease in performance. So if you are doing heavy ETL, co-locating this with other memory hungry services on the same infrastructure is probably a bad idea. The other example is the way data is loaded and accessed in data warehouses. Unlike business systems that often do random data access (“Open the customer card for Henry James”) data warehouses are sequential. Batches of transactions are loaded into the warehouse and data is retrieved by reports / analysis services models in batches. This has a significant impact on how you should balance the hardware and configuration of your SQL Server database engine and differs fundamentally from how you handle workloads from business applications. The last example may sound extreme but is something I have encountered multiple times. When businesses outsource their infrastructure to a third party they give up some of the control and knowledge in exchange for an ability to “focus on their core business”. This is a good philosophy with real value. Unfortunately if you do not have anyone on the requesting side of this partnership that knows what to ask for when ordering infrastructure for your BI project what you get can be pretty far off from what you need. Recently a client of mine made such a request for a SQL Server based data warehouse server. The hosting partner followed their SLA protocol and supplied a high availability configuration with a mandatory full recovery model for all databases. You can imagine the exploding need for disk space for the transaction logs when loading batches of 20 million rows each night. As these examples illustrate, it is critical for a successful BI implementation to have people with infrastructure competency on your BI team that also understand how BI solutions differ from “traditional” business solutions and can apply the right infrastructure configurations.
#14: Use Team Foundation Server for your BI projects too.
A couple of years ago putting Microsoft BI projects under source control was a painful experience where the benefits drowned in a myriad of technical issues. This has improved a lot. Most BI artifacts now integrate well with TFS and BI teams can greatly benefit from all the functionality provided by the product such as source control, issue tracking and reporting. Especially for larger projects with multiple developers working against the same solution TFS is the way to go in order to be able to work effectively in parallel. As an added benefit you will sleep better at night knowing that you can roll back that dodgy check-in you performed a couple of hours ago. With that said there are still issues with the TFS integration. SSAS data source views are a constant worry as are server and database roles. But all of this (including workarounds) is pretty well documented online.
#15: Enforce your attribute relationships.
This is mostly related to SSAS dimensional but you should also keep it in mind when working with tabular. Attribute relationships define how attributes of a dimension relate to each other (roll up into each other). For example would products roll up into product subgroups which would again roll into product groups. This is a consequence of the denormalization process many data warehouse models go through where complex relationships are flattened out into wide dimension tables. These relationships should be definied in SSAS to boost general performance. The magic best-practice analyzer built into data tools makes sure you remember this with its blue squiggly lines. Usually it takes some trial and error before you get it right but in the end you are able to process your dimension without those duplicate attribute key errors. If you still don’t know what I am talking about look it up online such as here. So far so good. Problems start arising when these attribute relationships are not enforced in your data source, typically a data warehouse. Continuing with the example from earlier over time you might get the same product subgroup referencing different product groups (“parents”). This is not allowed and will cause a processing of the dimension to fail in SSAS (those pesky duplicate key errors). To handle this a bit more gracefully than simply leaving your cube(s) in an unprocessed state (with the angry phone calls this brings with it) you should enforce the relationship at the ETL level, in Integration Services. When loading a dimension you should reject / handle cases where these relationships are violated and notify someone that this happened. The process should make sure that the integrity of the model is maintained by assigning “violators” to a special member of the parent attribute that marks it as “suspect”. In this way your cubes can still be processed while highlighting data that needs attention.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Hi Peter !
Nice article, now we discuss from point 11 to 15 in detail;
#11: I do agree with you partially on this, because I don't understand the need for creating a separate surrogate key for SSIS. My point is using the keys from Production tables; personally I use Change Table method to perform incremental loads. If a separate key is required in your Data warehouse Model, you can create in using a combination or reading the value from source table or by loading a value into SSIS variable and then assigning this to your table.
#12: I prefer to use Excel as a tool where i can perform quick data verification or number reconciliation by connecting to my cube. I know Microsoft has been investing lot in Excel through Power Pivot and all. But what about the future of "Power BI" which we heard a new tool which will have the capabilities to become the number one BI tool for reporting. Personally I think excel can't be used as enterprise reporting tool.
#13: A rare to have thing. Another thing to add is really hard to find BI Consultant which has experiences in not only Cube optimization, but also in Report and Database optimization as well. If you have one of these, I called them as a "Real Asset", because they not only help you in OLAP, they will help you in OLTP, in your SSIS and in your reporting as well. I must suggest including at least one of these guys in a BI project, this will actually save your time and money.
#14: I have been using TFS for keeping my SSRS reports to source control, and it’s been nice that it doesn't act up badly. But i do have a reservation about keeping my SSIS to TFS, because it happens to me multiple times where it got corrupted somehow, luckily I am not only relying on TFS so I have the source back with me. Always use a backup strategy if your source control might fail how you can do the recovering. So be prepared for this because it might be happening anytime soon.
#15: Always good to define hierarchies and attribute relationships, whenever possible define hierarchies. Remember once you define the Hierarchy, hide the attribute so that it won't be duplicated in reporting tool like if you are using Performance Point, end user might see same attribute both inside hierarchy and in the dimension as well. So do set the visibility of attribute to hidden.
Designing a BI Solution is an interesting job; in each development you will learn new things. Always plan your development, choose the right tools to be used for your final solution, if you are unsure about something better discuss it with some other Consultants to pick the right product for your solution.
Regards,
Hasham Niaz
Managing partner at a tech vendor with 1-10 employees
Low cost, good reporting, and simple installation
Pros and Cons
- "In my opinion, Microsoft BI is a low-cost solution, and it could be an interesting solution for a tourism company."
- "I believe the price should be degressive, and we should have a lower, or better unit price if we have more users."
What is our primary use case?
I am a reseller for Sage business solutions and will start to sell Microsoft BI.
Microsoft BI is primarily used for sales, financial, and production reports.
What is most valuable?
I believe that it's a good solution.
It is not an expensive solution. The price is very important in the solution selection process. In my opinion, Microsoft BI is a low-cost solution, and it could be an interesting solution for a tourism company.
For how long have I used the solution?
I have been using Microsoft BI for a few months.
As a cloud solution, we are always working with the latest version.
What do I think about the scalability of the solution?
In our company, we have five users.
The price is an important consideration when upgrading or adding new users. In my opinion, it would be better for the price to be degressive by the user. For example, if you have five users, you simply multiply five by the unit price to find the total cost of the solution.
I believe the price should be degressive, and we should have a lower, or better unit price if we have more users.
How are customer service and support?
I hadn't really needed Microsoft's technical support until now. We have not had any issues to resolve.
How was the initial setup?
The installation is not a difficult thing, it is very simple.
We only need one IT manager, to deploy this solution.
What's my experience with pricing, setup cost, and licensing?
Licensing fees are paid on a yearly basis.
What other advice do I have?
I would recommend this solution to others who are considering this solution, which is the reason that I will be selling it.
I would rate Microsoft BI an eight out of ten.
Which deployment model are you using for this solution?
Public Cloud
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Head of Global Services Business Performance Management at a comms service provider with 10,001+ employees
Simple visualizations, frequent updates, and good support
Pros and Cons
- "The most valuable features of Microsoft BI are the variety of possibilities to connect to various data sources. The visualizations are easily done, have useful rollover functions, and there are continuous updates being made to the system. You can benefit from the various improvements."
- "I'm missing collaborations functionality to operate or to work connected with multiple people on a data source or on virtualization. There should be more collaborations functions, such as in Confluence. We haven't explored the solution sufficiently in this area, but at this time it doesn't look sufficient."
What is most valuable?
The most valuable features of Microsoft BI are the variety of possibilities to connect to various data sources. The visualizations are easily done, have useful rollover functions, and there are continuous updates being made to the system. You can benefit from the various improvements.
What needs improvement?
I'm missing collaborations functionality to operate or to work connected with multiple people on a data source or on virtualization. There should be more collaborations functions, such as in Confluence. We haven't explored the solution sufficiently in this area, but at this time it doesn't look sufficient.
I would want one platform, which can be used for top management meetings where you see and comment on the data. That would be a perfect combination. Everybody has access, sees the status, the data, and the comments, and that will make life easier for us.
For how long have I used the solution?
I have been using Microsoft BI for approximately three years.
What do I think about the stability of the solution?
Microsoft BI is stable.
What do I think about the scalability of the solution?
I have found Microsoft BI to be scalable.
How are customer service and support?
The technical support from Microsoft has been good.
How was the initial setup?
The setup is good. Everybody can test and try the solution, it's not rocket science. There is a lot of training and courses available. We decided to have a separate workforce for that purpose which is doing nothing else than Microsoft BI every day in India. It has been very effective.
What other advice do I have?
I rate Microsoft BI an eight out of ten.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Technology Solutions Professional at OrchidaSoft
Has good implementation features
Pros and Cons
- "The solution is easy to set up and implement."
- "I would like to see a change in the premium capacity."
What is most valuable?
Defining the most valuable features would take a long time. While I cannot point to a specific feature, I feel the solution provides a full range of implementation capabilities. This allows for integration and modeling, for carrying out good transformations within the same platform, as well as visualization. There are not many products included inside and one can start from scratch.
In respect of the customer, one can quickly implement phases. A person can start out with creating his data set on a miniscule amount of the client's work and, working solely with a sales team, make it agile, only involving three or four dashboards. At this point, completion is possible and progress may be made to the next one very quickly. We like the solution's ease of use, functionality range and ability to be quickly implemented.
What needs improvement?
I would like to see a change in the premium capacity. It is very costly, particularly for the Egyptian market, amounting to $5,000 per month. Perhaps in the Gulf this would work well. The data flow should be enhanced from OnPrem-Gateways, which we find to be somewhat complicated and which does not always work. Regional pricing is the main issue.
The stability is okay, although there can sometimes be an issue with the connection when it comes to data OnPrem and the need to manage gateway communication and do troubleshooting. In brief, there are certain issues with OnPrem stability.
'Although we feel the solution to be a dream, it would be great to see everything on Power BI services, obviating the need for Power BI desktop. I hope to see such Power BI implementation.
I feel like we don't have a very powerful ELT or ETL tool when it comes to power and data cleansing. The solution compares unfavorably with such products as Informatica in this regard.
For how long have I used the solution?
I have been using Microsoft BI for two years.
What do I think about the stability of the solution?
The stability is okay, although there are occasional connectivity problems when it comes to managed gateway communication of data OnPrem and troubleshooting. The OnPrem stability should be addressed.
What do I think about the scalability of the solution?
While the stability is okay, there is an occasional need to add extra products, such as those involving Azure data analysis and Azure Analytics Services.
Big data would require the involvement of different products, Microsoft sign ups, for example. While we did not go for this, our technical teams are trying to get up to speed to have big data rediness.
I feel like the solution has a comparatively inferior ELT or ETL tool when it comes to power and data cleansing and compares unfavorably with Informatica. There is occasionally a need to involve other solutions, such as Informatica and Alteryx.
How are customer service and support?
While we have not made much use of Microsoft support, I did previously work with Microsoft's premier support and found it to be very good, overall.
How was the initial setup?
The solution is easy to set up and implement.
What's my experience with pricing, setup cost, and licensing?
The premium capacity is very costly in respect of the market in Egypt, amounting to $5,000 monthly. The regional pricing should be addressed.
Which other solutions did I evaluate?
The solution has an edge over others in its quick implementation. It is also very helpful to consider Microsoft data platform on Azure. Doing a combination with Microsoft data plaform will give one a great edge over, say, Tableau.
The combining of Power BI with Microsoft data platform on Azure provides one increased familiarity. We are talking about a great ecosystem.
The solution is comparatively inferior to those offered by other companies in respect of the ETL and ELT as these relate to power and data cleansing. It is not the best.
What other advice do I have?
My advice to someone looking to implement Power BI for his own organization would be to take things step by step. He should initially refrain from taking on big projects, instead focusing on agility, starting with the most requisite dashboards, working on them and garnering experience. The person will improve from one time to the next. One should familiarize himself with the details and how to move data and big data. He should not remain stuck waiting for the implementation of a big project.
I rate Microsoft BI as an eight out of ten.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Analyst Operations with 1-10 employees
Displays information well, simple to install, and low maintenance
Pros and Cons
- "In the process of using data there is the Extract, Transform and Load (ETL). For extracting we are using another software, for transforming we are using our own coding in C#, and then we use Microsoft BI for loading the information. Microsoft BI displays information very well."
- "The solution could improve the extraction and transformation of data. For example, you transform the data and then send it to Microsoft BI without having to use your own API. We are only providing the API to Power BI, and then Power BI is doing the job."
What is our primary use case?
We were using Microsoft BI but it could not do what we wanted it to do. Then we moved to an open-source platform, which was Apache Tika, Kafka, and third solution. We ended up moving back to Power BI only to display the information because displaying information with Power BI is better than open-source software. We are not using all aspects of the solution at the moment.
What is most valuable?
In the process of using data there is the Extract, Transform and Load (ETL). For extracting we are using another software, for transforming we are using our own coding in C#, and then we use Microsoft BI for loading the information. Microsoft BI displays information very well.
What needs improvement?
The solution could improve the extraction and transformation of data. For example, you transform the data and then send it to Microsoft BI without having to use your own API. We are only providing the API to Power BI, and then Power BI is doing the job.
In an upcoming release, Microsoft BI should increase the functionality of the solution.
For how long have I used the solution?
I have been using Microsoft BI for approximately six years.
What do I think about the stability of the solution?
We were satisfied with the stability of Microsoft BI.
What do I think about the scalability of the solution?
We have 30 users using the solution in my organization.
How are customer service and support?
We are in contact with the support of Microsoft but I have not been in contact directly.
Which solution did I use previously and why did I switch?
I have used other solutions, such as Apache Tika and Kafka.
How was the initial setup?
The initial setup was straightforward.
What about the implementation team?
The solution only needs one person to do the maintenance.
What's my experience with pricing, setup cost, and licensing?
There are a few options available for purchasing a license. Typically the number of users you have will determine the price of the license. The more users you have the more you will pay.
What other advice do I have?
The solution has had a lot of changes over the years and it is very good. However, there is more work to be done on extract and transform functions for it to be done properly.
I rate Microsoft BI an eight out of ten.
Which deployment model are you using for this solution?
On-premises
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Responsable Industrie 4.0 at a engineering company with 10,001+ employees
Easy to use for a non-IT person and good for data modeling and working with a large amount of data
Pros and Cons
- "I like data modeling. You can connect with your database, which is quite useful for me. It is a good tool if you have a large amount of data and you want to gather different data and interconnect it. The Power Query functionality is quite an interesting feature. If you have a query in Excel, you can also copy your query and run it in Power BI. Its dashboard is also very nice and not complicated. You don't need to be a developer to be able to use it. I am not an IT guy, and it is quite easy to use for somebody who is not an IT person."
- "It is not the right tool to do deeper analysis or predictions. When you have some data and you want to do some deep analysis, there is no feature to help you with this."
What is most valuable?
I like data modeling. You can connect with your database, which is quite useful for me. It is a good tool if you have a large amount of data and you want to gather different data and interconnect it.
The Power Query functionality is quite an interesting feature. If you have a query in Excel, you can also copy your query and run it in Power BI.
Its dashboard is also very nice and not complicated. You don't need to be a developer to be able to use it. I am not an IT guy, and it is quite easy to use for somebody who is not an IT person.
What needs improvement?
It is not the right tool to do deeper analysis or predictions. When you have some data and you want to do some deep analysis, there is no feature to help you with this.
For how long have I used the solution?
I have been using this solution for one year.
What other advice do I have?
I would rate Microsoft BI an eight out of ten.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Buyer's Guide
Download our free Microsoft Power BI Report and get advice and tips from experienced pros
sharing their opinions.
Updated: November 2024
Popular Comparisons
Amazon QuickSight
Teradata
IBM Cognos
SAP Analytics Cloud
SAP BusinessObjects Business Intelligence Platform
Oracle OBIEE
MicroStrategy
Oracle Analytics Cloud
Salesforce Einstein Analytics
TIBCO Spotfire
ThoughtSpot
Buyer's Guide
Download our free Microsoft Power BI Report and get advice and tips from experienced pros
sharing their opinions.
Quick Links
Learn More: Questions:
- BI Reports for business users - which BI solutions should we choose?
- Business users moving from Tableau to MS Report builder
- Is Power BI a complete platform or only a visualization tool?
- What are the key advantages of OBIEE compared to Microsoft BI?
- What Is The Biggest Difference Between Microsoft BI and Oracle OBIEE?
- Is Microsoft Power BI good for an ETL process?
- How would you decide between Microsoft Power BI and TIBCO Spotfire?
- Is it easy to extract data from Oracle Fusion into Power BI?
- PowerBI or SyncFusion - which is better?
- What challenges to expect when migrating multiple dashboards from TIBCO Spotfire to Microsoft Power BI?
Thanks Peter for the great range of tips for using Microsoft BI tool. They are indeed a must-read for all developers and novice users of this great tool for businesses.