We primarily use the solution primarily for making dashboards. We are also doing analytics on Microsoft BI.
Data Warehouse Engineer at a government with 1,001-5,000 employees
Easy to use and flexible but can't handle big data
Pros and Cons
- "The product is easy to use and simple to navigate."
- "The product doesn't support unstructured data. It doesn't support video, streaming, and strings of files."
What is our primary use case?
What is most valuable?
The solution is very stable.
The product is easy to use and simple to navigate.
The product has proven to be quite flexible.
It is quite scalable and it is doing the work that we want it to.
What needs improvement?
We'd like the solution to be more scalable.
We'd like the ability to manage it on the cloud, as a cloud-based solution. It is running out of business as it is not able to keep pace with the big data. Big data is not something that BI can support. It cannot be scaled that much.
The product doesn't support unstructured data. It doesn't support video, streaming, and strings of files. Microsoft is aware of this.
For how long have I used the solution?
I've been using the solution for seven to eight years at this point. It's been a while.
Buyer's Guide
Microsoft Power BI
February 2025

Learn what your peers think about Microsoft Power BI. Get advice and tips from experienced pros sharing their opinions. Updated: February 2025.
837,501 professionals have used our research since 2012.
What do I think about the stability of the solution?
The solution is very stable. There are no bugs or glitches. It doesn't crash or freeze. It's reliable and the performance is very good.
What do I think about the scalability of the solution?
The solution is quite scalable, although, in our case, as we are not on the cloud, we are facing some issues.
We have more than ten people using the solution currently.
How are customer service and support?
I've never reached out to technical support in the past. I cannot speak to how helpful or responsive they are.
How was the initial setup?
I wasn't a part of the initial setup. That was something that was handled by another team. I can't speak to how long it took to deploy.
My understanding is that you only need two people to deploy the solution. Admins usually handle it.
What's my experience with pricing, setup cost, and licensing?
I don't have any insights into the licensing and payments required.
What other advice do I have?
We are using the latest version of the solution. I'm not sure of the exact version number.
I'd rate the solution at a seven out of ten. For us, it has met our expectations and we enjoy its capabilities.
I'd recommend this product to other users and organizations.
Which deployment model are you using for this solution?
On-premises
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Business Intelligence Manager at a tech services company with 51-200 employees
Good reporting, enables self-service BI, it's stable, and we are satisfied with the pricing
Pros and Cons
- "It's easy to create reports and it's easy to download the data from the corresponding report."
- "We would like improvements made to the paginated reports so that it produces quality similarly to SSRS."
What is our primary use case?
This solution is primarily used for business analytics. We generate reports so that the business analytics team can use them for their analysis.
How has it helped my organization?
Previously, we used Excel to perform the analysis. It was very tedious for them to get data into Excel, do the refresh, and generate the reports. Now, using Microsoft BI, they can connect directly with the data, it gets refreshed automatically, and they don't have to worry about refreshing it.
They get the data in the report section and they don't have to worry about report generation, data modeling, and other aspects.
What is most valuable?
The most valuable feature is the reporting. It's easy to create reports and it's easy to download the data from the corresponding report.
This product enables self-service BI.
What needs improvement?
The paginated report functionality, which was intended to replace SPSS, is not yet up to the mark. We would like improvements made to the paginated reports so that it produces quality similarly to SSRS.
There is no automation for paginated reports, so then need to make it automated.
Ultimately, I would like to see all of the features from SSRS that are missing. That's ideally what I would like to have.
For how long have I used the solution?
I have been using Microsoft BI for the past 10 years.
What do I think about the stability of the solution?
This product is pretty stable.
What do I think about the scalability of the solution?
Scalability depends on the kind of reports and data we are looking at. It has multiple features for us based on the functionality and the requirement of the customer. It's not limited only to enterprise-specific customers. Rather, It can be used by small, medium, as well as enterprise-based customers.
There is no limitation as such. So it's quite scalable.
I have work two different companies at this time, and as far as I know, there are between 1,500 and 2,000 people who use this product.
How are customer service and technical support?
The quality of support depends on the Microsoft partner that you are working with.
We have not faced any significant issues, but when we have, we raised a bug with the Microsoft technical team and they have resolved things accordingly.
Which solution did I use previously and why did I switch?
This is the only tool that we are using for reporting. Previously, we used Crystal Reports.
What about the implementation team?
We deployed this product ourselves. We used the documentation that is available at Microsoft.com.
What's my experience with pricing, setup cost, and licensing?
The cost of licensing depends on the number of people that are going to be using the reports, and we are satisfied with it at $10 per user.
Which other solutions did I evaluate?
Our company focuses on Microsoft products, so we did not evaluate solutions by other vendors.
What other advice do I have?
My advice is that if you are looking for an aggregated level of data, using it for reporting, then it's better to use Power BI because everything will be in memory and it will be much faster.
I would rate this solution a nine out of ten.
Which deployment model are you using for this solution?
Private Cloud
Disclosure: My company has a business relationship with this vendor other than being a customer: partner
Buyer's Guide
Microsoft Power BI
February 2025

Learn what your peers think about Microsoft Power BI. Get advice and tips from experienced pros sharing their opinions. Updated: February 2025.
837,501 professionals have used our research since 2012.
Consultant at a computer software company with 10,001+ employees
A scalable BI solution with useful visualization features
Pros and Cons
- "I think the visualization part is valuable."
- "Actionable insights could be better."
What is our primary use case?
All our operational dashboards are on Microsoft BI. Visualization is primarily what we use Microsoft Power BI for today.
We're in a position to explore all the underlying data. For example, your SLAs, how they're trending month on month, or how your backlog of tickets is going.
We look at all the respondent resolution SLAs or different priorities every month. If there's a dip somewhere, we're able to double click and then go to the actual client or the ticket, which has caused a problem.
You can go back and see if you need to do anything to recover from that situation. For example, if your SLA brings 25% and if you're dipping to 94%, go back and see why you're dipping. If there are, let's say, too many incidences from a specific technology or a specific client, go back and see what you need to do to fix those things.
We're now looking to get to the next level with exploratory analytics. We want to go into what we call explanatory analytics, which analyzes the underlying data. Instead of waiting for something to fail, you come out and say, "Hey, these are some areas that are not working well, and you probably need to look at it."
We're trying to use Microsoft BI and for what we call actionable insights. This tool should be able to build up and show you what the underlying data is telling you. For example, our affiliates may be trending at 95%, but since we run a shared service, there could be some clients where it's 100% and some clients where it's probably 85%. Those claims could lead to a client-side problem or a client satisfaction issue.
Explanatory analytics can give you such exceptions automatically. Then you can go back and work on those clients to ensure that you pull your SLA back up from 85% to 95% and ensure that customer satisfaction doesn't dip.
What is most valuable?
I think the visualization part is valuable. It's also very easy to build new dashboards. It's fairly intuitive for people who understand the Microsoft Power BI tool.
We're fairly happy with the product in terms of both configuring the Microsoft BI dashboards and making changes to them. It's fairly easy to make changes.
What needs improvement?
Actionable insights could be better. I would like it to provide exceptional reports that you need to act upon to keep your operations or businesses going. That's something I would like to see.
On the origination side, if there are better graphs and maps to visualize data like I've seen other tools like Tableau do, it might be useful. They need to have very different ways of presenting information. If it's eye-catching, better than a pie chart or a bar graph, that's even better.
For how long have I used the solution?
I have been using Microsoft Power BI for about a year.
What do I think about the stability of the solution?
Microsoft BI is stable. We aren't faced with too much downtime. On a scale of five, I would probably rate it at 4.8 out of five throughout any given week.
What do I think about the scalability of the solution?
I think we have scaled up Microsoft BI fairly easily because it's on a cloud. We've added users. We added more dashboards from our different service lines, and we found it fairly easy to scale up.
How was the initial setup?
The initial setup is fairly easy and straightforward.
What other advice do I have?
I would definitely recommend Power BI from a visualization perspective. It's quick and easy to set up and scales up very well. If you've been using data on Excel sheets and converting them to graphs on PowerPoint, I think this is a tool that gives you almost a live visualization of what your operations are.
We use it for our day-to-day IT operations. I'm sure it can also be used to visualize other data like how many clients, how clients build up weekly, and the various stages of transitioning client needs into services. These things can be very easily developed on our Microsoft BI dashboard.
On a scale from one to ten, I would give Microsoft BI an eight.
Which deployment model are you using for this solution?
Public Cloud
Disclosure: My company has a business relationship with this vendor other than being a customer: Partner
Senior Consultant | Architect at DHL
We have the ability to connect separate services like CRM, NAV, Facebook, Exchange, etc. and make data integration and transformation based on these data connections in a few clicks.
What is most valuable?
Firstly there was the wow effect, as when I saw the number of connectors which were made for this product/service. It is quite fantastic to have the ability to connect separate services like CRM, NAV, Facebook, Exchange etc. and make data integration and transformation based on these data connections in a few clicks. Also, the ability to download new visuals is quite nice. The effect of new graphics for higher management is magical and this is the good way not only for a pre-sale/sale or up-sale, but also for good impressions. Another great feature is, for example, the collecting of data in concrete folder on disk. If there is, for example, 100 csv files from other information systems, these should be automatically merged, analyzed and transformed into great graphical report.
How has it helped my organization?
Because the product is based the ability to quickly produce BI insights and reports, it’s really useful on a daily basis. Another improvement is definitely the possibility of browsing the dashboards on smartphone using an app, and another is the ability to quickly produce reports.
What needs improvement?
Well definitely the connectors, which is always a huge space for improvement in configurations, especially the amount of connectors etc. Also, the graphic designs of the reports needs work as they are still really strictly defined, and with the amount of output, there isn’t space for such a design realization.
For how long have I used the solution?
I have used this solution since it was released (a few years ago) and I am using it on a daily basis at work and at university where I work as a researcher and am a PhD student. I use it on mobile, laptop and tablet as well. Also, it has its own app for viewing the reports and dashboards, own app for creating and editing.
What was my experience with deployment of the solution?
There have been no issues with the deployment.
What do I think about the stability of the solution?
There was no issue with stability.
What do I think about the scalability of the solution?
We have been able to scale it for our needs.
Which solution did I use previously and why did I switch?
Personally, I used Excel Services/Visio Services.
How was the initial setup?
Really straightforward. The most complex part is defining the data gateways between the cloud part and the on premises part of the infrastructure. Another bigger task is to define security model of all the datasets and reports with correct audience, data refresh etc.
What was our ROI?
It's too soon to calculate. I think, that for the correct ROI value, you need to have it in place for more than three years. But, it saves a lot of time and not only developers time, but also management time etc. Things are easier when there are functionalities like “quick insight” for auto-creating of data based on machine learning algorithms, or Q&A for using natural query language.
Which other solutions did I evaluate?
I played with Tableau. I see many similarities in Power BI to other products, so there is no reason to combine many different vendors/third parties to build such a complex BI solution.
What other advice do I have?
It's a tool for a new kind of business intelligence from Microsoft. Tool for quick modeling of data structures and for visualizing almost everything you have in mind at the moment. There is still a lot of room for improvement and there is also a huge space for new functionalities. But it’s a simple and great tool for everyday use.
You should get it and implement it. However, you should get a trial version, and contact a partner who can provide some sales presentation with the live session (CIE for example) and show, what the possibilities are of Power BI.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Consultant at a tech consulting company with 501-1,000 employees
My 30 tips for building a Microsoft BI solution, Part VI: Tips 26-30
This is the last part in my series of things I wished I knew about before starting a Microsoft BI project. I’ll be taking my summer vacation now so the blog will be quiet the next month. After the break I will revise a couple of the tips based on feedback so stay tuned.
#26: Decide how to source your data in Analysis Services and stick with it.
Ideally you will source your data from a correctly modeled star schema. Even then you may need to massage the source data before feeding it into SSAS. There are two ways of accomplishing this: Through views in the database or through data source views (dimensional) or queries (tabular). Unless you are unable to create views in your database (running on a prod system etc) I would strongly suggest using them. This will give you a clean separation of logic and abstraction between the SSAS solution and the data source. This means that clients connecting to the data warehouse directly will see the same data model as the SSAS solution. Also migrating between different front-ends (like dimensional and tabular) will become much simpler. In my solutions I never connect to tables directly I always bind to views for everything and never implement any logic in the DSV or via queries.
#27: Have some way of defining “current” time periods in your SSAS solution
Most SSAS solutions have a time dimension with dates, months, years, etc. In many ways its the most important dimension in your solution as it will be included in most reports / analyses as well as form the basis for a lot of calculations (see previous tips). Having a notion of what is the current period in your time dimension will greatly improve the usability of your solution: Reports will automatically be populated with the latest data without any user interaction. It can also simplify ad-hoc analysis by setting the default members to the most current date / month / year so that when users do not put these on one of the axes it will default to the most recent time period. There are a number of ways of implementing this including calculated members and named sets (for dimensional) and calculations for Tabular and the internet is abundant with sample solutions. Some of them are fully automated (using VBA time functions) and some require someone to manually set the current period. I prefer to use the latter if possible to avoid reports showing incorrect data if something went wrong in the ETL.
#28: Create a testable solution
This is a really big topic so I will emphasize what I have found most important. A BI solution has a lot of moving parts. You have your various source systems, your ETL pipeline, logic in the database, logic in your SSAS solution and finally logic in your reporting solution. Errors happen in all of these layers but your integration services solution is probably the most vulnerable part. Not only do technically errors occur, but far more costly are logic errors where your numbers don’t match what is expected. Luckily there are a lot of things you can do to help identify when these errors occur. As mentioned in tips #6 and #7 you should use a framework. You should also design your solution to be unit testable. This boils down to creating lots of small packages that can be run in isolation rather than large complex ones. Most importantly you should create validation queries that compares the data you load in your ETL with data in the source systems. How these queries are crafted varies from system to system but a good starting point would be comparisons of row counts, sums of measures (facts) and number of unique values. The way I do it is that I create the test before building anything. So if I am to load customers that have changed since X, I first create the test query for the source system (row counts, distinct values etc.) then the query for the data warehouse together with a comparison query and finally I start building the actual integration. Ideally you will package this into a SSIS solution that logs the results into a table. This way you can utilize your validation logic both while developing the solution but also once its deployed. If you are running SQL Server 2012 you might want to look into the data tap features of SSIS that lets you inspect data flowing through your pipeline from the outside.
#29: Avoid the source if you are scaling for a large number of users
Building a BI solution to scale is another very large topic. If you have lots of data you need to scale your ETL, Database and SSAS subsystems. But if you have lots of users (thousands) your bottleneck will probably be SSAS. Concurrently handling tens to hundreds of queries with acceptable performance is just not feasible. The most effective thing is to avoid this as much as possible. I usually take a two pronged approach. Firstly I implement as much as possible as standard (“canned”) reports that can be cached. Reporting Services really shines in these scenarios. It allows for flexible caching schemes that in most circumstances eliminates all trips to the data source. This will usually cover around 70-80% of requirements. Secondly I deploy an ad-hoc cube specifically designed and tuned for exploratory reporting and analysis. I talked about this in tip #17. In addition you need to consider your underlying infrastructure. Both SSRS and SSAS can be scaled up and out. For really large systems you will need to do both, even with the best of caching schemes.
#30: Stick with your naming standards
There are a lot objects that need to be named in a solution. From the more technical objects such as database tables and SSIS packages to objects exposed to users such as SSAS dimensions and measures. The most important thing with naming conventions is not what they are, but that they are implemented. As I talked about in tip #24 changing a name can have far reaching consequences. This is not just a matter of things breaking if you change them but consider all of the support functionality in the platform such as logging that utilize object names. Having meaningful, consistent names will make it a heck of a lot easier to get value out of this. So at the start of the project I would advise to have a “naming meeting” where you agree upon how you will name your objects. Should dimension tables be prefixed with Dim or Dim_? Should Dimension names be plural (CustomerS) or singular (Customer), etc.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Consultant at a tech consulting company with 501-1,000 employees
My 30 tips for building a Microsoft BI solution, Part III: Tips 11-15
#11: Manage your own surrogate keys.
In SQL Server it is common to use an INT or BIGINT set as IDENTITY to create unique, synthetic keys. The number is a sequence and a new value is generated when we execute an insert. There are some issues with this. Quite often we need this value in our Integration Services solution to do logging and efficient loads of the data warehouse (there will be a separate tip on this). This means that sometimes we need the value before an insert and sometimes after. You can obtain the last value generated by issuing a SCOPE_IDENTITY command but this will require an extra trip to the server per row flowing through your pipeline. Obtaining the value before an insert happens is not possible in a safe way. A better option is to generate the keys yourself through a script component. Google for “ssis surrogate key” and you will find a lot of examples.
#12: Excel should be your default front-end tool.
I know this is a little bit controversial. Some say Excel lacks the power of a “real” BI tool. Others say it writes inefficient queries. But hear me out. Firstly, if you look at where Microsoft is making investments in the BI stack, Excel is right up there at the top. Contrast that to what they are doing with PerformancePoint and Reporting Services and its pretty clear that Excel is the most future proof of the lot. Microsoft have added lot of BI features over the last couple of releases and continue to expand it through new add-ins such as data explorer and geoflow. Additionally, the integration with SharePoint gets tighter and tighter. The Excel web client of SharePoint 2013 is pretty on par with the fat Excel client when it comes to BI functionality. This means that you can push out the new features to users who have not yet upgraded to the newer versions of Excel. When it comes to the efficiency with which Excel queries SSAS a lot has become better. But being a general analysis tool it will never be able to optimize its queries as you would if you wrote them specifically for a report.Please note that I am saying “default” not “best”. Of course there are better, pure bred, Business Intelligence front-ends out there. Some of them even have superior integration with SSAS. But its hard to beat the cost-value ratio of Excel if you are already running a Microsoft shop. If you add in the fact that many managers and knowledge workers already do a lot of work in Excel and know the tool well the equation becomes even more attractive.
#13: Hug an infrastructure expert that knows BI workloads.
Like most IT solutions, Microsoft BI solutions are only as good as the hardware and server configurations they run on. Getting this right is very difficult and requires deep knowledge in operating systems, networks, physical hardware, security and the software that is going to run on these foundations. To make matters worse, BI solutions have workloads that often differ fundamentally from line of business applications in the way they access system resources and services. If you work with a person that knows both of these aspects you should give him or her a hug every day because they are a rare breed. Typically BI consultants know a lot about the characteristics of BI workloads but nothing about how to configure hardware and software to support these. Infrastructure consultants on the other hand know a lot about hardware and software but nothing about the specific ways BI solutions access these. Here are three examples: Integration Services is mainly memory constrained. It is very efficient at processing data as a stream as long as there is enough memory for it. The instant it runs out of memory and starts swapping to disk you will see a dramatic decrease in performance. So if you are doing heavy ETL, co-locating this with other memory hungry services on the same infrastructure is probably a bad idea. The other example is the way data is loaded and accessed in data warehouses. Unlike business systems that often do random data access (“Open the customer card for Henry James”) data warehouses are sequential. Batches of transactions are loaded into the warehouse and data is retrieved by reports / analysis services models in batches. This has a significant impact on how you should balance the hardware and configuration of your SQL Server database engine and differs fundamentally from how you handle workloads from business applications. The last example may sound extreme but is something I have encountered multiple times. When businesses outsource their infrastructure to a third party they give up some of the control and knowledge in exchange for an ability to “focus on their core business”. This is a good philosophy with real value. Unfortunately if you do not have anyone on the requesting side of this partnership that knows what to ask for when ordering infrastructure for your BI project what you get can be pretty far off from what you need. Recently a client of mine made such a request for a SQL Server based data warehouse server. The hosting partner followed their SLA protocol and supplied a high availability configuration with a mandatory full recovery model for all databases. You can imagine the exploding need for disk space for the transaction logs when loading batches of 20 million rows each night. As these examples illustrate, it is critical for a successful BI implementation to have people with infrastructure competency on your BI team that also understand how BI solutions differ from “traditional” business solutions and can apply the right infrastructure configurations.
#14: Use Team Foundation Server for your BI projects too.
A couple of years ago putting Microsoft BI projects under source control was a painful experience where the benefits drowned in a myriad of technical issues. This has improved a lot. Most BI artifacts now integrate well with TFS and BI teams can greatly benefit from all the functionality provided by the product such as source control, issue tracking and reporting. Especially for larger projects with multiple developers working against the same solution TFS is the way to go in order to be able to work effectively in parallel. As an added benefit you will sleep better at night knowing that you can roll back that dodgy check-in you performed a couple of hours ago. With that said there are still issues with the TFS integration. SSAS data source views are a constant worry as are server and database roles. But all of this (including workarounds) is pretty well documented online.
#15: Enforce your attribute relationships.
This is mostly related to SSAS dimensional but you should also keep it in mind when working with tabular. Attribute relationships define how attributes of a dimension relate to each other (roll up into each other). For example would products roll up into product subgroups which would again roll into product groups. This is a consequence of the denormalization process many data warehouse models go through where complex relationships are flattened out into wide dimension tables. These relationships should be definied in SSAS to boost general performance. The magic best-practice analyzer built into data tools makes sure you remember this with its blue squiggly lines. Usually it takes some trial and error before you get it right but in the end you are able to process your dimension without those duplicate attribute key errors. If you still don’t know what I am talking about look it up online such as here. So far so good. Problems start arising when these attribute relationships are not enforced in your data source, typically a data warehouse. Continuing with the example from earlier over time you might get the same product subgroup referencing different product groups (“parents”). This is not allowed and will cause a processing of the dimension to fail in SSAS (those pesky duplicate key errors). To handle this a bit more gracefully than simply leaving your cube(s) in an unprocessed state (with the angry phone calls this brings with it) you should enforce the relationship at the ETL level, in Integration Services. When loading a dimension you should reject / handle cases where these relationships are violated and notify someone that this happened. The process should make sure that the integrity of the model is maintained by assigning “violators” to a special member of the parent attribute that marks it as “suspect”. In this way your cubes can still be processed while highlighting data that needs attention.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Hi Peter !
Nice article, now we discuss from point 11 to 15 in detail;
#11: I do agree with you partially on this, because I don't understand the need for creating a separate surrogate key for SSIS. My point is using the keys from Production tables; personally I use Change Table method to perform incremental loads. If a separate key is required in your Data warehouse Model, you can create in using a combination or reading the value from source table or by loading a value into SSIS variable and then assigning this to your table.
#12: I prefer to use Excel as a tool where i can perform quick data verification or number reconciliation by connecting to my cube. I know Microsoft has been investing lot in Excel through Power Pivot and all. But what about the future of "Power BI" which we heard a new tool which will have the capabilities to become the number one BI tool for reporting. Personally I think excel can't be used as enterprise reporting tool.
#13: A rare to have thing. Another thing to add is really hard to find BI Consultant which has experiences in not only Cube optimization, but also in Report and Database optimization as well. If you have one of these, I called them as a "Real Asset", because they not only help you in OLAP, they will help you in OLTP, in your SSIS and in your reporting as well. I must suggest including at least one of these guys in a BI project, this will actually save your time and money.
#14: I have been using TFS for keeping my SSRS reports to source control, and it’s been nice that it doesn't act up badly. But i do have a reservation about keeping my SSIS to TFS, because it happens to me multiple times where it got corrupted somehow, luckily I am not only relying on TFS so I have the source back with me. Always use a backup strategy if your source control might fail how you can do the recovering. So be prepared for this because it might be happening anytime soon.
#15: Always good to define hierarchies and attribute relationships, whenever possible define hierarchies. Remember once you define the Hierarchy, hide the attribute so that it won't be duplicated in reporting tool like if you are using Performance Point, end user might see same attribute both inside hierarchy and in the dimension as well. So do set the visibility of attribute to hidden.
Designing a BI Solution is an interesting job; in each development you will learn new things. Always plan your development, choose the right tools to be used for your final solution, if you are unsure about something better discuss it with some other Consultants to pick the right product for your solution.
Regards,
Hasham Niaz
Consultant with 51-200 employees
Mapping Business Intelligence Developer’s Tools: Microsoft SQL server & SAP Netweaver BW
This Post is not about Microsoft BI VS Sap BI. NO. NO. NO.
Then What is it?
well, I have been playing with SAP’s Netweaver BW Tools for past three months now as a part of a Business Intelligence class that’s about to conclude – Also, I have been involved with work on Microsoft’s SQL server Business Intelligence Tools. So I thought – it would be FUN to map SAP Netweaver BW Tools (that I got to play with in an academic capacity) and Microsoft’s Business Intelligence Tools (which is currently what I am working on) – so, here you go:
Tool in Microsoft BI | Tool in SAP Netweaver BW | |
ETL (Extract, Transform, Load) | SQL Server Integration Services (SSIS) | SAP Netweaver BW: Data Warehousing Workbench |
Cube | SQL Server Analysis Services – Multidimensional Mode (SSAS) | SAP Netweaver BW: Data Warehousing Workbench: Modeling |
Report Design Tool and Reporting Layer(It’s not an exhaustive list and does not include third part tools) |
|
Business Explorer (BEx):
|
Data Mining | Data Mining Projects in SQL Server Analysis Services | SAP Netweaver BW: Data mining – Analysis Process Designer |
Note about SAP BusinessObjects: I mapped the Tools in Microsoft BI with the tools that I got to study in my SAP class. Then I was searching what’s the current scenario in SAP world (I know about Microsoft’s!)– I learned that SAP BI world is comprised of TOOLS in SAP Netweaver BW + SAP BusinessObjects (BO). And in the course I studied the following components of Business Objects:
- Web Intelligence for ad-hoc query and reporting
- Crystal Reports for enterprise reporting
- Xcelsius (BO Dashboard) for Dashboard designing
For those interested I am also mapping few terms used while cube development in Microsoft BI and SAP Netweaver BW
Microsoft: SSAS Multidimensional mode |
SAP Netweaver BW |
Cube | InfoCubes |
Dimensions | Characteristics |
Measures | Key Figures |
Data Source Views (DSV’s) | Data Source |
Note:
1) I have not mapped the Tools in Self Service BI space.
2) This comparison is not for deciding between Microsoft BI vs. SAP Netweaver BI/SAP BusinessObjects – this post is just meant for mapping tools available in Microsoft BI and SAP Netweaver BW and so if you are an expert in say Microsoft BI – this post will help you see what corresponding tool are available in SAP Netweaver BW world. Consider it as a starting guide for your research.
3) Note the date the post was written – the name of the products may have changed in future. refer to official sites for latest & greatest!
Thanks for reading.
This post was republished from Mapping Business Intelligence Developer’s Tools: Microsoft SQL server & SAP Netweaver BW
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Educator at a university with 11-50 employees
Great teaching tool for data transfers
Pros and Cons
- "Being able to transfer data with the Power Query feature is great for teaching with about one to two hundred rows of data. I am also able to see the relationship between multiple tables and create various sheets and dashboards."
- "The Power Query feature needs to be improved to handle large amounts of data. This feature is also slow for large data transfers making it better to use some kind of script."
What is our primary use case?
I have been a professor at a program mastery at UISEK University for the past two years where I teach students how to use these tools.
What is most valuable?
Being able to transfer data with the Power Query feature is great for teaching with about one to two hundred rows of data. I am also able to see the relationship between multiple tables and create various sheets and dashboards.
What needs improvement?
The Power Query feature needs to be improved to handle large amounts of data. This feature is also slow for large data transfers making it better to use some kind of script.
Additionally, in the next release, I would like to be able to export thing, such as charts, as a PDF.
For how long have I used the solution?
I have been using this solution for two years.
Disclosure: I am a real user, and this review is based on my own experience and opinions.

Buyer's Guide
Download our free Microsoft Power BI Report and get advice and tips from experienced pros
sharing their opinions.
Updated: February 2025
Popular Comparisons
Teradata
Amazon QuickSight
IBM Cognos
SAP Analytics Cloud
SAP BusinessObjects Business Intelligence Platform
Oracle OBIEE
MicroStrategy
Oracle Analytics Cloud
Salesforce Einstein Analytics
TIBCO Spotfire
ThoughtSpot
Buyer's Guide
Download our free Microsoft Power BI Report and get advice and tips from experienced pros
sharing their opinions.
Quick Links
Learn More: Questions:
- BI Reports for business users - which BI solutions should we choose?
- Business users moving from Tableau to MS Report builder
- Is Power BI a complete platform or only a visualization tool?
- What are the key advantages of OBIEE compared to Microsoft BI?
- What Is The Biggest Difference Between Microsoft BI and Oracle OBIEE?
- Is Microsoft Power BI good for an ETL process?
- How would you decide between Microsoft Power BI and TIBCO Spotfire?
- Is it easy to extract data from Oracle Fusion into Power BI?
- PowerBI or SyncFusion - which is better?
- What challenges to expect when migrating multiple dashboards from TIBCO Spotfire to Microsoft Power BI?
Thanks Peter for the great range of tips for using Microsoft BI tool. They are indeed a must-read for all developers and novice users of this great tool for businesses.