It cuts away the time spent to source some of this information. It saves a lot of time, and a lot of things have been automated. Things that used to take close to a week, now, in one day, we are done with it.
Business Intelligence Developer at a financial services firm with 5,001-10,000 employees
Self-service allows end-users to source information, but email-based subscription is needed
How has it helped my organization?
What is most valuable?
It's actually the self-service of Power BI, because we're on a lean structure here, the technical side. So we wanted them to push some of those things back to end-users.
What needs improvement?
For our reporting services, it would be good to have the capability for email-based subscriptions. I want to be able to change the sender dynamically, without having to rely on running through an exercise passage.
I know for a fact that SAP can do this, dynamic sending for subscriptions. Also IBM Cognos can do it. Why Microsoft refused to do that, that you have to rely on add-on to be able to do that...
Overall, it has performed to our expectation. It's okay for what we want to use it for. But because they keep changing, every now and then, there's a major shift in versions, so we are trying to catch up.
What do I think about the stability of the solution?
I've used the IBM suite, and I've used SAP, and Microsoft is fairly stable for me. And because there is a large community for issues that arise, for every bug that you encounter, you have a solution, just Google it, without having to go for special training like with SAS.
Buyer's Guide
Microsoft Power BI
November 2024
Learn what your peers think about Microsoft Power BI. Get advice and tips from experienced pros sharing their opinions. Updated: November 2024.
815,854 professionals have used our research since 2012.
What do I think about the scalability of the solution?
From what I understand, it requires some level of expertise to even understand what it can do, how it can be used in a huge enterprise. That is not so easy for deployment. But for a medium sized organization, yes, it's scalable.
How was the initial setup?
It was straightforward. Very straightforward.
You can just go to YouTube, somebody has done it before. Because it has GUI, just click, click, click, next, next, and you're done. And YouTube helped a lot in deploying projects: SSAS Projects, SSIS projects, and SSRS projects.
Which other solutions did I evaluate?
IBM Analytics. But the learning curve for IBM, that's what actually led us to pause on the implementation of the Cognos 10. Everything was pretty much difficult to do. Things that you could do in two clicks in Microsoft, you would be doing in five, ten clicks in IBM.
What other advice do I have?
When selecting a vendor, the most important criteria are affordability - because of the exchange in my country - and ease of use.
I would rate this solution a seven out of 10, because you can actually go from the common tools that you use, from your Excel, from your SharePoint... You can pretty much do everything with it. The little things that you can't do, you can customize it to do by writing code.
I would say that, you should look away from the limitations. Focus on the positives of it, because it's cheaper to acquire than most of the big five. It's cheaper to acquire and easy to use.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
BI Expert with 51-200 employees
We’ve Got The Power: “Power BI”, New Microsoft BI Suite Announced
Power BI: a new suite of Business Intelligence tools
Over the past few months, teams at Microsoft have made several new Business Intelligence tools available for preview; some only privately and some to the public. The entire suite will soon be available for either public preview or release under the new name: “Power BI”. All of the components of Power BI are listed below but the big news is a new hosted offering called “Power BI for Office 365” and “Power BI Sites”. The announcement was made at the Worldwide Partner Conference this week. Users can sign-up to be notified when the new offerings are available for general availability, apparently in the very near future. I’ve had an opportunity to work with early, pre-released versions and it has been interesting to see the gaps being filled a little at a time. On the heals of the new suite, some of the names of existing products are also being changed. It’s hard to have a conversation about the collection of Microsoft’s “Power”/”Pivot”/”Point”…named tools and not get tongue twisted but these changes bring more consistency.
Bottom line: this is good news and a promising step forward – especially for smaller businesses. Larger, enterprise customers should know that this move is consistent with Microsoft’s “cloud first” philosophy and these capabilities are being introduced through Office365/Azure platform with required connectivity. Read the commentary on community leaders’ sites below. I have no doubt that there will be a lot of discussion on this in the weeks to come with more announcements from Microsoft in the near future.
Power BI for Office 365 and Power BI Sites
When Power View was released with SQL Server 2012 Enterprise and Business Intelligence Editions, it was available only when integrated with SharePoint 2010 Enterprise Edition. This is a good solution for enterprise customers but it was complex and expensive for some to get started. Power View was also offered only as a Silverlight application that wouldn’t work on many mobile devices and web browsers. For this reason, Power View has really been viewed as a “Microsoft only” tool and only for big companies with deep pockets and very capable IT support groups. Even the new Power View add-in for Excel 2013 ProPlus Edition requires Silverlight which is not a show-stopper for most folks but a hindrance for multi-platform and tablet users. This all changes with this new offering as the Power View visualization tool in the hosted product come in 3 new flavors: native Windows 8 app (runs on desktop, Surface RT & Pro), native iOS (targeting the iPad) and HTML5 (works on practically any newer device). This means that when you open a Power View report on your Surface or iPad, it can run as an installed app with all the cool pinch-zoom and gestures you’ve come to expect on a tablet device. For now, this is good news for the cloud user as no on-premises option is currently available. An interesting new edition will be the introduction of a semantic translation engine for natural language queries, initially for English.
Power Query
Formerly known as “Data Explorer”, this add-in for Excel 2013 allows you to discover and integrate data into Excel. Think of it as intelligent, personal ETL with specialized tools to pivot, transform and cleanse data obtained from web-based HTML tables and data feeds.
Power Map
This Excel 2013 ProPlus add-in, which was previously known as “GeoFlow”, uses advanced 3-D imaging to plot data points on a global rendering of Bing Maps. Each data point can be visualized as a column, stacked column or heat map point positioned using latitude & longitude, named map location or address just like you would in a Bing Maps search. You can plot literally thousands of points and then tour the map with the keyboard, mouse or touch gestures to zoom and navigate the globe. A tour can be created, recorded and then played back. Aside from the immediate cool factor of this imagery, this tool has many practical applications.
Power Pivot
The be reveal is that “PowerPivot” shall now be known as “Power Pivot”. Note, the space added so that the name is consistent with the other applications. We all know and love this tool, an add-in for Excel 2010 and Excel 2013 ProPlus (two different versions with some different features) that allow large volumes of related, multi-table data sources to be imported into an in-memory semantic model with sophisticated calculations. On a well-equipped computer, this means that a model could contain tens of millions of rows that get neatly compressed into memory and can be scanned, queried and aggregated very quickly. Power Pivot models (stored as an Excel .xlsx file) can be uploaded to a SharePoint where they become a server-managed resource. A Power Pivot model can also be promoted to a server-hosted SSAS Tabular model where data is not only managed and queried on an enterprise server but also takes on many of the features and capabilities of classic SSAS multidimensional database. Whether a Power Pivot model is published to a SharePoint library or promoted to a full-fledged SSAS Tabular model, the data can be queried by any client tool as if it were an Analysis Services cube.
Power View
For now, Power View in Excel 2013 ProPlus and Power View in SharePoint 2010 Enterprise and SharePoint 2013 Enterprise remain the same – the Silverlight-based drag-and-drop visual analytic tool. With the addition of SQL Server 2012 CU4, Power View in SharePoint can be used with SharePoint published Power Pivot models, SSAS Tabular models and SSAS Multidimensional “cube” models. There has been no news yet about a non-Silverlight replacement for the on-premise version of Power View. The Microsoft teams and leadership have heard the requests and feedback, loud-and-clear, from the community and we can only guess that there is more is in-the-works but I make no forecast or assumptions about the eventual availability of an on-premise offering similar to Power BI for Office 365.
Additional thoughts and information from the community can be found at:
Chris Webb: Some Thoughts About Power BI
Andrew Brust: Microsoft Announces Power BI for Office 365
SQL Server Blog: Introducing Power BI for Office 365
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Buyer's Guide
Microsoft Power BI
November 2024
Learn what your peers think about Microsoft Power BI. Get advice and tips from experienced pros sharing their opinions. Updated: November 2024.
815,854 professionals have used our research since 2012.
BI Expert with 51-200 employees
Taking the Tabular Journey
A Getting-Started and Survival Guide for planning, designing and building Tabular Semantic Models with Microsoft SQL Server 2012 Analysis Services.
by Paul Turley
This post will be unique in that it will be a living document that will be updated and expanded over time. I will also post-as-I-go on the site about other things but this particular post will live for a while. I have a lot of good intentions – I know that about myself and I also know that the best way to get something done is to get it started – especially if I’m too busy with work and projects. If it’s important, the “completing” part can happen later. In the case of this post, I’ll take care of building it as I go, topic by topic. Heck, maybe it will never be “finished” but then are we ever really done with IT business solutions? I have been intending to get started on this topic for quite some time but in my very busy project schedule lately, didn’t have a concise message for a post – but I do have a lot to say about creating and using tabular models.
I’ve added some place-holder topic headers for some things that are on my mind. This list is inspired by a lot of the questions my consulting customers, students, IT staff members and business users have asked me on a regular basis. This will motivate me to come back and finish them and for you to come back and read them. I hope that you will post comments about your burning questions, issues and ideas for related topics to cover in this living post about tabular model design practices and recommendations.
Why Tabular?
SQL Server Analysis Services is a solid and mature platform that now serves as the foundation for two different implementations. Multidimensional models are especially suited for large volumes of dimensionally-structured data that have additive measure values that sum-up along related dimensional attributes & hierarchies.
By design, tabular architecture is more flexible than multidimensional in a number of scenarios. Tabular also works well with dimensional data structures but also works well in cases where the structure of the data doesn’t resemble a traditional star or snowflake of fact and dimension tables. When I started using PowerPivot and tabular SSAS projects, I insisted on transforming data into star schemas like I’ve always done before building a cube. In many cases, I still do because it’s easier to design a predictable model that performs well and is easy for users to navigate. A dimensional model has order and disciple however, the data is not always shaped this way and it can take a lot of effort to force it into that structure.
Tabular is fast for not only additive, hierarchal structured data but in many cases, it works well with normalized and flattened data as long as all the data fits into memory and the model is designed to support simple relationships and calculations that take advantage of the function engine and VertiPaq compression and query engine. It’s actually pretty easy to make tabular do silly, inefficient things but it’s also not very hard to make it work really well, either.
James Serra has done a nice job of summarizing the differences between the two choices and highlighted the strengths and comparative weaknesses of each in his April 4 blog post titled SQL Server 2012: Multidimensional vs Tabular. James points out that tabular models can be faster and easier to design and deploy, and that they concisely perform well without giving them a lot of extra attention for tuning and optimization. Honestly, there isn’t that much to maintain and a lot of the tricks we use to make cubes perform better (like measure group partitioning, aggregation design, strategic aggregation storage, usage-base optimization, proactive caching and cache-warming queries) are simply unnecessary. Most of these options don’t really exist in the tabular world. We do have partitions in tabular models but they’re really just for ease of design.
What About Multidimensional – Will Tabular Replace It?
The fact is the multidimensional databases (which most casual SSAS users refer to as “cubes”) will be supported for years to come. The base architecture for SSAS OLAP/UDM/Multidimensional is about 13 years old since Microsoft originally acquired a product code base from Panorama and then went on to enhance and then rewrite the engine over the years as it has matured. In the view of many industry professionals, this is still the more complete and feature-rich product.
Both multi and tabular have some strengths and weaknesses today and one is not clearly superior to the other. In many cases, tabular performs better and models are more simple to design and use but the platform is lacking equivalent commands and advanced capabilities. In the near future, the tabular product may inherit all of the features of its predecessor and the choice may become more clear; or, perhaps a hybrid product will emerge.
Isn’t a Tabular Model Just Another Name for a Cube?
No. …um, Yes. …well, sort of. Here’s the thing: The term “cube” has become a defacto term used by many to describe the general concept of a semantic model. Technically, the term “cube” defines a multidimensional structure that stores data in hierarchies of multi-level attributes and pre-calculated aggregate measure values at the intersect points between all those dimensions and at strategic points between many of the level members in-between. It’s a cool concept and an an even cooler technology but most people who aren’t close to this product don’t understand all that. Users just know that it works somehow but they’re often confused by some of the fine points… like the difference between hierarchies and levels. One has an All member and one doesn’t but they both have all the other members. It makes sense when you understand the architecture but it’s just weird behavior for those who don’t.
Since the tabular semantic model is actually Analysis Services with a single definition of object metadata, certain client tools will continue to treat the model as a cube, even though it technically isn’t. A tabular Analysis Services database contains some tables that serve the same purpose as measure groups in multidimensional semantic models. The rest of the tables are exposed as dimensions in the same way that cube dimensions exists in multidimensional. If a table in a tabular model includes both measures and attribute fields, in certain client tools like Excel, it will show up twice in the model; once as a measure group table and once as a dimension table.
(more to come)
Preparing Data for a Tabular Model
Data Modeling 101 for Tabular Models
Are There Rules for Tabular Model Design?
Tabular Model Design Checklist
What’s the Difference Between Calculated Columns & Measures?
What are the Naming Conventions for Tabular Model Objects?
What’s the Difference Between PowerPivot and Tabular Models?
How to Promote a Business-created PowerPivot Model to an IT-managed SSAS Tabular Model
Getting Started with DAX Calculations
DAX: Essential Concepts
DAX: Some of the Most Useful Functions
DAX: Some of the Most Interesting Functions
Using DAX to Solve real-World Business Scenarios
Do I Write MDX or DAX Queries to Report on Tabular Data?
Can I Use Reporting Services with Tabular & PowerPivot Models?
Do We Need to Have SharePoint to Use Tabular Models?
What Do You Teach Non-technical Business Users About PowerPivot and Tabular Models?
What’s the Best IT Tool for Reporting on Tabular Models?
What’s the Best Business User Tool for Browsing & Analyzing Business Data with Tabular Models?
Survival Tips for Using the Tabular Model Design Environment
How Do You Design a Tabular Model for a Large Volume of Data?
How Do You Secure a Tabular Model?
How to Deploy and Manage a Tabular Model SSAS Database
Tabular Model Common Errors and Remedies
Tabular Model, Workspace and Database Recovery Techniques
Scripting Tabular Model Measures
Simplifying and Automating Tabular Model Design Tasks
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Consultant at a tech services company with 51-200 employees
Report Builder … the Red-Headed Stepchild of Self-Service BI Tools?
A conversation on Twitter earlier this week has inspired me to pose this question…Is Report Builder the “Red-Headed Stepchild” when it comes to the Microsoft Self-Service BI toolset? Report Builder doesn’t get much attention these days, for various reasons we’ll discuss below.
(Note: The perspective of this blog entry is Self-Service BI so Report Builder is the reference – however, everything stated is applicable to full-fledged Reporting Services as well.)
Is Report Builder Deprecated?
Nope! Report Models, however, are deprecated. As of SQL Server 2012, you can no longer create or update Report Models (SMDL files). You can continue to use Report Models currently, but it is recommended to move away from them as time allows. Report Builder itself is at version 3.0 and is a mature product.
Self-Service BI Tools
The primary set of Microsoft Self-Service BI tools includes Excel (+ add-ins for data modeling such as PowerPivot), Power View, and Report Builder. Some people actually wouldn’t include Report Builder in this list, but I do believe it has a valid place as one of the Self-Service BI tools (albeit, a smaller audience & used for specific purposes).
Drawbacks of Report Builder
- Learning curve / ease of use for report designer. There are a significant number of properties and options. This offers significant control over the output – the cost for this significant control is ease of use because all the options in Report Builder can be a bit overwhelming for the casual business user. However, it’s not overly difficult to use for technically adept users who enjoy working with reporting tools and data.
- Limited interactivity. While there are some interactive features (such as drill-down, drill-through, sortable columns), each has to be explicitly defined by the report designer. Report Builder isn’t dynamically interactive like Power View or even Excel – rather, Report Builder is far more suitable for fully formatted reporting needs.
- Longer to develop. There’s some things that can be done with Power View or Excel that are inherently more work to do in Report Builder. The first example that comes to mind is hierarchies – with Power View or Excel, you drop a hierarchy onto the row & the navigation up/down works; with Report Builder you’d have to set up what is shown vs. hidden and the toggle properties. This is not overly difficult to set up, but could be frustrating for someone just getting started with Report Builder.
- GUI support is limited. There is a drag & drop graphical interface for SQL Server and Analysis Services data sources (plus a couple of others). This limited support leaves the report designer writing query syntax sometimes – which is obviously not the most user-friendly for non-technical folks. Currently the nicest way for users to work with Report Builder is using a BISM data source (i.e., the data is stored in Analysis Services or PowerPivot). SQL Server (relational) can be ok for users to work with if the data sources are made to be understandable & easy to work with (ex: with reporting views or stored procedures) – this takes some effort from the IT Dept. or BI Center to make sure it’s made suitable for self-service.
Positives of Report Builder
- Native connectivity to many data sources, including BISM. The Microsoft BI framework is very different from other BI tools (such as Cognos or Business Objects) which require a metadata layer – i.e., a report model between the data source & the reporting tool. Microsoft permits tremendous flexibility here – in fact, you can natively send queries from Report Builder to a variety of databases including non-Microsoft. Power View is very limited in terms of accepted data sources, and Excel can be somewhat limited (unless you bring the data into an intermediary PowerPivot model first – PowerPivot offers great flexibility in this regard, but do realize you are storing the data redundantly).
- Significant formatting control. If you need a pixel-perfect highly formatted report, Report Builder is the tool for you. Ironically, this is the inverse of the “learning curve / ease of use” drawback listed above. With Excel you can exercise a lot of control over the look & feel of your report (except Pivot Tables – you have to use formulas if you need to "break” out of the Pivot Table). Power View has some text size control and some color schemes to choose from, but overall offers minimal user control over formatting (after all, it’s a data discovery tool meeting an entirely different need – and it is purposely trying to remain simple).
- Consistent RDL file format. If a business user starts a report in Report Builder and needs some help with it, one of the IT or BI folks can open the report using their tool of choice (BIDS or SSDT in Visual Studio), make some updates, and send it back to the user with the file format intact. The consistent format is also helpful if a report is being promoted from the Self-Service environment to the Corporate BI environment.
- Reusable elements. To facilitate Self-Service BI using Report Builder, things such as shared data sources (reusable data connections) are obvious but there are others as well. Shared datasets (reusable queries) can be really helpful to handle common things like Dates and Geography. Report parts (reusable charts, graphs, tables) can be helpful to display commonly used elements. Images can also be stored centrally for reuse. Setting up reusable elements does take some effort from the IT Dept. or BI Center though, but can improve the Self-Service user’s experience tremendously.
- Parameterization. Reports with a number of parameters (within reason of course) can sometimes be thought of as “guided ad-hoc analysis” because one report can yield many different combinations of results depending on parameter values. Report Builder handles parameterization well.
- Subscriptions and alerts. If you wish to have reports delivered to you at a predefined frequency or based upon a condition, then Reporting Services is the tool to make that happen.
- Export and RSS capabilities. Report Builder can export to many different file formats. It can also publish an RSS feed – this can be very useful for a business user to consume existing aggregated/calculated data that has been rendered by Report Builder without recalculating or reinventing anything. Utilizing published report data via RSS also helps with the elusive “one version of the truth” that’s a constant challenge.
- Integration with SharePoint. With a Report Builder report, you can view or edit the report directly from the SharePoint document library (with appropriate permissions of course). Alternatively, a Report Manager portal can be used (although it would be used in isolation from other BI tools and reports).
So, even though sometimes Report Builder seems to be the “Red-Headed Stepchild” I very much appreciate having the tool in our toolbox. I hope it’s alive and well for a long time.
Got any thoughts on this subject? Leave a comment … I’d love to hear your thoughts.
Disclosure: The company I work for is a Microsoft Partner
Disclosure: I am a real user, and this review is based on my own experience and opinions.
BI Expert with 51-200 employees
SSAS Tabular Models: The Good, the Bad, the Ugly & the Beautiful
After working on a couple of serious tabular model projects, I’ve been talking to other BI practitioners lately about their thoughts regarding tabular model technology and I’m finding that there is little consensus among all these industry experts about which option to recommend for new BI projects. At SolidQ, I work with some of the most experienced Business Intelligence experts in the industry. At conferences and in discussion groups, there has been a lot of discussion about the choice between Analysis Services multidimensional models and the newer tabular semantic model technology introduced in SQL Server 2012. There are pros and cons for either side but I think there are some important factors to consider. Among some of the most experienced Analysis Services consultants, I have heard that some are primarily using and recommending tabular for most new projects and others are arguing that tabular offers little significant value over the tried-and-true multidimensional cubes and MDX script solutions.
As is typical for a newer product, the usability experience for the model designer isn’t perfect. In some ways, it’s different and just may take some getting used to, but in other ways there is clearly room for improvement. The question now is; do the advantages of tabular vs multidimensional outweigh the somewhat rocky design experience? I’m not taking sides in this comparison but merely offering an objective analysis of the state of tabular model option in SQL Server 2012 Analysis Services, Service Pack 1. I expect this product to go through changes so for future reference, this information is for the released product at the time of this posting in June of 2013.
The Good
-
The core VertiPaq (or xVelocity) query and aggregation engine is stable & reliable. Originally developed about five years ago and released with PowerPivot for Excel and SharePoint in SQL Server 2008R2 over three years ago, this technology has proven to be ready for serious use.
-
Under most conditions, for analytic reporting, data scanned and aggregated from an in-memory data structure performs faster than other conventional options; including relational and multidimensional storage.
-
Tabular can be less complex than multidimensional, OLAP SSAS. The core design and usage concepts are easier for both those who design models and for those use use them for analysis and reporting.
-
Tabular models can be easier & faster to implement because the model structure is simpler and there may be fewer steps in the design process.
-
DAX, the core calculation expression language for tabular models, is fairly easy to learn. Fundamental DAX expression concepts can be easier to understand than equivalent MDX commands used in multidimensional modeling and calculations.
The Bad
-
Comparing features, tabular can be hard to justify when compared to multidimensional. Traditional SSAS cubes still have more capabilities than tabular models and for someone who already has OLAP skills and background, tabular design is a new learning curve.
-
PowerPivot is a version 2 product. As such, it’s simple and fairly reliable in the majority of design scenarios. In some, mostly rare, cases, error handling and debugging capabilities aren’t as mature and robust as more tenured technologies like SSAS multidimensional or relational.
-
Tabular SSAS is a version 1 product. As such, it has a list of well-known design annoyances and features that could (and should) improve in the next product go-round.
-
The recommended design patterns & best practices, both from Microsoft development and support teams and from members of the expert practitioner community, are still emerging.
-
One model.bim file = one developer working at a time. A departure from the long-time SSAS project development environment where every object (such as data sources, dimensions and cubes) were defined in isolated files managed by the project; SSAS tabular manages all of the database objects in a single model definition file. The model designer treats the model is more of a black box than a solution comprised of independent objects. However, the fact is that most of the same objects we work with in multidimensional/cube projects are all defined as XML elements in this file. It can be opened and properties manipulated,and there are a growing number of third-party tools to provide enhancements. Regardless, it is one big project file tat would need to be checked-out in source code management as a single project asset.
-
The tabular SSAS support community is thriving but still small. A core group of trusted professionals from all over the world are the loudest voices right now. They promote tabular solutions and provide active and collective support for one another.
-
The DAX expression editor in PowerPivot & the Visual Studio-based SSDT designer is quirky. You have to know is strengths and weaknesses and be willing to use it in that context. It attempts to assists with code completion but in the end, it doubles-up brackets and duplicates, rather then replaces old code, that it suggests. No doubt that the experience will get better as enhancements are developed and released but we must live with a product today that is useful and reliable a lot of the time but, it leaves plenty of opportunity for improvements.
-
The entire tabular model must fit in memory. There’s no caching, swapping or distributed processing option for large models. This means that very large models need very large hardware and there is currently no scale-out option to distribute work loads or fail-over. Got a lot of data? Get a lot of memory.
The Ugly
-
After you get into serious data with 30, 40 or 50 tables and some complexity to your model, the version 1.0 SSDT tabular model designer can be cumbersome and error-prone. I’ve seen it stop responding and throw errors when there were no errors. I will say that it’s helpful and reliable most of the time but on occasion, when it falls down, I often save and close my work; shut down Visual Studio all together and the fire it back up.
-
My biggest peeve about the SSDT model designer is that all design work is performed while connected to the workspace database. This means that for every table and column rename,and for every single property setting change, this information is written to the model workspace database on the local SSAS tabular server, and after the changes have been committed, control is given back to the model designer.
-
Some key SSAS features are not currently supported. These include things like parent-child hierarchies, many-to-many relationships, cube actions and display folders. Actually, this is not entirely true; actions and display folders can be added using after-market tools (like BIDS Helper, available in the CodePlex library) and by hand-writing the XMLA script, but they are not currently supported through the SSDT model designer. There is simply a category of features that didn’t find their way into this first version of the product. There are work-arounds and methods to enable these capabilities but they’re not supported, or at least not to the the same degree as they are in multidimensional SSAS.
The Beautiful
-
There is no doubt that in-memory, tabular model technology is the promise of the future. It just makes sense. Several vendors have come to the same conclusion and are developing products following this paradigm. Oracle just made a big announcement about adding in-memory column store to their future 12C product.
-
Data residing and processed in memory is faster than data residing in disk. This is what VertiPaq does; whether implemented as PowerPivot, an SSAS tabular model or as a SQL Server column store, it works efficiently and elegantly without the complexities and overhead of indexes, partitions, file groups and other techniques typically used to optimize on-disk data stores.
-
Even though tabular is fairly new, many useful & valuable features are supported today and work well.
-
PowerPivot models upgrade seamlessly to tabular models. This provides a path for business users and IT professionals to author models in familiar tools (Excel or Visual Studio) and then promote them to a server hosted environment.
-
Tabular models are managed and stored by SQL Server Analysis Services! Although some components of the tabular engine and the designer are new and still have wrinkles to be ironed-out, the core product is based on the solid and time-tested foundation of SSAS. This means that many of the features not implemented now will be available in future builds.
-
Client applications that support SSAS multidimensional will also support tabular. In fact, any application built to work with SSAS cubes will natively work with PowerPivot and tabular as if it were a cube. This is because SSAS tabular uses the same data provider that understands both MDX & DAX queries.
Disclosure: The company I work for is a Microsoft Gold Partner
Disclosure: I am a real user, and this review is based on my own experience and opinions.
BI Expert at a transportation company with 51-200 employees
Microsoft BI vs. SAP Business Objects
A quick look at the whole idea on another weblogs gives you a sense that all of them just talked about very brief things like report refresh feature in BO or cube feature in MS Analysis Service. I choose MS BI and I want to share my reasons and opinions on why I choose it and give you another quick but a little deeper compare on these two Business Intelligence platforms.
As we all know both Business Objects and Microsoft are big companies who are working on BI solutions and both have their own advantages. It’s not true to compare them in term of which one is better, we have to check what is our requirements and then depend on requirements take the decision whether MS or BO. A vision like this could help us relief from religious decisions against a software or technology.
In a BI architect first of all we have the data store level, I mean the storage of the raw data not the stage or olap cubes or universe data source, I mean the first place of our data. This is important to know that where your raw data is and what is the type of storage used to store them. Whether file system or Access or Fox database or a complex database solution like oracle, sqlserver or a web service can made our place of raw data. We have to check our tools against them; check to see which one gives us a smooth way to transfer them among ETL process to destination. So take a look at what Business Objects gives us.
There is a Data Integration platform in Business objects but the problem is that you have to buy that separately because it is not shipped with the BI system. In Microsoft sqlserver enterprise you have all the services and features needed for this part of the game. SSIS is the service that sqlserver deliver for data extract, integration and load. Both product gives you the ability to enhance the data quality and data cleansing portion of your integration phase, but when we down to details things change a little to the Microsoft side, because of the ability of using your Dot.Net knowledge to write complex parts of ETL process you have more room to think and do whatever you want in your process, and in BO side it is always look simple and it’s really not easy to take complex situation into it. There are advantages and disadvantages on this. First you can do many things with the ability of dot.net code but it could give you complexity in your development so you have to decide on your situation, if things looking normal both could fit your need, but if the situation is not stable and you have to make yourself ready for the changes in future it’s better to get the power of SSIS and spend a little more time development today to create a powerful and easily changeable mechanism that could help you in future. You can also do that with Business Objects Data Integration but you have to spend more bucks for the development and changes of ETL processes because development cost in Business Objects solutions is always a nightmare for a project.
At this point we have a brief understanding of differences in ETL process between two vendors, so it’s good time to take a look back to the source database. Here is a very quick answer, if you use mostly MS products to store your transactional data then take your decision and move to MS for a robust and compatible BI platform. Business Objects don’t have a database system and it always used other database solutions to store data for its universe.
So guess what happen ! from an administrator perspective performance tuning is somehow problematic ! since we should use other database systems we should use different technics for each database systems. And this is one of the areas that MS wins the competition because when you use Microsoft platforms there lots of joint mechanism for performance considerations.
Before the SQL Server 2012 we have SSAS with its famous aggregated cubes, because of the nature of SSAS in previous versions we couldn’t call it a semantic layer, here is a little why. A semantic layer provides translation between underlying data store and business-level language(Business semantic that business users familiar with). There was no actual translation in previous release of SSAS. Perhaps we had some difficulties over SSAS to understand for a business user. So Microsoft change its approach in SSAS 2012 from delivering a complex understandable solution to end users to a true semantic layer like what we has in Business Objects that called Universe. So from now MS BI users can use a powerful toolset like Microsoft Excel and use their existing knowledge to interact with semantic layer. What Microsoft do in backyard is to create aggregations in memory so the performance of this approach is really high ! I don’t want to deep dive into what Microsoft do in backyard in this post but it would be one of my next topics. (sounds like advertisement
I talked about aggregations so know that in BO there are no facility for aggregation tables, so you have to deal with DBAs to create aggregation tables manually and integrate them into the Universe.
One of the important aspects of a BI system is the learning curve of the solution, it was always the slogan of the Business Object that learning curve is very low ! yes for end users it is not hard to interact with Universe. BUT ! the thing that I say here is the problem of every BI platform from Microsoft to BO or Cognos that deliver Semantic layer, it is very easy for a user to get the wrong answer, because everything is behind the Universe or Semantic Model and know that tracking from report back to the base data is a Non-trivial task. So be aware about letting users create whatever they want with their own knowledge. There should always an IT professional observing the whole process. So never think about a fully out of the box solution, because you will shortly find it on Mars ! or your users may have the chance to take decisions based on wrong calculations and find their way to Mars again
Another important aspect of a BI systems is the cost of it, about the Business Objects we can definitely say that it is expensive and for sure Microsoft could be expensive ! but how can we decide ' the answer is to compare the detail parts, there are 4 main parts Database, ETL, Semantic Layer and Reporting or user interaction layer. If you choose to go over BO you have to find heads for your data warehouse, database solution and Java skills or tomcat or other J2EE platform professionals for ETL and development phase and BO specific heads for Universe Modeling, Design, Implementation, perhaps you need security administration and if you want to integrate your Active Directory with this platform it is problematic and integrating with other LDAP platforms is a nightmare ! so be aware of these costs. The point of Microsoft solution is that we can use our in house knowledge like Dot.Net and SqlServer, SharePoint, Windows Server and these knowledge are transferable to other skills. But with BO we need headcount dedicated to BO (Universe Design, Implementation, Maintenance, Security) since BO skills are not transferable to other skills, those extra heads blow the project’s budget ! Microsoft BI platform is a more manageable, more secure and less expensive solution, I see the BO as a consultant dream, as an endless font of billable hours
Conclusion
I decide to go over Microsoft BI platform but I would not suggest anyone at first place to choose Microsoft. This is really depend on the nature and scale of the project and what you did and what technologies you have used in past but a quick look gives an idea that Microsoft’s platform is looking more robust and coherent in different parts so it can be a very good and convenient choice and perhaps after the release of SQL Server 2012 and its BI Semantic layer the answer is more easier and acceptable than before.
I also would like to hear about your experience on either of these solutions.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
In my experience one area that can get left behind is the distribution of reports. I've yet to find an instance where a company doesn't need to share information with an external party. Having an environment where distribution mechanisms are managed in one place only reduces risk. Here again we find that the MS stack can place more restrictions on end users.
Head of Internal Control at a tech company with 51-200 employees
Intuitive design, good support, and reliable
Pros and Cons
- "The solution is very intuitive, you do not need to have too much programming knowledge to use it. Advanced Excel users can use it very easily."
- "The solution could improve by providing more free visuals. A lot of them are very nice but you have to purchase them."
What is our primary use case?
We are using Microsoft BI for various reports, such as overall cost management, credit overview, and sales.
What is most valuable?
The solution is very intuitive, you do not need to have too much programming knowledge to use it. Advanced Excel users can use it very easily.
What needs improvement?
The solution could improve by providing more free visuals. A lot of them are very nice but you have to purchase them.
For how long have I used the solution?
I have been using Microsoft BI for approximately four years.
What do I think about the stability of the solution?
The solution has been stable. Everything has been running smoothly.
What do I think about the scalability of the solution?
Microsoft BI is scalable.
How are customer service and support?
The support has been good.
Which solution did I use previously and why did I switch?
We previously used Cognos.
What about the implementation team?
We used a consultant for the implementation of the solution.
What's my experience with pricing, setup cost, and licensing?
The solution is affordable.
What other advice do I have?
I rate Microsoft BI a nine out of ten.
Which deployment model are you using for this solution?
Private Cloud
Disclosure: My company has a business relationship with this vendor other than being a customer: partner
Solutions Architect at a tech services company with 11-50 employees
Easy to learn with good data preparation features and good stability
Pros and Cons
- "As it was my first time using Power BI, I can say that it was fairly easy to learn, especially if you already know BI-type tools."
- "The one thing that I noticed specifically was the graphical features, and some of the analytical features. They were stronger on the Tableau side."
What is our primary use case?
There were two systems that the client was looking at creating some BI visualizations for. There was the system called ICE, which is a system that takes in automated calls, for service calls. They go into voicemail, however, if there's not an agent to take the call, it'll track statistics like how long a client was on hold, if an agent actually took the call, if the person hung up, et cetera. There are all of these different statuses and it would take all of those statistics. The product was used primarily to take that data in that case.
The other system was an ITSM system, which was the ticketing system. It would
it was my first time using Power BI, I can say that it was fairly easy to learn, especially if you already know BI-type tools. But the one thing that I noticed specifically was the graphical features, and some of the analytical features, I think they were stronger on the Tableau side. But the data preparation features, I much preferred the Power BI with the Power Query type features, at least for the datasets that we were working with. They were helpful in preparing the data.
combine those two data sets and reduce a merged analytical set of reports to just show when the call volumes were and what the performance rate was and the nature of the calls and things like that.
What is most valuable?
As it was my first time using Power BI, I can say that it was fairly easy to learn, especially if you already know BI-type tools.
The data preparation features were great. I much preferred the Power BI with the Power Query type features - at least for the datasets that we were working with. They were helpful in preparing the data.
What needs improvement?
The one thing that I noticed specifically was the graphical features, and some of the analytical features. They were stronger on the Tableau side.
There are two areas where the solution can be enhanced. One is natural language, where you can have third parties, however, it would be nice to have it built-in within Power BI, where a graph could be automatically explained in whatever language so that somebody can just kind of read through and have the graphic as an extra for exploring. That's something that I was looking at.
The other is the explained data, which is more of a lineage-type feature. My understanding is Tableau has that feature and Power BI doesn't. I was looking at a graph of comparisons and I noticed Tableau had this explained data feature, which gives, some lineage where data has come from, and Power BI doesn't have that. Or, if it does, it's not as well developed as Tableau.
For how long have I used the solution?
I used the solution a few months ago. I'm semi-retired, however, I did have a two-month contract and did some work with it. On the project, I did work a lot with Tableau, however, I also used Power BI, and would like to continue to use it more.
What do I think about the stability of the solution?
It was a short project on the Power BI side. I didn't really identify any glitches. Certainly, over time things might've come out, however, I enjoy doing experimenting and playing around with things. The one definite thing that I noticed, as I do art with Tableau, is some really neat art stuff, so I tried to replicate that with Power BI and the quality just wasn't there.
What do I think about the scalability of the solution?
With ITSM, there's a lot of data that they collect on that side. That could be an issue, scalability. It wasn't moved to production yet. I basically produced a report as there were some issues with the ICE data, the voicemail system, and data collection. They had some issues that they had to solve. However, they were collecting data on a daily basis. You get thousands and thousands of calls on a daily basis. It depends on how they want to store that data historically. There may be some performance issues and scalability issues, however, for me, I can't really address that, as I didn't see that side. There's always a potential for trouble.
How are customer service and technical support?
It was a really short project for me. I was really there just to analyze those two data sources and produce a report for them, with some examples, using Power BI. I was able to do that without getting tech support involved.
Which solution did I use previously and why did I switch?
I've used other solutions, such as Tableau and Power Query. Tableau from a graphical point of view is much better than Power BI, however, from a data manipulation standpoint and things like that, I prefer Power BI and Power Query to prep the data.
Tableau got bought out by Salesforce, which is a cloud-based company, so they're all going to push everything to the cloud. I'm sure they'll incorporate all these new features.
How was the initial setup?
I downloaded the solution from their website.
The server side was already set up, therefore, I don't know much about the initial setup process. As a contractor, I came in just on the development side, which was just downloading the Power BI tool as a client. It was fairly easy. That's the nice thing is it's easy to set up. It is similar to Tableau, in that Tableau is fairly easy to set up. It is all built towards this whole idea of self-service.
What other advice do I have?
I'm just a customer.
As I downloaded the solution right off the website, I assume it was the latest version. I don't know the version number off-hand.
I would do the development on-pre
There were two systems that they were looking at creating some BI visualizations for. So there was the system called ICE, which is a system that takes in automated calls, for service calls. And so goes into voicemail, but if there's not an agent to take the call, it'll track statistics like how long you were on hold, if an agent actually took your call, if you hung up. All of these different statuses would take all of those statistics. And so it was to take that data.
And the other system was the ITSM system, which is the ticketing system. And kind of combine those two data sets and reduce a merged analytical set of reports to just show when were the call volumes were, and what was the performance rate, and the nature of the calls and things like that.
It's easy to set up, however, regarding scalability and high-end analytical features, Tableau is still better on that. It's always important to keep up, as technology's changing all the time. They're always adding new features. There are no negatives in exploring these tools. In the end, you just want some visualizations that help you make decisions. Both Tableau and Power BI tools are great for that. However, if you want advanced stuff, then it's just about experimenting and seeing what will help you solve your problems.
I'd rate the solution at an eight out of ten.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Buyer's Guide
Download our free Microsoft Power BI Report and get advice and tips from experienced pros
sharing their opinions.
Updated: November 2024
Popular Comparisons
Splunk Enterprise Security
Databricks
SQL Server
Amazon QuickSight
Teradata
IBM Cognos
Rapid7 InsightVM
SAP Analytics Cloud
Qlik Sense
SAP BusinessObjects Business Intelligence Platform
Oracle OBIEE
Buyer's Guide
Download our free Microsoft Power BI Report and get advice and tips from experienced pros
sharing their opinions.
Quick Links
Learn More: Questions:
- BI Reports for business users - which BI solutions should we choose?
- Business users moving from Tableau to MS Report builder
- Is Power BI a complete platform or only a visualization tool?
- What are the key advantages of OBIEE compared to Microsoft BI?
- What Is The Biggest Difference Between Microsoft BI and Oracle OBIEE?
- Is Microsoft Power BI good for an ETL process?
- How would you decide between Microsoft Power BI and TIBCO Spotfire?
- Is it easy to extract data from Oracle Fusion into Power BI?
- PowerBI or SyncFusion - which is better?
- What challenges to expect when migrating multiple dashboards from TIBCO Spotfire to Microsoft Power BI?
Entrepreneurs who run small businesses have another reason to smile or keep smiling. However, doesn't it seem like other clients will be locked out from using this great product in the making? My reasons entail the fact that the Power BI is designed for compatibility with Azure or Office 365 platforms? There are many businesses across the globe that utilize other platforms other than these two. Does it mean they will be locked out due to compatibility issues? If so, then the platform the Power BI will support will limit its use to some extent, making this a con.