Try our new research platform with insights from 80,000+ expert users
Business Application Group Manager at OBASE
Real User
Scalable with various licensing options and offers good technical support
Pros and Cons
  • "Technical support is quite good."
  • "The licensing needs improvement. There needs to be a middle option between Pro and Premium versions."

What is our primary use case?

We primarily sell Power BI licenses. 

Mostly, we focus on the retail industry. Usually, we are using the Power BI reports for dashboards. We are also providing our customers with some data warehouse reporting. Basically, we do long-term categorization of stock and inventory numbers and sales figures so that they can compare the sales and stocks to the inventory numbers.

We tend to deal with physical inventory reports. We tend not to deal with the solution for financial operations.

What is most valuable?

The solution has different licensing tiers.

The product can scale if you need it to.

Technical support is quite good.

With the cloud deployment, there's no setup required.

The product works well for small or mid-level organizations.

What needs improvement?

The solution's stability could be improved. In the retail industry, due to issues with information updating and data uploading. Clients may end up pricing items wrong as they could accidentally base their choices on old or wrong data.

The licensing needs improvement. There needs to be a middle option between Pro and Premium versions. It could also be less expensive.

For how long have I used the solution?

While we have experience with Microsoft tools for the past ten years, I personally have had experience with Microsoft BI for the past two or so years.

Buyer's Guide
Microsoft Power BI
November 2024
Learn what your peers think about Microsoft Power BI. Get advice and tips from experienced pros sharing their opinions. Updated: November 2024.
823,875 professionals have used our research since 2012.

What do I think about the stability of the solution?

The solution is not really stable. I've faced a lot of limitations especially in terms of some data flows updates. We have faced some exceptions. Right now, we have created a case for Microsoft in order to solve some problems we've been having as, right now, there's some sort of system fault.

While the platform is a little more stable, the big problem is the latency issues around customer updates.

In a recent project, in Sweden, we faced an exception in Power BI from the update path. Whenever we refresh the animations and the fact tables, we faced something that said we couldn't upload the data. After 20 minutes we tried to refresh again. We didn't change anything with our reports or platform or anything related to infrastructure or schema. We didn't touch anything. An yet, the data is updated successfully. That leads us to ask the question: what was the problem 20 minutes ago? We still don't know the answer.

That is why we need some stability for the update service. We are deciding some business decisions based on this data. If this data is not updated, we may decide to take the wrong path. 

This is very important in the retail industry. For example in the grocery field, sometimes with vegetables or fruits or other products, customers need to decide to change the prices within lock days. They need to decide, based on demand, if they should increase or lower prices. They can't make the correct decisions if the numbers don't update.  

What do I think about the scalability of the solution?

The scalability potential actually depends on your licenses. Microsoft provides three licenses. One of them is absolutely free. Another is called Pro. Yet another is per user or Premium. There is a huge difference between the Pro license and the Premium license. There is no need to scale the Pro license or other license models.

When you put the on-premise reports, you can scale out or scale up. It depends on your existing resources. However, in the services as a software (SaaS) version, it actually comes with Microsoft's units or Microsoft resources. 

How are customer service and support?

We have an ASFP agreement with Microsoft. As a Gold Partner, Microsoft is pretty good and responding automatically and helping us resolve issues.

If a partner does not have an ASFP agreement, they tend to have to go to forums or try to Google answers to their problems.

We are aware that sometimes when we make changes to BI, the solutions might be complex. There might be SQL involved, which needs to be taken into account, for example. We may have to look into other resources and other tools to see if there are interactions that are the issue. However, Microsoft is quite helpful as we work through issues. We're quite satisfied with their level of support.

Which solution did I use previously and why did I switch?

We also work with and sell MicroStrategy in Turkey. 95% of the MicroStrategy projects have been completed with our company. It is a direct competitor with Power BI. My company commonly does consulting for some small and medium markets and for that we recommend Power BI. Whenever we talk about enterprise-level solutions, the company we will use is MicroStrategy.

How was the initial setup?

There isn't much of an installation process, as the solution is on the cloud.

What's my experience with pricing, setup cost, and licensing?

There is a free license, however, it is limited. The Premium license and the Pro license functionalities are very different. Whenever we talk with our customers, they ask "Should we chose the Premium?" We say that "No, you don't need to, as it's too expensive. It is much more expensive than the Pro license." However, the Pro license functionality is not enough for some customers. There is no middle between Pro and the Premium. We need something in between.

The Pro license is maybe suitable for SMBs, small and medium-sized businesses. The premium version is ideal for companies that need to scale up and out.  

There also may be some additional costs that can drive up the price.

What other advice do I have?

I prefer this product and I suggest Power BI to all customers, however, we know that if Power BI is a new idea, we make sure to show company-specific data or analytics for analyzing the data and how this solution can analyze everything quickly. That said, it's not for everybody. If all companies tried to put all their reporting expectations in the Power BI, it would not fit their expectations.

It's not a standalone solution. You need other items in your toolbelt. You need, for example, something that can handle raw data, you need warehousing, et cetera.

In general, I would rate the solution eight out of ten.

Which deployment model are you using for this solution?

Public Cloud
Disclosure: My company has a business relationship with this vendor other than being a customer: Reseller
PeerSpot user
BI Consultant at a manufacturing company with 1,001-5,000 employees
Real User
Easy to use, good inquiry and visual gallery features, and the community is strong
Pros and Cons
  • "This is a self-service BI tool, so it doesn't require a huge amount of technical training, knowledge, or a background in BI in order to induct new people into it."
  • "A lot of people work on Excel, so being able to export an Excel file as part of a subscription would be an improvement."

What is our primary use case?

We primarily use Power BI for data science solutions.

How has it helped my organization?

This is a self-service BI tool, so it doesn't require a huge amount of technical training, knowledge, or a background in BI in order to induct new people into it. This is true even for business people who want to do a little bit of self-service. The whole thing has become quite simplified, as compared to what it was.

What is most valuable?

There are many good features in this product.

The most valuable features are a combination of the DAX and the inquiry capability, plus the visual gallery that we have. I think that these three things, or the combination of them, is the best that we have found.

What needs improvement?

I would like to have the capability to schedule exports in different versions. For example, if I want to schedule a subscription and attach an Excel file, we don't have this capability right now. A lot of people work on Excel, so being able to export an Excel file as part of a subscription would be an improvement.

For how long have I used the solution?

We have been working with Power BI for more than five years.

What do I think about the stability of the solution?

Overall, it is pretty stable. The cloud version is definitely very stable, and I have found very few flaws or areas for improvement. We have been using it for more than five years and we have every intention to continue with it.

What do I think about the scalability of the solution?

It has been pretty nice for us, in terms of scalability. There hasn't really been a problem, as such. We have approximately 300 users.

How are customer service and technical support?

Technical support from Microsoft gets a big thumbs up from me. They have blogs, a support page, and the community is very active. Most of the answers we need, we find them there. 

With this amount of information available, we don't need to use technical support extensively. When we do post something on the blog or in the community, the response that we get is pretty prompt.

The last time we utilized technical support was when we were investigating creating a subscription with an Excel file attached to it. We found out that it isn't yet possible with the current version.

How was the initial setup?

The initial setup is straightforward, to say that least. It took just a few days to deploy.

What's my experience with pricing, setup cost, and licensing?

The pricing is very reasonable compared to any other tool.

What other advice do I have?

My advice for anybody who is considering this tool is that if you want to move to a self-service BI, this is probably the best tool available right now. The self-service is pretty good in terms of os user acceptance, usage, and simplicity. It is something that everybody wants and Power BI does really well in that area.

The biggest lesson that I have learned from using Power BI is that if the data is not according to what you want, that really isn't a problem. It has been a problem in the past, especially where the data is not in the form or the shape that you need, but with Power BI, that problem can be solved very easily.

I would rate this solution an eight out of ten.

Which deployment model are you using for this solution?

Public Cloud

If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?

Microsoft Azure
Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
Buyer's Guide
Microsoft Power BI
November 2024
Learn what your peers think about Microsoft Power BI. Get advice and tips from experienced pros sharing their opinions. Updated: November 2024.
823,875 professionals have used our research since 2012.
Consultant at a consultancy with 11-50 employees
Consultant
Valuable dashboard and automated reporting features but should have better integration with RPA and Tableau platforms
Pros and Cons
  • "There was a lot of manual work involved with Excel, whereas once we moved on to Microsoft Power BI, it was a cleaner dashboard and it was faster too."
  • "I think because I'm moving more into RPA, I'd definitely like to see integration across the RPA, Power BI, or Tableau platforms, because that integration could then make automation and mundane reporting much faster."

What is our primary use case?

The primary use case was mainly to create dashboards and present them for reporting purposes. We use Microsoft Power BI to create dashboards, provide insights, and use its various analytical and insight functions to send to either the departmental manager or the general manager of marketing and sales or retail ops.

How has it helped my organization?

In short, we had to create sales reports. The sales reports needed to be emailed to the different sales managers all across New Zealand. The Excel file data had to be filtered manually, whereas, in Power BI, you can set restrictions. For example, if you have North, South, East and West branches, you can set corresponding restrictions, while allowing the national sales manager to view the entire data without having to do a lot of filtering, because of the restrictions involved.

It is time-saving for the person who creates the report and also a lot of that is automated, whereas with Excel you've got to keep working on it and keep filtering it all the time.

What is most valuable?

Our most valuable features are the dashboard and the reporting feature because, before that, our organization used to only use Excel. There was a lot of manual work involved with Excel, whereas once we moved on to Microsoft Power BI, it was a cleaner dashboard and it was faster too. The end user also had an opportunity to alter the reporting as per requirements with Power BI.

What needs improvement?

I haven't experienced any issues with it right now, so I can't really advise on that part.

I think because I'm moving more into RPA, I'd definitely like to see integration across the RPA, Power BI, or Tableau platforms because that integration could then make automation and mundane reporting much faster.

You'll always require some more complex reports that the analyst could work on, but the very standard sales reports could then be automated very quickly using Power BI and RPA together.

For how long have I used the solution?

I've been using this solution for one year.

What do I think about the stability of the solution?

Stability is pretty good. I've been using other services from Microsoft and the support also has been really good so it's a very stable platform and the support services are excellent.

What do I think about the scalability of the solution?

I've been using it for a year and it's only across my single department so I've not yet had to scale it to a very large level.

Across my department, there are two users. Across other departments at my company, there are many more. Most users are commercial or insights analysts.

Across our organization, Power BI is not extensively used, although I'm not privy to every department and their plans. I think it's being used moderately, but could be used more.

How are customer service and technical support?

In terms of Microsoft Power BI, if there was ever a query, you can find support online. There's a lot of help available but what I generally meant was I have been using Microsoft Office products through the Office 365 subscription business and the support of Microsoft Office, in general, has always been really good. They usually come back on the very same day and try and solve the problem. This is not something I've experienced with other platforms in the past.

Which solution did I use previously and why did I switch?

We hadn't previously used another solution. We were just using Excel and decided to go to a much better platform.

How was the initial setup?

The setup was straightforward for me because as part of an organization, I didn't have to set it up myself. It was set up via the organization, through the IT department, so it was pretty straightforward.

What about the implementation team?

I adopted it in my team, but it was already being implemented across other parts of the company. I just brought it into my department, so I wouldn't be able to tell the time frame for the actual implementation strategy. For my team alone, it probably took me around two months or so.

Which other solutions did I evaluate?

The other departments in my company were using Power BI, so it would not make sense to get another solution when they already had Power BI in other parts of the organization.

What other advice do I have?

Microsoft Power BI is definitely a good product and Microsoft has a good integration of a good suite of products across the range. I think if you take UI parts recent release, for example, I'm not sure that does it with Power BI but it definitely integrates it across the Microsoft Office activities, so I think it's only a matter of time before a majority of the Microsoft Office products start integrating with UI parts. That would take analysis, reporting, and automation to the next level and we could free up time for the general reporter.

I would rate this solution as seven out of ten because it's easy to use and there is a lot of support online. I think Microsoft Office itself has a lot of support being provided. In addition, I like its features and it's growing much faster than some of the other analytical tools out there, although it started off later.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
PeerSpot user
Business Intelligence Architect at a tech services company
Real User
Data integration enabled my organization to easily pool data from our varied system into one data warehouse.​
Pros and Cons
  • "Data integration enabled my organization to easily pool data from our varied system into one data warehouse."
  • "SSMS & SSRS."

What is most valuable?

Power Query, Power Pivot, Power View & SQL Server Data Tools comprising of (SSIS, SSAS, SSRS).

How has it helped my organization?

Data integration enabled my organization to easily pool data from our varied system into one data warehouse.

SSRS improved our report delivery timings and the Power BI feature empowered the data users.

Power View improved the visualization of our reporting and dashboards.

What needs improvement?

SSMS & SSRS.

For how long have I used the solution?

For over five years.

What do I think about the stability of the solution?

Yes.

What do I think about the scalability of the solution?

Yes, SSRS reporting portal cannot run on tablets, iPads and all other mobile devices. But SQL 2016 has been shipped with this capability.

How are customer service and technical support?

Microsoft has a lot of products, hence some products are given little attention and VS is one of them, so I would say their technical support for the BI suite is average.

Which solution did I use previously and why did I switch?

No.

How was the initial setup?

The initial setup was fairly simple and straight forward, I must say.

What's my experience with pricing, setup cost, and licensing?

To others, I would advise that Microsoft BI products are rightly priced and the licensing is easy to follow and to fulfill.

Which other solutions did I evaluate?

Yes, Tableau.

What other advice do I have?

For users intending to use the product in exclusively Microsoft environment, this product will meet most of your requirement if not all.

If there is a need for integration with other vendors’ products precautionary measures must be taken.

SQL 2016 is bringing and offering that support for mobile BI, so it would be great to give it a try.

If you are comfortable with MS Excel spreadsheets, then Power BI is your thing as it has the same look and feel of Excel, actually, it is embedded in Excel.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
PeerSpot user
Senior Consultant | Architect at DHL
Consultant
We have the ability to connect separate services like CRM, NAV, Facebook, Exchange, etc. and make data integration and transformation based on these data connections in a few clicks.

What is most valuable?

Firstly there was the wow effect, as when I saw the number of connectors which were made for this product/service. It is quite fantastic to have the ability to connect separate services like CRM, NAV, Facebook, Exchange etc. and make data integration and transformation based on these data connections in a few clicks. Also, the ability to download new visuals is quite nice. The effect of new graphics for higher management is magical and this is the good way not only for a pre-sale/sale or up-sale, but also for good impressions. Another great feature is, for example, the collecting of data in concrete folder on disk. If there is, for example, 100 csv files from other information systems, these should be automatically merged, analyzed and transformed into great graphical report.

How has it helped my organization?

Because the product is based the ability to quickly produce BI insights and reports, it’s really useful on a daily basis. Another improvement is definitely the possibility of browsing the dashboards on smartphone using an app, and another is the ability to quickly produce reports.

What needs improvement?

Well definitely the connectors, which is always a huge space for improvement in configurations, especially the amount of connectors etc. Also, the graphic designs of the reports needs work as they are still really strictly defined, and with the amount of output, there isn’t space for such a design realization. 

For how long have I used the solution?

I have used this solution since it was released (a few years ago) and I am using it on a daily basis at work and at university where I work as a researcher and am a PhD student. I use it on mobile, laptop and tablet as well. Also, it has its own app for viewing the reports and dashboards, own app for creating and editing.

What was my experience with deployment of the solution?

There have been no issues with the deployment.

What do I think about the stability of the solution?

There was no issue with stability.

What do I think about the scalability of the solution?

We have been able to scale it for our needs.

Which solution did I use previously and why did I switch?

Personally, I used Excel Services/Visio Services.

How was the initial setup?

Really straightforward. The most complex part is defining the data gateways between the cloud part and the on premises part of the infrastructure. Another bigger task is to define security model of all the datasets and reports with correct audience, data refresh etc.

What was our ROI?

It's too soon to calculate. I think, that for the correct ROI value, you need to have it in place for more than three years. But, it saves a lot of time and not only developers time, but also management time etc. Things are easier when there are functionalities like “quick insight” for auto-creating of data based on machine learning algorithms, or Q&A for using natural query language. 

Which other solutions did I evaluate?

I played with Tableau. I see many similarities in Power BI to other products, so there is no reason to combine many different vendors/third parties to build such a complex BI solution.

What other advice do I have?

It's a tool for a new kind of business intelligence from Microsoft. Tool for quick modeling of data structures and for visualizing almost everything you have in mind at the moment. There is still a lot of room for improvement and there is also a huge space for new functionalities. But it’s a simple and great tool for everyday use. 

You should get it and implement it. However, you should get a trial version, and contact a partner who can provide some sales presentation with the live session (CIE for example) and show, what the possibilities are of Power BI.


Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
it_user7845 - PeerSpot reviewer
Consultant at a tech consulting company with 501-1,000 employees
Consultant
My 30 tips for building a Microsoft BI solution, Part V: Tips 21-25
I might just get all 30 done before summer vacation :)

#21: Avoid using discretization buckets for your dimension attributes

Discretization buckets lets you group numerical attributes into ranges. Say you have a customer dimension including the age of the customer you can use this feature to group them into age clusters such as 0-5, 6-10 and so on. While you can tweak how the algorithm creates groups and even provide naming templates for the groups you still have relatively limited control over them. Worst case scenario: A grouping is removed / changed by the algorithm which is referenced in a report. A better way of grouping these attributes is by doing it yourself either in the data source view or a view in the database (there will be a separate tip on this). This way you have complete control over the distribution of values into groups and the naming of the groups.

#22: Do not build a SSAS solution directly on top of your source system

SSAS has a couple of features that enable it to source data directly from a normalized data model typically found in business applications such as ERP systems. For instance you can “fake” a star schema through queries in the data source view. You can also utilize proactive caching to eliminate any ETL to populate your cube with data. This all sounds very tempting but unfortunatly I have never seen this work in reality. Unless you are working with a very small source system with impeccable data quality and few simultanous users you should avoid the temptation for all the usual reasons: Proactive caching will stress your source system, data quality will most likely be an issue, integrating new data sources will be nearly impossible,etc. There is a reason BI projects spend 70-80% of their time working with modelling and integrating data.

#23: Deploy SSAS cubes with the deployment tool

If you are working with multiple environments (dev/test/prod) do not use the deployment functionality of visual studio to deploy to another environment. This will overwrite partitions and roles that may be different between the environments. Use the deployment wizard.

#24: Remember that your SSAS cubes are a single point of failure

Keep in mind that most client tools do not cope well with changes to SSAS data models. Any renames or removals you do in the model will most likely cause clients that reference those entities to fail. Make sure you test all your reports against the changed model before deploying it to production. Also, if you allow ad-hoc access to your SSAS solution be aware that users may have created reports that you do not know about. Query logging may help you a little here (it gives you an indication of which attribute hierarchies are in use). The best way to avoid all of this is to thoughtfully design your cube and the naming of your SSAS objects so that there is no need to change or remove anything in the first place.

#25: Avoid “real time”

“Real time” means different things to different people. Some interpret it as “simultaneous to an event occurring” while others have more leeway and have various levels of tolerance for delays. I prefer the term “latency”: How old can the data in the BI solution get before it needs to be refreshed?. The lowest latency I have ever implemented is two hours. That is hours not minutes. I know this does not sound very impressive but that is honestly the best I have been able to do at a reasonable cost. When doing “real time” you need to consider a lot of factors: Partitioning, changes to dimensions, ROLAP vs MOLAP / direct query vs xVelocity, source system access, how to administer it, etc., etc. These things add up quickly to a point where the value simply does not justify the cost.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
it_user7845 - PeerSpot reviewer
Consultant at a tech consulting company with 501-1,000 employees
Consultant
My 30 tips for building a Microsoft BI solution, Part IV: Tips 16-20
A note about the SSAS tips: Most tips are valid for both dimensional and tabular models. I try to note where they are not.

#16: Implement reporting dimensions in your SSAS solution

Reporting dimensions are constructs you use to make the data model more flexible for reporting purposes. They usually also simplify the management and implementation of common calculation scenarios. Here are two examples:

  • A common request from users is the need to select which measure to display for a given report in Excel through a normal filter. This is not possible with normal measures / calculations. The solution is to create a measure dimension with one member for each measure. Expose a single measure in your measure group (I frequently use “Value”) that you assign the correct measure to in your MDX script / DAX calculation based on the member selected in the measure dimension. The most frequently used measure should be the default member for this dimension. By doing this you not only give the users what they want, but you also simplify a lot of calculation logic such as the next example.
  • Almost all data models require various date related calculations such as year to date, same period last year, etc. It is not uncommon to have more than thirty such calculations. To manage this effectively create a separate date calculation dimension with one member for each calculation. Do your time based calculations based on what is selected in the time calculation dimension. If you implemented the construct in the previous example this can be done generically for all measures that you have in your measure dimension. Here is an example for how to do it tabular. For dimensional use the time intelligence wizard to get you started.

#17: Consider creating separate ad-hoc and reporting cubes

Analysis Services data models can become very complex. Fifteen to twenty dimensions connected to five to ten fact tables is not uncommon. Additionally various analysis and reporting constructs (such as a time calculation dimensions) can make a model difficult for end users to understand. There are a couple of features that help reduce this complexity such as perspectives, role security and default members (at least for dimensional) but often the complexity is so ingrained in the model that it is difficult to simplify by just hiding measures / attributes / dimensions from users. This is especially true if you use a “reporting cube” which I talked about in tip #16. You also need to consider the performance aspect of exposing a large, complex model to end user ad-hoc queries. This can very quickly go very wrong. So my advice is that you consider creating a separate model for end users to query directly. This model may reduce complexity in a variety of ways:

  • Coarser grain (Ex: Monthly numbers not daily).
  • Less data (Ex: Only last two years, not since the beginning of time).
  • Fewer dimensions and facts.
  • Be targeted at a specific business process (Use perspectives if this the only thing you need).
  • Simpler or omitted reporting dimensions.

Ideally your ad-hoc model should run on its own hardware. Obviously this will add both investment and operational costs to your project but will be well worth it when the alternative is an unresponsive model.

#18: Learn .NET

A surprisingly high number of BI consultants I have met over the years do not know how to write code. I am not talking about HTML or SQL here but “real” code in a programming language. While we mostly use graphical interfaces when we build BI solutions the underlying logic is still based on programming principles. If you don’t get these, you will be far less productive with the graphical toolset. More importantly .Net is widely used in Microsoft based solutions as “glue” or to extend the functionality of the core products. This is especially true for SSIS projects where you quite frequently have to implement logic in scripts written in C# or VB.net but also applies to most components in the MS BI stack. They all have rich API’s that can be used for extending their functionality and integrating them into solutions.

#19: Design your solution to utilize Data Quality Services

I have yet to encounter an organization where data quality has not been an issue. Even if you have a single data source you will probably run into problems with data quality. Data quality is a complex subject. Its expensive to monitor and expensive to fix. So you might as well be proactive from the get-go. Data Quality Services is available in the BI and Enterprise versions of SQL Server. It allows you to define rules for data quality and monitor your data for conformance to these rules. It even comes with SSIS components so you can integrate it with your overall ETL process. You should include this in the design stage of your ETL solution because implementing it in hindsight will be quite costly as it directly affects the data flow of your solution.

#20: Avoid SSAS unknown members

Aside from the slight overhead they cause when processing, having unknown members means that your underlying data model has issues. Fix them there and not in the data model.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
it_user7845 - PeerSpot reviewer
Consultant at a tech consulting company with 501-1,000 employees
Consultant
My 30 tips for building a Microsoft BI solution, Part II: Tips 6-10

# 6: Use a framework for your Integration Services solution(s) because data is evil

I know how it is. You may have started your ETL project using the SQL Server import / export wizard or you may have done a point integration of a couple of tables through data tools. You might even have built an entire solution from the ground up and been pretty sure that you thought of everything. You most likely have not. Data is a tricky thing. So tricky in fact that I over the years have built up an almost paranoid distrust against it. The only sure thing I can say is that it will change (both intentionally and unintentionally) over time and your meticulously crafted solution will fail. Best case scenario is that it simply will stop working. Worst case scenario is that this error / these errors have not caused a failure technically but have done faulty insert / update / delete operations against your data warehouse for months. This is not discovered until you have a very angry business manager on the line who has been doing erroneous reporting up the corporate chain for months. This is the most likely scenario. A good framework should have functionality for recording data lineage (what has changed) and the ability to gracefully handle technical errors. It won’t prevent these kinds of errors from happening but it will help you recover from them a lot faster. For inspiration read The Data Warehouse ETL Toolkit.

#7: Use a framework for your Integration Services solution(s) to maintain control and boost productivity

Integration Services is a powerful ETL tool that can handle almost any data integration challenge you throw at it. To achieve this it has to be very flexible. Like many of Microsoft’s products its very developer oriented. The issue with this is that there are as many ways of solving a problem as there are Business Intelligence consultants on a project. By implementing a SSIS framework (and sticking with it!) you ensure that the solution handles similar problems in similar ways. So when the lead developer gets hit by that bus you can put another consultant on the project who only needs to be trained on the framework to be productive. A framework will also boost productivity. The up-front effort of coding it, setting it up and forcing your team to use it is dwarfed by the benefits of templates, code reuse and shared functionality. Again, read The Data Warehouse ETL Toolkit for inspiration.

#8: Test and retest your calculations.

Come into the habit of testing your MDX and DAX calculations as soon as possible. Ideally this should happen as soon as you finish a calculation, scope statement, etc. Both MDX and DAX get complicated really fast and unless you are a Chris Webb you will loose track pretty quickly of dependencies and why numbers turn out as they do. Test your statements in isolation and the solution as a whole and verify that everything works correctly. Also these things can have a severe performance impact so remember to clear the analysis services cache and do before and after testing (even if you have cache warmer). Note that clearing the cache means different things to tabular and dimensional as outlined here.

#9: Partition your data and align it from the ground up.

Note that you need the enterprise version of SQL Server for most of this. If you have large data sets you should design your solution from the ground up to utilize partitioning. You will see dramatic performance benefits from aligning your partitions all the way from your SSIS process to your Analysis Services cubes / tabular models. Alignment means that if you partition your relational fact table by month and year, you should do the same for your analysis services measure group / tabular table. Your SSIS solution should also be partition-aware to maximize its throughput by exploiting your partitioning scheme.

#10: Avoid using the built-in Excel provider in Integration Services.

I feel a bit sorry for the Excel provider. It knows that people seeing it will think “Obviously I can integrate Excel data with my SSIS solution, its a MS product and MS knows that much of our data is in Excel”. The problem is that Excel files are inherently unstructured. So for all but the simplest Excel workbooks the provider will struggle to figure out what data to read. Work around this by either exporting your Excel data to flat files or look at some third party providers.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
PeerSpot user
it_user4014 - PeerSpot reviewer
it_user4014Developer at a tech consulting company with 51-200 employees
Consultant

Hi Peter !

Let's discuss from point 6 to 10 in here;

#6: I totally agree with you, never trust or undermine the fact that data will be coming in the format as suggested by the ETL Team. There is always a possibility of wrong data types, bad data, switched data, all kind of data to be appear as source data, so as a ETL developer you need to make sure you put data validation checks for each and every case you have in mind. Still you might miss out some cases. The good thing about MS SQL Server 2012 is now they have provided the TRY_CAST function which can be used to avoid casting errors. A craftily designed framework would be handy to have where ETL developers need to know about the framework, so invest on building a framework which can be used across multiple ETL projects. I strongly agreed with your point that data is evil and sometimes is such hard to load single files which have all kind of these bad data validation errors.

#7: Definitely, by having a framework you can save time by not spending your time on writing same piece of code again and again. While designing your ETL, please beware of the data types which you are using, for some people there is slight difference between Float, Decimal & Numeric data type but if you have been writing ETL solutions you know what kind of a mess it would create if you don\t pick up the right data type, same for Date & DateTime data types.

#8: MDX calculation needs to be tested again and again which is called regression testing. All these years i have been building end to end BI solutions, which involves writing complex ETL's, it is like impossible for QA agents to identify the problem in calculations, so while you assign someone task of verifying MDX calculation or just verifying the BI Dashboard output, make sure he has enough knowledge of Data Analysis. He would be proficient enough to query the database and be able to browse the Cube and also perform cross Data Verification. As a BI Consultant I invest much time in training my QA agents to be able to perform this regression testing.

#9: Partition is always a good practice when you are sure that data influx might going to be run into billions of rows. But if you are designing a BI Solution for an organization which might not have this big amount of data under Analysis then you may avoid partitioning.

#10: Strongly recommended, built in Excel provider is going to make you crazy really soon by having it own data type sensing ability, although you can try to turn it off by setting the property of Type Guess = 0, but there are so many problem with excel provider it always sense the data types for each source column.

One thing I need to mention, is carefully designed ETL with customized logging process can save you tons of time while analyzing the cause of data failure. And it's always good to have the ETL logging process which can be shared with your client as well.

Regards,
Hasham Niaz

Buyer's Guide
Download our free Microsoft Power BI Report and get advice and tips from experienced pros sharing their opinions.
Updated: November 2024
Buyer's Guide
Download our free Microsoft Power BI Report and get advice and tips from experienced pros sharing their opinions.