I've used it for work I did with three clients in the Brazilian banking sector:
Bank Bradesco, B&B Bank, and the Regional Bank of Brazil.
Consultant at a tech consulting company with 501-1,000 employees
My 30 tips for building a Microsoft BI solution, Part IV: Tips 16-20
A note about the SSAS tips: Most tips are valid for both dimensional and tabular models. I try to note where they are not.
#16: Implement reporting dimensions in your SSAS solution
Reporting dimensions are constructs you use to make the data model more flexible for reporting purposes. They usually also simplify the management and implementation of common calculation scenarios. Here are two examples:
- A common request from users is the need to select which measure to display for a given report in Excel through a normal filter. This is not possible with normal measures / calculations. The solution is to create a measure dimension with one member for each measure. Expose a single measure in your measure group (I frequently use “Value”) that you assign the correct measure to in your MDX script / DAX calculation based on the member selected in the measure dimension. The most frequently used measure should be the default member for this dimension. By doing this you not only give the users what they want, but you also simplify a lot of calculation logic such as the next example.
- Almost all data models require various date related calculations such as year to date, same period last year, etc. It is not uncommon to have more than thirty such calculations. To manage this effectively create a separate date calculation dimension with one member for each calculation. Do your time based calculations based on what is selected in the time calculation dimension. If you implemented the construct in the previous example this can be done generically for all measures that you have in your measure dimension. Here is an example for how to do it tabular. For dimensional use the time intelligence wizard to get you started.
#17: Consider creating separate ad-hoc and reporting cubes
Analysis Services data models can become very complex. Fifteen to twenty dimensions connected to five to ten fact tables is not uncommon. Additionally various analysis and reporting constructs (such as a time calculation dimensions) can make a model difficult for end users to understand. There are a couple of features that help reduce this complexity such as perspectives, role security and default members (at least for dimensional) but often the complexity is so ingrained in the model that it is difficult to simplify by just hiding measures / attributes / dimensions from users. This is especially true if you use a “reporting cube” which I talked about in tip #16. You also need to consider the performance aspect of exposing a large, complex model to end user ad-hoc queries. This can very quickly go very wrong. So my advice is that you consider creating a separate model for end users to query directly. This model may reduce complexity in a variety of ways:
- Coarser grain (Ex: Monthly numbers not daily).
- Less data (Ex: Only last two years, not since the beginning of time).
- Fewer dimensions and facts.
- Be targeted at a specific business process (Use perspectives if this the only thing you need).
- Simpler or omitted reporting dimensions.
Ideally your ad-hoc model should run on its own hardware. Obviously this will add both investment and operational costs to your project but will be well worth it when the alternative is an unresponsive model.
#18: Learn .NET
A surprisingly high number of BI consultants I have met over the years do not know how to write code. I am not talking about HTML or SQL here but “real” code in a programming language. While we mostly use graphical interfaces when we build BI solutions the underlying logic is still based on programming principles. If you don’t get these, you will be far less productive with the graphical toolset. More importantly .Net is widely used in Microsoft based solutions as “glue” or to extend the functionality of the core products. This is especially true for SSIS projects where you quite frequently have to implement logic in scripts written in C# or VB.net but also applies to most components in the MS BI stack. They all have rich API’s that can be used for extending their functionality and integrating them into solutions.
#19: Design your solution to utilize Data Quality Services
I have yet to encounter an organization where data quality has not been an issue. Even if you have a single data source you will probably run into problems with data quality. Data quality is a complex subject. Its expensive to monitor and expensive to fix. So you might as well be proactive from the get-go. Data Quality Services is available in the BI and Enterprise versions of SQL Server. It allows you to define rules for data quality and monitor your data for conformance to these rules. It even comes with SSIS components so you can integrate it with your overall ETL process. You should include this in the design stage of your ETL solution because implementing it in hindsight will be quite costly as it directly affects the data flow of your solution.
#20: Avoid SSAS unknown members
Aside from the slight overhead they cause when processing, having unknown members means that your underlying data model has issues. Fix them there and not in the data model.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Consultant at a tech consulting company with 501-1,000 employees
My 30 tips for building a Microsoft BI solution, Part II: Tips 6-10
# 6: Use a framework for your Integration Services solution(s) because data is evil
I know how it is. You may have started your ETL project using the SQL Server import / export wizard or you may have done a point integration of a couple of tables through data tools. You might even have built an entire solution from the ground up and been pretty sure that you thought of everything. You most likely have not. Data is a tricky thing. So tricky in fact that I over the years have built up an almost paranoid distrust against it. The only sure thing I can say is that it will change (both intentionally and unintentionally) over time and your meticulously crafted solution will fail. Best case scenario is that it simply will stop working. Worst case scenario is that this error / these errors have not caused a failure technically but have done faulty insert / update / delete operations against your data warehouse for months. This is not discovered until you have a very angry business manager on the line who has been doing erroneous reporting up the corporate chain for months. This is the most likely scenario. A good framework should have functionality for recording data lineage (what has changed) and the ability to gracefully handle technical errors. It won’t prevent these kinds of errors from happening but it will help you recover from them a lot faster. For inspiration read The Data Warehouse ETL Toolkit.
#7: Use a framework for your Integration Services solution(s) to maintain control and boost productivity
Integration Services is a powerful ETL tool that can handle almost any data integration challenge you throw at it. To achieve this it has to be very flexible. Like many of Microsoft’s products its very developer oriented. The issue with this is that there are as many ways of solving a problem as there are Business Intelligence consultants on a project. By implementing a SSIS framework (and sticking with it!) you ensure that the solution handles similar problems in similar ways. So when the lead developer gets hit by that bus you can put another consultant on the project who only needs to be trained on the framework to be productive. A framework will also boost productivity. The up-front effort of coding it, setting it up and forcing your team to use it is dwarfed by the benefits of templates, code reuse and shared functionality. Again, read The Data Warehouse ETL Toolkit for inspiration.
#8: Test and retest your calculations.
Come into the habit of testing your MDX and DAX calculations as soon as possible. Ideally this should happen as soon as you finish a calculation, scope statement, etc. Both MDX and DAX get complicated really fast and unless you are a Chris Webb you will loose track pretty quickly of dependencies and why numbers turn out as they do. Test your statements in isolation and the solution as a whole and verify that everything works correctly. Also these things can have a severe performance impact so remember to clear the analysis services cache and do before and after testing (even if you have cache warmer). Note that clearing the cache means different things to tabular and dimensional as outlined here.
#9: Partition your data and align it from the ground up.
Note that you need the enterprise version of SQL Server for most of this. If you have large data sets you should design your solution from the ground up to utilize partitioning. You will see dramatic performance benefits from aligning your partitions all the way from your SSIS process to your Analysis Services cubes / tabular models. Alignment means that if you partition your relational fact table by month and year, you should do the same for your analysis services measure group / tabular table. Your SSIS solution should also be partition-aware to maximize its throughput by exploiting your partitioning scheme.
#10: Avoid using the built-in Excel provider in Integration Services.
I feel a bit sorry for the Excel provider. It knows that people seeing it will think “Obviously I can integrate Excel data with my SSIS solution, its a MS product and MS knows that much of our data is in Excel”. The problem is that Excel files are inherently unstructured. So for all but the simplest Excel workbooks the provider will struggle to figure out what data to read. Work around this by either exporting your Excel data to flat files or look at some third party providers.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Buyer's Guide
Microsoft Power BI
January 2025
Learn what your peers think about Microsoft Power BI. Get advice and tips from experienced pros sharing their opinions. Updated: January 2025.
831,071 professionals have used our research since 2012.
BI Expert at a transportation company with 51-200 employees
Microsoft BI vs. SAP Business Objects
A quick look at the whole idea on another weblogs gives you a sense that all of them just talked about very brief things like report refresh feature in BO or cube feature in MS Analysis Service. I choose MS BI and I want to share my reasons and opinions on why I choose it and give you another quick but a little deeper compare on these two Business Intelligence platforms.
As we all know both Business Objects and Microsoft are big companies who are working on BI solutions and both have their own advantages. It’s not true to compare them in term of which one is better, we have to check what is our requirements and then depend on requirements take the decision whether MS or BO. A vision like this could help us relief from religious decisions against a software or technology.
In a BI architect first of all we have the data store level, I mean the storage of the raw data not the stage or olap cubes or universe data source, I mean the first place of our data. This is important to know that where your raw data is and what is the type of storage used to store them. Whether file system or Access or Fox database or a complex database solution like oracle, sqlserver or a web service can made our place of raw data. We have to check our tools against them; check to see which one gives us a smooth way to transfer them among ETL process to destination. So take a look at what Business Objects gives us.
There is a Data Integration platform in Business objects but the problem is that you have to buy that separately because it is not shipped with the BI system. In Microsoft sqlserver enterprise you have all the services and features needed for this part of the game. SSIS is the service that sqlserver deliver for data extract, integration and load. Both product gives you the ability to enhance the data quality and data cleansing portion of your integration phase, but when we down to details things change a little to the Microsoft side, because of the ability of using your Dot.Net knowledge to write complex parts of ETL process you have more room to think and do whatever you want in your process, and in BO side it is always look simple and it’s really not easy to take complex situation into it. There are advantages and disadvantages on this. First you can do many things with the ability of dot.net code but it could give you complexity in your development so you have to decide on your situation, if things looking normal both could fit your need, but if the situation is not stable and you have to make yourself ready for the changes in future it’s better to get the power of SSIS and spend a little more time development today to create a powerful and easily changeable mechanism that could help you in future. You can also do that with Business Objects Data Integration but you have to spend more bucks for the development and changes of ETL processes because development cost in Business Objects solutions is always a nightmare for a project.
At this point we have a brief understanding of differences in ETL process between two vendors, so it’s good time to take a look back to the source database. Here is a very quick answer, if you use mostly MS products to store your transactional data then take your decision and move to MS for a robust and compatible BI platform. Business Objects don’t have a database system and it always used other database solutions to store data for its universe.
So guess what happen ! from an administrator perspective performance tuning is somehow problematic ! since we should use other database systems we should use different technics for each database systems. And this is one of the areas that MS wins the competition because when you use Microsoft platforms there lots of joint mechanism for performance considerations.
Before the SQL Server 2012 we have SSAS with its famous aggregated cubes, because of the nature of SSAS in previous versions we couldn’t call it a semantic layer, here is a little why. A semantic layer provides translation between underlying data store and business-level language(Business semantic that business users familiar with). There was no actual translation in previous release of SSAS. Perhaps we had some difficulties over SSAS to understand for a business user. So Microsoft change its approach in SSAS 2012 from delivering a complex understandable solution to end users to a true semantic layer like what we has in Business Objects that called Universe. So from now MS BI users can use a powerful toolset like Microsoft Excel and use their existing knowledge to interact with semantic layer. What Microsoft do in backyard is to create aggregations in memory so the performance of this approach is really high ! I don’t want to deep dive into what Microsoft do in backyard in this post but it would be one of my next topics. (sounds like advertisement
I talked about aggregations so know that in BO there are no facility for aggregation tables, so you have to deal with DBAs to create aggregation tables manually and integrate them into the Universe.
One of the important aspects of a BI system is the learning curve of the solution, it was always the slogan of the Business Object that learning curve is very low ! yes for end users it is not hard to interact with Universe. BUT ! the thing that I say here is the problem of every BI platform from Microsoft to BO or Cognos that deliver Semantic layer, it is very easy for a user to get the wrong answer, because everything is behind the Universe or Semantic Model and know that tracking from report back to the base data is a Non-trivial task. So be aware about letting users create whatever they want with their own knowledge. There should always an IT professional observing the whole process. So never think about a fully out of the box solution, because you will shortly find it on Mars ! or your users may have the chance to take decisions based on wrong calculations and find their way to Mars again
Another important aspect of a BI systems is the cost of it, about the Business Objects we can definitely say that it is expensive and for sure Microsoft could be expensive ! but how can we decide ' the answer is to compare the detail parts, there are 4 main parts Database, ETL, Semantic Layer and Reporting or user interaction layer. If you choose to go over BO you have to find heads for your data warehouse, database solution and Java skills or tomcat or other J2EE platform professionals for ETL and development phase and BO specific heads for Universe Modeling, Design, Implementation, perhaps you need security administration and if you want to integrate your Active Directory with this platform it is problematic and integrating with other LDAP platforms is a nightmare ! so be aware of these costs. The point of Microsoft solution is that we can use our in house knowledge like Dot.Net and SqlServer, SharePoint, Windows Server and these knowledge are transferable to other skills. But with BO we need headcount dedicated to BO (Universe Design, Implementation, Maintenance, Security) since BO skills are not transferable to other skills, those extra heads blow the project’s budget ! Microsoft BI platform is a more manageable, more secure and less expensive solution, I see the BO as a consultant dream, as an endless font of billable hours
Conclusion
I decide to go over Microsoft BI platform but I would not suggest anyone at first place to choose Microsoft. This is really depend on the nature and scale of the project and what you did and what technologies you have used in past but a quick look gives an idea that Microsoft’s platform is looking more robust and coherent in different parts so it can be a very good and convenient choice and perhaps after the release of SQL Server 2012 and its BI Semantic layer the answer is more easier and acceptable than before.
I also would like to hear about your experience on either of these solutions.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
In my experience one area that can get left behind is the distribution of reports. I've yet to find an instance where a company doesn't need to share information with an external party. Having an environment where distribution mechanisms are managed in one place only reduces risk. Here again we find that the MS stack can place more restrictions on end users.
BI Expert at a financial services firm with 1,001-5,000 employees
Fast and user friendly
Pros and Cons
- "You don't need much support with Microsoft Power BI because it has such a large base of users who can answer your questions on their forums. There are also many video tutorials and webinars available online that offer solutions to whatever problems you may have."
- "Microsoft is behind IBM when it comes to security features."
What is our primary use case?
What is most valuable?
It's very fast and user-friendly.
What needs improvement?
Microsoft is behind IBM when it comes to security features.
For how long have I used the solution?
I've used Microsoft Power BI for about three years with different clients.
How are customer service and support?
You don't need much support with Microsoft Power BI because it has such a large base of users who can answer your questions on their forums. There are also many video tutorials and webinars available online that offer solutions to whatever problems you may have.
Which solution did I use previously and why did I switch?
I use Cognos Analytics 11 for some projects, but many of my clients prefer Microsoft Power BI because it is less expensive.
Which deployment model are you using for this solution?
On-premises
Disclosure: I am a real user, and this review is based on my own experience and opinions.
System Specialist at a government with 201-500 employees
Robust and low-maintenance with a good interface
Pros and Cons
- "It's a powerful platform."
- "The reporting could be a bit better."
What is our primary use case?
We primarily use the solution internally and also on our websites.
What is most valuable?
The interface is great.
The DAX function center and integration with Asia have been great.
It's a powerful platform.
The product is robust and stable.
You do not have to maintain the product. It's very low-maintenance.
Technical support is helpful.
What needs improvement?
I can't speak to any missing features. It has everything we need.
The reporting could be a bit better. It would be nice if there was just a bit more flexibility.
What do I think about the stability of the solution?
The solution is stable and reliable. The performance is good. It's not a product that crashes or freezes. There aren't bugs or glitches to deal with.
It's robust.
What do I think about the scalability of the solution?
We've never attempted to scale. I'm not sure what the process is like. I can't speak to if it's easy or difficult to do.
We have five people in our department that directly use Microsoft BI. They are professors.
While it is currently being used extensively, I cannot say that we have plans to increase usage.
How are customer service and support?
We've been in touch with technical support in the past. They've always been good and we are quite happy with the level of support that they offer us.
Which solution did I use previously and why did I switch?
I have experience with IBM Cognos products before Power BI. It's very difficult to compare products as there are so many different functional, decent products.
How was the initial setup?
I didn't handle the deployment process. I can't speak to what the deployment was like.
The is no maintenance required.
What about the implementation team?
I'm not sure if we used consultants or integrators. I cannot recall what the process was like.
What's my experience with pricing, setup cost, and licensing?
I can't speak to the licensing. I don't have any information related to the exact costs.
Which other solutions did I evaluate?
We have looked at, for example, Tableau and QlikUp. At the moment, Power BI just seems to fit our needs the best.
What other advice do I have?
We are just a customer and an end-user.
I'd advise people to use it, know their needs and understand what they want. Do the personal work well and make sure that Power BI is the right suite for your needs. It's important to do some research before jumping in.
I'd rate the solution at an eight out of ten.
Which deployment model are you using for this solution?
Private Cloud
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Marketing and Relationship Manager | Customer Journey at Ipiranga
Stable product with good technical support
Pros and Cons
- "The stability is good."
- "Areas for improvement would be the construction of reports and the dashboard budget."
What is our primary use case?
My primary use case is to turn data into analysis.
What needs improvement?
Areas for improvement would be the construction of reports and the dashboard budget.
What do I think about the stability of the solution?
The stability is good.
How are customer service and support?
Technical support is good.
How was the initial setup?
The initial setup was complex.
What's my experience with pricing, setup cost, and licensing?
Licenses are available on a monthly basis.
What other advice do I have?
I would rate this solution as nine out of ten.
Which deployment model are you using for this solution?
On-premises
Disclosure: I am a real user, and this review is based on my own experience and opinions.
MCSE at a financial services firm with 501-1,000 employees
Helpful support, plenty of useful features, and stable
Pros and Cons
- "I like all the tools in Microsoft BI because they all have different purposes. There are a lot of people using Microsoft BI, and everybody is using it. The users have adapted well to the technology."
- "I have a little problem with Synapse Analytics because it's very new. It's a product that has many capabilities, but it is not very well developed. I have to see the list of permissions from one object. I can't do this in Synapse. I have to search on the web for a solution because I don't have the tool for it."
What is our primary use case?
We use Microsoft BI in out data warehouse for analytics. We use it for many purposes.
What is most valuable?
I like all the tools in Microsoft BI because they all have different purposes. There are a lot of people using Microsoft BI, and everybody is using it. The users have adapted well to the technology.
There are plenty of visualization tools, a lot of functionality, and documentation.
What needs improvement?
I have a little problem with Synapse Analytics because it's very new. It's a product that has many capabilities, but it is not very well developed. I have to see the list of permissions from one object. I can't do this in Synapse. I have to search on the web for a solution because I don't have the tool for it.
The engine of Synapse Analytics and the ability to export the data that has been published to the web could improve.
For how long have I used the solution?
I have used Microsoft BI within the past 12 months. I have been working with Microsoft BI tools for approximately 20 years.
What do I think about the stability of the solution?
Microsoft BI is stable.
What do I think about the scalability of the solution?
We have approximately 30 people using the solution in my organization. This includes experts that make reports.
How are customer service and support?
The support is good. We created approximately two or three cases, and they had the solution for us, or they had meetings with us to find a solution.
How was the initial setup?
The initial setup of Microsoft BI is easy.
What's my experience with pricing, setup cost, and licensing?
If you purchase a license for a server, for users, or the professional version, you might not have to. The free version I think it can satisfy many customers' needs.
What other advice do I have?
Synapse Analytics is the engine of the data that Power BI consumes.
I would advise those that use the solution to check all the license modes that the tool has because it's very confusing sometimes. You cannot just buy the product, you need to know who needs the product, who will use it, and who will be publishing. There are many aspects to figure out that when you do it will be better for you.
I rate Microsoft BI an eight out of ten.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Founder & CEO at a tech services company with 1-10 employees
Effective dashboard data transfer, stable, and scalable
Pros and Cons
- "There is a feature in Microsoft BI that allows JSON data to SQL on the dashboards effectively and quickly."
- "I have found using Microsoft BI is not easy. For example, trying to use the auto-refresh feature is not simple."
What is our primary use case?
We provide improved dashboards for our own customers by using Power BI.
What is most valuable?
There is a feature in Microsoft BI that allows JSON data to SQL on the dashboards effectively and quickly.
What needs improvement?
I have found using Microsoft BI is not easy. For example, trying to use the auto-refresh feature is not simple.
If you like to use Microsoft BI as an embedded solution, you need to do many things in the Azure, BI, and 365 portals. it can be complicated. This whole procedure should be simplified.
For how long have I used the solution?
I have been using Microsoft BI for approximately two months.
What do I think about the stability of the solution?
The stability of Microsoft BI is good.
What do I think about the scalability of the solution?
I have not scaled the solution very much but I know it is highly scalable.
We have approximately five dashboard creators in my organization that is working on the solution.
How are customer service and support?
The customer service was very nice and responsive, when I opened a ticket they called me and emailed me but they could not solve the problem we had.
We have three technicians and one of them is dedicated to the maintenance of Microsoft BI.
Which solution did I use previously and why did I switch?
I have previously used another solution.
How was the initial setup?
The initial setup is very complex.
What about the implementation team?
I used a consultant company to help me do the implementation and it took approximately one month.
What's my experience with pricing, setup cost, and licensing?
In the beginning, the solution cost approximately $1,000 per month, we have a Microsoft startup subscription. There is a subscription pack for startups company which we are able to have for one year.
My customers will increase their usage of the solution and they will be additional costs for their usage.
Which other solutions did I evaluate?
I have evaluated other solutions before choosing Microsoft BI, such as Tableau. I have found that Tableau does not have a solution that can be embedded. The dashboard should be inside of our applications.
What other advice do I have?
If someone wants an embedded solution then Microsoft BI is very good because it works on many types of data, such as JSON.
I rate Microsoft BI an eight out of ten.
Which deployment model are you using for this solution?
Public Cloud
If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?
Amazon Web Services (AWS)
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Buyer's Guide
Download our free Microsoft Power BI Report and get advice and tips from experienced pros
sharing their opinions.
Updated: January 2025
Popular Comparisons
Teradata
Amazon QuickSight
IBM Cognos
SAP Analytics Cloud
SAP BusinessObjects Business Intelligence Platform
Oracle OBIEE
MicroStrategy
Oracle Analytics Cloud
Salesforce Einstein Analytics
TIBCO Spotfire
ThoughtSpot
Buyer's Guide
Download our free Microsoft Power BI Report and get advice and tips from experienced pros
sharing their opinions.
Quick Links
Learn More: Questions:
- BI Reports for business users - which BI solutions should we choose?
- Business users moving from Tableau to MS Report builder
- Is Power BI a complete platform or only a visualization tool?
- What are the key advantages of OBIEE compared to Microsoft BI?
- What Is The Biggest Difference Between Microsoft BI and Oracle OBIEE?
- Is Microsoft Power BI good for an ETL process?
- How would you decide between Microsoft Power BI and TIBCO Spotfire?
- Is it easy to extract data from Oracle Fusion into Power BI?
- PowerBI or SyncFusion - which is better?
- What challenges to expect when migrating multiple dashboards from TIBCO Spotfire to Microsoft Power BI?
Hi Peter !
Let's discuss from point 6 to 10 in here;
#6: I totally agree with you, never trust or undermine the fact that data will be coming in the format as suggested by the ETL Team. There is always a possibility of wrong data types, bad data, switched data, all kind of data to be appear as source data, so as a ETL developer you need to make sure you put data validation checks for each and every case you have in mind. Still you might miss out some cases. The good thing about MS SQL Server 2012 is now they have provided the TRY_CAST function which can be used to avoid casting errors. A craftily designed framework would be handy to have where ETL developers need to know about the framework, so invest on building a framework which can be used across multiple ETL projects. I strongly agreed with your point that data is evil and sometimes is such hard to load single files which have all kind of these bad data validation errors.
#7: Definitely, by having a framework you can save time by not spending your time on writing same piece of code again and again. While designing your ETL, please beware of the data types which you are using, for some people there is slight difference between Float, Decimal & Numeric data type but if you have been writing ETL solutions you know what kind of a mess it would create if you don\t pick up the right data type, same for Date & DateTime data types.
#8: MDX calculation needs to be tested again and again which is called regression testing. All these years i have been building end to end BI solutions, which involves writing complex ETL's, it is like impossible for QA agents to identify the problem in calculations, so while you assign someone task of verifying MDX calculation or just verifying the BI Dashboard output, make sure he has enough knowledge of Data Analysis. He would be proficient enough to query the database and be able to browse the Cube and also perform cross Data Verification. As a BI Consultant I invest much time in training my QA agents to be able to perform this regression testing.
#9: Partition is always a good practice when you are sure that data influx might going to be run into billions of rows. But if you are designing a BI Solution for an organization which might not have this big amount of data under Analysis then you may avoid partitioning.
#10: Strongly recommended, built in Excel provider is going to make you crazy really soon by having it own data type sensing ability, although you can try to turn it off by setting the property of Type Guess = 0, but there are so many problem with excel provider it always sense the data types for each source column.
One thing I need to mention, is carefully designed ETL with customized logging process can save you tons of time while analyzing the cause of data failure. And it's always good to have the ETL logging process which can be shared with your client as well.
Regards,
Hasham Niaz