I have been consulting with this solution, combined with SQL server, since 2005. The majority of my consulting at that time changed from Active Directory and C++ to SQL Server and SharePoint.
This solution is a workflow operating system with many metadata services. Information is taken and automatically triggers actions. The specific action is based on the information itself, which is used to calculate a complex answer that results in the action.
I would like a simpler, more cost-effective solution for connecting data sources with workflows and BI tools, or data mining tools. There are different tools for data mining and for data evaluation, but you have to be a skilled programmer to tie them together. There is no simple and low-cost method to do this, provided that development time is a cost factor.
There are some automatic solutions for this task, such as Team Foundation Server, which is built on SharePoint. These tools can learn specific errors that are being made, using data mining techniques, and they are able to target these errors for correction. Having this capability built in, and customizable for the customer would be of great interest.
I would like to see support for Visual Studio to connect to SharePoint and have a wizard to connect data processes to iHubs, like an analysis server or data mining model, to an output, and to have a smart way of creating workflows. Microsoft will tell you that they already have that for SharePoint online, it's called "Flow", but it is not customer compatible.
Since 2003.
It is stable after the hotfixes, or service pack has been applied. This has been the case for each release since 2003. If you take the release directly to the customer then it is almost always a big mess for them during implementation.
This solution is extremely scalable. It is a highly performance-optimized web service that you just have to install correctly and then add the machine to the farm with the proper permissions. That is one of the biggest strengths of SharePoint.
Technical support is extremely well developed with Microsoft. It's just that you have to pay for it, so it is not for someone without Software Assurance.
The setup of this solution is complex. There are SharePoint deployment architecture scenarios, and sometimes the C-level deciders underestimate the complexity of it. You have to know SharePoint very well.
For this type of solution, it is not wise to buy it without Software Assurance. It depends on the customer, but most are using an agreement that covers four to ten free incidents per year. You really need that, and it's well-invested money.
When comparing this solution to other workflow operating systems like Oracle or BP Logix, I give this solution a ten out of ten.
My advice for anyone implementing this solution is to first try everything that you want to do in a virtual environment, with people who know how SharePoint is programmed. You need to understand the psychology of business users because most of them omit essential steps when they are creating the business process model. They are used to doing things in their head, but the machine is not aware of everything that they know so some steps are missed.
Ideally, you want to buy a bunch of post-it notes and test your processes manually, by playing with different scenarios. You have to tune the business processes. I have seen projects fail because the debug phase of the business process design was not thorough.
This solution is useful for optimizing usual business processes, like writing an invoice. For any organization with more than one person in it, if they are trying to organize things to let people in the company know what others are doing, then this solution is good for them.
While this workflow operating system is better than others on the market, it is uncomfortable and expensive to really implement what you need.
I would rate this solution an eight out of ten.
Hi Henry
What you described about SharePoint enlightened me on what I can use it for.
In the situation where the QMS Manual has the functional procedures per organisational functions.
Documents and records are linking to the functional File Plan (indexed), against each functional activity' document requirements.
Each activity has input, output, requirements and the document number linking to the index (file plan)
I have the view that proper integration (repository) defined through who access and who is denied access might help in the central monitoring and control of documents and records. End users can pull documents and records to administer job activities and send them down the process channels to the reporting end
Do have the correct view?