More flexibility in terms of writing queries and accommodating additional facilities would be beneficial. The complexity of handling messages that need decoding and contain different characters should be addressed. Microsoft should improve networking and programming flexibilities and should be more open to other vendors. There should also be increased tool support from third-party tools.
Azure Stream Analytics is challenging to customize because it's not very flexible. It's good for quickly setting up and implementing solutions, but for building complex data pipelines and engineering tasks, you need more flexible tools like Databricks.
Azure Stream Analytics was not meeting our company's expectations because it was tedious to change the job, write queries, or if I needed to change something, I needed to stop the entire stream processing to change the job so that the changes could take effect. The aforementioned reasons were concerning, but I think that many of the issues related to the product have been resolved with the help of Microsoft Fabric. The only challenge was that the streaming analytics area in Azure Stream Analytics could not meet our company's expectations, making it a component where improvements are required.
One area that could use improvement is the handling of data validation. Currently, there is a review process, but sometimes the validation fails even before the job is executed. This results in wasted time as we have to rerun the job to identify the failure. It would be beneficial to have better error handling and early detection mechanisms in place. Additionally, there should be improved support for data joining and ensuring that customer matching is accurate. It's crucial to address these issues and add enhancements on top of the existing solution.
Manager | Advisory PI | Data & Analytics at Ernst & Young
Real User
2022-09-19T13:26:34Z
Sep 19, 2022
I'm not sure if there are any areas that are lacking The initial setup is complex. It should be easier for new users who may not have too much Azure experience.
The product could be improved by providing more detailed analytics. For example, a graph to identify the past, present and current users. Additionally, UI and UX testing could be supported on this solution.
I haven't come across missing items. It does what I need it to do. The pricing is a little bit high. The UI should be a little bit better from a usability perspective. The endpoint, if you are outsourcing to a third party, should have easier APIs. I'd like to have more destination sources available to us.
Associate Principal Analyst at a computer software company with 10,001+ employees
Real User
2021-09-24T19:49:20Z
Sep 24, 2021
With Azure specifically, the drawback is it is a very Azure-specific product. You can't connect it to external things out of Azure. For example, Spark or Databricks can be used in any cloud and can be used in AWS. This product doesn't work that way and is very Azure-specific. It's not a hybrid solution and it's not a cloud-agnostic solution, where you put it on other clouds, et cetera. We had some connections which we wanted to make with AWS, which we couldn't do with this. We had to use something else for that. Early in the process, we had some issues with stability. You cannot do joins on streams of data. For example, one stream joining with another stream. Real-time to real-time joins, you're not able to do that. You can only join your stream with static data from your Azure storage.
Business Architect Expert at a government with 51-200 employees
Real User
2021-03-02T16:21:55Z
Mar 2, 2021
While it depends on the business scenario, in some cases AWS offers better features. It's hard to speak to missing features at it really depends on the business case. However, in general, it has all the features a typical company might need. The solution needs to be marketed better. Developers should be pushed or enticed to use the solution more to get it more well-known on the market. It needs more of a presence. The solution offers a free trial, however, it is too short. You can't really properly test it before you have to start paying. They need to give companies a longer period of time to try it out risk-free. Also, the functionality is very limited. If you want to do a POC, you need the solution to offer more flexibility. Right now, you get a 14-day window, and that's not enough for a proper test.
The collection and analysis of historical data could be better. We use historical data and an assimilating algorithm to give us insights into the entire business process. We can collect all the historical data periodically to get insights into current business trends. For example, which area is getting emptied most of the time or which area is getting underutilized, and so on.
Collaboration Consultant at a tech services company with 201-500 employees
Consultant
2020-10-11T08:58:04Z
Oct 11, 2020
It is not complex, but it requires some development skills. When the data is sent from Azure Stream Analytics to Power BI, I don't have the access to modify the data. I can't customize or edit the data or do some queries. All queries need to be done in the Azure Stream Analytics.
BI Developer at a tech services company with 51-200 employees
Real User
2020-09-22T07:16:06Z
Sep 22, 2020
There are some improvements that could be made, first of all in pricing, because right now the pricing is a bit unclear. It's hard to estimate how much of that is a local issue but you can't figure out how prices are calculated or the proprietary part of the cost. Another area that could be improved is that if something does go wrong, it's very hard to investigate what caused it and why. The logging is available but it lacks detail and doesn't provide much information.
There may be some issues when connecting with Microsoft Power BI because we are providing the input and output commands, and there's a chance of it being delayed while connecting.
Azure Stream Analytics is a robust real-time analytics service that has been designed for critical business workloads. Users are able to build an end-to-end serverless streaming pipeline in minutes. Utilizing SQL, users are able to go from zero to production with a few clicks, all easily extensible with unique code and automatic machine learning abilities for the most advanced scenarios.
Azure Stream Analytics has the ability to analyze and accurately process exorbitant volumes of...
More flexibility in terms of writing queries and accommodating additional facilities would be beneficial. The complexity of handling messages that need decoding and contain different characters should be addressed. Microsoft should improve networking and programming flexibilities and should be more open to other vendors. There should also be increased tool support from third-party tools.
Azure Stream Analytics is challenging to customize because it's not very flexible. It's good for quickly setting up and implementing solutions, but for building complex data pipelines and engineering tasks, you need more flexible tools like Databricks.
Azure Stream Analytics was not meeting our company's expectations because it was tedious to change the job, write queries, or if I needed to change something, I needed to stop the entire stream processing to change the job so that the changes could take effect. The aforementioned reasons were concerning, but I think that many of the issues related to the product have been resolved with the help of Microsoft Fabric. The only challenge was that the streaming analytics area in Azure Stream Analytics could not meet our company's expectations, making it a component where improvements are required.
Easier scalability and more detailed job monitoring features would be helpful. Another room for improvement is the ingestion of data.
One area that could use improvement is the handling of data validation. Currently, there is a review process, but sometimes the validation fails even before the job is executed. This results in wasted time as we have to rerun the job to identify the failure. It would be beneficial to have better error handling and early detection mechanisms in place. Additionally, there should be improved support for data joining and ensuring that customer matching is accurate. It's crucial to address these issues and add enhancements on top of the existing solution.
The solution's query languages must be more comprehensive. Also, its features for event imports and architecture need enhancement.
I would like to have a contact individual at Microsoft and the price is high.
I'm not sure if there are any areas that are lacking The initial setup is complex. It should be easier for new users who may not have too much Azure experience.
The product could be improved by providing more detailed analytics. For example, a graph to identify the past, present and current users. Additionally, UI and UX testing could be supported on this solution.
I haven't come across missing items. It does what I need it to do. The pricing is a little bit high. The UI should be a little bit better from a usability perspective. The endpoint, if you are outsourcing to a third party, should have easier APIs. I'd like to have more destination sources available to us.
With Azure specifically, the drawback is it is a very Azure-specific product. You can't connect it to external things out of Azure. For example, Spark or Databricks can be used in any cloud and can be used in AWS. This product doesn't work that way and is very Azure-specific. It's not a hybrid solution and it's not a cloud-agnostic solution, where you put it on other clouds, et cetera. We had some connections which we wanted to make with AWS, which we couldn't do with this. We had to use something else for that. Early in the process, we had some issues with stability. You cannot do joins on streams of data. For example, one stream joining with another stream. Real-time to real-time joins, you're not able to do that. You can only join your stream with static data from your Azure storage.
While it depends on the business scenario, in some cases AWS offers better features. It's hard to speak to missing features at it really depends on the business case. However, in general, it has all the features a typical company might need. The solution needs to be marketed better. Developers should be pushed or enticed to use the solution more to get it more well-known on the market. It needs more of a presence. The solution offers a free trial, however, it is too short. You can't really properly test it before you have to start paying. They need to give companies a longer period of time to try it out risk-free. Also, the functionality is very limited. If you want to do a POC, you need the solution to offer more flexibility. Right now, you get a 14-day window, and that's not enough for a proper test.
The collection and analysis of historical data could be better. We use historical data and an assimilating algorithm to give us insights into the entire business process. We can collect all the historical data periodically to get insights into current business trends. For example, which area is getting emptied most of the time or which area is getting underutilized, and so on.
It is not complex, but it requires some development skills. When the data is sent from Azure Stream Analytics to Power BI, I don't have the access to modify the data. I can't customize or edit the data or do some queries. All queries need to be done in the Azure Stream Analytics.
There are some improvements that could be made, first of all in pricing, because right now the pricing is a bit unclear. It's hard to estimate how much of that is a local issue but you can't figure out how prices are calculated or the proprietary part of the cost. Another area that could be improved is that if something does go wrong, it's very hard to investigate what caused it and why. The logging is available but it lacks detail and doesn't provide much information.
There may be some issues when connecting with Microsoft Power BI because we are providing the input and output commands, and there's a chance of it being delayed while connecting.
We would like to have something that includes the desktops, and perhaps the main system, so we can protect our systems before a threat happens.