In terms of integrating Amazon Kinesis with other tools, I would say that we don't directly do anything. We have our jobs that consume the data. You have multiple ways to read Amazon Kinesis. If you have a traditional job, then you can use some SDK-based libraries to store data or send data. If you want to consume data, there are ways that you can directly do it, such as using JAR files and reading the data from them. If you have Redshift or Amazon Kinesis Data Firehose, you can directly send data to S3, and then later on, you can consume it. There are multiple options to consume data, but we are currently using a via script, and we are not directly sending it to S3 or some other database. I recommend the tool to others. If you have to deal with more than one MB, then you have to deal with your process of how you want to send data to places. I just put the data, and then on the job, you have to combine that, which would be tricky. Another way is to store the metadata and then store the actual data somewhere, like in SQL or somewhere else, and later on, you can put the locations. You need to make sure not to have multiple consumers creating the same data stream. Otherwise, you can get the throughput errors. I rate the tool a nine out of ten.
Senior Engineering Consultant at ASSURANCE IQ, INC.
Real User
Top 5
2024-06-24T07:46:44Z
Jun 24, 2024
For the use cases, the only thing with Amazon Kinesis is that you already have another queue system or storage like Kafka or any other already-built streaming systems, so you don't have to invest in Amazon Kinesis. If you don't have anything, you can easily start with Amazon Kinesis. For the logs, handling the streaming part, and if you already have many AWS products while also needing to send data to S3 or any other storage or cloud tool, it will be very optimal to use. I rate the tool a nine out of ten.
Amazon Kinesis is an AWS-managed service, just like S3 or EC. We don't have to deploy it; it is just there, and we spin it up. You must go to AWS' service page and click on Kinesis. Then, you can create it by clicking on Create and entering the name. I would not recommend Amazon Kinesis to other users. Users can choose a cheaper alternative. They can use any other queuing system or in-house Kafka if they have a Kafka team. Amazon Kinesis provides near real-time read-and-write, but its cost is too high. Users can choose another option that provides the same functionality at less cost. With Amazon Kinesis, you have to run a consumer who sees from Amazon Kinesis. AWS provides the Kinesis Client Library (KCL), which reads from the Kinesis stream. That library is also used in DynamoDB for data checkpointing. For example, if you have one day of data in Amazon Kinesis and started reading from 12 AM yesterday. The Kinesis Client Library (KCL) will check on the data in the DynamoDB. You get charged for the DynamoDB table out-of-the-box, along with Amazon Kinesis. The DynamoDB table also costs a lot, which should not be the case. It is just read-and-write and is downloaded from the Kinesis Client Library (KCL). The DynamoDB table's cost should be very minimal, but that's not the case. The consumer is not optimal for efficient read-and-write, which further increases the cost. Both Amazon Kinesis and DynamoDB come into the picture. Overall, I rate the solution a five or six out of ten.
I would recommend using Kinesis. Again, it depends on the use cases. One specific advantage of Kinesis is how straightforward it is, and it offers a lot of integration possibilities. So, for those wanting to use it with Amazon Connect, this is a good solution. Overall, I would rate it between an eight out of ten because of its ease of use. I like the ease of use and how we can quickly get the configurations done, making it pretty straightforward and stable.
There is a separate team in my company that looks after the real-time data analytics in our organization, which I don't know much about. As a DevOps engineer, I take care of the cloud, and because of this, I know why my company uses the product. The most valuable feature of the product for our company's data processing needs stems from the fact that it operates in real-time. The simplicity of the services offered by the product and the way I can use them is very smooth and easy to understand. The tool also provides good storage and an increase in on-demand capacity, which I think is the best for our company. Integrating Amazon Kinesis with other AWS services has smoothly helped our analytics workflow. It is also very easy to integrate with other AWS services. The analytics part is also very good with Amazon Kinesis. I recommend the product to those who plan to use it. Those who are new to data streaming and want to start with a new product can go for Amazon Kinesis, as it is very easy to set up, especially the installation part, which is very easy to handle. Amazon Kinesis is the first option others should consider since it is easy to set it up. I rate the tool an eight out of ten.
I would definitely recommend using the solution. It's a great service, and it can be used wherever it's applicable in their model and architecture. Overall, I would rate the solution an eight out of ten.
Senior Data Engineer Consultant at a tech company with 201-500 employees
Real User
Top 10
2023-03-01T14:40:00Z
Mar 1, 2023
I give the solution a nine out of ten. Amazon Kinesis is easy to use and configure, especially in the beginning. The solution is stable and I have not encountered any issues with it, nor am I aware of any. The solution is effective. I don't see any missing features in Amazon Kinesis. I haven't spent a lot of time with this interface, as I have only configured it once. If any changes need to be made, I simply adjust Amazon Kinesis and it works. I only go into Amazon Kinesis if there is a need for a new data stream to be included or if the throughput needs to be increased. This doesn't happen very often. Depending on the requirements, if there is a need to stream data and access it in real time, then I would consider Amazon Kinesis. However, if there is no need for real-time data access, then I will look for some other cheaper options. Companies such as Redshift, Snowflake, and BigQuery are developing databases with built-in streaming functionality. Depending on the case, this may be an option to consider. It also depends on the target; sometimes it is better to use the mechanisms available in the target tool. If we want to have the data on a stream or some hot stories, then I would consider Amazon Kinesis in that case.
Nowadays, my company works with AWS, Snowflake, Redshift, Amazon Kinesis, Firehose, Aurora, and Athena. In the future, my company plans to work with SAP HANA. My rating for Amazon Kinesis is six out of ten. My company is a user of Amazon Kinesis.
To someone who would like to implement it, I would simply tell not to shove giant bricks in. Data has to be reasonably sized. A single Kinesis message is measured in K, not megabytes. It's not meant for gigantic things. There is a different strategy for streaming data. I'd rate it at least a nine out of ten. It was very close to perfect.
Chief Technology Officer at a tech services company with 51-200 employees
Real User
2021-08-25T19:54:29Z
Aug 25, 2021
The question of whether I would recommend Amazon Kinesis over Azure Event Hub is tricky. While both have their advantages and I consider them to be almost equal, we feel the latter to be better suited to our environment, which is why we went with it. The data transferring policies and associated costs of Amazon were the deciding factors for me. I rate Amazon Kinesis as an eight or nine out of ten.
Senior Software Engineer at a tech services company with 501-1,000 employees
Real User
2020-12-21T13:55:00Z
Dec 21, 2020
With my limited exposure to Kinesis, and with the pain points and probably not using it properly, we did see that it was successful. Having said all that, and the pain points that we went through, on a scale of one to ten I would give Kinesis an eight out of 10.
I have a lot of experience in Kinesis and data analytics including in networking in the Amazon AWS environment. My experience is as a big data architect. I draw all environments in AWS. On a scale from one to ten, I would rate the solution at a six. It's pretty good, and great for big environments, however, you do need to be well versed in the product to set it up.
Senior Engineering Consultant at a tech services company with 201-500 employees
Real User
2020-10-28T15:29:13Z
Oct 28, 2020
My advice for anybody who is implementing this product is to start by reading through the Amazon documentation, as well as go through some videos on YouTube or Pluralsight just to get a high-level idea of what's going on. Then, start experimenting and trying to figure out how it works. From there, try to figure out how to choose your optimal sharding strategy, like how many shards do you need within the stream and how you want to partition the data within it. I think from there, you need to look at your production and consumption rates on the stream. This is how much data you are putting onto the stream and at what kind of rate. You need to make sure that you're consuming data off of the stream, also, and look at that rate too. The ideal use case is to be able to consume data faster than producing because then you're able to control things. If you're not able to do that, then you could get overwhelmed. The biggest lesson that I learned from using this product is that it's a whole new world of processing big data. I come from a traditional data warehousing background where everything is batch-oriented. So for this, this is a whole new ball game in terms of how to process data. It's a new mechanism for harnessing the power of data. A traditional data warehouse could not analyze, for example, what is going on in real-time on a racing car. It's not scalable and it's not going to work. However, something like this is dynamic and big enough to handle this kind of application. This is a pretty good product, albeit I don't have much to compare it with. That said, I don't have any problems with it. It's done what it's asked and it's easy to use. I would rate this solution a nine out of ten.
My advice to anyone thinking about Amazon Kinesis, is that if they have ClickStream or any streaming data which varies from megabytes to gigabytes, they can definitely go for Amazon Kinesis. If they want to do data processing, or batch or streaming analytics, they can choose Amazon Kinesis. And if you want to enable database stream events in Amazon DynamoDB, then you can definitely go for Amazon Kinesis. I don't see any better option for these other than Amazon Kinesis. You can use Amazon Kinesis Data Analytics Tool to detect an anomaly before you process the data. That's one more beauty. The first things we need to determine are the source and the throughput of the data and the latency you want. On a scale of one to ten I would rate Amazon Kinesis a nine.
IT Linux Administrator and Cloud Architect at Gateway Gulf
Real User
2020-10-25T23:39:00Z
Oct 25, 2020
Kinesis has the best of Amazon: data streaming, building processes, data analytics, data in real-time are very good. The output and monitoring are easy. It has good performance. I would rate it a nine out of ten.
My recommendation for Data Streams is to do a deep dive into the documentation before implementing to avoid what we did at the beginning. You try to process record by record or push record by record into Kinesis and then realize that it is not cost effective or even efficient. So you need to know that you need to aggregate your data before you push it into Kinesis. So documenting yourself about the best practices in using Kinesis is definitely something I would recommend to anyone. For Kinesis Analytics, I was actually surprised at how easy it is to use an application with such power. I would say with a trial, users will realize that for for such a fairly complex application such as Kinesis Analytics, it is something that you can do very quickly with minimal resources and it gives you a lot of value for specific use cases. On a scale of one to ten, I would give Amazon Kinesis a nine. I don't have much to complain about Kinesis.
Senior Software Engineer at a computer software company with 201-500 employees
Real User
2020-10-20T04:19:00Z
Oct 20, 2020
If you want to use a stream solution you need to evaluate your needs. If your needs are really performance-based, maybe you should go with Kafka, but for near, real-time performance, I would recommend Amazon Kinesis. If you need more than one destination for the data that you are ingesting in the stream, you will need to use Amazon Kinesis Data Streams rather than Firehose. If you only want to integrate from one point to another, then Kinesis Firehose is a considerably cheaper option and is much easier to configure. From using Kinesis, I have learned a lot about the synchronous way of processing data. We always had a more sequential way of doing things. On a scale from one to ten, I would give this solution a rating of eight.
Principal Data Engineer at a transportation company with 1,001-5,000 employees
Real User
2020-10-15T11:35:04Z
Oct 15, 2020
It's nice to deploy this with the Amazon goodness of Cloud Formation and Terraform, to have it all deployed in a repeatable way. I know that it's easy to go into the console and do it manually, but it's best to do infrastructure as code, in particular with Kinesis. I would rate this solution a nine out of 10.
Amazon Kinesis makes it easy to collect, process, and analyze real-time, streaming data so you can get timely insights and react quickly to new information. Amazon Kinesis offers key capabilities to cost-effectively process streaming data at any scale, along with the flexibility to choose the tools that best suit the requirements of your application. With Amazon Kinesis, you can ingest real-time data such as video, audio, application logs, website clickstreams, and IoT telemetry data for...
In terms of integrating Amazon Kinesis with other tools, I would say that we don't directly do anything. We have our jobs that consume the data. You have multiple ways to read Amazon Kinesis. If you have a traditional job, then you can use some SDK-based libraries to store data or send data. If you want to consume data, there are ways that you can directly do it, such as using JAR files and reading the data from them. If you have Redshift or Amazon Kinesis Data Firehose, you can directly send data to S3, and then later on, you can consume it. There are multiple options to consume data, but we are currently using a via script, and we are not directly sending it to S3 or some other database. I recommend the tool to others. If you have to deal with more than one MB, then you have to deal with your process of how you want to send data to places. I just put the data, and then on the job, you have to combine that, which would be tricky. Another way is to store the metadata and then store the actual data somewhere, like in SQL or somewhere else, and later on, you can put the locations. You need to make sure not to have multiple consumers creating the same data stream. Otherwise, you can get the throughput errors. I rate the tool a nine out of ten.
For the use cases, the only thing with Amazon Kinesis is that you already have another queue system or storage like Kafka or any other already-built streaming systems, so you don't have to invest in Amazon Kinesis. If you don't have anything, you can easily start with Amazon Kinesis. For the logs, handling the streaming part, and if you already have many AWS products while also needing to send data to S3 or any other storage or cloud tool, it will be very optimal to use. I rate the tool a nine out of ten.
Amazon Kinesis is an AWS-managed service, just like S3 or EC. We don't have to deploy it; it is just there, and we spin it up. You must go to AWS' service page and click on Kinesis. Then, you can create it by clicking on Create and entering the name. I would not recommend Amazon Kinesis to other users. Users can choose a cheaper alternative. They can use any other queuing system or in-house Kafka if they have a Kafka team. Amazon Kinesis provides near real-time read-and-write, but its cost is too high. Users can choose another option that provides the same functionality at less cost. With Amazon Kinesis, you have to run a consumer who sees from Amazon Kinesis. AWS provides the Kinesis Client Library (KCL), which reads from the Kinesis stream. That library is also used in DynamoDB for data checkpointing. For example, if you have one day of data in Amazon Kinesis and started reading from 12 AM yesterday. The Kinesis Client Library (KCL) will check on the data in the DynamoDB. You get charged for the DynamoDB table out-of-the-box, along with Amazon Kinesis. The DynamoDB table also costs a lot, which should not be the case. It is just read-and-write and is downloaded from the Kinesis Client Library (KCL). The DynamoDB table's cost should be very minimal, but that's not the case. The consumer is not optimal for efficient read-and-write, which further increases the cost. Both Amazon Kinesis and DynamoDB come into the picture. Overall, I rate the solution a five or six out of ten.
I would recommend using Kinesis. Again, it depends on the use cases. One specific advantage of Kinesis is how straightforward it is, and it offers a lot of integration possibilities. So, for those wanting to use it with Amazon Connect, this is a good solution. Overall, I would rate it between an eight out of ten because of its ease of use. I like the ease of use and how we can quickly get the configurations done, making it pretty straightforward and stable.
There is a separate team in my company that looks after the real-time data analytics in our organization, which I don't know much about. As a DevOps engineer, I take care of the cloud, and because of this, I know why my company uses the product. The most valuable feature of the product for our company's data processing needs stems from the fact that it operates in real-time. The simplicity of the services offered by the product and the way I can use them is very smooth and easy to understand. The tool also provides good storage and an increase in on-demand capacity, which I think is the best for our company. Integrating Amazon Kinesis with other AWS services has smoothly helped our analytics workflow. It is also very easy to integrate with other AWS services. The analytics part is also very good with Amazon Kinesis. I recommend the product to those who plan to use it. Those who are new to data streaming and want to start with a new product can go for Amazon Kinesis, as it is very easy to set up, especially the installation part, which is very easy to handle. Amazon Kinesis is the first option others should consider since it is easy to set it up. I rate the tool an eight out of ten.
Overall, I rate the solution an eight out of ten.
I rate Amazon Kinesis a seven out of ten.
I would definitely recommend using the solution. It's a great service, and it can be used wherever it's applicable in their model and architecture. Overall, I would rate the solution an eight out of ten.
I give the solution a nine out of ten. Amazon Kinesis is easy to use and configure, especially in the beginning. The solution is stable and I have not encountered any issues with it, nor am I aware of any. The solution is effective. I don't see any missing features in Amazon Kinesis. I haven't spent a lot of time with this interface, as I have only configured it once. If any changes need to be made, I simply adjust Amazon Kinesis and it works. I only go into Amazon Kinesis if there is a need for a new data stream to be included or if the throughput needs to be increased. This doesn't happen very often. Depending on the requirements, if there is a need to stream data and access it in real time, then I would consider Amazon Kinesis. However, if there is no need for real-time data access, then I will look for some other cheaper options. Companies such as Redshift, Snowflake, and BigQuery are developing databases with built-in streaming functionality. Depending on the case, this may be an option to consider. It also depends on the target; sometimes it is better to use the mechanisms available in the target tool. If we want to have the data on a stream or some hot stories, then I would consider Amazon Kinesis in that case.
Nowadays, my company works with AWS, Snowflake, Redshift, Amazon Kinesis, Firehose, Aurora, and Athena. In the future, my company plans to work with SAP HANA. My rating for Amazon Kinesis is six out of ten. My company is a user of Amazon Kinesis.
Overall, I would rate Amazon Kinesis a seven out of ten.
To someone who would like to implement it, I would simply tell not to shove giant bricks in. Data has to be reasonably sized. A single Kinesis message is measured in K, not megabytes. It's not meant for gigantic things. There is a different strategy for streaming data. I'd rate it at least a nine out of ten. It was very close to perfect.
The question of whether I would recommend Amazon Kinesis over Azure Event Hub is tricky. While both have their advantages and I consider them to be almost equal, we feel the latter to be better suited to our environment, which is why we went with it. The data transferring policies and associated costs of Amazon were the deciding factors for me. I rate Amazon Kinesis as an eight or nine out of ten.
With my limited exposure to Kinesis, and with the pain points and probably not using it properly, we did see that it was successful. Having said all that, and the pain points that we went through, on a scale of one to ten I would give Kinesis an eight out of 10.
It's important to think about how you are going to fix the end points that connect to your Kinesis files. I would rate this solution a nine out of 10.
I have a lot of experience in Kinesis and data analytics including in networking in the Amazon AWS environment. My experience is as a big data architect. I draw all environments in AWS. On a scale from one to ten, I would rate the solution at a six. It's pretty good, and great for big environments, however, you do need to be well versed in the product to set it up.
My advice for anybody who is implementing this product is to start by reading through the Amazon documentation, as well as go through some videos on YouTube or Pluralsight just to get a high-level idea of what's going on. Then, start experimenting and trying to figure out how it works. From there, try to figure out how to choose your optimal sharding strategy, like how many shards do you need within the stream and how you want to partition the data within it. I think from there, you need to look at your production and consumption rates on the stream. This is how much data you are putting onto the stream and at what kind of rate. You need to make sure that you're consuming data off of the stream, also, and look at that rate too. The ideal use case is to be able to consume data faster than producing because then you're able to control things. If you're not able to do that, then you could get overwhelmed. The biggest lesson that I learned from using this product is that it's a whole new world of processing big data. I come from a traditional data warehousing background where everything is batch-oriented. So for this, this is a whole new ball game in terms of how to process data. It's a new mechanism for harnessing the power of data. A traditional data warehouse could not analyze, for example, what is going on in real-time on a racing car. It's not scalable and it's not going to work. However, something like this is dynamic and big enough to handle this kind of application. This is a pretty good product, albeit I don't have much to compare it with. That said, I don't have any problems with it. It's done what it's asked and it's easy to use. I would rate this solution a nine out of ten.
My advice to anyone thinking about Amazon Kinesis, is that if they have ClickStream or any streaming data which varies from megabytes to gigabytes, they can definitely go for Amazon Kinesis. If they want to do data processing, or batch or streaming analytics, they can choose Amazon Kinesis. And if you want to enable database stream events in Amazon DynamoDB, then you can definitely go for Amazon Kinesis. I don't see any better option for these other than Amazon Kinesis. You can use Amazon Kinesis Data Analytics Tool to detect an anomaly before you process the data. That's one more beauty. The first things we need to determine are the source and the throughput of the data and the latency you want. On a scale of one to ten I would rate Amazon Kinesis a nine.
Kinesis has the best of Amazon: data streaming, building processes, data analytics, data in real-time are very good. The output and monitoring are easy. It has good performance. I would rate it a nine out of ten.
My recommendation for Data Streams is to do a deep dive into the documentation before implementing to avoid what we did at the beginning. You try to process record by record or push record by record into Kinesis and then realize that it is not cost effective or even efficient. So you need to know that you need to aggregate your data before you push it into Kinesis. So documenting yourself about the best practices in using Kinesis is definitely something I would recommend to anyone. For Kinesis Analytics, I was actually surprised at how easy it is to use an application with such power. I would say with a trial, users will realize that for for such a fairly complex application such as Kinesis Analytics, it is something that you can do very quickly with minimal resources and it gives you a lot of value for specific use cases. On a scale of one to ten, I would give Amazon Kinesis a nine. I don't have much to complain about Kinesis.
If you want to use a stream solution you need to evaluate your needs. If your needs are really performance-based, maybe you should go with Kafka, but for near, real-time performance, I would recommend Amazon Kinesis. If you need more than one destination for the data that you are ingesting in the stream, you will need to use Amazon Kinesis Data Streams rather than Firehose. If you only want to integrate from one point to another, then Kinesis Firehose is a considerably cheaper option and is much easier to configure. From using Kinesis, I have learned a lot about the synchronous way of processing data. We always had a more sequential way of doing things. On a scale from one to ten, I would give this solution a rating of eight.
It's nice to deploy this with the Amazon goodness of Cloud Formation and Terraform, to have it all deployed in a repeatable way. I know that it's easy to go into the console and do it manually, but it's best to do infrastructure as code, in particular with Kinesis. I would rate this solution a nine out of 10.