Defining simple expressions and queries can be slightly complicated due to DynamoDB's unique JSON syntax. This can be confusing, especially for beginners.
The process of making a query could be optimized. Sometimes, it can be difficult if you don't design it properly at the beginning of the project. It's more expensive to look for a field in your database than when you look for it in a relational database. The solution should find another way to handle the queries. Counting all your data or the number of fields you have in a relational database is easier than in a non-relational database.
Senior Engineering Consultant at ASSURANCE IQ, INC.
Real User
Top 5
2024-06-24T07:44:59Z
Jun 24, 2024
Previously, when in my company, we used to store 64 KB of data, we used to get problems and errors, and due to such reason, at such a point in time, we had to find a different storage system or modify the system so that the size of the value is not more than 64 KB. The main aforementioned issue in the tool can be considered for improvement for Amazon DynamoDB. In our company, we have some data which can be stored as we want. Previously, only 64 KB could be used, and later, I think, it was about 400 KB. If the tool could have an additional 10 MB to offer, then the tool could be easier to use. The tool is a key value storage, where the key will be long. In terms of value, we couldn't store more than 64 kb previously in the tool, but later on, it was increased to 400 KB, which is a limitation that I don't like in the tool.
Solutions Architect at a tech services company with 501-1,000 employees
Real User
Top 20
2024-05-01T15:54:00Z
May 1, 2024
The response time for data queries should be less than a second. The queried data is not required in normalization for parameter queries, password queries, or data that can be used frequently. It would be beneficial if the product can be made open source.
There are a few areas of improvement. In future releases, I would like a feature that lets us store information about public holidays or weekends. When customers call during those closed periods, we could use DynamoDB to trigger an automatic message. It could say something like, "We're currently closed due to a holiday. Please call back during our regular working hours." So this would eliminate the need for agents to manually inform customers. With a holiday calendar stored in a DynamoDB table, we could write a Lambda function to check the date. If it's a UK holiday, for example, the system could automatically play the message.
We are currently consuming this data and storing it using coding. We are storing data in the database and retrieving those details. Everything is based on the EAP, but we are creating. There is no need to go to the particular TIN, DB, or application DB or search for the ID. Amazon DynamoDB does not currently offer any import functionality. However, users have requested this feature. For instance, if you have a large dataset stored in Excel and you wish to migrate it to DynamoDB, there is currently no direct import option available through the portal. Nevertheless, this limitation can be overcome using third-party tools such as AMP, which facilitates data importation into DynamoDB. Having an import option, whether through browsing or local file uploads, would significantly improve the efficiency of data migration, enabling users to swiftly transfer large volumes of data into DynamoDB.
They could provide more information or training programs to deliver knowledge to the engineers about the components of relational databases similar to popular vendors.
Maybe the documentation could be improved a bit. Sometimes, it's a little confusing, and people can easily be mistaken about DynamoDB. That's the main thing, the documentation could be a little more explicit or a little more elaborate. But that's it. It's a key-value system, and it works well for what it is. The only thing I can see that could be improved is the documentation.
The product allows us to query for items in the UI. Sometimes when we query through the UI, it takes a long time to get the results. I would like it if the results were faster.
Database Architect at a transportation company with 1,001-5,000 employees
Real User
Top 10
2023-05-24T02:58:49Z
May 24, 2023
The solution's interface is the biggest challenge because if you want to access DynamoDB, you need an AWS account. So, you need to be logged in to the AWS console and can only make changes from there. So, if there's, like, any other DB, like, whether it's Redshift or any other service that shows us Spectrum, Athena, or anything they provide, we can connect it through an external client. I don't need to be logged in to the AWS console. That is one thing where it restricts me, and as of today, I've restricted its access to four people who can make the changes in the product or who can monitor or directly log in to DynamoDB and check what has been configured, what is working, what is not working. But if it was accessible, like any other DB, I could have just given them more people to read permission, and It would have been easier for me to maintain that. As of now, I'm using it as a configuration DB and not exactly using it for transactions or storage since, for such purposes, I depend on Redshift.
The design patterns and the documentation for this solution could be improved. In a future release, we would like to see an improvement of the data push options as we sometimes experience blockers when moving data.
I'd like to see better integration with Cognito. It has the integration, but I'd like to see a little more ease of setup. If you have multiple customers and you want the database to enforce who can see what, you can treat DynamoDB so that each row has permissions. You can set this up, but it's a little more of a science project to make Cognito and DynamoDB work well to do protection of individual rows. So I'd like that to be more wizard or easy to set up. Documentation and examples can always get better.
The documentation is not good enough and can be improved. There is a lot of information, and it is old and hard to find specific information. The documentation should be updated like the Firebase in Google. It is not easy to manage. For example, uploading a certificate and resources to our GraphQL databases is too difficult because there's no user interface. You need to jump into terminal business.
There are some issues, like if we missed something or somehow were not able to store the data, then it was quite difficult for us to get back that data. If some data crashed during transmission, then there were no alternate options to recover that packet, to backup, or to re-collect that data for a specific device. Caching is a problem; it is not there. In DynamoDB, my experience is that it works like a UDP; whatever is lost is gone. There is no other mechanism by which we can re-collect that data. I would like to see video and audio buffers in DynamoDB.
Principal at a computer software company with 11-50 employees
Real User
2021-08-24T20:32:37Z
Aug 24, 2021
Amazon DynamoDB could improve by being more robust, having a better user interface and data management. Additionally, there is some limited functionality compared to other solutions, such as MongoDB. In an upcoming release, it would be beneficial to show spatial data on the interface. There would is a very important metric for our company.
Engineering Intern at a tech services company with 51-200 employees
Real User
2021-04-01T10:12:33Z
Apr 1, 2021
Currently, there is no option for a scheduled refresh in this solution. We want the data to be populated into DynamoDB on a timely basis. Currently, you have to go to the DynamoDB table and hit the refresh button to populate it with the new data. If you have connected DynamoDB to a BI application for creating visualizations with charts, graphs, or other things, you would want it to get updated as per the schedule so that you have updated visualizations in your BI application.
Expert Solution Principal at a tech services company with 10,001+ employees
Real User
2019-09-15T16:44:00Z
Sep 15, 2019
Querying data on the solution is quite limited, but this is like any other NoSQL database. It's the most common criticism of the NoSQL database in general.
Amazon DynamoDB is a scalable NoSQL database valued for its speed and cost efficiency, adept in handling unstructured data and delivering fast data retrieval without predefined schemas.Amazon DynamoDB is recognized for seamless integration with AWS services and its ability to accommodate large datasets. It provides powerful performance with automatic scaling, JSON document storage, and requires no external configurations. Users appreciate the predictable performance and ease of use, although...
Defining simple expressions and queries can be slightly complicated due to DynamoDB's unique JSON syntax. This can be confusing, especially for beginners.
The process of making a query could be optimized. Sometimes, it can be difficult if you don't design it properly at the beginning of the project. It's more expensive to look for a field in your database than when you look for it in a relational database. The solution should find another way to handle the queries. Counting all your data or the number of fields you have in a relational database is easier than in a non-relational database.
Previously, when in my company, we used to store 64 KB of data, we used to get problems and errors, and due to such reason, at such a point in time, we had to find a different storage system or modify the system so that the size of the value is not more than 64 KB. The main aforementioned issue in the tool can be considered for improvement for Amazon DynamoDB. In our company, we have some data which can be stored as we want. Previously, only 64 KB could be used, and later, I think, it was about 400 KB. If the tool could have an additional 10 MB to offer, then the tool could be easier to use. The tool is a key value storage, where the key will be long. In terms of value, we couldn't store more than 64 kb previously in the tool, but later on, it was increased to 400 KB, which is a limitation that I don't like in the tool.
The setup cost could be reduced. But overall, the tool works smoothly.
The response time for data queries should be less than a second. The queried data is not required in normalization for parameter queries, password queries, or data that can be used frequently. It would be beneficial if the product can be made open source.
We use the document database. The primary key is quite slow. The free tier is quite hard to use.
There are a few areas of improvement. In future releases, I would like a feature that lets us store information about public holidays or weekends. When customers call during those closed periods, we could use DynamoDB to trigger an automatic message. It could say something like, "We're currently closed due to a holiday. Please call back during our regular working hours." So this would eliminate the need for agents to manually inform customers. With a holiday calendar stored in a DynamoDB table, we could write a Lambda function to check the date. If it's a UK holiday, for example, the system could automatically play the message.
We are currently consuming this data and storing it using coding. We are storing data in the database and retrieving those details. Everything is based on the EAP, but we are creating. There is no need to go to the particular TIN, DB, or application DB or search for the ID. Amazon DynamoDB does not currently offer any import functionality. However, users have requested this feature. For instance, if you have a large dataset stored in Excel and you wish to migrate it to DynamoDB, there is currently no direct import option available through the portal. Nevertheless, this limitation can be overcome using third-party tools such as AMP, which facilitates data importation into DynamoDB. Having an import option, whether through browsing or local file uploads, would significantly improve the efficiency of data migration, enabling users to swiftly transfer large volumes of data into DynamoDB.
The solution could be cheaper.
The solution's efficiency and performance should be faster than other databases.
Amazon DynamoDB has a very complex configuration if you go very advanced.
The solution's backup and restore could be improved to be able to utilize batch operations.
They could provide more information or training programs to deliver knowledge to the engineers about the components of relational databases similar to popular vendors.
Maybe the documentation could be improved a bit. Sometimes, it's a little confusing, and people can easily be mistaken about DynamoDB. That's the main thing, the documentation could be a little more explicit or a little more elaborate. But that's it. It's a key-value system, and it works well for what it is. The only thing I can see that could be improved is the documentation.
The solution has size limitations. It also needs to be more user-friendly.
The product allows us to query for items in the UI. Sometimes when we query through the UI, it takes a long time to get the results. I would like it if the results were faster.
The solution's interface is the biggest challenge because if you want to access DynamoDB, you need an AWS account. So, you need to be logged in to the AWS console and can only make changes from there. So, if there's, like, any other DB, like, whether it's Redshift or any other service that shows us Spectrum, Athena, or anything they provide, we can connect it through an external client. I don't need to be logged in to the AWS console. That is one thing where it restricts me, and as of today, I've restricted its access to four people who can make the changes in the product or who can monitor or directly log in to DynamoDB and check what has been configured, what is working, what is not working. But if it was accessible, like any other DB, I could have just given them more people to read permission, and It would have been easier for me to maintain that. As of now, I'm using it as a configuration DB and not exactly using it for transactions or storage since, for such purposes, I depend on Redshift.
If you have no prior experience with this type of non-relational database, the syntaxes, implementation, or understanding may be difficult.
The design patterns and the documentation for this solution could be improved. In a future release, we would like to see an improvement of the data push options as we sometimes experience blockers when moving data.
I'd like to see better integration with Cognito. It has the integration, but I'd like to see a little more ease of setup. If you have multiple customers and you want the database to enforce who can see what, you can treat DynamoDB so that each row has permissions. You can set this up, but it's a little more of a science project to make Cognito and DynamoDB work well to do protection of individual rows. So I'd like that to be more wizard or easy to set up. Documentation and examples can always get better.
The documentation is not good enough and can be improved. There is a lot of information, and it is old and hard to find specific information. The documentation should be updated like the Firebase in Google. It is not easy to manage. For example, uploading a certificate and resources to our GraphQL databases is too difficult because there's no user interface. You need to jump into terminal business.
There are some issues, like if we missed something or somehow were not able to store the data, then it was quite difficult for us to get back that data. If some data crashed during transmission, then there were no alternate options to recover that packet, to backup, or to re-collect that data for a specific device. Caching is a problem; it is not there. In DynamoDB, my experience is that it works like a UDP; whatever is lost is gone. There is no other mechanism by which we can re-collect that data. I would like to see video and audio buffers in DynamoDB.
Amazon DynamoDB could improve by being more robust, having a better user interface and data management. Additionally, there is some limited functionality compared to other solutions, such as MongoDB. In an upcoming release, it would be beneficial to show spatial data on the interface. There would is a very important metric for our company.
Currently, there is no option for a scheduled refresh in this solution. We want the data to be populated into DynamoDB on a timely basis. Currently, you have to go to the DynamoDB table and hit the refresh button to populate it with the new data. If you have connected DynamoDB to a BI application for creating visualizations with charts, graphs, or other things, you would want it to get updated as per the schedule so that you have updated visualizations in your BI application.
Querying data on the solution is quite limited, but this is like any other NoSQL database. It's the most common criticism of the NoSQL database in general.