Sometimes we encounter errors, and the error messages from DMS are very ambiguous and sometimes misleading. For example, if we connect to an incorrect instance with incorrect setups, it will not say the connection failed. Instead, it says the DMS task failed with an unclear error message like 'subtask AWS failed', which does not help. The connection should fail if we are connecting to the wrong instance, but the endpoint connection still succeeds.
Professional Service Consultant- Data Systems [Oracle, OCI, AWS, Azure, Data Bricks] at Self-employed
Real User
Top 20
2024-09-06T07:11:45Z
Sep 6, 2024
There could be enhancements in the product intelligently covering more scenarios to prevent crashes or failures. Improved dashboard capabilities for monitoring jobs and the ability to resume migrations from the last successful point instead of starting over would significantly enhance the user experience.
Business Intelligence Manager & Data Analytics (Retail Business) at a retailer with 1,001-5,000 employees
Real User
Top 5
2024-07-03T15:14:00Z
Jul 3, 2024
In my experience, AWS Database Migration Service performs well for its primary purpose of data migration. One area that could be improved is its support for non-AWS file formats, particularly when integrating data from sources stored outside of AWS. For example, handling AWS Data Lakes, Delta Lake, or Hadoop file formats stored in S3 requires extensive configuration and isn't always straightforward. It would be beneficial to streamline this process to ensure smoother migrations from non-AWS environments to AWS. As for additional functionality, I think enhancing support for these external file formats and simplifying configuration steps would be valuable improvements for the service.
Database Engineer at a computer software company with 51-200 employees
Real User
Top 10
2024-04-30T13:53:00Z
Apr 30, 2024
At our company, sometimes we feel the migration processes are not manageable. On a few occasions, the data-capturing processes need to be reinitialized. Platform-specific issues like memory leakage and a lack of process fine-tuning arise as the solution is serverless. My company is able to initially setup the migration processes for how the data chunks or memory gets used for the data capture process. If the migration processes ever freeze with the AWS Database Migration Service, it needs to be reinitiated right from the beginning, uploaded data needs to be removed, and some verifications need to be implemented. For being a vendor managed solution there are some cons which the user cannot control.
Development Team Manager/Chief Solutions Architect at a consultancy with 11-50 employees
Real User
Top 10
2024-01-15T04:10:00Z
Jan 15, 2024
The cost is a concern. We use DMS because of its simplicity, but the price could definitely be more competitive. So, in my opinion, some potential areas for improvement are price and possibly supporting Oracle Autonomous Database (ADB) on AWS, as it's a powerful option. Oracle ADB on AWS would significantly reduce our migration workload. Price and lack of ADB support are the main downsides of DMS for us right now. In fact, about 60% of our monthly AWS costs go towards database services.
Data Engineering Manager at a consumer goods company with 201-500 employees
Real User
Top 10
2023-12-29T03:59:00Z
Dec 29, 2023
This solution can offer more tweaks where the latency can be brought down to fifteen seconds. DMS is not the go-to choice when it comes to data streaming, and the major reason is the latency issues because it's file-based and not message-based. So if DMS could offer such a solution, it has the potential to replace Kafka as well. DMS is more of a one-click setup in comparison to Kafka.
Learn what your peers think about AWS Database Migration Service. Get advice and tips from experienced pros sharing their opinions. Updated: December 2024.
Sr Director / Architect at a computer software company with 201-500 employees
Real User
Top 5
2023-11-13T17:11:52Z
Nov 13, 2023
We do some processes where we use JSON coding. It could be more user-friendly, like a drag-and-drop interface or studio-like experience. Additionally, It offers vertical and horizontal scaling mechanisms. There were some issues, particularly when the migration process was extended longer. The live duplication has a delay of two minutes, which can be an issue. AWS have some languages to pass some parameters in configurations. When running a pipeline, batch or warehouse application, we pass many different parameters like configurations.
As a solution, the better thing is if more platforms come with direct compatibility, like connecting to different data sources. The basic problem I faced the most was while transferring and reading data from Excel. So, one time, I found that all the components I had declared in my scripts were able to take it. And after some patching happens, it is not able to support that. Again, we need to update the version of Excel. So, it's not like a plug-in type place where you have done the solution and are confident that it will work. So, this is an area of improvement. If we look at Microsoft products, most things look or are plug-and-play. For example, if you are using storage as a service, we need to go for CLI commands. So, those types of stuff that's not as easy as using a Microsoft product, like DDoS, that level of easiness is not there.
Software architect at a computer software company with 51-200 employees
Real User
Top 20
2023-06-16T06:07:00Z
Jun 16, 2023
The product's performance could be a little bit better. It is good in terms of the data, but it is difficult for the first time. After that, the synchronization is fine. We saw a few failures because of the bandwidth the first time the data got loaded. The performance could be better, but it's only for the first time. We haven't seen any performance-related issues post the initial setup.
AWS Database Migration Service is a huge product, and it takes a great amount of effort to reverse engineer what they do on the backend. It would be better if they did more troubleshooting at the moment. Currently, if something goes wrong, you get a message that says one thing that has nothing to do with the RCA, and it could be misleading. You aren't even sure which part was broken. There is no connectivity to the source database or the target database. Any of those channels could break, and it becomes very hard to troubleshoot.
Senior Database Administrator at Overonix Technologies
Real User
2022-10-04T11:23:04Z
Oct 4, 2022
One area that AWS DMS can improve on is its conversion of data types. For example, in Oracle, you have a data type called RAW, but in PostgreSQL there is no such thing. Thus, AWS DMS doesn't know what type I want to use when migrating from Oracle to PostgreSQL, and when performing the migration, AWS DMS changed the RAW data type to the byte data type, which isn't what I wanted. For example, if I wanted to manually transfer the RAW data type in the Oracle database to something else like VARCHAR in the PostgreSQL database, AWS DMS doesn't seem to have this functionality. It would be great if I could change the data type conversions manually instead of automatically. Another area that has proven difficult for me is the use of AWS Schema Conversion Tool, which is a free, cross-platform app that they offer as part of AWS DMS. I was under the impression that I would first have to use this tool to convert from one database to another, and then use AWS DMS, but when I used it, some of the tasks didn't work correctly. To my surprise, when I skipped using the Schema Conversion Tool and went ahead with the migration with AWS DMS, it automatically transferred everything and it was all correct. So I am not sure what the point of the Schema Conversion Tool is, because the default functionality of AWS DMS seemed to transfer and convert the databases fine without it. There is also room for improvement from a support perspective. It is sometimes necessary to contact their support team when there is something you don't understand, and when I wrote a support ticket they simply weren't able to help. Yet, when our company contacted the manager of our reseller, they were able to create a meeting room with an Amazon specialist for DMS, and with one call, all our questions were answered. I think their email support team could be much better when compared with their personal support team.
Infrastructure Lead at a computer software company with 51-200 employees
MSP
2022-09-12T11:48:54Z
Sep 12, 2022
We had challenges working with the database as it was a different kind of exit. It has blobs and other types of storage which caused issues. If they had some sort of functionality where, at a specific point in time, if I want to start a new job, it should automatically pick up from where it has been left rather than having people worry about the exact job number and the timing. If something could be automated, that will be really helpful.
What needs improvement in AWS Database Migration Service is that it lacks a log file validation feature. If the solution could provide more details about a particular transaction, that would be helpful. The stability of AWS Database Migration Service for online CDC records also needs to be improved.
Our organization works with both client data migration and the cost governance part. Years ago, the price was nominal and acceptable for the client to do a migration. Now, prices are challenging, especially with versions continually changing. We have Azure and GCP in place. Amazon provides a hybrid solution, so people are accustomed to adapting all these technologies. Cost is the only factor that is challenging. AWS Database is supporting six or seven flavors of RDS. In the next release, I would like to extend the other database as well. There is a need for extra features that are available in open source. For example, for Postgres, we have limited features of admin availability. If it were standard, it would be very helpful for the database team as well as the migration team.
The price is expensive for a person or student who wants to learn how to use the solution. For students, AWS provides free access for a year. I would like to see the company provide the same access to individuals who are trying to learn the solution on their own to pursue a particular career that requires the knowledge.
Data platform architect at S&P Global Market Intelligence
Real User
2022-02-16T10:25:57Z
Feb 16, 2022
There's a lot of room for improvement in AWS Database Migration Service, e.g. more endpoints to be supported, more control and transparency on the product and on how we get things done, and better operational support.
Net Full-Stack developer at a tech services company with 201-500 employees
Real User
2021-10-21T13:52:21Z
Oct 21, 2021
Database Migration Service could be more integrated. I think that it makes sense to add integration to these functions. For example, AWS Glue has a feature called Orchestrator to create data flows, and that's more straightforward. But it's not easy to do the same things with Database Migration Service.
I think that Amazon needs to improve the migration scenarios after analytics. We need more migration tools or more specific tools for migration and licenses. It's a very complicated scenario because, in some cases, we need specific licenses to create new instances, and some instances are very expensive. That's a very manual scenario.
Digital Services & Engagement Senior Manager at a insurance company with 10,001+ employees
Real User
2020-11-02T18:06:00Z
Nov 2, 2020
There is something where AWS Database Migration can be improved. Many of the application teams don't want to invest on a migration. They don't want to use the Database Migration service. They want us to export to the dump point, database backup, and then put it into the RDS. There is not a cast and mold on it except in the database and that's it. So migrating from here and pushing the data from on-premise to AWS cloud is a big challenge, and a few more services from AWS would be helpful. For example, currently we are using ILDB internet tools which move data from on-premise to AWS cloud. I need a few more services which would be really helpful for me to move the master data.
It would be helpful if the bandwidth could be independent of the network or if we could have a dedicated bandwidth for this product. In terms of dedicated bandwidth, if it can be support Excel or prioritize Excel based on the data it is going to transfer, it would be better.
Lead Oracle Developer and DevOps Engineer at Versent
Real User
2019-08-19T05:47:00Z
Aug 19, 2019
There's some functionality that we're waiting on, like the problem scheduler. It's not yet supported in the current product. The solution could use schedule linking. I'm keen to get that from the solution in the future.
AWS Database Migration Service, also known as AWS DMS, is a cloud service that facilitates the migration of relational databases, NoSQL databases, data warehouses, and other types of data stores. The product can be used to migrate users' data into the AWS Cloud or between combinations of on-premises and cloud setups. The solution allows migration between a wide variety of sources and target endpoints; the only requirement is that one of the endpoints has to be an AWS service. AWS DMS cannot...
Sometimes we encounter errors, and the error messages from DMS are very ambiguous and sometimes misleading. For example, if we connect to an incorrect instance with incorrect setups, it will not say the connection failed. Instead, it says the DMS task failed with an unclear error message like 'subtask AWS failed', which does not help. The connection should fail if we are connecting to the wrong instance, but the endpoint connection still succeeds.
There could be enhancements in the product intelligently covering more scenarios to prevent crashes or failures. Improved dashboard capabilities for monitoring jobs and the ability to resume migrations from the last successful point instead of starting over would significantly enhance the user experience.
In my experience, AWS Database Migration Service performs well for its primary purpose of data migration. One area that could be improved is its support for non-AWS file formats, particularly when integrating data from sources stored outside of AWS. For example, handling AWS Data Lakes, Delta Lake, or Hadoop file formats stored in S3 requires extensive configuration and isn't always straightforward. It would be beneficial to streamline this process to ensure smoother migrations from non-AWS environments to AWS. As for additional functionality, I think enhancing support for these external file formats and simplifying configuration steps would be valuable improvements for the service.
At our company, sometimes we feel the migration processes are not manageable. On a few occasions, the data-capturing processes need to be reinitialized. Platform-specific issues like memory leakage and a lack of process fine-tuning arise as the solution is serverless. My company is able to initially setup the migration processes for how the data chunks or memory gets used for the data capture process. If the migration processes ever freeze with the AWS Database Migration Service, it needs to be reinitiated right from the beginning, uploaded data needs to be removed, and some verifications need to be implemented. For being a vendor managed solution there are some cons which the user cannot control.
The cost is a concern. We use DMS because of its simplicity, but the price could definitely be more competitive. So, in my opinion, some potential areas for improvement are price and possibly supporting Oracle Autonomous Database (ADB) on AWS, as it's a powerful option. Oracle ADB on AWS would significantly reduce our migration workload. Price and lack of ADB support are the main downsides of DMS for us right now. In fact, about 60% of our monthly AWS costs go towards database services.
This solution can offer more tweaks where the latency can be brought down to fifteen seconds. DMS is not the go-to choice when it comes to data streaming, and the major reason is the latency issues because it's file-based and not message-based. So if DMS could offer such a solution, it has the potential to replace Kafka as well. DMS is more of a one-click setup in comparison to Kafka.
We do some processes where we use JSON coding. It could be more user-friendly, like a drag-and-drop interface or studio-like experience. Additionally, It offers vertical and horizontal scaling mechanisms. There were some issues, particularly when the migration process was extended longer. The live duplication has a delay of two minutes, which can be an issue. AWS have some languages to pass some parameters in configurations. When running a pipeline, batch or warehouse application, we pass many different parameters like configurations.
As a solution, the better thing is if more platforms come with direct compatibility, like connecting to different data sources. The basic problem I faced the most was while transferring and reading data from Excel. So, one time, I found that all the components I had declared in my scripts were able to take it. And after some patching happens, it is not able to support that. Again, we need to update the version of Excel. So, it's not like a plug-in type place where you have done the solution and are confident that it will work. So, this is an area of improvement. If we look at Microsoft products, most things look or are plug-and-play. For example, if you are using storage as a service, we need to go for CLI commands. So, those types of stuff that's not as easy as using a Microsoft product, like DDoS, that level of easiness is not there.
The product's performance could be a little bit better. It is good in terms of the data, but it is difficult for the first time. After that, the synchronization is fine. We saw a few failures because of the bandwidth the first time the data got loaded. The performance could be better, but it's only for the first time. We haven't seen any performance-related issues post the initial setup.
We would like to see some improvement in the performance of large scale procedures, such as when we migrate from Oracle to csSQL.
The performance of data migration could be smoother.
AWS Database Migration Service is a huge product, and it takes a great amount of effort to reverse engineer what they do on the backend. It would be better if they did more troubleshooting at the moment. Currently, if something goes wrong, you get a message that says one thing that has nothing to do with the RCA, and it could be misleading. You aren't even sure which part was broken. There is no connectivity to the source database or the target database. Any of those channels could break, and it becomes very hard to troubleshoot.
One area that AWS DMS can improve on is its conversion of data types. For example, in Oracle, you have a data type called RAW, but in PostgreSQL there is no such thing. Thus, AWS DMS doesn't know what type I want to use when migrating from Oracle to PostgreSQL, and when performing the migration, AWS DMS changed the RAW data type to the byte data type, which isn't what I wanted. For example, if I wanted to manually transfer the RAW data type in the Oracle database to something else like VARCHAR in the PostgreSQL database, AWS DMS doesn't seem to have this functionality. It would be great if I could change the data type conversions manually instead of automatically. Another area that has proven difficult for me is the use of AWS Schema Conversion Tool, which is a free, cross-platform app that they offer as part of AWS DMS. I was under the impression that I would first have to use this tool to convert from one database to another, and then use AWS DMS, but when I used it, some of the tasks didn't work correctly. To my surprise, when I skipped using the Schema Conversion Tool and went ahead with the migration with AWS DMS, it automatically transferred everything and it was all correct. So I am not sure what the point of the Schema Conversion Tool is, because the default functionality of AWS DMS seemed to transfer and convert the databases fine without it. There is also room for improvement from a support perspective. It is sometimes necessary to contact their support team when there is something you don't understand, and when I wrote a support ticket they simply weren't able to help. Yet, when our company contacted the manager of our reseller, they were able to create a meeting room with an Amazon specialist for DMS, and with one call, all our questions were answered. I think their email support team could be much better when compared with their personal support team.
We had challenges working with the database as it was a different kind of exit. It has blobs and other types of storage which caused issues. If they had some sort of functionality where, at a specific point in time, if I want to start a new job, it should automatically pick up from where it has been left rather than having people worry about the exact job number and the timing. If something could be automated, that will be really helpful.
What needs improvement in AWS Database Migration Service is that it lacks a log file validation feature. If the solution could provide more details about a particular transaction, that would be helpful. The stability of AWS Database Migration Service for online CDC records also needs to be improved.
Our organization works with both client data migration and the cost governance part. Years ago, the price was nominal and acceptable for the client to do a migration. Now, prices are challenging, especially with versions continually changing. We have Azure and GCP in place. Amazon provides a hybrid solution, so people are accustomed to adapting all these technologies. Cost is the only factor that is challenging. AWS Database is supporting six or seven flavors of RDS. In the next release, I would like to extend the other database as well. There is a need for extra features that are available in open source. For example, for Postgres, we have limited features of admin availability. If it were standard, it would be very helpful for the database team as well as the migration team.
The price is expensive for a person or student who wants to learn how to use the solution. For students, AWS provides free access for a year. I would like to see the company provide the same access to individuals who are trying to learn the solution on their own to pursue a particular career that requires the knowledge.
This solution is compatible with only AWS. I cannot use this solution with AWS and other cloud services like Azure or Google Cloud.
There's a lot of room for improvement in AWS Database Migration Service, e.g. more endpoints to be supported, more control and transparency on the product and on how we get things done, and better operational support.
Database Migration Service could be more integrated. I think that it makes sense to add integration to these functions. For example, AWS Glue has a feature called Orchestrator to create data flows, and that's more straightforward. But it's not easy to do the same things with Database Migration Service.
I think that Amazon needs to improve the migration scenarios after analytics. We need more migration tools or more specific tools for migration and licenses. It's a very complicated scenario because, in some cases, we need specific licenses to create new instances, and some instances are very expensive. That's a very manual scenario.
There is something where AWS Database Migration can be improved. Many of the application teams don't want to invest on a migration. They don't want to use the Database Migration service. They want us to export to the dump point, database backup, and then put it into the RDS. There is not a cast and mold on it except in the database and that's it. So migrating from here and pushing the data from on-premise to AWS cloud is a big challenge, and a few more services from AWS would be helpful. For example, currently we are using ILDB internet tools which move data from on-premise to AWS cloud. I need a few more services which would be really helpful for me to move the master data.
The pricing can be better and it should be more competitive, so I would like to see an improvement there.
It would be helpful if the bandwidth could be independent of the network or if we could have a dedicated bandwidth for this product. In terms of dedicated bandwidth, if it can be support Excel or prioritize Excel based on the data it is going to transfer, it would be better.
There's some functionality that we're waiting on, like the problem scheduler. It's not yet supported in the current product. The solution could use schedule linking. I'm keen to get that from the solution in the future.