Sometimes it's difficult to set it up, such as when naming your bucket. The naming conventions require unique global names, which can be cumbersome. An improvement could be associating the naming with personal accounts, allowing more familiar or desired names without conflicting with global conventions.
The service could improve by offering mobile backups, which would be great since using Google Backup is costly. The complexity of the initial setup due to partitioning data over five GB could also be simplified.
Storage Administrator at a insurance company with 501-1,000 employees
Real User
Top 5
2024-08-16T17:26:12Z
Aug 16, 2024
There's a lot of complexity, but that's unavoidable because it needs to be versatile. You have to be able to get the data in many different ways. For example, if you need to give just one file to someone outside the office, you can create a temporary pre-signed URL for them to access it. That takes a little bit of fiddling with the settings, but it's not really a disadvantage. It's just part of the system. There's no real way to make that easier. The more secure something is, the more complex it will be.
From a pricing perspective, there’s still room for improvement, especially for smaller or startup entities that are very cost-conscious. If Amazon could repackage their S3 products to be more affordable and appealing for these smaller users, it could attract more customers, particularly at the lower end of the market.
The solution could be cheaper. EC2 has a feature called get instant screenshot. It would be good if Amazon S3 had an API to get screenshots or video streaming continuously. There are cases where we want to see what is happening in real-time instead of connecting to Amazon EC2. It would be good if we could watch it like a video stream.
The search option must be enabled in the UI to filter and retrieve files. The filter option must be provided within the directory. Users must be able to filter files based on the file name.
The product should enable users to create their own shortcuts. It should allow more customization. Users should be able to place their shortcuts on the dashboard. It will make things easier.
Recently, I was trying to clean up the data because I have a lot of old, unused objects and images. I am trying to identify the files with the highest sizes and determine if I need to keep them or delete them. I didn't find the user interface very friendly. Beyond 399 or 999 objects, the tool's user interface was not allowing me to use the sort option. I had to use various keywords to search phrases and shortlist them so that I got less than 399 objects and could use the sort option. The aforementioned area can be considered for improvement.
The solution's cost should be improved. The solution's performance needs to increase for small objects. Amazon S3 is excellent for large projects, but its performance can be comparatively slower for fewer small objects.
The free web and DB hosting could be improved. It could be faster when you download huge files from the S3 bucket to your local system. If the S3 bucket availability and performance could be improved, we would consider hosting our database over there.
Sr full stack java developer at JPMorgan Chase & Co.
Real User
Top 10
2023-02-17T19:51:00Z
Feb 17, 2023
Many customers use this tool. We use it for enterprise application data where we store all cloud-native application production data. You can upload any amount of data and access it from anywhere to deploy the applications. However, the tool needs to improve its performance. Clients can reach more end users if the overall process is faster. The solution can be a perfect tool if it can add more S3 storage classes and manage the authorized users to one bucket and give access to the storage usage and activity trends.
We are already utilizing cloud services, and Amazon is providing us with an abundance of options. As we move forward, we must ensure that we are obtaining the best value for our money when it comes to pricing. The price of Amazon S3 has room for improvement. The current solution is not ideal, as it requires us to log into multiple dashboards and technical services. To improve our centralized monitoring or central idea, we need to create a single centralized system that can provide access to all services. The stability of the solution has room for improvement.
Lead Software Engineer at a tech services company with 1,001-5,000 employees
Real User
Top 5
2023-01-16T19:30:59Z
Jan 16, 2023
The query size in Amazon S3 has room for improvement because it's limited. Querying on Amazon S3 is also expensive with Amazon Athena, so that's another area for improvement in the solution.
My company is in several regions with some offices in the Middle East, Europe, and Asia. Sometimes we have to concentrate networks to a certain location, like the EU, for example. Therefore, if we set S3 in the EU region from the Middle East or Africa, the latency is an issue and we find it hard to connect to that S3. If we could have a much faster connection, it would be better. While the price is pretty cheap, it could always be less costly.
I would like a batch downloading option added to the AWS console. If we have a bucket in Amazon S3 and for example, we have 50 files that we want to download locally on our system, we need to download them one by one. We can batch-download the files with the command, but from the AWS console, we cannot select all the files and download them into a folder. An option to delete multiple files at one time would be a nice feature to be added to the AWS console because we currently have to specify the individual file names. For example, if five files are present, and we want to delete them, then each file name has to be entered separately.
CEO - Founder / Principal Data Scientist / Principal AI Architect at Kanayma LLC
Real User
2022-11-25T22:17:26Z
Nov 25, 2022
I would like to see the API for beginners improved because it is hard to understand. It is based on the idea of a container when most programmers just understand directories and subdirectories. For a beginner, the API is a bit confusing.
Solution Architect, DevOps Engineer at sonne technology
Real User
Top 10
2022-11-07T20:58:35Z
Nov 7, 2022
The solution's pricing should be modified to automatically expand storage based on prior usage. For example, our monthly usage is very low so our storage allotment should be increased by 200%.
It is difficult to retrieve files in an emergency because the public cloud has data limits for immediate downloads so they often take twelve hours. We need better, immediate control of our content. The cloud and on-premises solutions should sync all data. Infrastructure services could be improved to provide free education related to the solution's features. For example, developers who do not have experience building games could learn how to do so with free support.
Senior Technical Manager -Information Technology at a computer software company with 5,001-10,000 employees
Real User
2022-09-14T11:19:55Z
Sep 14, 2022
Amazon S3 could improve the load balancers. They are very basic compared to other solutions in the industry, such as F5. It has to be able to be analyzed up to a certain level. The basic features need some improvement, users should not have to use any third-party solutions, Amazon S3 should provide those particular features. The feature at present does not suffice most of the requirements of an enterprise-class organization. The security could improve in Amazon S3. I am not seeing any value in the solution in security.
Senior Software and Cloud Engineer at Velocis Technologies LLC
Real User
2022-07-19T11:17:21Z
Jul 19, 2022
Regarding S3, S2, and E2, all I'd recommend is for pricing to be lower since we can easily overuse them. For example, if a certain resource is left unchecked, you can easily use a lot of it and drive up costs.
Presales Consultant - Solution Architect at Hewlett Packard Enterprise
Real User
2022-04-05T13:20:00Z
Apr 5, 2022
Maybe it could be a bit cheaper. There are many people using it or willing to use it, and the data size is increasing. It is reasonable at the beginning, however, then it is starting to get a bit expensive as the data size is increasing incredibly. It is getting more expensive.
CTO at a tech services company with 11-50 employees
Real User
2022-01-20T10:27:11Z
Jan 20, 2022
The solution could improve by having a simpler way to have the detail size of a file in the CLI to be read or understood easier. Today, it's a little raw in the way the details are presented. You have to make some scripts and some modifications to be able to see the information properly of the size of a given bucket or the size of a file.
The pricing and licensing are pretty complex. It needs to be simplified. They have no development environment. That means that even during the development time, we are paying. The initial setup can be a bit complex. It would be helpful if there were more templates available.
System Administrator at a tech services company with 11-50 employees
Real User
2021-11-07T10:01:00Z
Nov 7, 2021
We need the mapping to Windows. That feature is needed in S3. That feature is in the Azure cloud where we can map it like a drive. We can map the cloud storage to the effort driving our Windows PC. But Amazon is not providing that feature for the S3 solution. In the next release I would like to see mapping on the Windows machine. In Windows, there are drives right in volumes. We need to map these Amazon S3 storage to the Windows. That usually is available in the Azure. I would like to have that in the Amazon as well.
Senior Software Engineer at a tech services company with 501-1,000 employees
Real User
2021-06-27T11:11:17Z
Jun 27, 2021
When we need to find the queries there is no user interface, it is the only textual format that is bulky. If we need some data we will use Athena for queries and we will get results. When comparing this solution to others, such as DataDog, they provide a wonderful UI, and user-friendly dashboards, and many other features that this solution is lacking. We have found that the query takes too much time to process and it is quite difficult to receive the data.
The only pain point is migrating the life cycle policy because if someone is a newbie, they cannot easily create the life cycle policy. This is because there are multiple, additional ways. If I want to store data for one month, they have the option to Glacier over it. With Glacier, you have addition policies for one year or four years. The option will not be there. So you would need to use a UI process over it.
IT-Services Manager & Solution Architect at Stratis
Real User
2021-04-24T16:12:24Z
Apr 24, 2021
I don't really have any pain points with this product. There isn't really anything to complain about. It does what we need it to do. Technical support could have a faster response time. They are a bit slow right now.
Whatever enhancement they could include in terms of object storage and limitations, would be an improvement. There is a five terabyte limitation size, and in today's world with the data size doubling all the time, anything extra would be helpful. We really need a very, very huge file system, a big data system to accommodate all this content.
Manager, IT Infrastructure and Data Center at Asian Paints
Real User
2020-01-07T06:27:00Z
Jan 7, 2020
Overall, I don't think there's anything that needs to be improved. The console could be improved - it's not very user-friendly for non-technical guys and the cost has to improve. It does not give us a clear picture of the total cost. On a scale of 1 to 10, I'd rate it an 8.
I would like to see translations or description context for the options. It is difficult for me as a consultant to explain the science, or the configuration processes, or why I am using this much power or the size to the customer. It takes some effort to describe how you got to that sizing. We have some problems with connectivity. In my country, most of the issues we have are with stable connections rather than stable platforms. In the next release, I think that it would be good to have wizards that would update into specific applications, for example, a one-touch configuration in Pagemaker. It would mean that you don't need to activate S3, and then have to do a configuration on the page. You would want a single wizard that would do all of the necessary applications.
Technical Director at a healthcare company with 5,001-10,000 employees
Real User
2019-09-08T09:50:00Z
Sep 8, 2019
Some of the areas that could be improved are the dashboard, and to have a richer functionality. Because there are so many services offered by Amazon, they can do anything. If we were to add anything it wouldn't be anything inside of it, but services on top of it. There is a concern with security. In one of our main use cases, we prescreen, but we have to create a gateway or layer on top of it to access the data in that particular case. Because the user accesses the data, they to be authenticated before doing so. It indicates that the users of our systems need to gain access to this data. Amazon allows access with this mechanism called Presigned URL. We need to share files, so we upload the file and request a link from Amazon, which allows you to share with anyone. The link is signed, which is the reason it is called Presigned. However, this sign is not compliant with internet regulations. In our company, we are concerned with privacy regulations. In the United States, there is a law and regulation that is called HIPPA. It's a regulation on how to keep patients' data private and how to protect it. Amazon S3 is eligible for HIPPA compliance, but not with the Presigned URL. This is very important and because we cannot use the Presigned URL, we have to build the layer on top of Amazon S3. As a result of having to do this, we lose performance, availability, and we lose some benefits of Amazon S3. A feature that should be included is to find and provide a HIPPA compliant solution for the Presigned URL.
Amazon Simple Storage Service is storage for the Internet. It is designed to make web-scale computing easier for developers.
Amazon S3 has a simple web services interface that you can use to store and retrieve any amount of data, at any time, from anywhere on the web. It gives any developer access to the same highly scalable, reliable, fast, inexpensive data storage infrastructure that Amazon uses to run its own global network of web sites. The service aims to maximize benefits of scale and...
In terms of security, I struggled with setting permissions and access control initially.
The practice of protecting data could be more streamlined or mandatory. Enhancements in security features might be beneficial.
Sometimes it's difficult to set it up, such as when naming your bucket. The naming conventions require unique global names, which can be cumbersome. An improvement could be associating the naming with personal accounts, allowing more familiar or desired names without conflicting with global conventions.
The service could improve by offering mobile backups, which would be great since using Google Backup is costly. The complexity of the initial setup due to partitioning data over five GB could also be simplified.
There's a lot of complexity, but that's unavoidable because it needs to be versatile. You have to be able to get the data in many different ways. For example, if you need to give just one file to someone outside the office, you can create a temporary pre-signed URL for them to access it. That takes a little bit of fiddling with the settings, but it's not really a disadvantage. It's just part of the system. There's no real way to make that easier. The more secure something is, the more complex it will be.
From a pricing perspective, there’s still room for improvement, especially for smaller or startup entities that are very cost-conscious. If Amazon could repackage their S3 products to be more affordable and appealing for these smaller users, it could attract more customers, particularly at the lower end of the market.
The solution could be cheaper. EC2 has a feature called get instant screenshot. It would be good if Amazon S3 had an API to get screenshots or video streaming continuously. There are cases where we want to see what is happening in real-time instead of connecting to Amazon EC2. It would be good if we could watch it like a video stream.
The search option must be enabled in the UI to filter and retrieve files. The filter option must be provided within the directory. Users must be able to filter files based on the file name.
The product should enable users to create their own shortcuts. It should allow more customization. Users should be able to place their shortcuts on the dashboard. It will make things easier.
Sometimes while using S3, I've wished for certain features. One thing would be automatic replication between buckets, some feature like that.
The tool needs to improve its flexibility in support.
Recently, I was trying to clean up the data because I have a lot of old, unused objects and images. I am trying to identify the files with the highest sizes and determine if I need to keep them or delete them. I didn't find the user interface very friendly. Beyond 399 or 999 objects, the tool's user interface was not allowing me to use the sort option. I had to use various keywords to search phrases and shortlist them so that I got less than 399 objects and could use the sort option. The aforementioned area can be considered for improvement.
No direct connector is available to connect Amazon S3 with other tools like MicroStrategy. So, I have to use a third party and connect to it.
The solution's cost should be improved. The solution's performance needs to increase for small objects. Amazon S3 is excellent for large projects, but its performance can be comparatively slower for fewer small objects.
The UI should be more user-friendly. I rate the UI a five out of ten. The solution must improve its UI.
The free web and DB hosting could be improved. It could be faster when you download huge files from the S3 bucket to your local system. If the S3 bucket availability and performance could be improved, we would consider hosting our database over there.
Many customers use this tool. We use it for enterprise application data where we store all cloud-native application production data. You can upload any amount of data and access it from anywhere to deploy the applications. However, the tool needs to improve its performance. Clients can reach more end users if the overall process is faster. The solution can be a perfect tool if it can add more S3 storage classes and manage the authorized users to one bucket and give access to the storage usage and activity trends.
We are already utilizing cloud services, and Amazon is providing us with an abundance of options. As we move forward, we must ensure that we are obtaining the best value for our money when it comes to pricing. The price of Amazon S3 has room for improvement. The current solution is not ideal, as it requires us to log into multiple dashboards and technical services. To improve our centralized monitoring or central idea, we need to create a single centralized system that can provide access to all services. The stability of the solution has room for improvement.
We would like some improvement in the cost of storage via this solution, as it currently has a very high price point.
The query size in Amazon S3 has room for improvement because it's limited. Querying on Amazon S3 is also expensive with Amazon Athena, so that's another area for improvement in the solution.
Amazon S3 could improve by having more frequent updates.
You need to fully understand the solution's best practices to get the most out of its security features.
My company is in several regions with some offices in the Middle East, Europe, and Asia. Sometimes we have to concentrate networks to a certain location, like the EU, for example. Therefore, if we set S3 in the EU region from the Middle East or Africa, the latency is an issue and we find it hard to connect to that S3. If we could have a much faster connection, it would be better. While the price is pretty cheap, it could always be less costly.
I would like a batch downloading option added to the AWS console. If we have a bucket in Amazon S3 and for example, we have 50 files that we want to download locally on our system, we need to download them one by one. We can batch-download the files with the command, but from the AWS console, we cannot select all the files and download them into a folder. An option to delete multiple files at one time would be a nice feature to be added to the AWS console because we currently have to specify the individual file names. For example, if five files are present, and we want to delete them, then each file name has to be entered separately.
I would like to see the API for beginners improved because it is hard to understand. It is based on the idea of a container when most programmers just understand directories and subdirectories. For a beginner, the API is a bit confusing.
The solution requires you to buy and use Amazon services. The solution could be more auto functional.
The solution's pricing should be modified to automatically expand storage based on prior usage. For example, our monthly usage is very low so our storage allotment should be increased by 200%.
It is difficult to retrieve files in an emergency because the public cloud has data limits for immediate downloads so they often take twelve hours. We need better, immediate control of our content. The cloud and on-premises solutions should sync all data. Infrastructure services could be improved to provide free education related to the solution's features. For example, developers who do not have experience building games could learn how to do so with free support.
Amazon S3 could improve the load balancers. They are very basic compared to other solutions in the industry, such as F5. It has to be able to be analyzed up to a certain level. The basic features need some improvement, users should not have to use any third-party solutions, Amazon S3 should provide those particular features. The feature at present does not suffice most of the requirements of an enterprise-class organization. The security could improve in Amazon S3. I am not seeing any value in the solution in security.
File versioning could be improved and I'd like to see some additional security features.
Regarding S3, S2, and E2, all I'd recommend is for pricing to be lower since we can easily overuse them. For example, if a certain resource is left unchecked, you can easily use a lot of it and drive up costs.
Amazon S3 could improve by providing some type of storage notification.
Amazon S3 needs to simplify the backup features.
We would like Amazon to reduce the price of storage.
Amazon S3 could improve by being more secure.
Maybe it could be a bit cheaper. There are many people using it or willing to use it, and the data size is increasing. It is reasonable at the beginning, however, then it is starting to get a bit expensive as the data size is increasing incredibly. It is getting more expensive.
The solution could improve by having a simpler way to have the detail size of a file in the CLI to be read or understood easier. Today, it's a little raw in the way the details are presented. You have to make some scripts and some modifications to be able to see the information properly of the size of a given bucket or the size of a file.
The pricing and licensing are pretty complex. It needs to be simplified. They have no development environment. That means that even during the development time, we are paying. The initial setup can be a bit complex. It would be helpful if there were more templates available.
We need the mapping to Windows. That feature is needed in S3. That feature is in the Azure cloud where we can map it like a drive. We can map the cloud storage to the effort driving our Windows PC. But Amazon is not providing that feature for the S3 solution. In the next release I would like to see mapping on the Windows machine. In Windows, there are drives right in volumes. We need to map these Amazon S3 storage to the Windows. That usually is available in the Azure. I would like to have that in the Amazon as well.
When we need to find the queries there is no user interface, it is the only textual format that is bulky. If we need some data we will use Athena for queries and we will get results. When comparing this solution to others, such as DataDog, they provide a wonderful UI, and user-friendly dashboards, and many other features that this solution is lacking. We have found that the query takes too much time to process and it is quite difficult to receive the data.
The only pain point is migrating the life cycle policy because if someone is a newbie, they cannot easily create the life cycle policy. This is because there are multiple, additional ways. If I want to store data for one month, they have the option to Glacier over it. With Glacier, you have addition policies for one year or four years. The option will not be there. So you would need to use a UI process over it.
I don't really have any pain points with this product. There isn't really anything to complain about. It does what we need it to do. Technical support could have a faster response time. They are a bit slow right now.
Whatever enhancement they could include in terms of object storage and limitations, would be an improvement. There is a five terabyte limitation size, and in today's world with the data size doubling all the time, anything extra would be helpful. We really need a very, very huge file system, a big data system to accommodate all this content.
The security model can be improved as it is a bit confusing. The access speed could be faster.
Overall, I don't think there's anything that needs to be improved. The console could be improved - it's not very user-friendly for non-technical guys and the cost has to improve. It does not give us a clear picture of the total cost. On a scale of 1 to 10, I'd rate it an 8.
I would like to see translations or description context for the options. It is difficult for me as a consultant to explain the science, or the configuration processes, or why I am using this much power or the size to the customer. It takes some effort to describe how you got to that sizing. We have some problems with connectivity. In my country, most of the issues we have are with stable connections rather than stable platforms. In the next release, I think that it would be good to have wizards that would update into specific applications, for example, a one-touch configuration in Pagemaker. It would mean that you don't need to activate S3, and then have to do a configuration on the page. You would want a single wizard that would do all of the necessary applications.
Some of the areas that could be improved are the dashboard, and to have a richer functionality. Because there are so many services offered by Amazon, they can do anything. If we were to add anything it wouldn't be anything inside of it, but services on top of it. There is a concern with security. In one of our main use cases, we prescreen, but we have to create a gateway or layer on top of it to access the data in that particular case. Because the user accesses the data, they to be authenticated before doing so. It indicates that the users of our systems need to gain access to this data. Amazon allows access with this mechanism called Presigned URL. We need to share files, so we upload the file and request a link from Amazon, which allows you to share with anyone. The link is signed, which is the reason it is called Presigned. However, this sign is not compliant with internet regulations. In our company, we are concerned with privacy regulations. In the United States, there is a law and regulation that is called HIPPA. It's a regulation on how to keep patients' data private and how to protect it. Amazon S3 is eligible for HIPPA compliance, but not with the Presigned URL. This is very important and because we cannot use the Presigned URL, we have to build the layer on top of Amazon S3. As a result of having to do this, we lose performance, availability, and we lose some benefits of Amazon S3. A feature that should be included is to find and provide a HIPPA compliant solution for the Presigned URL.