NetApp Cloud Backup is our primary DR solution for our file service solution in Hitachi Energy, meaning we have multiple services like on-prem hosting solutions. As any company, we have this file service, like whatever users see on their computer through the NetDrive currently in our company, which is backed by NetApp systems globally, along with some Windows file servers, of course. Mostly, it is NetApp systems, so it is SMB software from NetApp. We have 65 locations on-premises globally, and we have a different NetApp system backing up NeDrive because we have a global infrastructure. The whole infrastructure is basically for the DR side, and this is how the tool supports us. The cloud backup from NetApp, which is Microsoft Azure's virtual volumes running on NetApp software, is interconnected with all our 65 locations on an on-premises model. On a daily basis, replications of the mapped drives backed by NetApp on-premises are backed up straight into Azure Cloud. As a part of our PR strategy is that whenever we lose, we have, like, the disaster recovery or the disaster scenario in any of our one of our 65 locations on-premises, and when we sense it, then we automatically with cloud backup and NetApp software, activate NetApp's replicated data that is in Azure Cloud. It allows us to continue to serve users with the file service because our data is currently running from Azure Cloud, but it is not the primary reason we run the data, and it is just as DR data in Azure. For data integrity, SnapMirror is at the top of the snapshots of how the tool works to ensure data integrity. We basically have to respect the SLAs, RTOs, and RPOs we have agreed upon within the company. The snapshots are helping us in a way that can allow us to have a definitive time stamp of the data that was at some time on-premises, so we can schedule snapshots in the last seven days at an exact hour each day, and that helps us to know that exactly yesterday, we had the data at what time, or when was the data available the day before. NetApp's SnapMirror ensures data integrity and offers protection for us because as soon as the snapshot is taken and completed locally on-premises, the SnapMirror feature kicks in and basically starts the replication between that data that was snapshotted and the cloud volume. Basically, all the data from a particular time is replicated in the cloud. At the same time, the on-premises storage is still running, and it continues to serve the clients. The data keeps changing at the set intervals till the data is replicated into the cloud, but it doesn't matter because we have a way to define the timestamp at which we want to put the tendency, saying this is where the point from where we want to put the data in the cloud. As the product is highly scalable and highly available worldwide, it financially allows us to lower the cost we would have for our DR environment because if we did not have a product like this, we would have to look at different alternatives, maybe from Microsoft or maybe from some company-built solutions like regional data centers that will have to be provisioned to stay online. We will also have to spend money on the hardware support and infrastructures to host some storage that it will allow you to have, like a DR location, highly available or time. In case something goes wrong on-premises, regional data centers will actually have to be workable and sustain the workload from the factory users accessing the file services. It is a lower cost for us as a company because this is the secondary data, not the primary data. It is not so cheap, but it is not so expensive in a crazy manner like it would be if you had to build a similar solution with your own tools and your own hosting. Being able to run the tool on the cloud, we have everything embedded into the same software from the same company, which is NetApp, together with the infrastructure and architecture that we have, and it allows us to really have a seamless file service and file service recoverability experience for end users globally at a good price. The challenges that I faced with the product during the integration process were in the area of the networking side. You had to architect it really well in a way that data is accessible when browsing on-premises and also when you switch over to the backup on the cloud, so the network was one of the challenges, together with the firewalling part. Generally, the product was straightforward, technically, and also contractually, if I consider the costs from the beginning. The only feature that I am aware of is that it offers protection as something that scans the data on the storage, whether it is on the cloud or on-premises, which is something that we do not really use. Leveraging AI with NetApp is not something I have seen. Even the feature with the ransomware recovery is pretty efficient because besides just making sure that it does snapshots and tries to scan the storage while those snapshots are running, I don't know how it actually scans the storage based on what repositories or based on what antimalware list or from which vendor is used, making them all gray areas even if it is a feature presented by NetApp. I recommend the tool to others. I rate the tool a nine out of ten.
Service Manager at a tech services company with 11-50 employees
Reseller
2022-02-14T13:25:05Z
Feb 14, 2022
I would highly recommend this solution to others. When choosing a solution you need to find the one that best suits your use cases. The environment plays a big role when it comes to the setup and what you'd like to achieve. I rate NetApp Cloud Backup a seven out of ten.
Co-founder & Chief Architect at Prescriptive Data Solutions
Reseller
Top 10
2021-12-10T19:44:39Z
Dec 10, 2021
Smaller companies should be able to maintain this product easily with only one admin, but larger enterprises will require teams of up to twenty people. NetApp is our go-to storage platform because of its flexibility, adaptability, and the different types of protocols it provides. I would rate this solution as ten out of ten.
Data backup involves copying and moving data from its primary location to a secondary location from which it can later be retrieved in case the primary data storage location experiences some kind of failure or disaster.
NetApp Cloud Backup is our primary DR solution for our file service solution in Hitachi Energy, meaning we have multiple services like on-prem hosting solutions. As any company, we have this file service, like whatever users see on their computer through the NetDrive currently in our company, which is backed by NetApp systems globally, along with some Windows file servers, of course. Mostly, it is NetApp systems, so it is SMB software from NetApp. We have 65 locations on-premises globally, and we have a different NetApp system backing up NeDrive because we have a global infrastructure. The whole infrastructure is basically for the DR side, and this is how the tool supports us. The cloud backup from NetApp, which is Microsoft Azure's virtual volumes running on NetApp software, is interconnected with all our 65 locations on an on-premises model. On a daily basis, replications of the mapped drives backed by NetApp on-premises are backed up straight into Azure Cloud. As a part of our PR strategy is that whenever we lose, we have, like, the disaster recovery or the disaster scenario in any of our one of our 65 locations on-premises, and when we sense it, then we automatically with cloud backup and NetApp software, activate NetApp's replicated data that is in Azure Cloud. It allows us to continue to serve users with the file service because our data is currently running from Azure Cloud, but it is not the primary reason we run the data, and it is just as DR data in Azure. For data integrity, SnapMirror is at the top of the snapshots of how the tool works to ensure data integrity. We basically have to respect the SLAs, RTOs, and RPOs we have agreed upon within the company. The snapshots are helping us in a way that can allow us to have a definitive time stamp of the data that was at some time on-premises, so we can schedule snapshots in the last seven days at an exact hour each day, and that helps us to know that exactly yesterday, we had the data at what time, or when was the data available the day before. NetApp's SnapMirror ensures data integrity and offers protection for us because as soon as the snapshot is taken and completed locally on-premises, the SnapMirror feature kicks in and basically starts the replication between that data that was snapshotted and the cloud volume. Basically, all the data from a particular time is replicated in the cloud. At the same time, the on-premises storage is still running, and it continues to serve the clients. The data keeps changing at the set intervals till the data is replicated into the cloud, but it doesn't matter because we have a way to define the timestamp at which we want to put the tendency, saying this is where the point from where we want to put the data in the cloud. As the product is highly scalable and highly available worldwide, it financially allows us to lower the cost we would have for our DR environment because if we did not have a product like this, we would have to look at different alternatives, maybe from Microsoft or maybe from some company-built solutions like regional data centers that will have to be provisioned to stay online. We will also have to spend money on the hardware support and infrastructures to host some storage that it will allow you to have, like a DR location, highly available or time. In case something goes wrong on-premises, regional data centers will actually have to be workable and sustain the workload from the factory users accessing the file services. It is a lower cost for us as a company because this is the secondary data, not the primary data. It is not so cheap, but it is not so expensive in a crazy manner like it would be if you had to build a similar solution with your own tools and your own hosting. Being able to run the tool on the cloud, we have everything embedded into the same software from the same company, which is NetApp, together with the infrastructure and architecture that we have, and it allows us to really have a seamless file service and file service recoverability experience for end users globally at a good price. The challenges that I faced with the product during the integration process were in the area of the networking side. You had to architect it really well in a way that data is accessible when browsing on-premises and also when you switch over to the backup on the cloud, so the network was one of the challenges, together with the firewalling part. Generally, the product was straightforward, technically, and also contractually, if I consider the costs from the beginning. The only feature that I am aware of is that it offers protection as something that scans the data on the storage, whether it is on the cloud or on-premises, which is something that we do not really use. Leveraging AI with NetApp is not something I have seen. Even the feature with the ransomware recovery is pretty efficient because besides just making sure that it does snapshots and tries to scan the storage while those snapshots are running, I don't know how it actually scans the storage based on what repositories or based on what antimalware list or from which vendor is used, making them all gray areas even if it is a feature presented by NetApp. I recommend the tool to others. I rate the tool a nine out of ten.
I would highly recommend this solution to others. When choosing a solution you need to find the one that best suits your use cases. The environment plays a big role when it comes to the setup and what you'd like to achieve. I rate NetApp Cloud Backup a seven out of ten.
Smaller companies should be able to maintain this product easily with only one admin, but larger enterprises will require teams of up to twenty people. NetApp is our go-to storage platform because of its flexibility, adaptability, and the different types of protocols it provides. I would rate this solution as ten out of ten.
I would rate NetApp Cloud Backup a seven out of ten.