It's for our daily data processing, and there's a batch job that executes it. The process involves more than ten servers or systems. Some of them use a mobile network, some are ONTAP networks, and they're on some kind of system. Everything comes through the Microsoft service. It's not a tentative process, but when the accounting system requires it, a batch job processes the data and secures it in Datalab. In GCP, there's an engine called Dataflow. You need to put some scripts in it. This process happens automatically every day around midnight. It gets executed, processes the data, and returns it to the database.
Senior Cloud Solution Architect at Integrated Technology Solution Group (ITSG)
Real User
Top 5
2024-01-18T02:46:00Z
Jan 18, 2024
Our main use cases involve transferring workloads from AWS and Univision to Google Cloud Datalab. Before coming to the setting we utilised Google Datalab for looker and handling separated tables for research and development scenarios. Currently in association with AWS and Univision, we are focusing and migrating, defining data set tables and moving data to the Google hosted platform.
Project Manager - Data Science at Mettler-Toledo International Inc.
Real User
Top 10
2023-11-01T11:49:22Z
Nov 1, 2023
Currently, we're currently provisioning some ML models into Vertex AI. The sales team will likely start using the other resources in a month or two. When deploying the model on-premises, with a global team, we faced time lag issues. On Google Cloud, there is a facility to select different locations, making the application available on servers almost anywhere. That's why we shifted to the cloud. It's a data science platform, and we use various resources from Google Cloud, depending on project permissions.
Cloud Datalab is a powerful interactive tool created to explore, analyze, transform and visualize data and build machine learning models on Google Cloud Platform. It runs on Google Compute Engine and connects to multiple cloud services easily so you can focus on your data science tasks.
It's for our daily data processing, and there's a batch job that executes it. The process involves more than ten servers or systems. Some of them use a mobile network, some are ONTAP networks, and they're on some kind of system. Everything comes through the Microsoft service. It's not a tentative process, but when the accounting system requires it, a batch job processes the data and secures it in Datalab. In GCP, there's an engine called Dataflow. You need to put some scripts in it. This process happens automatically every day around midnight. It gets executed, processes the data, and returns it to the database.
The solution is really useful. It’s an easy way to get information. I use it as a reference for analytics, sourcing information, and research.
Our main use cases involve transferring workloads from AWS and Univision to Google Cloud Datalab. Before coming to the setting we utilised Google Datalab for looker and handling separated tables for research and development scenarios. Currently in association with AWS and Univision, we are focusing and migrating, defining data set tables and moving data to the Google hosted platform.
Currently, we're currently provisioning some ML models into Vertex AI. The sales team will likely start using the other resources in a month or two. When deploying the model on-premises, with a global team, we faced time lag issues. On Google Cloud, there is a facility to select different locations, making the application available on servers almost anywhere. That's why we shifted to the cloud. It's a data science platform, and we use various resources from Google Cloud, depending on project permissions.
We are using this solution to help manage personnel and to see if everyone is in the right place.