It's for our daily data processing, and there's a batch job that executes it. The process involves more than ten servers or systems. Some of them use a mobile network, some are ONTAP networks, and they're on some kind of system. Everything comes through the Microsoft service. It's not a tentative process, but when the accounting system requires it, a batch job processes the data and secures it in Datalab.
In GCP, there's an engine called Dataflow. You need to put some scripts in it. This process happens automatically every day around midnight. It gets executed, processes the data, and returns it to the database.
The dashboards are good. When you want to show data in a dashboard, like in pipelines, it can be done by the end-user as well. You can only put limited data (JSON) in Datalab.
For example, if you want to show captions in a bar search, you need to provide the information to the data engine. You need to specify that it is a relation in GCP. It will be automatically displayed in the dashboard. That is the advantage of using GCP.
I believe GCP is more costly than other cloud platforms because of this feature for end-users. The data visualization is more meaningful for the end-user compared to Azure Cloud. I have worked with Azure Cloud as well, and Google Cloud provides more meaningful information in its dashboards.
There are a lot of AI feature as well. One AI feature I found is auto-completion. GCP also uses AI for recording purposes and data management in Data Proc, maintaining logs. You just need to mention where to store the logs, either in the cloud or in a data directory. It's very easy.
However, there are limitations with GCP's AI. You need to configure it based on the limits of the nodes. If it goes beyond the limit, it should go to the next node, but that doesn't always happen properly. It could be due to the wrong configuration, but it's not immediately moved to a separate node. You need to restart the server and reconnect it to another node.