While loading data into Snowflake, I encountered an issue with the key due to the file's large size and a record count in the billions. Loading the data with a Python script was taking a long time, so I decided to explore other options. This is when I discovered Matillion ETL, which I had not heard of before. I learned more about it and used some of its features, including the Material Data Loader, to load the data into Snowflake. Using Matillion ETL, I was able to load around 770 million records in just five to ten minutes. This was a successful use case, and I have also used Matillion ETL for loading data from Amazon S3 to Snowflake and for other data-loading tasks, including connectivity to on-premise servers and different cloud platforms.
I have used on-premise and cloud deployments of this solution.