I have a zipped csv file stored in an Amazon S3 bucket.
End goal: Copy CSV file into table in Snowflake Database.
I understand an easy solution is to download the ZIP file locally from S3, unzip the file, upload it back into my S3 bucket, and then copy into the table using Snowflake SQL. However, the data that I want copied into Snowflake is extremely large so I am looking for a more time conservative option.
I am hoping to do one of the following using Alteryx:
1) Convert the ZIP file to GZIP (since Snowflake supports the decompression of GZIP files), or
2) Unzip the file while remaining in the S3 network to avoid the timely downloading/uploading of the file to my local computer.
Does anybody know any solution to either one of these options? Thank you!