Hi Team,
I have a source SAP table having 30 years of data and i need to load these data in to snowflake using parallel session or spitting in to date wise using single workflow.
can any one help on this how build this.
Hi @Basheer
I don't know of any way to have parallel output to Snowflake. The Amp engine only supports parallel output for certain output types and Snowflake isn't one of them.
One thing you can look at is Snowflake bulk. Though I've used this method myself, I have used SQL Server Bulk and it has a transaction size parameter, that effectively writes and commits in chunks.
Dan
User | Count |
---|---|
19 | |
14 | |
13 | |
9 | |
8 |