Hi Experts,
Need some inputs i am building an workflow to Update and insert data from Oracle Db to Snowflake DB using IN DB. Please help me this.
Thanks,
Shasank
Solved! Go to Solution.
https://community.alteryx.com/t5/user/viewprofilepage/user-id/2690 can you help me with the above question.
I have the same issue, any help would be welcome...
In my case, despite the Alter fucntion sI set in Alteryx's Pre-SQL, I am still gettin gerrors due to time stamp formats:
expecting TIMESTAMP_NTZ(9) but got TIMESTAMP_LTZ(9)
@Tharashasank @fpinchon Can you please try and set both the TIMESTAMP_TYPE_MAPPING and CLIENT_TIMESTAMP_TYPE to TIMESTAMP_NTZ. That should do the trick!
What worked for me is to insert as a Pre Create SQL Statement the following:
alter session set client_timestamp_type_mapping=timestamp_ntz;
It was not super intuitive, but now it works wonderfully!
I captured my whole experience here:
http://insightsthroughdata.com/how-to-load-data-in-bulk-to-snowflake-with-alteryx/
Cheers,
Frederic
@fpinchon,@diganP Do you know how I can resolve this issue when using Data Stream In tool? I am getting a similar error message but there is no option available to run pre-create SQL. I have tried all 3 creation modes but always get the error:
Error: Data Stream In (6): DataWrap2ODBC::SendBatch: SQL compilation error:
Expression type does not match column data type, expecting TIMESTAMP_NTZ(9) but got TIMESTAMP_LTZ(9) for column....
I have found a workaround using the DateTime tool to convert each date and datetime field to a string. They are then read as varchar in Snowflake. It works, but is not scalable as these fields would need to be specified for each different table.
Why do you have to use Data Stream? I have not met that situation before...