Alteryx Designer Knowledge Base

Definitive answers from Designer experts.
The Expert Exam is now live online! Read about the specifics and what it took to bring it to life in the blog by our very own Elizabeth Bonnell!

Load Data into Snowflake Data Warehouse Using Alteryx


As many of you know, Snowflake is a admin free cloud data warehouse used for data analytics! In this article, I am going to show you how to load data into snowflake using Alteryx Database Connections! The first step is to have your snowflake instance up and running with the warehouse and database created. The next step is to create the database table you want to load into Snowflake. You can either do this manually:




Or use the power of Alteryx to load the data!



If you are using Alteryx to load the data, read on! The first step is to download the Snowflake ODBC driver. After installing the driver, go to the dropdown 6.png  > Other Databases > ODBC within the Output Tool. This will open the ODBC Connection:




Click on the ODBC admin > System DSN > Add… > Scroll down to the Snowflake DSIIDriver and click Finish.




Populate the Snowflake Configuration Dialog correct credentials and click OK!




Note: The Role specifies the default role to the user for sessions initiated by the driver. The specified role should be a role that has been assigned to the user for the driver. If the role does not match any of the roles assigned to the user, sessions initiated by the driver will have no role initially. A role can always be specified from within the session.


Tracing is how detailed the logs in the driver should be. I like to use 6 for detailed tracing.

0 = Disable tracing

1 = Fatal only error tracing

2 = Error tracing

3 = Warning tracing

4 = Info tracing

5 = Debug tracing

6 = Detailed tracing


Snowflake Test now should be under System DSN. Go to your ODBC Connection screen, pick the correct Data Source Name (DSN) and Enter User Name and Password, enter the table name you want to append to or create and you should be good to go.


You can now bring in data from various data sources, make various changes using the power of Alteryx and output to the snowflake database!


Some tips:

If you ever encounter the error “expecting TIMESTAMP_NTZ(9) but got TIMESTAMP_LTZ(9)” error when trying to write to a Snowflake data warehouse, the solution is to insert ALTER SESSION SET CLIENT_TIMESTAMP_TYPE_MAPPING = TIMESTAMP_NTZ; to the Pre-Create SQL Statement.

11 - Bolide

Note:  Even though my odbc connection specified a database and a schema, I had to specify these values in either the odbc connection string (i.e., SCHEMANAME.TABLE_NAME) and / or in a pre-create statement (i.e., 'use database DatabaseName;') in order to get things ironed out. 

Alteryx Partner

Now if only there was a Bulk Loader in Alteryx for Snowflake like Redshift has...

Alteryx Partner

Ye asked, and ye receiveth!

Alteryx Partner

Yes it has arrived, I didn't ask to require a S3 account though, because by definition, Snowflake users don't want S3 when they have a Staging Area in Snowflake ...

8 - Asteroid

We're evaluating Snowflake right now, but not having great luck with loading data using Alteryx.  The Bulk Load isn't really an option for me because I am not in a position to have an external S3 environment.  We've tried using the staging area in Snowflake and can get this to work, but it requires the data to be in UTF8.  This doesn't work for us because the data would then be different than our source systems.  We've tried using the ODBC connection with Alteryx.  This gets us around the UTF8 requirement that the CSV options has.  But, the load times are horribly slow.  It is inserting the data in small batches.  As a result, we wind up burning through credits and have to wait forever for our data to be loaded.  Has anyone had any better luck getting data loaded into Snowflake? 

Alteryx Partner
@jonriddle As mentioned here: I put together a quick guide to load in Bulk using Snowflake's internal Staging Area: Not super easy, but it works... I have it run in my operations flawlessly...
8 - Asteroid

Loading data using GALLERY fails . I get this message 


The job XXXXXX did not complete successfully and resulted in the following errors:

  • Error running PreSQL on "XXXXX": No active warehouse selected in the current session. Select an active warehouse with the 'use warehouse' command. (Tool Id: 3)

But this PRE SQL command works like charm in desktop


use database. schema ;


Wondering why GALLERY is throwing this error . the DSN on my desktop matches the GALLERY but is it done differently ?

8 - Asteroid

But it takes for ever to load data . 50,000 rows took 30 mins 

Alteryx Partner
@srikant96 : you HAVE to use the Staging Area or S3 to load into Snowflake, there is just no way you want to go direct... 50,000 records will then take 3 seconds...
8 - Asteroid

i figure that after quite a struggle. thanks ( the admins never told me about bulk feature ) 

Alteryx Partner

"If you ever encounter the error “expecting TIMESTAMP_NTZ(9) but got TIMESTAMP_LTZ(9)” error when trying to write to a Snowflake data warehouse, the solution is to insert ALTER SESSION SET CLIENT_TIMESTAMP_TYPE_MAPPING = TIMESTAMP_NTZ; to the Pre-Create SQL Statement." 


It seems like my problem has a solution but I unfamiliar with Pre-Create SQL Statements. Where would I find this in Alteryx? Do you have instructions for how to do this?




Alteryx Partner
@Rafter It is really simple: in your Output Data Tool, just click on the highlighted ... to insert the code that will instruct Snowflake of the incoming format:
8 - Asteroid

Select your output tool on your workflow to show the configuration pane.  Click the ellipses for the Pre Create SQL Statement option to enter this code.



5 - Atom


  How do schedule event based dependencies.  For example, I use Alteryx to load the "raw" files into a staging area.  In a separate workflow, I read from that raw area and transform the table(s) into a dimensionally modeled area.  I have tried the Crew Macros, however, it does not seem to recognize the activity from S3 into the staging area thus running the dimensional jobs too soon.  Is there another option/method?


Thanks so much!!



Alteryx Partner

Version 2020.3 of Alteryx now offers a Snowflake Bulk loader through the Internal stage! works great and MUCH easier to configure than through S3, enjoy!

Here is the help page: