Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

Having trouble writing workflow to snowflake.

kpontarollo
8 - Asteroid

Hello all!

 

I have been running an alteryx query to snowflake (designer), 2.8 millions rows. Previously, I was not doing the bulk upload and it was taking 24 hours just to get to 75%

 

Now when I do a snowflake bulk load, it gets to 100% quickly, but the query never finishes.

 

 

I attached an image of what I am seeing, this has been at this current view or about 35 mins. Ay insight as to what I can do/why this is happening would be helpful.

 

8 REPLIES 8
apathetichell
18 - Pollux

1) Turn off Amp.

2) are you staging locally via bulk - or using an S3?

3) Can you try datastream in - vs output data?

4) Do you see the query id in your snowflake ui? What does it say there?

cmcclellan
13 - Pulsar

So in your Output Data tool you're using bulk load to Snowflake?

I use my own macro (that I developed back when Alteryx didn't use internal staging).  Snowflake should load 2.8 million records in a few seconds if you've done it the right way, but a single INSERT for each record will take an amazingly long time (and high cost) because that's how Snowflake works.

 

 

apathetichell
18 - Pollux

we should probably ask whats going on upstream? is there something this is waiting on?

kpontarollo
8 - Asteroid

Hi! @apathetichell thanks for the help!

 

1. I tried turning off AMP and it still appears to be having the same problem.

2. I am staging locally... I am the only person in my company who uses alteryx so there is no s3.

3. When you say datastream in... do you mean then output to snowflake with what?

4. In snowflake, I can see the table name but that there is 0 data in it.

 

 

Any further insight is gratefully appreciated! This has been driving me nuts!

kpontarollo
8 - Asteroid

Nothing that its waiting on, its just an alteryx DB to snowflake

kpontarollo
8 - Asteroid

What is a "single insert" and how would I know if I am mistakenly doing that?

apathetichell
18 - Pollux

A single instert is non-bulk loaded Snowflake data - ie each row is a distinct API call/query. if you limit your sample size to 100, and set up logs in your ODBC 64 - you will be able to open your logs and see either 1 call to write your api (with 1 instert statement with all values) - or 100 insert statements.

 

cmcclellan
13 - Pulsar

Can you post more details about your Output Data tool configuration please?

 

In Snowflake terms, you can't stage locally (as far as I know) - there's only internal staging or external staging)

 

Labels