Community Spring Cleaning week is here! Join your fellow Maveryx in digging through your old posts and marking comments on them as solved. Learn more here!

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Problems loading data using Snowflake Bulk Loading Tool

danielkresina
9 - Comet

I've been trying to use the new Snowflake bulk loading utility in Alteryx.

 

I'm using a US East 1 instance of AWS for snowflake and my S3 bucket.  I'm using a simply workflow in Alteryx that takes a single column/value of data and tries to bulk insert it into a new table in Snowflake.  When I run it, Alteryx errors out with the message below stating that the copy filed due to a SQL compilation error.

 

    Output Data (6)    The COPY failed with error: SQL compilation error:¶Object '"DB1.PUBLIC.TEST4"' does not exist.
   

If I log into snowflake separately I can confirm that the table was successfully created and is sitting there empty so the error message does not appear to be correct.

 

There is no encryption enabled on my S3 bucket, I'm using the default endpoint choice in the bulk loading tool configuration.  I tried picking different endpoints as well as changing the Table/field name SQL style in Alteryx's output to "quoted" and "None" and nothing seems to change the error.  Has anyone else encountered this?

 

Here is my Alteryx output tool configuration:

Capture.JPG

 

I confirmed that it is not a permission issue in snowflake and I am able to insert data into this table using the same user ID via SQL in the snowflake console.

 

If someone has gotten the new snowflake bulk loader to work correctly then could you share a screenshot of what settings you used so I can try to replicate?  Thanks!

 

11 REPLIES 11
fpinchon
8 - Asteroid

I am running into the same issue...

I tried us-east-1 (N. Virginia) and 

us-west-2 (Oregon)

but keep running into the same issue again and again:

Error: Output Data (5): The COPY failed with error: We were unable to remove some files from S3 after the copy operation!10Error from AWS: Access Denied

 

Online help is not very useful:

https://help.alteryx.com/2018.3/DataSources/Snowflake.htm?Highlight=bulk

danielkresina
9 - Comet

We finally figured out the issue.

 

With every other database I’ve ever connected to in Alteryx, you need to use a fully qualified table name in your output tool (i.e. <DATABASE>.<SCHEMA>.<TABLE>).  Apparently since the connector for snowflake bulk utility includes the database and schema name in the configuration of the connection, you’re only allowed to write to one schema anyway and therefore when configuring your output tool you have to just state the TABLE name alone without any database or schema before it.

 

When I made that change, everything worked.

 

Going forward, it would be nice if Alteryx could make this connector work like the others since this approach forces you to setup a whole new connection for each schema that you want to write to.  It may be a limitation in the snowflake driver...

danielkresina
9 - Comet

@fpinchon, I've begun receiving the same error for some reason.  This was working fine for me last month and now when I returned to use it, I get the error about being unable to remove files.

 

What was your solution to this error?

fpinchon
8 - Asteroid

I found out then that us-east-1 was actually spelled differently by AWS than the default spellingin Alteryx connector:

us.east-1

 

perhaps they corrected it?

srikant96
9 - Comet

i did not see this solve the problem at my end . still the bizarre error

 

Error: Output Data (7): The COPY failed with error: We were unable to remove some files from S3 after the copy operation!10Error from AWS: The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.
Error from AWS: The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.

srikant96
9 - Comet

I had this issue . I resolved the errors but table is still empty 

 

Default END POINT worked fine for Amzon S3 Upload , Download Amazon S3 tool is fien too 

 

But Alteryx output to table tool incorrectly code region as us-west-1 instead of default us-west-2 . So i changed default end point to S3-us-west-2.amazonaws.com ( only in output tool in which you enter  DSN entry etc- ODBC )

 

However , teh bad news now is NO DATA ..tried many tables no luck

srikant96
9 - Comet

I was able to get over this error by making the endpoint on output ALTERYX tool to 's3-us-west-2.amazonaws.com' 

 

the default actually created an incorrect string 

 

snowbl:DSN=DEV-SNOWFLAKE;UID=xxxxx;PWD=__EncPwd1__;Bucket=yyyyy;Access=zzzzzzz;Secret=bbbvvvdd;URL=s3.amazonaws.com;Region=us-east-1;Signing=V4|||table_name

 

i changed END POINT to s3-us-west-2.amazonaws.com

 

and the URL changed to URL=s3-us-west-2.amazonaws.com;Region=us-west-2;

 

works fine . But bad luck ..no data still table 

amti
7 - Meteor

HI,

 

I' using Alteryx version 2020.4.xxx and I' m pretty sure this worked for me before.

I have bulk load to SF with full qualified name, without it and it doesn't work either.

I use local staging, so no bucket name or location issues for me.

The problem is that, it seems the bulkloader tries to first perform a Select * From table and it fails because, obviously, the table doesn't exist. This also happens when create table is selected, instead of overwrite table(drop) option.

It's not a permissions issues, because I'm using the same user without admin rights to that schema. I also tried to create a dummy table with just one field, but then it can't recreate the table with the input fields ...

I don't know why it doesn't continue on select failure and just creates the table with the fields coming from the stream, as I would expect.

 

Does anybody had this issue of "Object doesn't exist or it's not authorized"?

This is an app, by I am able to run the debug as an workflow and the issue is the same. 

I also noticed that my bulkloader doesn't present the chunk Size and the last fields on the context menu ... Is this normal? 

aka
5 - Atom

Hey @danielkresina .. I am facing same issue. I have the file present in S3 which is also snowflake staging area. I am getting error that  table does not exist, actually it is present . us-east-1 is the region that i am trying into.

Labels