Be sure to review our Idea Submission Guidelines for more information!
It would be really helpful to have a bulk load 'output' tool to Snowflake. This would be functionality similar to what is available with the Redshift bulk loader.Currently it takes a reaaally long time to insert via ODBC or would require you to write a custom solution to get this to work.
This article explains the general steps but some of the manual steps outlined would have to be automated to arrive at a solution that is entirely encapsulated within a workflow.
@BradW Are there any plans to be able to use Snowflake's built-in staging area? I don't think it's uncommon to have folks like me that have Snowflake without access to another S3 bucket. We're trying to use Snowflake so we don't have to mess with AWS directly. Love the direction Alteryx is going in though!
Thanks for the feedback! There aren't currently plans to use Snowflake's built in staging area, but that's exactly the type of feedback we're hoping for so that we can add that to Snowflake bulk if there is enough demand for it!
I absolutely agree that there should be an option to use the internal Snowflake stage. Otherwise users are forced to have a separate Amazon S3 account (which Snowflake by itself does not require) and pay for those additional storage and miscellaneous costs. Thanks!
I agree on the built-in staging as well, table stages are very simple to code, since each table has its own in Snowflake, and to use it, just have to add @% in front of the table name...
+1 on utilizing the built-in snowflake stage. Having to provision a separate S3 bucket is just an additional step before an end-user can get started and isn't always possible due to internal controls or fiscal constraints.Please! :)
Another vote for built-in snowflake stage
One more vote for built-in snowflake staging. Is there an expected timeline for when the non-beta version will be released?
Snowflake Bulk on AWS has been released. Please start a new Idea thread for further bulk functionality requests. I do see that a lot of people are asking for bulk via built-in staging, which we'll look into, but would also be very helpful to track as a separate request!
Thanks for all of the feedback!
This is definitely progress. But S3 is so finicky for analysts like me, a version of the connector that would use Snowflake internal staging area is really a MUST HAVE.
I am experiencing config issues with the new S3 connector, and I am not alone:
Is there a way to provide better help than this:
or at least some visibility into the script that the connector is generating?
Another ask would be to make the Data connection editable! There are so many fields to input, and each time you press Save, it cannot be edited anymore! A real nightmare when trying to debug that connection by trying different combinations of extractions...
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.