This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
Find answers, ask questions, and share expertise about Alteryx Designer.
General Discussions has some can't miss conversations going on right now! From conversations about automation to sharing your favorite Alteryx memes, there's something for everyone. Make it part of your community routine!
I have several questions about Snowflake connecting from Alteryx.
1. Bulk loader without primary key field
Would it be possible to upload data into snowflake using bulk loader that does not have primary key field?
Background: There is a primary key field with AUTOINCREMENT in snowflake
2. Insert new data into snowflake using bulk load
It seems like "Update: Insert if new" is not available for bulk load. In that case, probably "Append existing" is the best way. If there is any suggestions, please let me know!
3. Modify database by fill in value
What is the best way to modify data in snowflake?
There are some rows that are missing values. I would like to modify/fill-in data in snowflake. Would it be better deleting specific rows and append modified data? OR modify data using sql statement in Alteryx output tool? OR clone original table --> create modified table --> replace table?
4. Normal input and output tool vs In-Database connection tools
What is the advantage/disadvantage using In-Database connection tools?
1) Yes, I'm pretty sure you can do this. In the file format you can specify that the PK field is not in the file and Snowflake should create the value. I can dig out some details from an old project if you need more info 🙂
2) I don't think Snowflake support UPSERT when loading. Usually I've uploaded to an empty table and used SQL to achieve the same thing
3) Depends on your situation. Snowflake isn't designed for single record maintenance, you can do it, but don't try it for a large number of records and expect good performance. It can take seconds to bulk load millions of rows, but hours to edit a thousand records individually. If you understand how Snowflake works then you can design a really fast workflow
4) The same as every other database. InDB means that the data stays in the database. It's VERY fast and efficient and if you're just transforming data (not loading or unloading) then InDB will make things really fast.
In @cmcclellan 's point 3: "It can take seconds to bulk load millions of rows, but hours to edit a thousand records individually. If you understand how Snowflake works then you can design a really fast workflow"
This is a very interesting insight! Snowflake is optimised for bulk operations and analytics, not good for operation.