This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
This seems to be a driver issue, so you could try different ones, or maybe Snowflake will release one in the future that handles these characters. But in terms of troubleshooting, you could just test which characters are throwing the error and remove those. In terms of reading into Alteryx, there's no real workaround for connecting directly to Snowflake since it depends on the driver to connect. Alteryx can handle Chinese characters and emojis so if there's another way to read in from Snowflake without losing those characters, they can definitely be read into Alteryx.
We needed to replace the delimiter used in the csv file for bulk loading.
You can load the data with either loaded, leave it as is but make sure to remove the delimiter used in the csv file. If you're using the regular output tool with snowflake bulk loading, the character is 0x01, you remove that with the following code: regex_replace([_CurrentField_],"\x01","") If you're using the bulk loading macro, the delimiter is whatever you picked (I was using "|" as delimiter), so it would be regex_replace([_CurrentField_],"|","")
Then to read the data in convert it to a binary in the query: Select TO_BINARY(<database>."PUBLIC"."<Table>"."<Column>",'UTF-8') As Content_Binary from <database>."PUBLIC"."table"
Then use the Blob Convert tool to convert the binary field back to text.