I need some help writing a table to Azure Databricks. I’ve been successful in writing smaller sized tables using the In-DB tool, however it appears like there is a 2GB size limit within Databricks that is potentially limiting my ability to write larger tables (Receiving “Error from Databricks”). I’m wondering if there is a work around? I’ve recently downloaded the 2023.1.1.5 version and started experimenting with the Databricks Delta Lake Bulk Loader (Avro) for writing, but without much luck (This could be due to my Shared Key not being correct). If I can’t write directly to Databricks, is there a backdoor I can write my large tables to from Alteryx, maybe a Blob Storage or something along those lines? Any help would be much appreciated!
Thanks!