Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

Databricks Data Stream In Tool found bug

ChristinaFPE
7 - Meteor

I'm trying to write to a Databricks table. I've created the In-DB connection, and made a successful connection. I'm able to use the Connect In-DB tool to read the table I want. I can write to the table using the Write Data In-DB tool. However, when I try to write to the table using the Data Stream In tool, I get an error saying I've found a bug. Why can I write with the Write Data In-DB tool but not the Data Stream In tool?

 

I receive the same "found bug" error no matter what configuration I choose for the Data Stream In tool. I've tried all 3 drop down options: "Create Temporary Table", "Create New Table", and "Overwrite Table (Drop)". And for the "Create New Table" and "Overwrite Table (Drop)", I have tried putting just the table name, the schema.table name, and the database.schema.table name.

 

FoundBug.png

24 REPLIES 24
apathetichell
19 - Altair

Yeah - they'll ask you to enable trace logs. What happens if you delete the write-in-db  --- so you are only using datastream-in?

Mwoody_05
7 - Meteor

Hi there, No matter what I do, I keep getting the error "Invalid Object Name" I'm using the Data Stream In tool.

Do I need to put https:// in front of the host or Http Path ??

It says I have a successful connection, but I don't know what wrong because the error is so vauge.

Thank you in advance


apathetichell
19 - Altair
Mwoody_05
7 - Meteor

Hi, what data source are you using, 'Databricks Unity Catalog' or 'Databricks' ??

I only see the csv option under databricks data source

 

Screenshot 2024-08-06 142946.png

apathetichell
19 - Altair

Databricks - have you googled here? I've written ad nauseum on the set up which works for me - on various threads here. 1) you need to set up your connection - so you can scrolll down to that in connections. for writing you will need to configure your token/password AGAIN in the connection arrow on the right. I do not use DCM. I only take data in via In-DB. I only write via Datastream-IN. Your use cases may be vastly different. I bulk write locally - but I believe that S3 writing is more efficient. Message me if you have any questions with more information on your set up - It's definitely more efficient then postin ghere.

Luipiu
6 - Meteoroid

DOES ANYONE HAS A SOLUTION TO THIS PROBLEM?

 

My error is more similar to : Error: Data Stream In (3): Error from Databricks: Unable to reach URL Endpt dbc-xxxxxxxx.cloud.databricks.com; HTTP response code said error

 

I enabled the logging for the odbc symba spark and founded this error: ERROR 6604 Simba::ODBC::Connection::SQLSetConnectAttr: [Simba][ODBC] (11470) Transactions are not supported

 

Then I try to add some Server Side properties,autocommit=true, in the Advance Options of the odbc driver but I catch always the same error.

 

Thanks

 

apathetichell
19 - Altair

Can you access your tables IN-DB?Try to read in DB first - before writing. Can you also confirm that your enable CTAS translation is unchecked. Can you test in ODBC 64? Is it connection? can you share your settings there?

Luipiu
6 - Meteoroid

Thanks for your answer

 

We try several settings, CTAS transaction enable or not. We can write to Databrics using Write Data In DB Tool, it works correctly but if I have the necessity to read from a file I'm going to use Data Stream In. We encountered the same problem decribed as Christina, but with another error message. Perhaps it's an odbc driver issue, the odcb log said "Transactions are not supported"

Furthermore using In/Out Tool with odbc dsn we catch the error "Not Supported".

apathetichell
19 - Altair

Do you have a Write In-DB after your Datastream-In? In the same process? I vaguely remember hitting some issues when using both of these in the same process/workflow. Can you test if your Datstream-In by itself works. Can you post your settings on Datastream-In? How are you configuring table type (if it's permanent - switch to temp, if it's temp - switch to permanent). Can you confirm that your table name is set up correctly?

Luipiu
6 - Meteoroid

Hi, we catch the same error with both Databricks and Databticks Unity Catalog, what changes is the error message

 

Using Databricks Connector, that use dns, we try several ODBC settings but without success, all enabled/disabled, some option enabled and so on

 

The WF is very simple, input data then Data Stream In Tool

 

Labels
Top Solution Authors