We are celebrating the 10-year anniversary of the Alteryx Community! Learn more and join in on the fun here.
Start Free Trial

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

DSN/ODBC connection

Shaaz
9 - Comet

I've a workflow which is connected to PostgreSQL server using a normal OBDC connection as highlighted below in yellow.

Shabaz_0-1646389831055.png

 

How do I change the workflows from ODBC to Bulk ??

 

Will the existing ODBC connection work??

 

Kindly help me with steps to change my workflow from ODBC to Bulk.

6 REPLIES 6
TheOC
16 - Nebula
16 - Nebula

hi @Shaaz 

Theres a little bit of documentation here:
https://help.alteryx.com/20214/designer/configure-posgresql-bulk-connection-writing-data

But by the sounds, you can just use the same connection for bulk as ODBC, just select bulk instead of ODBC.

cheers,
TheOC

Cheers,
TheOC
Connect with me:
LinkedIn Bulien
apathetichell
20 - Arcturus

Does that work with In-DB? I don't see an option for bulk write configuration on my Greenplum In-DB set up.

danilang
19 - Altair
19 - Altair

Hi @apathetichell 

 

The bulk option is only available on Output Data tools, except for Teradata which a bulk import option for some reason.  The bulk setting is implemented on the Alteryx side and has to do with the way that data is written.  If the data is in non-bulk mode, the primary key is checked by designer for inserts and updates on a record by record basis.  This slows the writing process down since the records cannot be batched.  This is how Designer can determine whether to insert or update any specific record.  In bulk mode, the records are sent in batches which is up to 1000's of times faster.  The drawback is that you have to determine which records can be updated and which can be inserted.   If you get this step wrong, the database will throw PK related errors, which will stop your workflow.  

 

The bulk option doesn't apply the same way to In-DB processes.  The Data Stream In tool either creates a temp table, a new table or overwrites(drops) an existing table, so these can always be batched.  The only issue will be with the overwrite option if the data you're streaming in doesn't satisfy the primary key on the target table.  The Data Stream Out is a read operation which is always batched.  All other operations are In-DB and managed by the backend database.

 

Dan

Alteryx_AR
12 - Quasar
Shaaz
9 - Comet

Hi, Thanks...

 

I made a successful connection to Postgres using Bulk option.

 

However, when I run workflow, my workflows run without any errors and upon checking in DB table, there are zero records.

 

Any idea, how to fix this ?

Shaaz
9 - Comet

I made below changes in the workflow and it worked.

 

Column order and column size in Alteryx should be in sync with DB table

Labels
Top Solution Authors