Hi,
I have a workflow that pulls data from SQL and Loads into Postgresql.
I am using the "Input Data Tool" for SQL source and the "Output Data Tool" for Postgresql. The data is around 8 to 9K. However, it takes 25 to 28m. It can't be like this. I know it is tough to suggest without looking at the flow, but does anyone have any tips so I can check it out?
I appreciate any help you can provide.
Are you using Greenplum as your connection? Are you using the Postgres bulk loader? https://help.alteryx.com/20231/designer/configure-postgresql-bulk-connection-writing-data
Is you postgres hosted on a cloud? is there a VPN? Are there any VPN traffic issues? I'd start by looking into the bulk writer. Some types of DBs require an api call per row (Redshift - I'm looking in your direction) and it can take an eternity if you aren't using a bulk loader - while others (Databricks) can be quite fast...
well say how is your odbc connection;
for more information about your odbc because postgresql is one of the fastest ones out there.
may be network problem