Postgresql very slow to write operation
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hi,
I have a workflow that pulls data from SQL and Loads into Postgresql.
I am using the "Input Data Tool" for SQL source and the "Output Data Tool" for Postgresql. The data is around 8 to 9K. However, it takes 25 to 28m. It can't be like this. I know it is tough to suggest without looking at the flow, but does anyone have any tips so I can check it out?
I appreciate any help you can provide.
- Labels:
- Optimization
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Are you using Greenplum as your connection? Are you using the Postgres bulk loader? https://help.alteryx.com/20231/designer/configure-postgresql-bulk-connection-writing-data
Is you postgres hosted on a cloud? is there a VPN? Are there any VPN traffic issues? I'd start by looking into the bulk writer. Some types of DBs require an api call per row (Redshift - I'm looking in your direction) and it can take an eternity if you aren't using a bulk loader - while others (Databricks) can be quite fast...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
well say how is your odbc connection;
for more information about your odbc because postgresql is one of the fastest ones out there.
may be network problem
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Thanks for your reply.
I am using PostgreSQL (ODBC). Please see attached two files.
1. SQL Source OLEDB connection
2. PostgreSQL Destination ODBC connection detail.
I know Bulk Load is way faster. However, I don't have a permission for that.
