Delete rows to a Database using alteryx (looking for fast method)
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Hello,
I am working with a workflow to find duplicates from a database and now I am trying to Delete thoose duplicates, directly from alteryx.
I know there is a PreSQL Statement to create a query. So, I have create my query, then add a control parameter for build a macro.
I connect my workflow to the macro to delete rows from it, It's working but it's very very long.
To give you an idea, I have 500 rows to delete, and it's still running after 1 hour (34% complete).
So, I would like to know if there is a faster method, because it's not possible to use this method.
Especially , I only have 500 rows, I can't even imagine with millions of rows.
Thank for your help.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
I'm not quite seeing what you are doing - but it sounds a bit off.... Can you sketch out more of what you are doing? Are you deleting the rows in a SQL database? what's the format? I don't know what the macro is for.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Yes, I want to delete specific rows in SQL DataBase, using a where condition on a ID, to choose the rows that i want to delete.
That's why I am using a macro to replace the ID from the query with a control parameter.
I am using exactly the method found in this post :
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
o.k. - post your action tool configuration screens with samples of the values you are feeding in.
Can you clarify what kind of SQL DB you are using?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
I am using a Postgre DB. I have made some screens of my workflows.
As I've said, it's working but the execution is very very long.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
My recommendation for this type of processing is to create a stream with the to-be-deleted IDs and insert them into a database table that will be used to power a SQL Delete with Join statement as the post-SQL statement to delete the records required. This will then become a SQL bulk delete that will perform much more efficiently. This is an example of a MySQL delete statement.
Delete c
From Customer c
Join CustomerDeleteTemp ct
On c.customerID = ct.customerID
This video might help.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
This is the simple workflow. Text Data has the to-be-deleted ID. And Ouput Data has the Post-SQL.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
I have a table of 2300 jobs, and I would like to delete 461 of them. They are identified by a filter (isreplacejobs = 1).
The list of jobs to delete goes to the Control Container "tempDelete" input to start the process, and also goes to a Select within the Control, which reduces the fields to just the jobid. This creates a table called "tempdelete", with one field jobid (from the Select).
When the table is created, the Control Container is satisfied, and Container output starts the next Container "delete jobs". The list of jobs is sent to the Select Records, which chooses 0, so no records are continued. The Count counts zero, which creates a field called Count.
The "jobs then tempdelete" Output (ODBC) creates a new table called "tempdelete", overwriting the previous tempdelete that had the jobids. But, prior to the deletion, the Pre-SQL deletes records from the jobs table that match the IDs in tempdelete.jobid. tl;dr: delete the jobs by ID and erase your tracks.
Note: this is safe to the workflow. You might need to change tempdelete to tempdeletejobs to avoid external conflict.
