Hello,
I have an output of 35k rows from one database table that I need to use as input for another database table. In the past, I managed this by manually pasting the data into an SQL query and searching for it, but back then my data was under 5k rows. Now, this process is very time-consuming, especially since Alteryx only allows me to paste 1k numbers at a time for searching. Please suggest if there is a more efficient way to handle this
Solved! Go to Solution.
@noopurjain perform your database search as batches with the help of batch macro
so 35K rows isn't much. If this is this is being matched vs a gigantic db (like let's say 9 figures plus) --- your best way to do this is to actually transform your 35K rows into variables in an IN statement (so yup --- your key in (term1,term2...term35K)) and dynamically feed this to your db via dynamic input in-db. This allows you to return only the matching values from your main db without creating a temporary table and joining.
As above, generating an IN statement and passing it to the DB is going to be efficient. I'm not sure if there's a step in-between that you're missing though. Are you using the In-DB tools for this? It sounds like you may be trying to just paste the search terms into an Input Tool. You don't need to paste values anywhere.
The example mentioned by @apathetichell that the IN statement would be more performant then is a simple Data Stream In > Join type operation. The Data Stream In will create a temp table, and then you'll be able to push that processing built with the In-DB tools down onto the DB engine.
Also, the solution mentioned by @binuacs, would also work with a batch Macro and Dynamic Input Tool.
User | Count |
---|---|
63 | |
28 | |
23 | |
22 | |
22 |