Hello,
I want to read a database table which has millions of records. I am trying to read that tables in batches (set of 1000) so that I can avoid physical memory error. Here is the image of the workflow.
1. Input data (retrieving the data from database table having high volume of rows)
2. Output will be again writing to different database
I want to build a workflow to read records in 1000's , add/update into different database and loop until all records gets updated. Can you share any sample workflow?
@pawan_bagdiya
Do you data have a Primary key, so we can do something like below then update with a Batch Macro?
ELECT top 10000 ...
WHERE id > @maxid
ORDER BY id
bel