This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
Thanks for you reply, Presently I have put the Record ID initial value = 1 in the designer workflow,Which is used as unique key for inserting the records into the target table. this workflow will run as and when data needs to be loaded to target table.
Each time we run the workflow ,we need the Record ID to be incremented not from the initial value set-up but from last run max record ID. Since I am new to Alteryx might not finding the right approach.
Check out the attached. Personally, I'd first try to determine the max RecordID from the existing database, then append that to your new data stream. If you create another new recordID that begins incrementing from 1, you then just simply add that to the max RecordID from the old set.
Thanks for the info. Now I am facing another challenge, The Record ID what i am generated from the max of some other lookup table has to be used in two
data flow. Means let say i have used the JOIN condition which splits data into two pipeline(Flow),
One flow will write the data into TABLE_A, and other flow will will get some data from other table and also writes to TABLE_A.
Now I want the unique record ID to be generated for both the flow as these are writing to the same target table.
Please see the attached Screenshot of the workflow. in that I have data coming from two source tables and i am joining them first. Next I am appending the MAX(Seq) from other table to flow one, it works fine, But i can't do the same MAX(Seq) appending to the second data flow. (high lighted in RED screen shot).
As that will regenerate the same record id of (max +1) to the second flow. Which I don't want
Please guide me this can be achieved in other way.
Hi @SMANE, if I understand correctly, you should be able to take the max ID from one data stream (after processing) and append that to data stream 2 in a similar fashion, creating separate unique IDs that continue to increment.