Hi everyone,
I am running a workflow wherein the data after processing is getting stored in .yxdb alteryx database. Now when I am checking the data I can see it only stores 140 records or row count. And as soon as new data is added each week one week is removed from the back. So ultimately losing out on the historical data. Also checked the current sixe of database file is 21.7MB, which is nowhere close to the limit of 2GB (as I am using 64 bit system)
Solved! Go to Solution.
@Deba93 are you able to show your Output Data tool configuration? Is there any chance you have 140 entered in the 'Max Records Per File'?
@DataNath I have attached the screenshot of the configuration
@Deba93 If you want to keep the historic data you need to have the yxdb as an input as well and union the new data to it prior to the output tool. Otherwise it's just overwriting the file.
Hmm, the output configuration looks fine. Is there any chance you can attach or at least screenshot your flow? I'm wondering if there's any filters or sampling tools upstream that are affecting your dataset as this shouldn't be happening otherwise.
@Luke_C I forgot to mention, I am doing that as well. I am having the historical data in input, processing data for current week and removing duplicates and combining them to to the same .yxdb again