This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
Stuck in an issue. Have been working on time series data. I've divided the data in an interval of 10 min slots.
Now I need to create a carry over file, in which the data of last hour and last slot had to be skipped and saved in a carryover file. Next day, when the workflow runs for the new data, the records in the carry over file will be appended to the rest of the data. The problem I am facing particularly, is separating these records. As the last record can occur at any time, there is no fixed time when the last record will occur. Naturally, the number of transactions in the last slot of the day are variable as well.
(The reason of skipping the records is because, for each transaction in each slot, the state: free/waiting has to be taken from the immediate next transaction. For the last transaction of the day, there will be no record to look forward to. Hence the need to save the transactions in that particular slot and append them in the data next day)
PS. It is not necessary, the data is of same day. Data of multiple days can be present, but the interval will be 24 hrs only.
Have attached a dummy workflow, for better understanding.
But we can't select 3 or any number in the sample tool. As the number of transactions in the last slot aren't fixed. It can be any number. The transactions in the last slot (in the dummy workflow) are 3 just for reference.
Add in a Date field and find the max slot for each date. Join back to your original data. The L output gives everything except the last slot for each day. The J output gives the last slot for each day, which you can save to a table to be read the next day.
Hi @danilang ! Thanks for guidance, figured it out. The solution is similar to the one you have suggested, except I created sample rows for last 10 minutes. 10 minutes=600 seconds. 600 rows generated for each second. and then applied the logic similar to the suggested one.