This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
We've recently made an accessibility improvement to the community and therefore posts without any content are no longer allowed. Please use the spoiler feature or add a short message in the message body in order to submit your weekly challenge.
We are currently experiencing an issue with Email verification at this time and working towards a solution. Should you encounter this issue, please click on the "Send Verification Button" a second time and the request should go through. If the issue still persists for you, please email firstname.lastname@example.org for assistance.
This one is more of a "stream of consciousness" workflow. I didn't plan it out, just started throwing tools on the canvas. Certainly not the recommended way of building out a workflow 🙂. I'm sure it could be tightened up quite a bit.
I randomly generated a larger test dataset with 2,600 unique client IDs and 636K input records (with the workflow adding 300K records to fill in the blanks), and then ran each configuration 30 times on the same dataset.
Even at that much larger dataset size, the workflow ran in < 5 sec, and Data Cleansing tool added less than 1 sec to the average run time. I was expecting a larger difference. It would be interesting to see how that plays out with an even larger dataset.
Moving the formula tool didn't make a measurable difference in run time, although the measurement isn't very precise.