ACT NOW: The Alteryx team will be retiring support for Community account recovery and Community email-change requests Early 2026. Make sure to check your account preferences in my.alteryx.com to make sure you have filled out your security questions.
Learn more here
Took me too long to figure out how to best minimise the number of records and stop it running away Nice solution though, only minimal changes to the macro for part 2 - I think you could probably wrap both into one
Day 11 cracked! Had a lot of fun with this one - we had a very similar problem back in 2022 (I believe?) so knew the approach I wanted to take right away. It's nice to catch myself spotting these patterns and know what I need to do, and often how to optimise, which I did off the bat for Part 1 and set me up nicely for Part 2! Unfortunately I spent a good chunk of time debugging Part 2 to realise I'd set the wrong aggregation on a single field - once I sorted that it was plain sailing and both parts take <1s.
If you use the approach of aggregating i.e. summing counts of possible paths to a point, you can just extend this for part 2 and add a field that increases by one when you hit dac or fft - the important part here that I messed up was you need to GROUP BY this count and exit your macro when you hit svr and this = 2. I was originally summing them which gave me something not so great!
I learnt very quickly that brute force was not going to work for part two, when my macro got stuck at just 12 iterations! I made a couple of adjustments to my original solution, but was able to adapt most of my part one to work for part two.
Another day where parts one and two overlapped for me nicely, here's how I approached it.
Part 1 & 2
1) Step one was simply create a mapping table. In one column I wanted to know the "current position" and in the other column, I wanted to know the corresponding "Next Position"/ potential connections.
2) Obviously the exact value I filtered to differs slightly between part one and part two, but the basic principle is the same. I separated out the starting value, from all other connections. With this starting value I then needed to create a couple of placeholder columns. - I needed a count, this will contain the "Number of Paths" possible, to get to a given connection. - I also needed a way to keep track of weather or not, I had passed "dac" and "fft", so these both got a flag column too, each set to 0, to begin with.
3) Now comes the macro, in one anchor, I fed in the current position (aka starting point). This was my iterative anchor. In the other input anchor, I had all the remaining connections, that I could then join on. Each iteration starts with a join, between the "current position", and the mapping table. This will "duplicate", my input data, if the next connection, has multiple further connections itself.
4) This step is a simple one, but if the "Next position" is either "dac" or "fft", then I need to update the flag to represent this.
5) In an effort to prevent the data from growing out of control, I decided I ideally wanted to have just have one record per "Next position" and keep track of the Number of potential paths to get to that point. To do this I could simply group by and sum these fields ( + the flag fields too).
6) Finally, if the next position is "out" then I can exit the loop", otherwise go back around again.
7) Once the macro has finished running the workflow is basically complete. For part 1, I just summed the "Number of Paths" and for part two, I filtered to records where both the flags had been triggered (set to 1), then finally performed the same sum!
Technically Day 12, but for some reason I was having trouble posting this so here goes take 2! Either way, Day 11 is in the books! What a nice and calm treat after what was such a chaotic Day 10 puzzle! Part 2 required some testing, but in the end we made it through!
Like I said above, this problem was a much calmer one compared to what we were given yesterday and honestly, it's exactly what the doctor ordered, mostly. Part 2 did come with some challenges for sure, but it was much more easy to work through overall.
Part 1 was a breeze. I was able to grasp the problem pretty quickly and start right away. I mean, after 10 days of this challenge I am having dreams of iterative macros😂. The plan was simple: Find all the new locations that 'you' can go to, create the 'rules universe' of remaining pairs, and keep joining ends to new starts one by one until there are no new endpoints or the destination 'out' has been reached. This macro was built with the intention of preserving the destination chain, meaning that all records outputted actually write out the chain they represent. It runs like the flash and got me super excited, as I thought I could just do the exact same process for Part 2 but switch 'you' to 'svr' and then filter for strings that contain 'fft' and 'dac'. Boy, was I wrong!🤣
So, much like most of you I'm sure, I did attempt to do that method and was met with an absolute monstrosity of iterations that were on a mission to crash my Designer. I took a peek at the input text in more detail and realized that there were certain 'hubs' that can lead to tons of other destinations, and svr manages to activate pretty much every one of them🤣. In order to potentially cut things down, I tried going from svr to fft, just to see if I would be able to break it down in that way and it still took forever. I knew I needed to reduce the number of rows being iterated on, but I was struggling to come up with a way to group the data up. I knew somehow I would need to not keep track of the individual chains, as the combinations of all the hubs were what was messing me up, but I had 2 problems with this: First, how would I know if fft or dac was reached? This was my first problem, and the solution was actually hidden in my second attempt I had just made. The logic of going from svr to fft then fft to dac and so on and so forth( as well as the converse, svr to dac and dac to fft ) makes logical sense, it just fails to compute given the current build, but if I went in a modular format, I wouldn't need to track the string to see if hit fft because the first macro ends at fft. That was just the first problem. My second fear was that loops would exist in the data. For example, These rules could be pretty bad if left unchecked: aaa -> bbb, bbb -> ccc, ccc- > aaa. I actually test for this in my P1 macro in the very first filter tool! I decided to playtest whether or not it is actually worth worrying about by flagging the tool and looking at the macro logs to see if any records were being caught through the first few iterations. To my surprise/disappointment/excitement, it wasn't catching any of these instances, and with that green light, I was ready to scrap the concept of tracking each unique chain and look to consolidate my iteration candidates in some way. From there, I realized if I switch to counts, I could reduce the pool by orders of magnitude. The concept relies on the fact that I no longer need to know what each string is and now just need to know how many times a link in the chain is reached. From there, I added an interface element so I could declare the desired destination dynamically instead of making 3 near-identical copies, and I was off to the races. When performing my final test, I was delighted to find that no paths from svr --> dac --> fft --> out even exist, so my solution only needs to calculate the number of paths to svr --> fft --> dac --> out!
This problem was a classic case of test vs prod environment, and why efficiency matters when building things to run at scale. My Part 1 macro technically works for Part 2, if I wanted to wait 1 year for my answer! Only one more day until we're all done!