Discussion thread for day 11 of the Advent of Code - https://adventofcode.com/2025/day/11
Feel a lot better about that one after yesterday.
Part 1 was straightforward. Trace the path. I expected a twist in the real data. In particular, I was on the lookout for loops but there really didn’t seem to be anything.
Part2 was the familiar "now do it with too much data."
The secret here is pruning and merging. You have to track whether you’ve visited dac and/or fft but once you allow for those two paths that get to the same place don’t have to be tracked separately, they can just be tracked as one path that counts double. Merging paths that reach the same point takes the execution from functionally infinite to seconds.
Final, Macro pasted into primary for picture
Slightly different approach to the problem from @ScottLewis, but same general insights:
Macro:
Happy Solving, y'all! Now back to Day 10 for me
One day to go.
My WF is configured based on 'what I already know' about dataset that all path follow svr=>...=>fft=>...=>dac....=>outYou don't need memorize all path but summarize them by the current node and sum up the number of path in each iteration.
Nice and simple one today... assuming you know how to optimize part 2...
Today's problem seemed like one where the data volume would explode, so I designed my workflow to avoid that, and I got the answer on the first try.
Solved, I will be back to Day10.
It was easier than when I thought at first.
Part 2 macro:
Vizualization:
fft:
Maybe this one suited me but it's probably the quickest I've completed any challenge this year.
Part1: Starting with "you", simply find the next move, iterate, until you find "out". Initially failed to read the question and realise we only cared about starting with you and accidentally got the right answer for the test data by summarising into unique ending combinations.
Part 2: Very similar approach, but with different starting and end point (SVR and OUT), and need to keep track if you meet two other points along the way in any order (DAC and FFT). This was the part designed to catch you out as it quickly balloons into loads of common points, so I added a counter to keep track of how many I had. Ran pretty much instantaneously for the data set.
Day 11
Took me too long to figure out how to best minimise the number of records and stop it running awayNice solution though, only minimal changes to the macro for part 2 - I think you could probably wrap both into one
I had to refer to the solution from @Hub119 for P2 since my macro for P1 will just die for P2.
Then eventually I managed to use one macro to do both parts.
Day 11 cracked! Had a lot of fun with this one - we had a very similar problem back in 2022 (I believe?) so knew the approach I wanted to take right away. It's nice to catch myself spotting these patterns and know what I need to do, and often how to optimise, which I did off the bat for Part 1 and set me up nicely for Part 2! Unfortunately I spent a good chunk of time debugging Part 2 to realise I'd set the wrong aggregation on a single field - once I sorted that it was plain sailing and both parts take <1s.
I learnt very quickly that brute force was not going to work for part two, when my macro got stuck at just 12 iterations! I made a couple of adjustments to my original solution, but was able to adapt most of my part one to work for part two.
Technically Day 12, but for some reason I was having trouble posting this so here goes take 2! Either way, Day 11 is in the books! What a nice and calm treat after what was such a chaotic Day 10 puzzle! Part 2 required some testing, but in the end we made it through!
My Solutions:
Part 1 Macro
Part 2 Macro
My Reflection:
similar to Day7P2 for D11P1 and another few steps for P2
Macro for P2
Part 1 was no problem. Part 2 I needed a hint.
Day 10 - I have to come back to.
My Day 11 solutions. Tried to keep everything simple!
Here's my part 1 solution.
Pt1 Only. Have to still learn about optimization to try pt2, my Macro still refuses to take action.
Here's my solulu