Advent of Code 2024 Day 9 (BaseA Style)
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Discussion thread for day 9 of the Advent of Code - https://adventofcode.com/2024/day/9
- Labels:
- Advent of Code
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
first?
split to the step accordingly then create a freeID in ascend and fileID in descend. verified if the positionID will be smaller.
part2:
use data given before split to the line. (keep the data size, to easier to compare later)
check one by one from bottom, if size is enough, position is earlier.
generate another rows if free size is larger.
loop for 10k iteration, and take 40 mins to run it
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
There are many optimizations that can be made here to shorten time needed for Part 2 (I have only used a few, but have a V2 that I'm working on). Current workflow runs in ~20 minutes:
V2 (Part 2 runs in ~8 minutes):
File Data | File Size |
0 | 5 |
. | 2 |
1 | 3 |
2 | 1 |
New iterative macro:
It seems like there is opportunity for further optimization, but this is what I have for now.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
part1 was simpler than I thought at first glance. I tried to create Iterative macro but I realized it's not needed after 1 hour.
For part2, I just configured the required logic into Iterative Macro. No special trick is applied and it took 11 min.
I suppose there should exist a lot of trick to speed up but cannot come up with it for now.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
My solution.
Part2 workflow
The approach for Part 2 is different from Part 1. (I tried, but could not find the way.)
Not sophisticated (it took 48:47 minutes), but it worked anyway.
Part2 macro
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
P2 was tough Challenge. My WF run in 9min for P1 and P2.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Solved!
I realized that the logic was incorrect, so I had to remake the workflow that I had created several times.
Part1: No macro
I solved it by breaking down the file into rows, reversing the order of free space, and joining them by row position. However, free space is larger than files, the end will be a bit strange, so the key point was how to deal with that.
 
Part 2:
It would be a lot of work to break it down row by row, so I calculated the number of files and the free space and stuff them one by one into the free space from the back ID files. There are a total of 10,000 IDs, so I just repeated this 10,000 times. My workflow took 8 min for part1 and 2.
The point is that after the files are moved, there is continuous free space, so if you don't combine it well, you won't be able to pack the files as you intended. This is the last thing I noticed, and I ended up having to recreate the macro I created at the beginning (at first I thought it would be enough to just keep the free space in a loop, but in the end it turned out that I can't combine continuous free space unless I also have the file part).
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Day9. More in spoiler
P2 was to build the macro to make sure an entire block could move left if the last item of the block could move over. Brute forced it and it took 45 minutes
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Finally finished up with day 9! Part 1 was fine. Explored a few different routes for part 2 and got there after a bit of nonsense.
