This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
This felt much easier than the last few days. Both parts can add an optimization challenge depending on your approach.
In a last few days, I didn't hold the time to do the challenge because of my work and year-end party, but today I did it! Certainly, today's challenge is easier before a few days.
The easiest way to do this was, as expected, to explode the number of records, so I had to do it different way.
P1 was straightforward. P2 required some ugly brute force.
P2: Generate the min/max for each path by row. Then do a calc to determine if y_min <=[row-1:y_max] THEN 1 for the 14 rows possible. Filtered where THEN =0 and plug magic row back into original logic to determine x,y coordinate then final logic
Managed to catch up on Day 15! Part 2 was really fun to try and figure out/optimise after realising I couldn't scaffold a 4m*4m grid to check against. Following a couple of ideas that still lead to huge blowups and cancelling a couple of workflows after 15-20 minutes, I decided to try and use spatial for this which actually ended up running in 0.2 seconds! Entire flow for P1+P2 takes ~17s.
It was difficult to resolve the performance issue for Part2.
So - took a completely different approach to part 1 vs. part 2.
Very quickly realised (i.e. after running for an hour) that solving part 2 by using a brute-force array would not work.
And then the outer call looks like this.
To make this scale - switched to Spatial since this is really an intersection of squares exercise.
To explain this.
The top piece creates a combined polygon of the diamonds for the coverage for every sensor.
The bottom piece creates the bounding box.
Then you subtract the top from the bottom and that tells you what pieces remain.
Split them into polygons
- The example test had thin sliver-type polygons - so I did a bit more work to see if I could fit in a 1x1 beacon - but this wasn't needed on the final data.
Things to watch out for:
- Use X & Y as floating point long & lat
- Scale these values up / down to fit within the range of -2;2 (this allows you to use Alteryx spatial tools which only allow Earth coords and staying within the range of -2;2 reduces issues of curvature since spatial tools are not flat-plain geometry but geography (spherical earth).
Step 1: Overlapping sensor areas:
Step 2: Merge into one poly:
Subtract from Bounding Box:
Final Resul (tiny box with the centroid in the middle)
My workflow is slow and had 2Billion+records, but it worked well enough.
although get answer via spatial, but still want to find another method. take less than 10 seconds in total.
and multiple the required 4000000 + [y]