Amigos,
I'm trying to create a distance matrix (see example).
The distance matrix is ~29k x ~29k.
I've run my workflow multiple times, and it gets stuck at 50% on the 'Measure Distance' tool.
The workflow ran for me when I used a set of only 120 lat/long points. The workflow also ran for me when I used my dataset, but limited the input to 5 rows.
Any suggestions on getting the workflow to complete?
Solved! Go to Solution.
It runs for me. Are you possibly running out of disk space? Does it just hang there at 50% forever?
@dmccandless One quick way to reduce the dataset would be to filter out any records where Store Point and Source_Store Point are the same. Sure it's a quick Zero distance calculation, but you know exactly what that one is. No need to do the work.
0.2s worth of a reduction in the workflow you provided using my machine, but every little bit may help here.
Do you need EVERY point or can you provide a maximum radius where if the distance is greater than a specified number it's not calculated?
Neil,
I'm calculating the Tom Tom drive time.
I've set my max radius drive time at 99 minutes.
I'll see how that helps.
JPoz, I can run the attached workflow fine - I'm having trouble with my 29k x 29k production data set.
@CharlieS , Alas, even with your clever solution, I'm still getting stuck at 50% on the Measure Distance tool :(
Any suggestions?
Is 29k points just too many?
So, in one full day, my 32 GB RAM machine went only from 50 to 51% complete on the Measure Distance tool.
So, that leads me to believe this is just a performance issue, and I've got 'too much' data.
I've got an idea - let me know if you think this would help, @CharlieS.
The union has three inputs.
What if I turned the bottom input into a .yxdb or Calgary DB and turned the input of the Measure Distance tool into a .yxdb or Calgary DB?
Do you think that would help? Or, would it not make any difference 'substituting' those DBs for all the prior steps that they currently are?