Hi all,
I have a set of data consisting of relatively large numbers, and I am trying to aggregate them using the Summarize tool. The issue is that I seem to get inconsistent results every time I run the workflow.
For example,
I understand that the double data type has 15 digits of precision, and my initial understanding is that 15 digits of precision means 15 significant figures. However, based on my run results, this does not seem to be the case.
Does 15 digits of precision actually mean that any digit beyond the 15th digit will become random? Are such inconsistent results expected with large numbers, and does this mean I will not be able to produce the exact same results between runs?
[Edit: I saw a related post here Big numbers and Alteryx . Not sure if it has anything to do with my issue]
@apathetichell in your experience, do problems exist with DOUBLE only when there are decimal values? If all of our values are Integers, can we expect "challenges"? See attached workflow.
Chris
@ChrisTX - until you get to the limits of double size - ints should be fine (because the simplest fraction of two integers for an integer will always be X/1) - so X should be fine. At least that's my hypothesis. I try not to use double in general.