Hi all,
I have a set of data consisting of relatively large numbers, and I am trying to aggregate them using the Summarize tool. The issue is that I seem to get inconsistent results every time I run the workflow.
For example,
- 1st run: 5,973,734,380,328,240
- 2nd run: 5,973,734,380,328,330
- 3rd run: 5,973,734,380,328,260
I understand that the double data type has 15 digits of precision, and my initial understanding is that 15 digits of precision means 15 significant figures. However, based on my run results, this does not seem to be the case.
Does 15 digits of precision actually mean that any digit beyond the 15th digit will become random? Are such inconsistent results expected with large numbers, and does this mean I will not be able to produce the exact same results between runs?
[Edit: I saw a related post here Big numbers and Alteryx . Not sure if it has anything to do with my issue]