Hi all,
This is a general discussion question as opposed to a specific workflow of mine question.
My question: is Alteryx or Excel more accurate when it comes to lengthy decimal point calculations? For example, I'm pulling in data from excel in which I'm recreating calculations and I know the answer I'm supposed to arrive at. The data has fairly lengthy decimal figures. (IE: .0176601994161032) In excel that number multiplied by 33.1 is
0.584552600673016 |
where as in alteryx when multiplying those same two numbers I get 0.584552600673009. Very very minor difference, but when these differences add up over 1m calculations. The difference between the excel output and alteryx output comes to about .1-.25% which is significant for my field.
Basically in this scenario which number is being more accurate.
Solved! Go to Solution.
Hi @WestonProhaska ,
I've tried to replicate your example and it returned the expected result.
By any case, are you limiting your number by changing the data type for fixed decimal?
Best,
Fernando Vizcaino
I think this generally depends on how the values are stored. For this level of precision, a fixed decimal field (or these values multiplied and stored at Int64 field) would be more reliable than a float/double, which is subject to floating point encoding.
Outputting as a string works sure, but I have multiple outputs from formulas built onto each other. So they can't be strings whatsoever in the middle of my workflow.
The data seems to slowly degrade precision as it goes on. My excel input is data provided to me in which it seems the decimal points are arbitrarily cut off. IE: one goes 19 spots and another record goes to 13
@WestonProhaska wrote:Outputting as a string works sure, but I have multiple outputs from formulas built onto each other. So they can't be strings whatsoever in the middle of my workflow.
It' shouldn't be too much of a problem to convert back and forth in the middle of a workflow if that's what needs to be done for the sake of accuracy. There's lot of tools to covert and we can help with that. 🙂🙂
@WestonProhaska wrote:The data seems to slowly degrade precision as it goes on. My excel input is data provided to me in which it seems the decimal points are arbitrarily cut off. IE: one goes 19 spots and another record goes to 13
Are you talking about in the Excel file, or in the Alteryx workflow? if this is the Excel file (as read in Excel), then maybe Excel isn't the best format to be storing/sharing these values.
1. @CharlieS I can show you a picture of what I mean. The data has to stay consistently numeric for these computations to run. Would it be best to read everything in as a string then convert to numeric before calcs?
2. @CharlieS It's excel. It is data provided by clients so storing it in something different is not an option unless we convert it ourselves.
@WestonProhaska You may want to keep in mind that the formula tool (and any other expressions for that matter) perform floating point calculations as doubles. This means that any fixed decimal or large int64 value used in a floating point operation will be converted to a double before the operation is completed, and then converted back to the defined data type once the result is achieved. This can cause some loss of precision or rounding to occur that you may not expect. Another thing to keep in mind is that computers are not capable of precise fractional math due to the way decimal values have to be converted and stored in binary. One of our developers has written an excellent article detailing this, and there are a number of conversations on community discussion some of the quirks encountered when performing floating point math especially around rounding values.
Article explaining floating point numbers and precision:
A few quick community discussions that cover similar topics:
https://community.alteryx.com/t5/Alteryx-Designer-Discussions/Alteryx-Odd-Behavior/td-p/89086
@KevinP Thank you. From my understanding Alteryx will be more accurate in its output then or no?