Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Selecting the correct data type.

mdwood
5 - Atom

I am using Longitude and Latitude to create spatial points to match up with US census shapefiles. I have done this in a different workflow and I was able to get this to work when I converted the Lon and Lat to the double data type.  However, when I do this for this data set, which is the exact same type of file just much bigger, this doesn't work.  The conversion error reported is "Lon is not a valid number" and "Lat is not a valid number."

My first thought was to try some other data types.  So I tried using int64, float, and fixed decimal.  But all of these also return errors.  Fixed decimal returns that it has truncated some of the data as the value was too large.  Float returns the same as double.  Finally, int64 returns a bunch of different individual Lon and Lat saying that information was lost in translation and then finally that the field conversion error limit reached.

So what other data types should I try? Is there one that makes more sense for a Lon and Lat?  Is there something I could change about how things are inputted that would work correctly?  I have checked with the other flow that I have worked with and I can't find anything that would distinguish that one from this new one.  

2 REPLIES 2
KaneG
Alteryx Alumni (Retired)

Hi @mdwood,

 

Double is the correct format. The errors that are being thrown indicate that there may be some funny looking data in those columns somewhere.

 

Try the following in a filter tool 

 

REGEX_CountMatches([Field1],'[[:alpha:]]') > 0 ||

REGEX_CountMatches(ReplaceChar([Field1],'-.',''),'[[:punct:]]')>0

 

That will separate any records that have Alphanumeric characters or Punctuation that is not a decimal or dash. That might give an indication as to the problematic records.

mdwood
5 - Atom

There were some problems in the data.   I misunderstood the meaning of the conversion error.  I thought that having it meant that all of the data was reading incorrectly when in reality it was just 22 rows out of the about 7 million I had in the file.  When I ran the whole workflow, the file worked as it should with the double data type.  

Labels