This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
It would be awesome if Alteryx was able to accept Lat/Long and be able to tell you what timezone that location was in and be able to do time analysis from there.
This relates to a conversation that we were having at Inspire with @KatieH @BenG @AdamR and @NickJ
As you bring in data - if you could right click on a column and say "this is a long/lat", then from that point onwards, Alteryx could automatically do interesting things with this like treating this as a piece of spatial info.
This is part of a broad category that @JayB referred to in his keynote of ways that we can eliminate or streamline the cleanup jobs that we do as BI professionals so that we can spend more time on analysis.
This is very true, but even if i set these coords to Lat/Long, Alteryx doesnt seem to have a time-zones ability to pull in that Lat/Long and tell me which time zone it is in. I think that would be an extremely valuable tool!
:-) Great point. Once Alteryx is told "This is a latt / long" there should be a whole bunch of enrichments you can do on the data very quickly and easily.
@Nick_Yarbrough This could be done with using Google Maps API that has a timezone lookup based on lat/long:
An Download tool could be written that takes the latitude and longitude from Alteryx, sends it to the API, gets a response and processes the result. I have just skimmed the docs and not yet written to the Google API, but I know an API key is needed. This might be a central API key (one key to rule them all) or a key that the user enters outside the macro as a parameter. Second method seems preferrable.
The advantage of this is that Alteryx would not need a local lookup of timezones. That lookup would have to be a horrible set of polygons defining each timezone, so maintaining that becomes a headache. With a remote lookup, you're "leveraging big data", which is, apparently, rather hip now.
I have a demo macro now that uses the Google API. Google's API will give you 2500 calls a day for free, and you have to pay after that. Let me know if you would like to see it.
Thanks! That's a solution in hadn't thought of, I would definitely be interested in taking a look!!
Here's a quick screenshot of what pops out of the Google API, and hence the macro (more message text follows the picture)...
Screenshot of Output
I have included a pass-through ID field, so that the user can knit the results back into the source dataset.
The above test Latitude and Longitude values are randomly generated degree values, and hence could be out at sea. The Google API returns ZERO_RESULTS when that happens. As there is a lot of sea on the planet, there are a lot of empty results coming back, but if you are using actual lat and long values that you know are on land, you will not get this noise in your data.
The UnixTimestamp might be a bit cryptic. This is the number of seconds since 00:00:00 01-JAN-1970 UTC. You can convert a DateTime to UnixTimestamp using this formula...
DateTimeDiff([MyUTCDateTime], '1970-01-01', 'seconds')
You just have to make sure you convert the inbound DateTime object to UTC before you work out the Unix time, and then feed that Unix time to the macro.
Performance is surprisingly good, not because of anything I have done, but because of Google's heavy-duty API infrastructure. 100 records take about 1.5 to 2 seconds, and that's on my scrubby old machine with Sky domestic "broadband".
Let me know what you think,
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.