This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
If you import the file as plain text using the Input Tool, you are then able to use the Formula Tool to update the text within the KML file. Even better, you can pull in all KML files using the Input Tool and a wildcard character, attaching the file name to each of the records, then output dynamically as well.
I created a write-up on the Knowledge Sharing site that has a bit more explanation as well as an example module to download, check it out!
This same methodology can be utilized to recover a csv with import errors caused by an extra delimiter. Normally, if you select the "treat errors as warnings" the csv will still process, however, the row with the extra delimiter will be blank. If you set the delimiter in the input tool to "\0" then use the parse tool right after, its just a matter of cleaning to recover the once lost information.
I use this all the time when I receive dirty data from clients.
Hi Chad, thanks for your help and the example, works very smoothly.
One remark for other people trying to do the same though: pls check the "Field Length" under the CSV input options. A KML polygon is easily more then the standard 254 characters. So by the default settings, it would get truncated.
I've also used this trick when there are too many delimiters but instead of parsing, I use the "Regex_CountMatches" formula function to count all the delimiters in each row, a Summarize tool to see how many different values I have, and then a Filter to pull out the records where there are too many (or too few) delimiters. Now I can fix the problems if they're obvious or go back to the data supplier if they're not -- but with specific problems, not "your data file is dirty"!