This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
I'm trying to load a large Json file that's about 100MB in size, but for some reason it only reads about 4% of the total data giving me a few thousand records.
Is there a limit to the size of Json file that Alteryx reads or just the way the file was encoded?
For example the encoding methodology could set up any breaks within the dataset?
Due to the size of the data, it cant be read as a csv file.
But when I ran a smaller dataset, reading as csv format gives 8,031 records whereas input as a json file gives me straight 5,155 records (shown in screenshot).
Maybe some python script to read json file would help too?
Any idea would be very much appreciated, Thanks
@parisli have a look at this article which may help:
This post will help.
Simple tool that works with Designer using a free macro, handles any size JSON data set very easily.