This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
hi @harshal98 I believe this is simply a Memory limit on your machine. Here i tested with 2.5gb of data in Alteryx, and imported it into the Python tool the same as you:
however, If you check task manager, it is definitely not 2.5gb of data:
And Python needs to store it in RAM.
Honestly, my recommendations would be to cleanse the data as much as you can prior to importing into the Python tool. Sadly Python has a much smaller limit to the size of data (RAM
dependant), than Alteryx, as it is not compressed the same way.
The only other thing i can think of, is splitting the data into multiple chunks, and inserting it into the python tool like that, then running a loop within the python tool, to iterate whatever code you are trying to run, on each chunk of data. As you will be writing over the 'df' each time, this should reduce the memory required.