Good morning all -- I have a workflow that is designed to do the following:
- Use a Directory Tool to find the most recently delivered .gz file to a network drive
- Use a Dynamic Input tool to extract the data from the .csv zipped inside the .gz tool
- Perform some transformations and load the results into a data warehouse.
This workflow has been working flawlessly until recently when the .gz files started containing files over 10GB or so in size (unzipped).
I get an error on the Dynamic Input tool: "Insufficient Disk space - extracting this file will result in less than 10% (or less than 10 GB) of free disk space in your Alteryx temp folder."
This appears to be a different type of error than most of the other threads and Alteryx team member posts regarding Temp memory management and I'm at a loss. I can extract the .csv from the .gz file and run the workflow using the .csv as an input instead, but I'd prefer to continue using the .gz input if possible.
The error appears immediately as the workflow is executed so it seems there's an immediate comparison between the assumed size of the data and my available disk space. But since I can extract the file manually, and use it as input, it almost seems as though the assumed size of the data is wrong.
From what I understand, there isn't a Windows-managed "Temp folder size" so I'm not sure what the reference to <10GB or <10% could even really mean, if not overall disk space.
Any ideas on how I can overcome this, or else "adjust" the amount of data allowed to reside in the Temp folder?
Some possibly helpful detail:
- The option in the Dynamic Input file template to extract files >2GB is checked
- There are a few blocking tools in the workflow (Uniques and Autofield) but even running the data input section by itself without any other tools on the canvas triggers the error