I currently have a workflow that uses an API to run a report and returns a URL that contains the report as a csv. When I push the URL back through the download tool and output to a string I get the following error "Error in libCURL Data Callback: Data too long to fit in field. Write to a file instead."
I tried to download to a temporary file but I am not sure how to access that data after because it needs to be further manipulated. I also tried to download to a folder on my computer, but received an access denied error and I'm unsure if my syntax was incorrect or if there is an issue in my settings.
I would also like to store this workflow on my Alteryx Server and schedule daily refreshes so I am unsure if the download to a folder technique would work.
Any feedback is appreciated. Thanks!
Solved! Go to Solution.
Create a local workbook where you have access;
In a text input put the url and fullpath with the file name and put it in the download option and it will work.
On the server you will have to replace it with a folder with the UNC address \\servidor\folder\file name.???
and the alteryx server service user will need to have access to that folder.
@ryannschuessler I would try the temporary file path. I've setup a simple example using a csv from data.gov
After your download tool, use a dynamic input tool to read your file. To setup the dynamic input, you may need to run the workflow. The download tool saves the csv to a .tmp file in your temp space. The download tool will put the full path in the DownloadTempFile as shown in the pic. You can copy that path into the source template section of your dynamic input.
You can also see I configured the dynamic input to use the downloadtempfile field and change the entire path so that it will pick up the newly downloaded file whenever the workflow is run. In case the dynamic input tool complains about not being configured, you can try running the workflow anyway to see if it resolves itself.
Hope that helps!
Hey, @ryannschuessler. I love the replies from geraldo and Patrick. If the Download tool is too difficult for this scenario, I'd take the approach of using the Python tool since downloading files with Python is fairly straightforward in many cases. I recommend using the requests library.