Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Download file wont extract URL's inputed in text file

zaina1498
7 - Meteor

Hi, I'm unsure why the download tool won't download the URL's properly and extract them and output them. Is there anything I can do to fix this? Why might this issue be occurring? I've attached the workflow below, any help would be greatly appreciated!

21 REPLIES 21
TheOC
15 - Aurora
15 - Aurora

@atcodedog05 

Yeah! Very useful, figured out this was the issue with a previous problem that I worked on, as Postman was allowing the requests to go through, but they weren't on Alteryx. Postman automatically allows some headers, such as user-agent (in their case, the value is postman x.x).

Worth noting if you're using the download tool!


Bulien
zaina1498
7 - Meteor

Hi @TheOC ,

 

Would greatly appreciate it if you can! I have the Alteryx Designer 2020.3 x64 version

TheOC
15 - Aurora
15 - Aurora

@zaina1498  Hi,

I am on the same version, and weirdly I do not get this issue. There is only a few other things I can think of, but I'm not that sure at this point.

Has the machine you're running it off got a stable internet connection?

Is it potentially a firewall stopping you doing the requests?


Can you troubleshoot by running my attached workflow? This should just run for one of the rows, just to see if that works.

As you can see from the screenshot, my workflow is a success and displays a result:

TheOC_0-1602176361791.png

 


Bulien
zaina1498
7 - Meteor

Hi @TheOC,

 

This is my work laptop so I'm not sure about potential firewalls. And yes your new workflow does run for me, for the one link but if i increase it i get the same error, So frustrated why this seems to happen. Do you think I could extract the same info. I need from the python tool. Not sure how to use that tool because I've never used it before. 

TheOC
15 - Aurora
15 - Aurora

This one is weird to me, to be honest!

I have put a filter on, and as far as my side of things goes, none of the download headers are null:

TheOC_0-1602179065716.png

TheOC_1-1602179077542.png

 


if you go into the sample tool in the last workflow and change it to 50, or 100, does it still work?


Bulien
zaina1498
7 - Meteor

Nope! It didnt even go through 30.  Do you think I could extract the same info. I need from the python tool. Not sure how to use that tool because I've never used it before. 

TheOC
15 - Aurora
15 - Aurora

You definitely could get the data from the python tool, but I suspect you will run into the same issue. Just to check, do you happen to know how much RAM your machine has? and have you been find running big workflows before?


Bulien
zaina1498
7 - Meteor

I have a 16GB Ram and I have run other workflows before which also go through a lot of URL's like this one, but there didn't seem to be a problem with that. Would you be able to explain how I would implement the same thing using the Python tool? Again, thank you so much for your help!

TheOC
15 - Aurora
15 - Aurora

@zaina1498  okay so that's literally mostly everything I can think of ruled out as to why it would run on my machine and not your own. And not a problem at all for the help!

As far as the python tool - I have yet to touch that tool, but as far as I understand it will just run python scripts, you would be looking at something like Beautiful Soup , to run for each URL brought in.

What I would like to try first, is a Batch Macro, just to see if it works, or gives any clues, running through them iteratively.

Please see attached a workflow with a macro, you may need to import the macro

 

TheOC_0-1602180675068.png

 

If you have to bring in the macro, then it should have both connectors from the original file, and the only thing that needs changing is the control parameter in the questions tab:

TheOC_1-1602180710894.png


Give that a whirl, hopefully it works! I've tested it on my side and it works fine!

 


Bulien
zaina1498
7 - Meteor

This ran for me perfectly! It took some time but finished running successfully! thank you soo much! You were a huge help today!!

Labels