Community Spring Cleaning week is here! Join your fellow Maveryx in digging through your old posts and marking comments on them as solved. Learn more here!

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Using Download Tool to get file from site with login

Troy
8 - Asteroid

I'm not having much luck of going to a web site and getting past the login form using the Download tool.  See picture of the Payload tab of the Download tool configuration.

 

Process:

1.  Goto website (in this case it is freshbooks.com)

2.  login with credentials (I can't seem to get past this point)

3.  get report data from page and put into process

 

Any suggestions would be appreciate as well as how to troubleshoot the download tool.


Thanks!

8 REPLIES 8
AndrewW
11 - Bolide

I had to do something similar last week and ended up writing a simple script using Curl.exe and the Run Command tool instead of the Download tool.

michael_treadwell
ACE Emeritus
ACE Emeritus

You will more than likely have to utilize the FreshBooks API https://www.freshbooks.com/developers to authenticate and access information from your account.

 

The best way that I have found to troubleshoot the Download tool is using Fiddler http://www.telerik.com/fiddler to monitor the requests it sends.

Troy
8 - Asteroid

Thanks for this after lots of tinkering got to work. This is more flexible and I can't see why it wouldn't work on any web site.  

 

For the next person, here is my cURL script

 

curl -c cookies.txt -X POST -F username=<Your_UserName> -F password=<Your_Password> https://<URL of login page> - H "Accept: application/json" --next -X GET https://<URL>/internal/export_Report_EstimatedBilling?format=excel --output text.xls

 

which does the following:

1. Goes to the URL of the login page and posts the user name and password

2. Goes to the report and gets the XLS file

3. Saves the file to text.xls

 

Also sure there are many ways but I put the above in a batch file and added a pause for debugging and called with the run command in my workflow.

 

pratapganpam
5 - Atom

Hi,

 

I am new to Alteryx, completed my core certification. Here I am also require same kind of help.

 

Process:

Step 1: Need to login into one website with my credentials.

Step 2: Need to search with one 'tag word' on one particular search field of that page

Step 3: Compare the results with the other tag name and select the link

Step 4: Open the links and process the output to alteryx

 

Please elaborate the steps involved or possibly with a small work flow about dealing with login and opening the links that are part of the page.

 

Thank you in advance

Adam12
5 - Atom

Troy, can you upload your Curl Command workflow? Thanks!

Adam12
5 - Atom

(For #cdgogan.)

 

I believe I posted this when I was trying to get a WF working that would download multiple CSV files from an Amazon S3 site into Designer for cleanup then load to tables in SQL Server. I do have that up and running now, though it's never been 'easy' or without versioning issues. My solution did not include cURL and required hard-coding Credentials in the Server version of the WF to my login credentials for my desktop (rather than AD) since the cleanup work happens using my desktop's memory. The WF itself is simple, but getting all the configurations just right is another matter.

 

I'm attaching a sample WF that includes the tools needed to download from amazon s3 URL, load the CSV to desktop temp space, clean up the file as needed, then load to DB. I'll not be surprised if additional Q&A is needed for you to get it working with your own solution.

velisetty
6 - Meteoroid

@Troy - can you please upload a sample workflow? 

Which tool are you using to write the script?

Troy
8 - Asteroid

See attached picture but here are the full details (obscured with <FIELD> where needed)

 

Output: blank

Command: C:\Users\<USER>\AppData\Local\Apps\cURL\bin\curl

Command Arguments:  -c cookies.txt -X POST -F username=<USERNAME> -F password=<PWD> https://<COMPANY>.freshbooks.com -H "Accept: application/json" --next -o ..\Input\download-freshbooks.csv "https://<COMPANY>.freshbooks.com/internal/export_Report_EstimatedBilling?project_status=_all&date_start=01/01/13&group_by=team&billed_filter=&submit=submit&format=csv"

 

This goes to first url, submits USERNAME and PWD and then goes to the (--next) page that returns a CSV file.

 

I didn't use a tool to get the URL other than Chrome Inspect.  Navigated to the website and got the URL from the reporting page I needed.  Took some trial and error.

 

Hope this helps some.

Labels