This site uses different types of cookies, including analytics and functional cookies (its own and from other sites). To change your cookie settings or find out more, click here. If you continue browsing our website, you accept these cookies.
The Run Command tool is a great way to take your workflow to the next level of efficiency. It allows you to interact with the command line directly, just as you would if you were to access it manually and type in a command. Which is great because sometimes we have a lot of important things to do in the command line.
So we’re now downloading all the network-shared documents we want thanks to instructions posted on our Knowledge Base, and we’re on our way to mastering FTP in Alteryx. But what if we want to take it a step further? A lot of our users rely on FTP as a drop zone for datasets that are generated periodically (e.g. weekly, monthly, or quarterly datasets). We should then be able to schedule a workflow to coincide with those updates, automatically select the most recent dataset, crank out all the sweet data blending and analytics we have in our scheduled workflow, and proceed with the rest of our lives, right? Right. We can do just that, and with a little work up front, you can automate your FTP download and analysis to run while you’re enjoying the finer things in life. Here’s how in v10.1:
The following steps detail how to obtain a client ID, client secret, and refresh token that can be used for authentication with Google related tools.
1. Open the Google Developers' Console
2. Login with the Google account associated with the data you would like to analyze
3. Create a new project by clicking the My Project dropdown (top-left corner) and selecting Create project (top-right corner of the pop up
4. Enter a Project name of your choosing and click Create
5. If you have not already enabled the Google API you will be working with, you can do so by navigating back to the webpage we started on, the Console Dashboard, and clicking Enable API:
For Google Analytics:
Other popular APIs >> Analytics API
For Google Drive:
G Suite APIs >> Drive API
6. After you've confirmed that your API is enabled you can obtain API credentials by returning to the Console and clicking on Credentials in the left-hand navigation pane next to the key icon
7. Click on the Create Credentials dropdown and select OAuth client ID:
8. Select the Web application radio button and add https://developers.google.com/oauthplayground as an Authorized redirect URI before clicking Create
9. At this stage, a pop up should appear where you can copy and save your Client ID and Client Secret
You can also find your Client ID and Client Secret by returning to the Developer's Console >> Credentials and clicking the name of the app we just created:
10. Go to https://developers.google.com/oauthplayground
11. Click on the gear icon in the top-right corner of the page and click the checkbox for Use your own OAuth credentials, enter the client ID and client secret from step 13, and click close
12. Copy/paste the respective scopes into the Input your own scopes field and click Authorize APIs
For Google Analytics
For Google Sheets
https://www.googleapis.com/auth/drive, https://www.googleapis.com/auth/drive.appdata, https://www.googleapis.com/auth/drive.readonly, etc
14. Click Allow
15. Click Exchange authorization code for tokens and save the Refresh token
16. Test the authorization by sending a request for an available operation from List possible operations
17. If successful, the client ID, client secret, and refresh token that you obtained in the prior steps can now be used for authentication with the Google related tools
Web scraping, the process of extracting information (usually tabulated) from websites, is an extremely useful approach to still gather web-hosted data that isn’t supplied via APIs. In many cases, if the data you are looking for is stand-alone or captured completely on one page (no need for dynamic API queries), it is even faster than developing direct API connections to collect.
Fact: workflows are the best. Look it up. They’re all about getting things done and, with hundreds of tools and the ability to integrate external processes , there’s no shortage of things you can get done. We know that there are some areas of analytics that require a little extra firepower, however, and that’s why you can leverage your workflows in apps and macros for added functionality.
API connections give access to many web-based applications, database systems, or programs by exposing objects or actions to a developer in an abstracted format that can easily be integrated into another program.
Connecting to Google Analytics is becoming more and more popular. There are a few things you need in order to use the Google Analytics macro; a Google Account (e.g., Gmail) and authorized access to an existing Google Analytics account. This article will help you get the rest of the way.
Hello #AlteryxAddicts, tl;dr tl;dr - Alteryx Partners leveraging the Designer API via .NET should not upgrade to Alteryx Designer 11.8, 2018.1, or 2018.2 because doing so will render API integration inoperable. The issue will be addressed in Alteryx Designer 2018.3 (expected GA end of August 2018) at which time users can safely upgrade. FAQ What Happened? We uncovered a previously unknown issue that renders the Designer API from .NET inoperable for Alteryx Designer 11.8, 2018.1, & 2018.2 (NOTE: API code itself is sound, the issue is 100% attributable to licensing). What is the Designer API? This notice is specifically for APIs accessed via the .NET assemblies such as “Allocate.Net.dll” and “AlteryxAPI.Net.dll”, documentation found on page 1 of: [installdirectory]\Alteryx\APIs\readme.pdf. Why did this happen? In our work to improve the Alteryx Designer Licensing experience, we inadvertently broke the ability to license API in Designer when using the .NET assemblies. This issue was identified and resolved in a timely manner and is expected to be resolved in Alteryx Designer 2018.3. What will this problem look like? - The user will receive a message that the API is not licensed. Who's Impacted? - Anyone leveraging Alteryx API on Designer 11.8, 2018.1, and/or 2018.2. Who's Not Impacted? Anyone leveraging Alteryx Designer API in Alteryx 11.7 and older. Alteryx Designer APIs accessed via C++ or the CLI were not impacted and will continue to work as expected. When will the issue be fixed? Alteryx Designer 2018.3 (expected GA end of August 2018) Next Steps? If you are using 11.8, 2018.1, 2018.2 and the Alteryx API, this issue directly impact you. We recommend downgrading to 11.7 until 2018.3 is released. Otherwise, wait until 2018.3 is released and use API integration with it. NOTE: You can also use Alteryx 11.7 and older without interruption until you choose to upgrade to 2018.3 (or future releases). Questions? Contact email@example.com Alter Everything! -- Alteryx Product Extensibility team
In a workflow, not too far, far away...
Structured data has vanished. In its absence, the sinister Dirty Data Order has risen from the ashes of the Server and will not rest until Data Analytics have been destroyed.
With the support of the Alteryx Engineers, Solutions Engineer Tony Moses leads a brave RESISTANCE. He is desperate to find structured data and gain its help in restoring blending, joining and analytics to the galaxy.
Tony has sent his most daring Community Leader, Matt DeSimone, on a secret mission to Jakku, where an old ally has discovered a clue to the structured data whereabouts....
Welcome to the Star Wars universe!
Ever wanted to know the most important details of your favorite characters from Star Wars? Me too!
Our generous friends, Paul Hallett and team, have given us the Star Wars API - the world's first quantified and programmatically-accessible store of Star Wars data.
After hours of watching films and trawling through content online, Paul presents us all the People, Films, Species, Starships, Vehicles and Planets from Star Wars.
The data is formatted in JSON and has exposed it to us in a REST implementation that allows us to programmatically collect and measure the data.
Now, how was I able to retrieve this treasure of information via Alteryx? Easy! I've built a REST API connection using the Download Tool to pull information based on a user inputted query in an Alteryx Application (attached as v2018.1 Star Wars.yxwz).
Normally, once having retrieved JSON formatted data, structuring and parsing the data would be a nightmare! With Alteryx, this is just one tool away. The JSON Parse Tool allows you to identify the JSON field, in this case our download data field, and easily extract Name and Value columns. From there it's some simple formatting and using the reporting tools to present us a nice clean composers file (pcxml).
Man, if only the Rebels could process information as fast as Alteryx then they wouldn't have had to send poor R2 to find Obi Wan.
I'll be bringing you, the Alteryx Community, updates of the app with each new movie release!
I hope you enjoy the API and may the Force be with you!
If you haven’t used the Run Command Tool just yet, that’s great. It means that whatever your analyses required, we had it covered with basic Designer functionality. But in spite of how great the Designer is, it just can’t do everything. There is a utility on your computer that can do just about anything, however, and it’s the command line . The Run Command Tool pairs the two into a dynamic tag-team duo that can wrestle all the computation you could need into one, integrated, Designer workflow:
Question Does Alteryx support web crawling?
Yes. In Alteryx you can look at a web page, find embedded links (e.g. using regular expressions), and add to a queue of "links to visit". Then continue visiting/adding indefinitely, while also extracting various other tidbits of interest from each page visited.
In a Text Input Tool, enter URLs to crawl. Alteryx can take the URLs from a data stream (a database where we have all of the URLs we want to crawl) and iteratively repeat the process of connecting and getting the code beneath that URL:
Use the Download Tool and point it to a web address:
Alteryx returns the whole content available for that URL:
The attached v10.0 workflow allows you to connect to wikipedia and "crawl" the content of that URL. It can be saved, parsed etc. Additional functionality may be added to create a very powerful crawling engine.
Question Can you wait X seconds between processing each row in Alteryx?
Yes! Thanks to Invisio for creating an Inviso Macro Pack and posting on their blog here.
The "Wait a Second" macro lets you wait X number of seconds before processing each row in the dataset.
One application is if you are contacting an API with multiple requests. The WaitAsecond macro may help to pause the API long enough to process multiple rows without issue.
It can also be used to scrape sites without putting heavy loads on their server. An Invisio sample of scraping the Alteryx community (See Insights to the Alteryx Community)
As you can see, the part of the flow that runs through the WaitASecond tool gets a NOW timestamps which are 5 seconds a part, whereas the bottom stream, not running through the WaitASecond tool, all gets the same timestamp.
There are essentially two macros:
The first one assigns a unique id to each record and then uses that ID for the batch macro.
The batch macro has a “Command tool” that runs a ping that waits x seconds before timeout (184.108.40.206 if that exist in your network it won’t work).
The macro can be downloaded here (InvisoMacros.zip).
To do your best data blending, it is a critical need to have the flexibility to connect to as many data stores as possible. No puzzle reveals a complete picture without all the pieces in place, and the same adage holds true in analytics. While we’re proud to boast a list of supported input file formats and data platforms that may even be large enough for database storage itself, unfortunately, in the ever expanding world of data you just can’t catch them all. Enter the Download Tool . In addition to FTP access, this tool can web scrape or transfer data via API (check your data source – there’s almost always an API!), giving you access to even the most secluded data stores. With the examples compiled below, and the wealth of data accessible on the web, you can turn nearly any analytical puzzle into the Mona Lisa :