There are some things that we would like to do with the download tool that currently are not possible:
1. use client side certificates to sign requests. This is a requirement for us in a project where we are interacting with the API of our customer's financial system. They provide us with a certificate and it is used to sign our requests, along with other authentication.Currently we have to use the external command tool to execute a powershell script using invoke-restmethod to do this interaction. I would much prefer to not have more tools in the chain.
Client certificates are described in the TLS 1.0 specification: https://tools.ietf.org/html/rfc2246 (page 43), and I believe them to be supported by cURL.
2. Multipart form-data. We have a number of workflows where we need to send multipart form data as part of a POST request. As this is not supported by the Download Tool, we have again used the external command tool to execute invoke-restmethod or invoke-webrequest in powershell. I don't know if modification of the Download Tool would be the best thing here, vs having either a dedicated tool for multipart form data, or having an HTTP POST specific tool that was able to handle multipart form data. What I envisage is something like the formula tool, with the ability to add an arbitrary number of cells, where we could use formula to either directly output a value from a column, synthesise a new value, or directly enter a static value). The tool would then compose this with boundaries between the parts, and calculate the content-size to add to the http request.
This has probably been mentioned before, but in case it hasn't....
The dynamic input tool is useful for bringing in multiple files / tabs, but quickly stops being fit for purpose if schemas / fields differ even slightly. The common solution is to then use a dynamic input tool inside a batch macro and set this macro to 'Auto Configure by Name', so that it waits for all files to be run and then can output knowing what it has received.
It's a pain to create these batch macros for relatively straightforward and regular processes - would it be possible to have this 'Auto Configure by Name' as an option directly in the dynamic input tool, relieving the need for a batch macro?
This has probably been mentioned before, but in case it hasn't....
Right now, if the dynamic input tool skips a file (which it often does!) it just appears as a warning and continues processing. Whilst this is still useful to continue processing, could it be built as an option in the tool to select a 'error if files are skipped'?
Right now it is either easy to miss this is happening, or in production / on server you may want this process to be stopped.
With the new intelligence suite there is a much higher use of blob files and we would like to be able to input them as a regular input instead of having to use non- standard tools like Image, report text or a combination of directory/blob or input/download to pull in images, etc. I would like to see the standard input tool capable of bringing in blob files as well.
Please upgrade the "curl.exe" that are packaged with Designer from 7.15 to 7.55 or greater to allow for -k flags. Also please allow the -k functionality for the Atleryx Download tool.
(TLS) By default, every SSL connection curl makes is verified to be secure. This option allows curl to proceed and operate even for server connections otherwise considered insecure.
The server connection is verified by making sure the server's certificate contains the right name and verifies successfully using the cert store.
Please enhance the dynamic select to allow for dynamic change data type too. The use case can be by formula or update in an action for a macro. If you've ever wanted to mass change or take precision action in a macro, you're forced to use a multi-field formula. It would be rather helpful and appreciated.
We need color coding in the SQL Editor Window for input tools. We are always having to pull our code out of there and copy it into a Teradata window so it is easier to ready/trouble shoot. This would save us some time and some hassle and would improve the Alteryx user experience. ( I think you've used a couple of my ideas already. This one is a good one too. )
When training people on the use of action tools, something that I always have to hit on is that when you are telling the tool which piece of the XML that you are adjusting, it's sort of difficult to tell what you have selected, and super easy to accidentally select something else.
When you initially select the action to take it's this nice Blue Color. However, it still doesn't feel exactly like you have actually selected anything or told the Action Tool what to do, since it's so easy to just select any other one of these actions.
A slightly different problem is that if you are selecting an action that has been previously configured, it is just this light grey color. So it can be easy to accidentally change your settings because you may not realize it's actually set up.
Here is a recent community post that sort of outlines a few of these problems.
Who needs a 1073741823 sized string anyways? No one, or close enough to no one. But, if you are creating some fancy new properties in the formula tool and just cranking along and then you see that your **bleep** data stream is 9G for nine rows of data you find yourself wondering what the hell is going on. And then, you walk your way way down the workflow for a while finding slots where the default 1073741823 value got set, changing them to non-insane sized strings, and the your data flow is more like 64kb and your workflow runs in 3 seconds instead of 30 seconds.
Please set the default value for formula tools to a non-insane value that won't be changed by default by 99.99999% of use cases. Thank you.
Wanted to control the order of execution of objects in Alteryx WF but right now we have ONLY block until done which is not right choice for so many cases
Can we have a container (say Sequence Container) and put piece of logic in each container and have control by connecting each container?
Hope this way we can control the execution order
It may be something looks like below
The R tool has AlteryxProgress() and AlteryxMessage() functions for generating notifications in the Results window https://help.alteryx.com/current/designer/r-tool, however the Python tool does not. Since I'm writing more Python code than R code I'd like to have similar functionality available in the Python tool, e.g. an Alteryx.Progress() function and an Alteryx.Message() function.
While Alteryx allows for a proxy username and password in the settings, these are not passed properly to an NTLM proxy. Support for NTLM authentication would be incredibly useful for a number of corporations who utilize this firewall setup.
We currently have to either download via Python or cURL through batch commands called by Alteryx. Since Alteryx uses a cURL back-end, this should be a fairly simple addition to the existing download tool by allowing a selection of proxy server, port, and authentication method in addition to the proxy username and password. This could be done either in the tool itself or in User Settings.
I've seen this question before and have run into it myself. I'd like to see a new tool that would allow a developer (of a workflow) to choose a path of logic based upon criteria known only during the execution of a module.
If LEFT INPUT Count of records < 10,000 THEN Path1 (e.g. use a calgary join)
ELSE Path 2 (e.g. use a standard join)
When we try to call external web site from Alteryx Designer Download tool, our company proxy server failed the authentication because Alteryx uses the basic login/password authentication. This has happened to multiple applications that need to interact with external partners. Will like to request an enhancement to enable Alteryx to authenticate using Kerberos or NTLM.
Was very happy to see the Bulk Loader introduced for Snowflake during last release. This bulk loader is specifically available for Snowflake environments that are hosted on AWS, but does not provide functionality for those environments using Azure. As Snowflake continues to build momentum, I imagine this will be a common request. Is there something in the pipeline to add this functionality?
For an interim solution, we will be working toward developing some generic scripts/snowsql to mimic that bulk load, but ultimately we'd love to have this as part of the tool.
Idea: Prompt the user to find a missing macro instead of the current UX of a question mark icon.
Issue: When a macro referenced in a workflow is missing, then there is no way to a) know what the name of the macro was (assuming you were lazy like me and didn't document with a comment) and b) find the macro so you can get back to business.
When this happens to me know, I have to go to the XML view and search for macros and then cycle through them until I find the one that's missing. Then I have to either copy the macro back into that location or manually edit the workflow XML. Not cool man.
Solution: When a macro is missing, the image below at the right should be shown. In the properties window, a file browse tool should allow the user to find the macro.
The Alteryx.Flexnetoperations.com license management site needs major work.
On the View Licenses page it shows all licenses going back several years. A basic need is to show only licenses which haven't expired, but that is not an option.You cannot even sort on the expiration column while you can sort on most others columns.
The most simple need is to see a list of my current active license users - but I do't see a way to do that.
I tried an "Advanced Search" and chose expiration date after 2019-10-29 and none of my licenses which expire in 2020 appear - I get a blank list.
Similarly on the administer machines page you cannot filter to hide expired licenses or even on the licenses column (which doesn't sort either).
The help link on the page doesn't bring you to help specific to that page but the general activation help front page. After several clicks I found this page:
But the help is incomplete (doesn't list Machine types or the difference between Active and Inactive)
Also, there is no export capability - copy and pasting into Excel is a formatting headache as it brings in check-boxes.
Lots of room for improvement here.
P.S. I understand that work is being done on this, but an ETA would be greatly appreciated.
The Download tool allows for encrypted SFTP connections, but I recently discovered (the hard way) that the Alteryx capabilities are incomplete and the algorithms not fully up to date. Just adding an additional updated algorithm or two to the 4 available for message authentication would bring it up to date.
As back story, our firm has onboarded a new SFTP server, and all of a sudden my Alteryx SFTP workflows didn't work when I pointed them at the new server. After going back and forth extensively with the helpful folks at Alteryx, we discovered there's a gap in Alteryx's current capabilities.
Basically, the Alteryx download tool can use the old encryption algorithm and half of the new version, and half of the new version is like having half a bridge.
Up until 2017, SHA-1 was the most common hash used for cryptographic signing. Since then it's been slowing getting supplanted by SHA-2.
Alteryx can use SHA-2 for key exchanges, but not for message authentication (the HMAC algorithm). The internet seems to swear up and down that the old SHA-1 algorithm works just fine for message authentication, but I don't have the luxury of caring about that. All I know is that as of March 2019 the SFTP server I have to connect with has deprecated Alteryx's SHA-1 algorithm as being too out of date and only allows the new SHA-2 message authentication.
Alteryx CAN use the up to date SHA-2 for key exchange (GOOD, halfway there!) but can only use (old) ways of doing message authentication that do NOT include SHA-2 (NOT GOOD!). Please add updated SHA-2 algorithms (hmac-sha2-512, hmac-sha2-256) to the HMAC mix too!
At the moment if a part of your python code takes more than 30s to run, Jupyter times out and Alteryx cancels the workflow. This makes the Python Tool unusable for anything intensive and the timeout should be removed by default or be configurable per workflow.
I've made this idea as none of the solutions in these threads feel satisfactory:
Figuring out who is using custom macros and/or governing the macroverse is not an easy task currently.
I have started shipping Alteryx logs to Splunk to see what could be learned. One thing that I would love to be able to do is understand which workflows are using a particular macro, or any custom macros for that matter. As it stands right now, I do not believe there is a simple way to do this by parsing the log entries. If, instead of just saying 'Tool Id 420', it said 'Tool Id 420 [Macro Name]' that would be very helpful. And it would be even *better* if the logging could flag out of the box macros vs custom macros. You could have a system level setting to include/exclude macro names.
Thanks for listening.