Bring your best ideas to the AI Use Case Contest! Enter to win 40 hours of expert engineering support and bring your vision to life using the powerful combination of Alteryx + AI. Learn more now, or go straight to the submission form.
Start Free Trial

Alteryx Designer Desktop Ideas

Share your Designer Desktop product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Featured Ideas

Using the download tool is great and easy to use. However, if there was a connection problem with a request the workflow errors out. Having an option to not error out, ability to skip failed records, and retrying records that failed would be A LIFE CHANGER. Currently I have been using a Python tool to create multi-threaded requests and is proven to be time consuming.

I'm converting some old macros that were built in 8.6 for use in 10.6 and found something that could potentially be changed about the Alteryx .xlsx drivers. Specifically this refers to the field types passed from a Dynamic Input tool when the data fields are blank when using an .xlsx file as the data source template. 

 

When the new .xlsx file has fields populated, it works fine. If data is not populated in the new file, the field types are converted to Double[8]. This doesn’t cause a problem for the Dynamic Input, but it can cause problems for downstream tools. In the second case below, the field names are retained but the downstream Join tool errors when trying to join this now double field to a string (rather than returning no results from the join output as desired). This also occurs when a single field is empty, only that field will be converted to a Double[8] field. When the legacy .xlsx drivers are used, the field types are retained from the data source template.

 

File Source Template vs. file that is returned upon running

XLSX Drivers - Populated Values.png       XLSX Drivers - Empty Values.png

 

There are other solutions for this scenario such as using a Select tool with a loaded .yxft file of the correct field types, or selecting the Legacy .xlsx drivers from the File Format dropdown when configuring the Dynamic Input. However I thought this is something that could be improved about the Alteryx .xlsx drivers. 

Hi all, 

 

I'm trying my best to think of the most secure way to do this and struggling within Alteryx using the Download tool in its current format. 

 

I am using an Internal API Manager to retrieve data but this particular API requires additional "Headers" values for username and pw beyond my standard OAuth2 flow to the API Manager. Now I can run this locally but in order to save this down to our network as a workflow or to ideally run it from Gallery I should not be leaving credentials in open text anywhere so that anyone looking at the workflow or the underlying xml can grab these creds. Surely quite an easy one to mask or can this be made more dynamic to retrieve credentials from a Key Vault for example? e.g. Azure Key Vault?

 

Can we add masking to the Download Tool Header Values?

 

Thanks,

Ciaran

Hello gurus - 

 

I think it would be an important safety valve if at application start up time, duplicate macros found in the 'classpath' (i.e., https://help.alteryx.com/current/server/install-custom-tools, ) generate a warning to the user.  I know that if you have the same macro in the same folder you can get a warning at load time, but it doesn't seem to propagate out to different tiers on the macro loading path.  As such, the developer can find themselves with difficult to diagnose behavior wherein the tool seems to be confused as to which macro metadata to use.   I also imagine someone could also arrive at a situation where a developer was not using the version of the macro they were expecting unless they goto the workflow tab for every custom macro on their canvas.  

 

Thank you for attending my TED talk on the upsides of providing warnings at startup of duplicate macros in different folder locations.  

 

 

 

This suggestion is particularly relevant for macros and custom tools created with the Python SDK, but I think it can apply to other tools as well.

 

When searching for tools in Alteryx, I can easily find tools I want fairly quickly.  However, I often don't know which tool category it is in, which can sometimes slow me down (it is sometimes faster/easier for me to go to the tool category, rather than search for the tool I want).

As a quick example, I just installed the Word Cloud tool that @NeilR shared here: https://community.alteryx.com/t5/Dev-Space/Python-Tool-Challenge-BUILD-a-Python-tool-and-win-a-prize... .  I was able to find the tool really easily using search once it was installed, but in order to find the tool category, I either had to unzip the .yxi file and find out where it was, or click around through the tool categories until I found it (it was in the Reporting tools, which makes a lot of sense).

 

Could we add something either to the search window or to the description/config of tools which calls out where a given tool is in the Tool Palette?

It would be great to make R tool in Alteryx closer in interface to, let's say, RStudio. By this I mean - can we please have code auto completions, color highlighting of formulas/dataset names, and other useful interface details that make coding easier?

Hello AlteryxDevs - 

 

Back when I used to do more coding, some of the ORMs had the ability to return back to you a natively generated primary key for new rows created; this could be really useful in situations wherein you wanted / needed to create a parent / child relationship or needed to pass the value back to another process for some reason.  

 

As it stands now, the mechanism to achieve this in Alteryx is kind of clunky; all I have been able to figure out is the following:

 

1) Block until done 1.

1a) Create parent record. Hopefully it has an identifying characteristic that can be attached to. 

2) Block until done 2.  

2a) Use a dynamic select to go get the parent record and get the id generated by the database. 

3) Block until done 3. 

3a) Append your primary key found in 2a.  Create your children records. 

 

I mean it works.  But it is clunky, not graceful, and does not give you any control over the transaction, though that is kind of a more complicated feature request. 

 

Thanks!

 

Give me the ability to select a date range that limits the available selections for a date tool.  

The limitations should include:

Future dates only

Past dates only

Dates between [startdate] and [enddate]

Future/Previous # years

Future/Previous # months

Future/Previous # weeks

Future/Previous # days

I would the ability to run/call a module from within another module. I've thought of two solutions:

Simple - Add an option that currently exists in the Analytic App properties to "On Success - Run Another Analytic App". Instead this option would be to Run Another Module

Complex - Create a new tool that would have a single input that would accept a list of filepaths to Alteryx modules. The modules would be run sequentially (module 2 run once module 1 was finished).

Cheers,

John Hollingsworth

Many a times, we come across scenarios when the formula tool fails due to the change in the data type of the input fields.
For instance, a numerical calculation would fail or would not give correct result if the data type of a field was changed due to some reason(from double to string for example).
In such cases, we might have to change the change the data type in Select tool or add Tonumber() to the fields expression of the formula tool to make it work.

My proposal is to have a formula tool that should be dynamic to identify the purpose of expression and either add the Tonumber() expression while execution or convert the data type of the fields as per the requirements of the expressions in the formula.

 

 

1) Add cURL Options support from within the Download Tool

https://community.alteryx.com/t5/Alteryx-Knowledge-Base/File-Transfer-Protocol-FTP-Download/tac-p/41...

 

The Download tool allows adding Header and Payload (data), which are key components of the cURL command structure. However, there is no avenue to include any of the cURL options in the Download tool. Most of the 'solutions' found have been to abandon the use of the Download tool and run the cURL EXE through the Run Command tool. However, that introduces many other issues such as sending passwords to the Run Command tool, etc. etc.

 

With the variation and volume of connection requests that are being funneled through the Download tool, users are really looking for it to have the flexibility to do what they need it to do with the Options that they need to send to cURL.

 

2) Update cURL version used by Alteryx Download tool to a recent version that allows passing host keys

https://community.alteryx.com/t5/Alteryx-Designer-Discussions/Download-Tool-with-SFTP-and-multiple-K...

 

cURL added the option to pass host keys starting in version 7.17.1 (available 10/29/2007). The option is --hostpubmd5 <md5>:

 

https://curl.haxx.se/docs/manpage.html

 

And it appears that the cURL shipped with Alteryx is 7.15.1 (from 12/7/2005😞

 

PS C:\Program Files\Alteryx\bin\RuntimeData\Analytic_Apps> .\curl -V
curl 7.15.1 (i586-pc-mingw32msvc) libcurl/7.15.1 zlib/1.2.2
Protocols: tftp ftp gopher telnet dict ldap http file
Features: Largefile NTLM SSPI libz

 

Thanks for considering!

 

Cameron

Currently pip is the package manager in place within the Designer. Unfortunately this is something that doesn't fit our requirements as Data Scientists. We prefer using conda due to the following reasons:

  1. conda manages also non-Python library dependencies. This way conda can be used to manage R packages as well which comes in quite handy (even tough not all packages from CRAN Repository are available)
  1. conda provides a very simple way of creating conda envs (similar to virtualenv but with conda one can also install and manage pip packages --> virtualenv cannot install conda packages!) to isolate required packages (with specific versions) used in a workflow (e.g. for a Python Tool in Designer).

 

So I would like to have conda instead or additionally to pip and would like to create my conda envs where I install the packages I need for a specific task within my workflow. Moreover, if you think about to feature an R jupyter notebook capability (like the Python Tool) it could be beneficial to change from pip to conda for managing packages in both worlds.

Problem:

Dynamic Input tool depends on a template file to co-relate the input data before processing it. Mismatch in the schema throws an error, causing a delay in troubleshooting.  

 

Solution:

It would be great if the users got an enhancement in this Tool, wherein they could Input Text or Edit Text without any template file. (Similar to a Text Editor in Macro Input Tool)

 

Give me the ability to show/hide, enable/disable user interface tools via a control parameter.

Figuring out who is using custom macros and/or governing the macroverse is not an easy task currently.  

 

I have started shipping Alteryx logs to Splunk to see what could be learned.  One thing that I would love to be able to do is understand which workflows are using a particular macro, or any custom macros for that matter.  As it stands right now, I do not believe there is a simple way to do this by parsing the log entries.  If, instead of just saying 'Tool Id 420', it said 'Tool Id 420 [Macro Name]' that would be very helpful.  And it would be even *better* if the logging could flag out of the box macros vs custom macros.  You could have a system level setting to include/exclude macro names.  

 

Thanks for listening.

 

brian 

I would love the ability to double click a un-named tab and rename it for 'temp' workflows.

 

eg - "New Workflow*" to "working on macro update"... 

 

Reason:

  1. when designer crashes it is a huge pain to go through auto saves with "New Workflow*" names to find the one you need
  2. I work on a lot of projects at once and pull bits of code out and work on small subset and then get destracted and have to move over to another project. With mulitpule windows and tabs open it gets confusing with 10 'new worflow' tabs open.
  3. Allows for better orginaization of open tabs - can drag tabs into groups and in order to know where to start from last time.

 

 

 

 

Hi!

 

Can you please add a tool that stops the flow? And I don't mean the "Cancel running worfklow on error" option.

Today you can stop the flow on error with the message tool, but there's not all errors you want to stop for.

 

Eg. I can use 'Block until done' activity to check if there's files in a specific folder, and if not I want to stop the flow from reading any more inputs.

 

Today I have to make workarounds, it would be much appreciated to have a easy stop-if tool!

This could be an option on the message tool (a check box, like the Transient option).

 

Cheers,

EJ

All the tools in the interface should be populated and assigned/selected with dynamic values.

 

Now we have the option of populating a set of values for Tree/List/Dropdown.

But we do not have the selecting some of them by default / while its loading.

 

And Other controls like TextBox/NumericUpDown/CheckBox/RadioButton also should be controlled by values from database.

 

For example If I have a set of three radio buttons, I should be selecting a radio button based on my database values while the workflow loading.

 

 

Hello,

please remove the hard limit of 5 output files from the Python tool, if possible.

It would be very helpful for the user to forward any amount of tables in any format with different columns each.

 

Best regards,

I think it would be extremely helpful to have an in-DB Detour so that you could filter a user's information without having to pull it out of DB and then put it back in for more processing.  A time where this would be useful is if you have a large dataset and don't want to pull the entire dataset out of the DB because it will take a long time to pull it.  This would be applicable for filtering a large dataset by a specific state chosen by the user or possibly a region.  The Detour in the developer tools actually seems like it would do the job necessary, it just needs to connect to the In-DB tools.  

Top Liked Authors