Bring your best ideas to the AI Use Case Contest! Enter to win 40 hours of expert engineering support and bring your vision to life using the powerful combination of Alteryx + AI. Learn more now, or go straight to the submission form.
Start Free Trial

Alteryx Designer Desktop Ideas

Share your Designer Desktop product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Featured Ideas

When passing a data connection to the Dynamic Input Tool as a string and using the 'Change Entire File Path' option, the password parameter of the connection string is not encrypted and is displayed in the metadata source information.

 

We have since changed our macro that was using this method, but wanted to raise awareness of this situation. I suggest that the same procedure used to encrypt the password in all other connection methods be called if the workflow is configured to pass a password through the input as a string.

From a user standpoint there’s not a good reason for these locations to not be linked.

Would love to see the Workflow - Configuration > Runtime > Record Limit for All Inputs option extended to Dynamic Input tools.

RecordLimitforDynamicInput.PNG

 

 

 

 

Hi All,

 

This is a fairly straightforward request. I'd like to be able to pass through interface tool values to the workflow events the same way I would pass it through to a tool in the workflow (%Question.<tool name>%). One use-case for this is that we are calling a workflow and passing in an ID, and if this workflow fails, I'd like to trigger an event that will call back to the application and say this specific workflow for this ID failed.

 

The temporary solution is to have the workflow write to a temp file and have the event reference that temp file, but this is clunky and risky if there are parallel runs occurring. 

 

Best,

devKev

A client contacted us recently asking if there was a way to view the log file of a scheduled process during runtime.  The use case is so that if there is a runaway process, they will be able to identify at what point the process gets hung up.  

Currently, when a scheduled job is running (and logging is enabled), the log file is locked for use.  

Thanks!
Provide the ability to leverage data "Connectors" as an option in the "Dynamic Input" tool.

With SSIS, you can invoke user precedence contraint(s) to where you will not run any downstream flows until one or more flows complete.  A simple connector should allow you to do this.  Right now, I have my workflow(s) in containers, and have to disable / enable different workflows, which can be time consuming.  Below is a better definition:

 

Precedence constraints link executables, containers, and tasks in packages in a control flow, and specify conditions that determine whether executables run. An executable can be a For Loop, Foreach Loop, or Sequence container; a task; or an event handler. Event handlers also use precedence constraints to link their executables into a control flow.

If you have a complex SQL query with a number of dynamic substitutions (e.g. Update WHERE Clause, Replace a Specific String), it would be nice to be able to optionally ouput the SQL that is being executed. This would be particuarly useful for debugging.

I've been using Events a fair bit recently to run batches through cmd.exe and to call Alteryx modules.

Unfortunately, the default is that the events are named by when the action occurs and what is entered in the Command line.

When you've got multiple events, this can become a problem -- see below:

 

Events tab with all events called the same thingEvents tab with all events called the same thing

 

It would be great if there was the ability to assign custom names to each event.

It looks like I should be able do this by directly editing the YXMD -- there's a <Description> tag for each event -- but it doesn't seem to work.

Implement a process to have looping in the workflow without resorting to Macros.   Although macros do, generally, solve the issue, I find them confusing and non-intuitive.  

 

I would suggest looping through the use of two new tools:   A StartLoop and EndLoop tool.  

 

The start loop would have two (or more) input anchors.  One anchor would be for the initial input and the other(s) for additional iterative inputs.  The start loop would hold all iterative inputs until the original inputs have passed the gate and then resubmit them in order returned to the start loop.  

 

The end loop would have three output anchors.   One anchor would be for data exiting the loop upon reaching the exit condition.  Another loop would be for the iterative (return) data.   Note that transformations can be performed on the data BEFORE it re-enters the loop.  The third would be an "overloop" exit anchor.   This would be for any data that failed to meet the exit condition within the (configurable) maximum iteration expression.   The data from the overloop anchor could be dealt with as required by the business rules for the unsatisfied data after being output from the EndLoop tool 

 

The primary configurations would be on the EndLoop tool, where you would indicate the exit condition and the maximum iteration expression.  The tool would also create an iteration counter field.  As part of the configuration you could have a check box to "retain iteration count field on exit".   If checked, the field would be maintained.   If not checked, the field would be dropped for the data as it exits the loop.

 

This would making looping a bit more intuitive and it would be graphically self-documenting as well.   Worth a mention at least. 

I just had to put an Idea in the Developer category when really it should have gone into a User Interface category.

Using the Download Tool, when doing a PUT operation, the tool adds a header "Transfer-Encoding: chunked".  The tool adds this silently in the background.

 

This caused me a huge headaches, as the PUT was a file transfer to Azure Blob Storage, which was not chunked.  At time of writing Azure BS does not support chunked transfer.  Effectively, my file transfer was erroring, but it appeared that I had configured the request correctly.  I only found the problem by downloading Fiddler and sniffing the HTTPS traffic.

 

Azure can use SharedKey authorization.  This is similar to OAuth1, in that the client (Alteryx) has to encrypt the message and the headers sent, so that the Server can perform the same encryption on receipt, and confirm that the message was not tampered with.  Alteryx is effectively "tampering with the message" (benignly) by adding headers.  To my mind, the Download tool should not add any headers unless it is clear it is doing so.

 

If the tool adds any headers automatically, I would suggest that they are declared somewhere.  They could either be included in the headers tab, so that they could be over-written, or they could have an "auto-headers" tab to themselves.  I think showing them in the Headers tab would be preferable, from the users viewpoint, as the user could immediately see it with other headers, and over-ride it by blanking it if they need to.

Hi alteryx can you please create a poll or an forms to fill or approval processes kind of tools . I know we have some analytics app tools but can we create something like google forms where we can easily create forms and get data outputs. Emails notifications for those forms and approvals .. etc ..

It would be awesome if I could re-display the users selections to them before I continue with the remainder of the workflow in an analytic app. That way, I could collect all of the UI inputs, do my validation on the values provided and then re-display the selections/options/text to the user so they can confirm that they are correct and they wish to continue, or they can stop the processing and make changes via the already-open UI without having to re-enter everything from scratch.

 

Then, when someone selects something that's potentially harmful or very time consuming, I can confirm their selections and alert them to potential issues.

Passing Access and Secret Keys to connect to AWS S3 poses a security risk. It would be great if the Amazon Redshift Bulk Connection tool was enhanced to include an authentication option to use a Native IM group instead of keys.

Add a button to the zoom tools toolbar that zooms the workspace to all selected tools.

Add a button to the zoom tools toolbar that zooms the workspace to all tools.
We use the test tool extensively in our App and Macro authoring. It would be very useful to have the ability to label the tested values so that when the Test tool writes to the output window it’s more descriptive.
Example:
Currently this is the report
Error: Market (1): Tool #235: The test "The count of Columns does not match the count of Labels." failed: TargetNumRecords(#2)==102, NumRecords(#1)==106
 
 
A more useful report would be
Error: Market (1): Tool #235: The test "The count of Columns does not match the count of Labels." failed: Columns (#2)==102, Labels (#1)==106
The Alteryx icon image on the Windows tab at the bottom of the screen should change or change color when running and when a module has finished running. I got this idea from Teradata SQL Assistance. When the query is finished the little icon (which is sunglasses) changes from red (running) to green when complete. 

Hello gurus - 

 

Pretty much every coding framework supports this.  If we really want Alteryx to embrace no-code, we've got to have some ability to control commit / rollbacks across transactions.  As it stands currently, it is pretty easy to write out parent records, fail to be able to write out children, and wind up with a database state that makes the end users very sad.  

 

Thanks!

 

brian

When reading and writing large data frames to/from a python script in Alteryx it seems that there are limitations to the SQLite component of the tool. Given that this selection is recommended only when the user is having issues in the python tool why is the option selected by default? A colleague and I spent a couple of hours trying to work through an issue with importing a data frame larger than 1000x1000 and once we found this option (SQLite override) and unchecked it the data was written back to Alteryx without any problems.

 

Hint provided by the tool, "This changes the intermediate data format between Alteryx and Jupyter from yxdb to SQLite. Use only if running into issues. See help for more details."

 

SQLite override is default selectionSQLite override is default selection

Error message provided by the tool

error message.PNG

After unchecking the option the workflow ran without any errors.

 

Recommendation: the python tool should default to SQLite override unchecked

 

Top Liked Authors