The Product Idea boards have gotten an update to better integrate them within our Product team's idea cycle! However this update does have a few unique behaviors, if you have any questions about them check out our FAQ.

Alteryx Designer Desktop Ideas

Share your Designer Desktop product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Featured Ideas

Hi Alteryx Devs - 

 

It would be *really tight* to have a drop down interface tool that would support auto completion based on a odbc connection to a table/column or ajax call.  I recently had a situation wherein we need to give the users the ability to select an address, then run a workflow.  But the truth is, our address data is terrible, and what I really needed was to be able to let the users start typing the address, then give them a list of choices to pick from, they pick the correct (but usually wrongly formatted) address, and then I send that value into the workflow. 

 

I could not find a decent way to give a gallery user a reliable way to pick an address from our list, so eventually wound up having to write an ajax piece to handle the auto completion, capture the user input, then post to a service that would in turn, interact with gallery through the API, get the response, and send it back calling page, and back to the user.  A significant amount of work to put into something that is an exceedingly common web operation of auto completion.  

 

This would make a lot of gallery operations flow so much more naturally.  

 

Thanks for listening! 

brian

Hello,

 

I had a business case requiring a cost effective and quick storage solution for real time online sourced survey data from customers.  A MongoDB instance would fit the need, so I quickly spun up a cluster on Mongo Atlas.  Atlas was launched by MongoDB in 2016 as a database-as-a-service deployed on AWS.  All instances for Atlas require TLS/SSL to connect.  Currently, the Alteryx MongoDB connector does not support TLS/SSL connections and doesn't work against Atlas.  So, I was left with a breakdown in my plan that would require manual intervention before ingesting data to Alteryx (not ideal).

 

Please consider expanding this functionality on all connectors.  I am building Alteryx out in my agency as a data platform that handles sensitive customer information (name, address, email, etc.).  Most tools I use to connect to secure servers today support this type of connection and should be a priority for Alteryx to resolve. 

 

Thanks,

Mike Schock

 

 

 

 

Roughly, in all versions of Alteryx Designer, you can use the Annotations tab and rename a tool.  This is awesome for execution in designer, because you can then easily search for certain tool names, better document your workflow, and see the custom tool name in the Workflow Results.

However, when log files are generated, either via email, the AlteryxGallery settings, or an AlteryxEngineCMD command, each tool is recorded using only its default name of "ToolId Toolnumber", which is not particularly descriptive and makes these log files harder to parse in the case of an error.

 

Having the custom names show in these log files would go a long way towards improving log readability for enterprise systems, and would be an amazing feature add/fix.  For users who prefer that the default format be shown, this could be considered as a request to ADD renames in addition to the existing format.  EG "Input Data 1" that I have renamed to "Load business Excel File" could be shown in the log as:

 

00:00:0.003 - ToolId 1 - Load business Excel File: 1 record was read from File Finished in 00:00:0.004

Please upgrade the "curl.exe" that are packaged with Designer from 7.15 to 7.55 or greater to allow for -k flags. Also please allow the -k functionality for the Atleryx Download tool.  

 

-k, --insecure

(TLS) By default, every SSL connection curl makes is verified to be secure. This option allows curl to proceed and operate even for server connections otherwise considered insecure.

The server connection is verified by making sure the server's certificate contains the right name and verifies successfully using the cert store.

 

Regards,

John Colgan

I know this has been posted before, but the posts are fairly old, and I have just confirmed with Support that it is still an issue.  Seems to be a pretty basic request, so I'm putting it out there again under this new heading.


The issue is that if you have data in a field, and you have that data separated by a new line (\n), it will show up fine in a browse tool, or pretty much any other output (database file, Office Document file, etc.). But if you try to use the Table Tool under Reporting, it ignores the line break and strings the data together.


Example:

The field data looks like this in a browse or most other outputs:

Hello, my name is 

Michael Barone

and I love

Alteryx

 

But when I try to pull this field into a Table Tool, it shows up like this:
Hello, my name is Michael Barone and I love Alteyrx

 

Putting this out here again in hopes that it gets lots and lots of stars so it gets put on the road map!!

 

We have Alteryx running in AWS which seems to be a common setup.Our AWS instances are set-up with IAM roles which has been one of the security measures applied in order to finally allow our enterprise company to allow some development in the cloud. IT will not allow the sharing of Access keys to connect to S3.

  • Would like to use the AWS S3 Tools from the connectors palette as the AWS CLI has limited ability to handle/report exceptions or issues with any detail. At the moment, we are limited on what goes into production as we are using CLI for what we can.
  • Ideally, an option would be to add to the S3 Tools allowing the user to select IAM Roles rather than Key Access. Refer the screen attached.

Whenever I add an interface tool, it adds a constant just like the 4 engine constants and any user constants. It would be useful if tools like the formula and filter automatically added question constants to the list for you to use. This would be identical to how user constants behave currently. Here is the before and after for visual effect:

 

BEFORE:

Capture.PNG

 

 

AFTER:

Capture2.PNG

 

Please enhance the input tool to have a feature you could select to test if the file is there and another to allow the workflow to pause for a definable period if the input file is locked by another user, then retry opening.  The pause time-frame would be definable for N seconds and the number of iterations it would cycle through should be definable so you can limit how many attempts to open a file it would try.

 

File presence should be something we could use to control workflow processing.  

 

A use case would be a process that runs periodically and looks to see if a file is there and if so opens and processes it.  But if the file is not there then goes to sleep for a definable period before trying again or simply ends processing of the workflow without attempting to work any downstream tools that might otherwise result in "errors" trying to process a null stream.

 

An extension of this idea and the use case would be to have a separate tool that could evaluate a condition like a null stream or field content or file not found condition and terminate the process without causing an error indicator, or perhaps be configurable so you could cause an error to occur or choose not to cause an error to occur.

 

Using this latter idea we have an enhanced input tool that can pass a value downstream or generate a null data stream to the next tool, then this next tool can evaluate a condition, like a filter tool, which may be a null stream or file not found indicator or other condition and terminate processing per the configuration, either without a failure indicated or with a failure indicated, according to the wishes of the user.  I have had times when a file was not there and I just want the workflow to stop without throwing errors, other times I may want it to error out to cause me to investigate, other scenarios or while processing my data goes through a filter or two and the result is no data passes the last filter and downstream tools still run and generally cause a failure as they have no data to act on and I don't want that, it may be perfectly valid that on a Sunday or holiday no data passes the filters.

 

Having meandered through this I sum up with the ideal being to enhance the input tool to be able to test file presence and pass that info on to another tool that can evaluate that and control the workflow run accordingly, but as a separate tool it could be applied to a wider variety of scenarios and test a broader scope of conditions to decide if to proceed or term the workflow.

 

This functionality would allow the user to select (through a highlight box, or ctrl+click), only the tools in a workflow they would want to run, and the tools that are not selected would be skipped. The idea is similar to the new "add selected tools to a new tool container", but it would run them instead. 

 

I know the conventional wisdom it to either put everything you don't want run into a tool container and disable it, or to just copy/paste the tools you want run into a blank workflow. However, for very large workflows, it is very time consuming to disable a dozen or more containers, only to re-enable them shortly afterwards, especially if those containers have to be created to isolate the tools that need to be run. Overall, this would be a quality of life improvement that could save the user some time, especially with large or cumbersome workflows.

Hey there,

 

The performance profiling option on the "runtime" tab is very helpful to identify bottlenecks on a long-running workflow.   However this is missing (along with the entire "Runtime" tab) if I change this to a macro.

 

Given that the only way to build relatively complex dependant chain jobs is to wrap them in dummy batch macros (using a macro like a sub-procedure with flow-of-control on the master-canvas) - most of our work is done in Macros - so it would be helpful to be able to performance profile them during testing.

Hello,

As of today, only English is available. But it's hard to convince French Customers with french language data to buy the AIS if it cannot work with their data.

Best regards,

Simon

To get simple information from a workflow, such as the name, run start date/time and run end date/time is far more complex than it should be. Ideally the log, in separate line items distinctly labelled, would have the workflow path & name, the start date/time, and end date/time and potentially the run time to save having to do a calculation. Also having an overall module status would be of use, i.e. if there was an Error in the run the overall status is Error, if there was a warning the overall status is Warning otherwise Success.

 

Parsing out the workflow name and start date/time is challenge enough, but then trying to parse out the run time, convert that to a time and add it to the start date/time to get the end date/time makes retrieving basic monitoring information far more complex than it should be.

For the Output tool, File Format of Microsoft Excel (*.xlsx) - the non-Legacy one - it doesn't have the "Delete Data & Append" option that the Legacy ad 97-2003 Excel formats have. 

 

Having the Delete Data & Append for the most recent version of Excel would be very beneficial. Without it, there does not appear to be a way to udpate an existing Excel sheet using an Alteryx workflow while preserving the formatting within the Excel sheet. The option to Overwrite/Drop removes all formatting. 

 

I have this workflow refreshing an Excel sheet daily, and then am emailing it to a distribution at the end of the workflow. Unfortunately, right now I have to use the 97-2003 format to preserve the formatting of the Excel sheet when it is automatically refreshed and emailed each day. 

 

Can you please assess adding this option? Thanks!

I love the new Custom Format option with the DateTime tool in Alteryx 11.0, this makes working with dates SO MUCH easier... BUT it would be great if you could update an existing field rather than having to create a new column (e.g. DateTime_Out) and then use a select to put this back to the original Date field.

 

 

Datetime.png

It would be great if we could set the default size of the window presented to the user upon running an Analytic App. Better yet, the option to also have it be dynamically sized (auto-size to the number of input fields required).

Please add a configuration to the RedShift bulk load to EITHER use access keys or an IAM EC2 role for access. 

 

We should not have to specify access keys when we are in an IAM enabled environment.

 

Thanks

Currently when creating a table in Oracle in Alteryx there is a lot of "magic" that happens under the hood in converting Alteryx data types to Oracle data types.

For example fixed decimal creates NUMBER, String created CHAR and V_String created VARCHAR.

It would be great to have an option to review the Oracle data types in the Output Data Tool when writing to Oracle. This would improve efficiency when using Alteryx to create Oracle tables. 

See picture for example of what would display in output configuration. 

Data type idea.png

One of the most common causes for Admin trauma for our central Alteryx Gallery team - is dealing with drivers that may not be on the server; or a particular worker; or on a designer.

 

What we're looking for, is for the Alteryx team to maintain a packaged set of drivers as a single installer - which we can download at the same location as the Alteryx designer / server versions.

 

This would allow us to have 1 version of all drivers across ALL designer clients; as well as on our workers and servers.

 

CC: @rijuthav @jithinmony @HengHe @RajK @ydmuley @revathi @Deeksha @MPistone @Ari_Fuller @Arianna_Fuller @JoshKushner @samnelson @avinashbonu @Sunder_Sriram @Rahul_Thakur @Rahul_Singh

When building an Alteryx Macro - one of the tough parts is that the input data you put into the Macro Input is used for testing, but you cannot set the type.

 

So for example - I want to test with the value 1, and Alteryx automatically assumes this is a Byte.

However 1 is just a useful test value, but I need this to be an int 64.

 

Can we provide the option to strongly type the macro inputs - this way, we can give advanced users the ability to control the type on Macro Inputs, and not run into this sort of issue with test data implicitly defining the type?

 

Note: this is similar to the idea here:

https://community.alteryx.com/t5/Alteryx-Product-Ideas/Set-data-type-in-Text-Input-tool/idc-p/115209...

 

I would like to specify two points on a map and have Alteryx create a spatial object that represents the best route from one point to another given some parameters such as quickest route, or shortest route.
Top Liked Authors