Inspire EMEA 2022 On-Demand is live! Watch now, and be sure to save the date for Inspire 2023 in Las Vegas next May.

Alteryx Designer Ideas

Share your Designer product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Featured Ideas

If a tool fails, there should be a way to customise the error message. Currently a way to do it: log all messages in a file, read that file with another workflow, then customise the messages (Alteryx workflow error handling - Alteryx Community). However, there should be a more convenient solution. We should be able to:

- Find/replace parts of a message.

- Specify, which tools messages to modify.

- Change the message type.

- Change the order of the messages in the results window, to prioritise the critical ones.

- Pick which messages cannot be hidden by "xxx more errors not displayed".

 

This would especially help for macros, as sometimes we have a specific tool failing within a macro and producing a non-user friendly message.

Here's a sample of what you get if no records are read into a python tool:

 

Error: CReW SHA256 (4): Tool #1: Traceback (most recent call last):
  File "D:\Engine_10804_3513901e8d4d4ab48a13c314a18fd1f9_\2f1b1eb4701e445775092128efe77f76\workbook.py", line 7, in <module>
    df = Alteryx.read('#1')
  File "C:\Program Files\Alteryx\bin\Miniconda3\envs\DesignerBaseTools_venv\lib\site-packages\ayx\export.py", line 35, in read
    return _CachedData_(debug=debug).read(incoming_connection_name, **kwargs)
  File "C:\Program Files\Alteryx\bin\Miniconda3\envs\DesignerBaseTools_venv\lib\site-packages\ayx\CachedData.py", line 306, in read
    data = db.getData()
  File "C:\Program Files\Alteryx\bin\Miniconda3\envs\DesignerBaseTools_venv\lib\site-packages\ayx\Datafiles.py", line 500, in getData
    data = self.connection.read_nparrays()
RuntimeError: DataWrap2WrigleyDb::GoRecord: Attempt to seek past the end of the file

 

I've fixed this in my macro by forcing a DUMMY record into the python tool (deleting it on the back-end).  It would be much nicer to have error handling that prevents the issue.  Even as a configuration option, what to do with no input this would simplify things.  

 

This error condition potentially effects every python tool created.

 

Cheers,

 

Mark

When switching modes sometimes it reboots and looses all the code:

Python Tool Bug.gif

Hi 

Wanted to control the order of execution of objects in Alteryx WF but right now we have ONLY block until done which is not right choice for so many cases 

Can we have a container (say Sequence Container) and put piece of logic in each container and have control by connecting each container?
Hope this way we can control the execution order
It may be something looks like below 


  • Category Developer

I would love to be able to see the actual curl statement that is executed as part of the download tool. Maybe something like a debug switch can be added which would produce 1 extra output field which is the curl statement itself? This would greatly enhance the ability to debug when things aren't working as expected from the download tool.

I've seen this question before and have run into it myself.  I'd like to see a new tool that would allow a developer (of a workflow) to choose a path of logic based upon criteria known only during the execution of a module.

 

If LEFT INPUT Count of records < 10,000 THEN Path1 (e.g. use a calgary join)

ELSE Path 2 (e.g. use a standard join)

endif

 

Thanks,

 

Mark

I would to suggest to add a configuration in the Block Until Done tool, which allow the user to prioritize the release of a data stream through multiple Block Until Done tools in the same module. 

 

In the example below, the objective is to update multiple sheets in a single Excel workbook. Each sheet is a different data stream, that cannot be unioned together, therefore making the filtering of a single stream feeding into multiple Block Until Done from that filter solution impossible.

 

What I would like to be able to do is have a configuration, where Block Until Done #2 will not allow the data stream to pass through until Block Until Done #1 is complete, Then Block Until Done #3 will not pass through the data stream until Block Until Done #2 is complete, and so forth through the all the Block Until Done instances. 

 

ScubaGeek_0-1654554889263.png

 

We need color coding in the SQL Editor Window for input tools.  We are always having to pull our code out of there and copy it into a Teradata window so it is easier to ready/trouble shoot.  This would save us some time and some hassle and would improve the Alteryx user experience. ( I think you've used a couple of my ideas already. This one is a good one too. )

2018-10-18_7-49-16.jpg2018-10-18_7-50-52.jpg

 

The Dynamic Input will not accept inputs with different record layouts.  The "brute force" solution is to use a standard Input tool for each file separately and then combine them with a Union Tool.  The Union Tool accepts files with different record layouts and issues warnings.  Please enhance the Dynamic Input tool (or, perhaps, add a new tool) that combines the Dynamic Input functionality with a more laid-back, inclusive Union tool approach.  Thank you.

This has probably been mentioned before, but in case it hasn't....

 

Right now, if the dynamic input tool skips a file (which it often does!) it just appears as a warning and continues processing. Whilst this is still useful to continue processing, could it be built as an option in the tool to select a 'error if files are skipped'? 

 

Right now it is either easy to miss this is happening, or in production / on server you may want this process to be stopped.

 

Thanks,

 

Andy 

 

 

This has probably been mentioned before, but in case it hasn't....

 

The dynamic input tool is useful for bringing in multiple files / tabs, but quickly stops being fit for purpose if schemas / fields differ even slightly. The common solution is to then use a dynamic input tool inside a batch macro and set this macro to 'Auto Configure by Name', so that it waits for all files to be run and then can output knowing what it has received. 

 

It's a pain to create these batch macros for relatively straightforward and regular processes - would it be possible to have this 'Auto Configure by Name' as an option directly in the dynamic input tool, relieving the need for a batch macro? 

 

Thanks,

 

Andy 

 

 

Idea: Prompt the user to find a missing macro instead of the current UX of a question mark icon.

 

Issue: When a macro referenced in a workflow is missing, then there is no way to a) know what the name of the macro was (assuming you were lazy like me and didn't document with a comment) and b) find the macro so you can get back to business.

 

When this happens to me know, I have to go to the XML view and search for macros and then cycle through them until I find the one that's missing. Then I have to either copy the macro back into that location or manually edit the workflow XML. Not cool man.

 

Solution: When a macro is missing, the image below at the right should be shown. In the properties window, a file browse tool should allow the user to find the macro.

 

 

 

FindMissingMacro.png

 

 

 

 

 

Who needs a 1073741823 sized string anyways?  No one, or close enough to no one.  But, if you are creating some fancy new properties in the formula tool and just cranking along and then you see that your **bleep** data stream is 9G for nine rows of data you find yourself wondering what the hell is going on.  And then, you walk your way way down the workflow for a while finding slots where the default 1073741823 value got set, changing them to non-insane sized strings, and the your data flow is more like 64kb and your workflow runs in 3 seconds instead of 30 seconds.  

 

Please set the default value for formula tools to a non-insane value that won't be changed by default by 99.99999% of use cases.  Thank you.

 

 

When building API calls within Alteryx there are a few common steps required

1) Build out the URI for the API call (base URL plus any query parameters)

2) Deal with authentication, such as basic authentication requires taking a key and secret, base 64 encoding and passing this into the tool

3) parsing the results out and processing these downstream

 

For this idea I am specifically focusing on step 3 (but it would be great to have common authentication methods in-built within the download tool (step 2)!).

 

There are common steps required to parse out the results, such as using Filter (to check for a 200 response), JSON parse, text to columns and then cross tab to get the results into a readable format. These will all be common steps anyone who has worked with APIs will be familiar with:

cgoodman3_2-1616585073736.png

 

This is all fine for a regular user to quickly add in and configure these tools. However there is no validation here for the JSON result being as expected, which when embedding an API into a batch macro or analytic app means it can easily fail.

One example of a failure which I've recently come across is where the output JSON doesn't have all fields (name:value pairs) depending the json response. For example using the UK Companies House API, when looking at the ceased to act field at this endpoint - https://developer-specs.company-information.service.gov.uk/companies-house-public-data-api/resources... the ceased to act field only appears in the results if a person has actually ceased to act. This is important if you have downstream tools such as a formula to create a field [Active] where you have:

IF ISNull([ceased_to_act]) THEN "Active" ELSE "Ceased to Act" ENDIF

However without modification the macro / app will error if any results are returned where there is not this field.

 

A workaround is to add in the Crew Ensure Fields or union on a list of fields, to ensure that the Cease to Act field is present in the output for all API calls. But looking at some other tools it would be good if an expected Schema could be built in to the download tool to do this automatically.

 

For example in Power Automate this is achieved as follows:

 

cgoodman3_1-1616584699689.png

 

I am a big advocate of not making things unnecessarily complicated. Therefore I would categorise this as an ease of use feature to improve the experience of working with APIs within Alteryx and make APIs (as load of integrations are API based) accessible to as many users as possible.

 

 

 

 

It will be really great if we add Single sign-on (SSO) in download tool. Many users are facing this issue when they're trying to download data from weburl. In some case the url will verfiy the sign-on and then redirect to link from where we can download data. Currently download tool fails to verfiy SSO or siteminder authentication.

Hello all,

As of today, you can use the Dynamic Select Field with two options

-by types (you can dynamically select all, all date, etc..)
-by formula

I suggest 2 easy improvements

-from a list field. You connect a field list to a second entry with a "Field name" field
-from flow : You connect a flow to a second entry and the common fields are selected

Best regards,

Simon

In normal output tool, when file type is csv, it is possible to custom select the delimiter.  It would be great to be able to have the same option in the Azure Data Lake output tool, so for example you can write a pipe delimited file to your ADLS storage account.

Hello All,

As of today, Alteryx can use the proxy settings set in Windows Network and Internet Settings "Server pulls the proxy settings displayed in Engine > Proxy from the Windows internet settings for the user logged into the machine. If there are no proxy settings for the user logged into the machine, Engine > Proxy isn't available within the System Settings menu.". Then, you can override the credentials (but not the adress) in system settings but also in user settings.

The issue : in many organizations, there are several proxies that you can use for different use case. And by default, it can happen access to API are blocked by these proxies. The user, which is not admin cannot change his Windows Settings... and even if it's done by IT, it will impact all the system, including other software and leading to safety failures.


image.png
What I suggest :
-ability to change credentials AND adress

-a multi-level settings for both credentials and adress:
    default : Windows Settings
    System Settings
    User Settings

    Workflow Settings

    Download tool/ Settings

 

Best Regards,

Simon

 

The Alteryx.Flexnetoperations.com  license management site needs major work.

 

On the View Licenses page it shows all licenses going back several years. A basic need is to show only licenses which haven't expired, but that is not an option.You cannot even sort on the expiration column while you can sort on most others columns.

 

The most simple need is to see a list of my current active license users - but I do't see a way to do that.

 

I tried an "Advanced Search" and chose  expiration date after 2019-10-29 and none of my licenses which expire in 2020 appear - I get a blank list.

 

Similarly on the administer machines page you cannot filter to hide expired licenses or even on the licenses column (which doesn't sort either).

 

The help link on the page doesn't bring you to help specific to that page but the general activation help front page. After several clicks I found this page:

 

https://help.alteryx.com/licensing/current/Administer/AdministerMachines.htm?tocpath=Administer%7C__...

 

But  the help is incomplete (doesn't list Machine types or the difference between Active and Inactive)

 

Also, there is no export capability - copy and pasting into Excel is a formatting headache as it brings in check-boxes.

 

Lots of room for improvement here.

 

Cheers,

Bob

 

P.S. I understand that work is being done on this, but an ETA would be greatly appreciated.

 

It would be great to have the below functionality in Alteryx.

A workflow is built in Alteryx and button click in Alteryx can be used to generate SQL code that can be ran on a specific database platform, such as SQL Server to run external editors such as SQL Server Management Studio. Thanks. 

Top Liked Authors