We're excited to announce that we'll be partnering with Credly starting October 19th - see what this means and read the announcement blog here!

Alteryx Designer Ideas

Share your Designer product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Featured Ideas

Hello all,

In help, we can read that :
https://help.alteryx.com/current/designer/write-data-db-tool

Update/Delete is currently only supported for SQL Server ODBC connections.

 

 I don't know about you but SQL Server is well used in transactional workload but in analytics... well... I have only used once in several dozens of context !

Maybe it would be cool to make it work on many more database?

Best regards,

Simon

When building API calls within Alteryx there are a few common steps required

1) Build out the URI for the API call (base URL plus any query parameters)

2) Deal with authentication, such as basic authentication requires taking a key and secret, base 64 encoding and passing this into the tool

3) parsing the results out and processing these downstream

 

For this idea I am specifically focusing on step 3 (but it would be great to have common authentication methods in-built within the download tool (step 2)!).

 

There are common steps required to parse out the results, such as using Filter (to check for a 200 response), JSON parse, text to columns and then cross tab to get the results into a readable format. These will all be common steps anyone who has worked with APIs will be familiar with:

cgoodman3_2-1616585073736.png

 

This is all fine for a regular user to quickly add in and configure these tools. However there is no validation here for the JSON result being as expected, which when embedding an API into a batch macro or analytic app means it can easily fail.

One example of a failure which I've recently come across is where the output JSON doesn't have all fields (name:value pairs) depending the json response. For example using the UK Companies House API, when looking at the ceased to act field at this endpoint - https://developer-specs.company-information.service.gov.uk/companies-house-public-data-api/resources... the ceased to act field only appears in the results if a person has actually ceased to act. This is important if you have downstream tools such as a formula to create a field [Active] where you have:

IF ISNull([ceased_to_act]) THEN "Active" ELSE "Ceased to Act" ENDIF

However without modification the macro / app will error if any results are returned where there is not this field.

 

A workaround is to add in the Crew Ensure Fields or union on a list of fields, to ensure that the Cease to Act field is present in the output for all API calls. But looking at some other tools it would be good if an expected Schema could be built in to the download tool to do this automatically.

 

For example in Power Automate this is achieved as follows:

 

cgoodman3_1-1616584699689.png

 

I am a big advocate of not making things unnecessarily complicated. Therefore I would categorise this as an ease of use feature to improve the experience of working with APIs within Alteryx and make APIs (as load of integrations are API based) accessible to as many users as possible.

 

 

 

 

Hello all,

As of today, you can use the Dynamic Select Field with two options

-by types (you can dynamically select all, all date, etc..)
-by formula

I suggest 2 easy improvements

-from a list field. You connect a field list to a second entry with a "Field name" field
-from flow : You connect a flow to a second entry and the common fields are selected

Best regards,

Simon

It would be very helpful if we could pick from a list of installed calgary datasets in the dropdown menu:

we currently have the ability to choose geocoder/drivetime/Allocate datasets which are typically stored in the .ini files, but don't currently have the ability to choose calgary datasets. 

jarrod_1-1628862282636.png

 

 

Hi all,

 

The Publish to Tableau Server tool is great.. but requires username and password. If you are using AD, there is a chance that your users don't have a password. In that case, you probably have a technical user that you share across the team. This is not an ideal situation and you loose the governance around the data. 

 

Fortunately, there is an easy workaround. You can leverage personal token authentication : https://help.tableau.com/v2019.4/server/en-us/security_personal_access_tokens.htm 

 

The advantage of this method is that it logs in with your user and your data source is uploaded under your name. This is still using the Tableau REST API so the changes to do in the current macro is MINOR. 

 

Changes to do in the current macro : 

 

1- Add a parameter authentication method with choices : Username/Password ; Personal Token 

2- If Personal Token is selected, add two parameters : Token_Name and Token_Value

3 - In the TableauServer.Login supporting macro, improve the formula(13) to change the payload based on user selection. If Username/Password, keep it as is. Else use the syntax here : https://help.tableau.com/current/api/rest_api/en-us/REST/rest_api_concepts_auth.htm#make-a-sign-in-r... 

 

 

This is quite a straight forward change but could help a lot of companies using Alteryx.

Can you please implement that changes to strengthen this tool ? 

 

Thanks a lot,

I personally think it would work better to tab from 'Select Column' to 'Enter Expression Here' and not the 'Functions' List as probably people who are tabbing would immediately like to start typing the formula rather than going through functions, fields, etc.

 

joe_lipski_1-1589790965393.png

 

 

Need a way to highlight lines whether that means right-clicking and selecting a color or what-not, but just having the lines become black & BOLD doesn't cut it. It's not easy on the eyes. If I could click this line/connector and make it bright green that would be ideal and then I can see where it connects better when zooming out.

mbogusz_0-1612895197406.png

 

As a change to Designer UI in 2021.2, when in the filter box, I used to be able to use my mouse to click a little X in the corner to clear the filter or sort that had been implemented and it would immediately clear the filter.  It's not working as of 2021.2.  Now, I must navigate to the last cascade to get to the word Clear and click on it to clear the filter. 

This feels like another very tiny move in the wrong direction. These small UI changes cause 2 or 3 additional steps and slow the diagnostic/navigation process in moving around the Results Grid in the Browse Tool or at any point in the flow where the Results Grid is used.

Can the X in the top level of the Filter/Sort box in the Results Grid be restored in 2021.2?

 

Drvt6713_0-1626981721300.png

Related to submission:

Small fix for the UI in the Results Grid (or Browse Tool)

Small Keyboard fix for the browse tool's filter

Referencing the previous idea: Inputs/Output should have the option to read/write a compressed file (ZIP or GZIP)

 

This idea has been implemented for inputting .zip files. However, we still need to use the run command workaround for outputs. It's very common for many users to want to output their .csv, .xlsx, .pdf to a .zip. The functionality would also need to extend to Gallery.

 

See the following links for people that are looking for this type of functionality:

https://community.alteryx.com/t5/Alteryx-Server-Discussions/Download-Multiple-Outputs-from-the-Galle...

https://community.alteryx.com/t5/Alteryx-Designer-Discussions/Output-files-to-ZIP/td-p/163502

https://community.alteryx.com/t5/Alteryx-Designer-Discussions/Zip-files/td-p/151456

 

Feel free to merge this idea with the previous one for continuity.

it would be great if the formula tool could expand the intellisense to the select column box. For example, I could start typing in the select column box and it would widdle down the list of fields down.let's suppose I wanted to update field 79A, I could type in 7 and it might show something like 

7

17

27

37

70

71

79A

79B.

 

So if I typed in 79 then, it would further reduce it to 

79A

79B

 

And i could select 79A.

 

patrick_digan_0-1614186078945.png

 

When inputting a CSV file via the Azure Data Lake File Storage tool the default behaviour is for the first row to be interpreted as data.

 

When reading the same file locally using the File Input tool the default behaviour is for the first row to be interpreted as headers.

 

Since the majority of files will include headers on the first row, it would be helpful to have the "First row contains field names" option selected by default in the Azure Data Lake File Storage tool, and this would also bring the defaults of this tool in line with the standard File Input tool.

 

Illustration below showing the issue:

jamielaird_0-1620852719680.png

 

when you render out to an excel file, the excel file is created as a new file.  You cannot render to an existing excel file.

I'd like to see this functionality.  I have a client who has a workbook with multiple formatted sheets and they'd like to render an addiitional sheet of formatted data out from Alteryx into the existing workbook.

I often need to create a record ID that automatically increments but grouped by a specific field. I currently do it using the Multi-Row Formula tool doing [Field-1:ID]+1 because there is no group by option in the Record ID tool.

 

Also, sometimes I need to start at 0 but the Multi-Row Formula tool doesn't allow this so I have to use a Formula tool right after to subtract 1.

 

So adding a group by option to the Record ID tool would allow the user not to use the multi-row formula to do this and to start at any value wanted.

I love this tool,  but think it would be improved by including an option to create a column per delimiting character.  This could be added in the number of columns selector box.  In the case where 1 row has more delimiters than another, null columns can be created.  Without this option you have to Regex count the delimiters, select the max and then embed the Text to columns tools in a macro and then pass the max columns as a param.  Would be nice to resolve all this in the main tool.

 

Thanks, nick

After using the Text to Columns tool, I generally find myself using a Select tool to get rid of the original field that I split up. Could an option be added in the config to automatically delete this field once it is split to columns?

Cleanse Macro

Given a choice between the delivered macro and the CReW macro, I’ll choose the CReW macro for both speed and functionality.  Wikipedia says, “Data cleansing or data cleaning is the process of detecting and correcting (or removing) corrupt or inaccurate records from a record set, table, or database and refers to identifying incomplete, incorrect, inaccurate or irrelevant parts of the data and then replacing, modifying, or deleting the dirty or coarse data.”  If Alteryx were to convert the macro to a true tool, here is my feature request list:

Performance:

  • AMP compatible – Fast!
  • Faster than the CReW macro for deleting empty fields/rows
  • Resolve time it takes to load the tool (current macro versions are slow), html is faster.

Feature Enhancement:

  • Allow selection of fields based on data type
  • Include incoming/outgoing SELECT functionality
  • Allow for PREFIX functionality (like multi-field formula), but NOT default
  • Read incoming metadata to provide color coding of fields to indicate where potential problems exist (e.g. NULL, Whitespace) – part of browse everywhere currently
  • Allow for Nulls to convert to 0/blank or 0/blank to convert to Null
  • When removing punctuation, provide for exceptions (e.g. Numeric set of negative, comma and period).
  • Include HTML tag removal
  • Support internationalization (character sets)

Going the extra mile:

  • Display or opt for output, cleanup metrics.  How dirty was my data?  Potentially, allow for ERROR to stop workflow if garbage is detected.
  • Optional:  Detect outliers in numeric data.  I’ve got an outlier detection macro that we can review, but while you are passing all of the data for numeric values, explaining or tagging outliers would be useful.  Could be a box-whisker on numeric values maybe?
    • Make outlier actionable
      • Identify in data (new field indicator)
      • Remove
      • Modify/Impute
    • Test/Preview against metadata:  (pre-run), see what the incoming/outgoing results would be on *all of the metadata before I run the workflow.
    • camelCase:  https://en.wikipedia.org/wiki/Camel_case
    • Identify/Replace unknown values (e.g. N/A, Not Applicable, #) with Null() or other?
    • Identify/Remove duplicate values within a cell
    • See also:  https://en.wikipedia.org/wiki/Data_cleansing
    • Option to point to a “personal” dictionary for spelling or validation
    • Provide “smart” annotation on tool

Similar to the thoughts in this idea, it would be great if the parenthesis matching functionality could be added to the formula tool as well.

Hi all,

 

The SalesForce Input tool is great.. but has some really bad limitations when it comes to report. 

I think there are 2 main limitations :

 

A - It can only consume 2000 rows due to the rest api limitation. There plenty of articles about it in the community.

B - Long string such as text comment are cutout after a certain number of characters. 

 

Thanks to this great article : https://community.alteryx.com/t5/Alteryx-Designer-Discussions/Salesforce-Input-Tool-amp-Going-Beyond... , I had the idea of going through a csv file export to then import the data into Alteryx. 

I've done it using two consequent download tool. The first download is used to get the session id and the second to export a report into a csv in the temp folder. This temp file can then be read using a dynamic input workflow. 

 

Long story short, I think Alteryx should upgrade the Salesforce connector to make it more robust and usable. Using the export to csv feature, this should enable Alteryx to be fully compatible with Salesforce report.

 

Regards,

What about allowing us to maintain the same active license / activation info on 2 devices simultaneously, but automatically deactivating the license on the other device when the program is used? Almost all software these days allows users to do full installation and activation on secondary devices, but restricts the use of the software to a number of active devices at any point in time. The current process of juggling a clunky transfer process (and temporary "demo" installs when that fails) just to be able to work on an office desktop during the day and a laptop at night / home is a brutal restriction on users.

 

This has been brought up many times and is always ignored. The current license approach is no longer a simple inconvenience now that we are living in the age of Covid. The ability to easily move between computers is a necessity in order for us to manage the constant unpredictable work arrangements of our modern world. Please address this issue.

 

 

The "Manage Data Connections" tool is fantastic to save credentials alongside the connection without having to worry when you save the workflow that you've embedded a password. 

 

Imagine if - there were a similar utility to handle credentials/environment variables. 

 

  • I could create an entry, give it a description, a username, and an encrypted password stored in my options, then refer to that for configurations/values throughout my workflows. 
    • Tableau credentials in the publish to tableau macro
    • Sharepoint Credentials in the sharepoint list connector
  • When my password changes I only have to change it in one place
  • If I handoff the workflow to another user I don't have to worry about scanning the xml to make sure I'm not passing them my password
  • When a user opens my workflow that doesn't have a corresponding entry in their credentials manager they would be prompted using my description to add it.
  • Entries could be exported and shared as well (with passwords scrubbed)

Example Entry Tableau:

Alias Tableau Prod
Description Tableau Production Server
UserID JPhillips
Password *********
+  

Then when configuring a tool you could put in something like [Tableau Prod].[Password] and it would read in the value.

 

Or maybe for Sharepoint:

 

Alias TeamSP
Description Team sharepoint location
UserID JPhillips
Password *********
URL  http://sharepoint.com/myteam
+  

 

Or perhaps for a team file location:

Alias TeamFiles
Description Root directory for team files
Path \\server.net\myteam\filesgohere
+  

 

Any of these values could be referenced in tool configurations, formulas, macro inputs by specifying the Alias and field.

1-3-2019 12-43-52 PM.png

Top Liked Authors