ALTERYX INSPIRE | Join us this May for for a multi-day virtual analytics + data science experience like no other! Register Now

Alteryx Designer Ideas

Share your Designer product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Featured Ideas

When inputting a CSV file via the Azure Data Lake File Storage tool the default behaviour is for the first row to be interpreted as data.

 

When reading the same file locally using the File Input tool the default behaviour is for the first row to be interpreted as headers.

 

Since the majority of files will include headers on the first row, it would be helpful to have the "First row contains field names" option selected by default in the Azure Data Lake File Storage tool, and this would also bring the defaults of this tool in line with the standard File Input tool.

 

Illustration below showing the issue:

jamielaird_0-1620852719680.png

 

Hi all,

 

The SalesForce Input tool is great.. but has some really bad limitations when it comes to report. 

I think there are 2 main limitations :

 

A - It can only consume 2000 rows due to the rest api limitation. There plenty of articles about it in the community.

B - Long string such as text comment are cutout after a certain number of characters. 

 

Thanks to this great article : https://community.alteryx.com/t5/Alteryx-Designer-Discussions/Salesforce-Input-Tool-amp-Going-Beyond... , I had the idea of going through a csv file export to then import the data into Alteryx. 

I've done it using two consequent download tool. The first download is used to get the session id and the second to export a report into a csv in the temp folder. This temp file can then be read using a dynamic input workflow. 

 

Long story short, I think Alteryx should upgrade the Salesforce connector to make it more robust and usable. Using the export to csv feature, this should enable Alteryx to be fully compatible with Salesforce report.

 

Regards,

Hi all,

 

The Publish to Tableau Server tool is great.. but requires username and password. If you are using AD, there is a chance that your users don't have a password. In that case, you probably have a technical user that you share across the team. This is not an ideal situation and you loose the governance around the data. 

 

Fortunately, there is an easy workaround. You can leverage personal token authentication : https://help.tableau.com/v2019.4/server/en-us/security_personal_access_tokens.htm 

 

The advantage of this method is that it logs in with your user and your data source is uploaded under your name. This is still using the Tableau REST API so the changes to do in the current macro is MINOR. 

 

Changes to do in the current macro : 

 

1- Add a parameter authentication method with choices : Username/Password ; Personal Token 

2- If Personal Token is selected, add two parameters : Token_Name and Token_Value

3 - In the TableauServer.Login supporting macro, improve the formula(13) to change the payload based on user selection. If Username/Password, keep it as is. Else use the syntax here : https://help.tableau.com/current/api/rest_api/en-us/REST/rest_api_concepts_auth.htm#make-a-sign-in-r... 

 

 

This is quite a straight forward change but could help a lot of companies using Alteryx.

Can you please implement that changes to strengthen this tool ? 

 

Thanks a lot,

The Azure Machine Learning Training and Scoring Tools seems great to improve Azure ML process.

Introducing: The Azure Machine Learning Training and Scoring Tools 

We tried to use this tool but can't log in to Azure ML correctly. We have several Tenant ID then log in to another tenant for office 365 not Azure ML.

====================== <Error Message> ==========================================================
Error: Azure ML Training (367): UserErrorException:
    Message: You are currently logged-in to 55f0a...-.............................................. tenant. You don't have access to d846a...-............................................. subscription, please check if it is in this tenant. All the subscriptions that you have access to in this tenant are =
 [SubscriptionInfo(subscription_name='Microsoft Azure Enterprise', subscription_id='754c5...-...........................')].
 Please refer to aka.ms/aml-notebook-auth for different authentication mechanisms in azureml-sdk.
    InnerException None
    ErrorResponse
=======================================================================================================

Microsoft states that tenant needs to be specified if we have access to multiple tenants.

Set up authentication for Azure Machine Learning resources and workflows 

temp.JPG
Could you add Tenant ID into Azure credentials so that we can use this tool? 

temp2.JPG

We store valuable data in our MS Teams sites (which are sharepoint folders behind the scenes). Currently, there is no way to connect to sharepoint directly (only if I sync sharepoint to my local drive, which is problematic and doesn't work on Alteryx server).

 

My recommendation is to have a sharepoint connector which works on both the desktop and server.

 

Thanks!

https://community.alteryx.com/t5/Alteryx-Designer-Discussions/Limited-access-to-the-table-contents-i...

 

Please follow above link...

 

We are seeing a huge requests from users to support this feature...

 

Till 2019.4 version, Alteryx can't connect to a table if complete read access is NOT granted to it.... 

Other DataConnector tools des this action except Alteryx...

 

Please consider this as a Feature Request ..

Currently Alteryx does not support writing to SharePoint document libraries.

However there are success sometimes but not at other times.

Please see attachment where we ran into an issue.

See this link for additional information.

https://community.alteryx.com/t5/Data-Sources/Connect-to-and-Publish-Back-to-SharePoint-Online/m-p/4...

 

We need official support for reading and writing to SharePoint document libraries.

It's an important Output target, and will becoming more so, as Alteryx enhances its reporting capabilities.

 

Would it be possible to update the SalesForce input tool to support API version 49 or later.

Changes were made to the way recurring events are handled in the SalesForce lightning update and the current salesforce input connector does not include all events when extracting.

It would be great to have the below functionality in Alteryx.

A workflow is built in Alteryx and button click in Alteryx can be used to generate SQL code that can be ran on a specific database platform, such as SQL Server to run external editors such as SQL Server Management Studio. Thanks. 

Alteryx Server was recently updated to allow TLS-mediated connections to the MongoDB persistence layer. This allowed us to switch off of the embedded MongoDB to a highly-available MongoDB Atlas cluster. To our surprise after the switch, when we went to edit our workflows that make use of the persistence layer's data (Server Usage Report, etc.) to hit the new Atlas cluster, we found that the MongoDB Input tool does not support TLS connections. This absolutely needs to be changed. Based on organizational constraints, Atlas is our only option for a HA persistence layer. We absolutely have to have TLS support for the MongoDB Input tool. There is no other way for us to natively query our server persistence layer in Designer. Please bring the MongoDB Input tool into alignment with the MongoDB connections that are supported by Alteryx Server.

 

It would be wonderful for Alteryx to be able to connect to and query OData feeds natively, rather than using a 3rd-party driver or custom macro.   

 

OData querying is supported by quite a few familiar products, including Excel and PowerBISSIS/SSRS, FME SafeTableau, and many others. And the protocol is used to publish feeds from Microsoft Dynamics and Sharepoint, as well as many of the 10,000 publically available government datasets with API's (esp. those hosted by Socrata)   

 

I didn't see it as in the Idea section, but questions and workarounds have been discussed in the community a few times (11/15, 3/18, 4/18), and suggestions seem to be just to buy the $400-600 ODBC driver from CDATA (or ZappySys), or I could use a VBA script in Excel trigger a refresh, or create my own Alteryx connector macro (great series btw, though most was beyond my understanding!) 

   

While not opposed paying, kludging, or learning to program, they're just one more thing to build/buy, install, maintain, and break at the most inconvenient time 🙂

 

Thanks,
Chadd

 

OData Overview:

OData (Open Data Protocol) is an ISO/IEC approvedOASIS standard that defines a set of best practices for building and consuming RESTful APIs. OData helps you focus on your business logic while building RESTful APIs without having to worry about the various approaches to define request and response headers, status codes, HTTP methods, URL conventions, media types, payload formats, query options, etc. OData also provides guidance for tracking changes, defining functions/actions for reusable procedures, and sending asynchronous/batch requests.  OData RESTful APIs are easy to consume. The OData metadata, a machine-readable description of the data model of the APIs, enables the creation of powerful generic client proxies and tools.

More info at at http://odata.org

In the Python 3.6/3.8 versions of SF Input Tool, the business name of an object is returned (e.g., Quote).  In the now-deprecated Macro-based version of SF Input, the technical name was returned (e.g., Quote, Quote__c, SBQQ_Quote__c).

 

With the Python Input tools, there are multiple occurrences of "Quote" to select from with the SF Input tool.  This is confusing and leads to "guessing" which object is the right one.

 

See attached screenshots.

 

My proposal is to add an option to the SF Input tool to allow the workflow developer to choose whether technical or business names should be returned.

While Alteryx allows for a proxy username and password in the settings, these are not passed properly to an NTLM proxy. Support for NTLM authentication would be incredibly useful for a number of corporations who utilize this firewall setup.

 

We currently have to either download via Python or cURL through batch commands called by Alteryx. Since Alteryx uses a cURL back-end, this should be a fairly simple addition to the existing download tool by allowing a selection of proxy server, port, and authentication method in addition to the proxy username and password. This could be done either in the tool itself or in User Settings.

I would like to see the Publish to Tableau Server tool updated to include the option of authenticating with a Personal Access Token in addition to Username/Password.  The user would be able to toggle the login method and provided the necessary credentials for that method.

  • Category Connectors

TIBCO Data Virtualization is a Data Virtualization product focused on creating a virtual data store consolidating data from throughout the enterprise.  It can be accessed via a SQL query engine, and has a variety of supported connectors, including an ODBC driver.

 

This data source can be connected to via ODBC in Alteryx today, but error messaging is unclear/unhelpful, and attempting to use the Visual Query Builder causes Alteryx to crash.

 

Adding TIBCO Data Virtualization as a supported ODBC connection would empower business users to leverage this product and easily utilize this enterprise data store, enhancing the value of the Alteryx platform as a consumer of this data.

Please update the Publish to Tableau Server connector tool to support Tableau's Ask Data feature. The data source must be recognized as an extract on Tableau Server in order for the Ask Data feature to work. Currently, all data source published using version 2.0 of the connector tool are recognized as a live data source. The work around is cumbersome and requires multiple copies of data sources to be created and managed.

  • Category Connectors

I have recently added an Azure data lake v2.   The Azure input/output connectors do not work with this version of the Azure data lake.

 

It appears that Alteryx adds ".azuredatalakestore.net" to the file path.   This works for V1, but not needed for V2

 

any plans to configure a connector for Azure data lake v2?

  • Category Connectors

At our organization we are required to change our passwords every few months forcing a change to my Tableau Server password.  How does this relate to Alteryx?  Well, every 90 days I have to change my password in the "Publish to Tableau Server Tool" for all of my workflows.  This is quite a cumbersome process that could be eliminated with AD.

 

If you dislike manually changing your for each workflow that uses this tool then "star" this post!

 

 

 

  • Category Connectors

Hey all,

 

The join tool currently does not allow case-insensitive joins, but the find/replace tool does.    Additionally- even if both sides are identical, the join tool will not join "Sean's house" to "Sean's house" because of the non-letter character in the middle.    Finally - if one side is a string(2), and the other is a vString(200) - even if you have a single identical character on both sides you get uncertain outcomes unless you force the type

 

Please could you consider amending the join tool to include 3 new options or capabilities:

- Case insensitive join

- Allow full Unicode character set in join

- Full match across text types (irrespective of string size) - this would allow a string(2) value to match to a string(100) value as long as the string(100) value only has the same 2 characters in it as the string(2) value

 

That would remove a load of work from every text-join that's being done on every canvas we do.

 

Thank you 

Sean

 

 

Hi,

A lot of companies now are deploying on both AWS and Microsoft Azure.

Alteryx supports AWS S3 object storage out of the box, it would be important to support Microsoft Azure blob as part of the native Alteryx product as well. 

Cheers,

Adrian

  • Category Connectors
Top Liked Authors