Alteryx Analytics Cloud Product Ideas

Share your Alteryx Analytics Cloud product ideas, including Designer Cloud, Intelligence Suite and more - we're listening!

We need a custom viewer role so that user is able only to use connections shared to him, but not re-share those connections to others. In our case, admin will set up the connections for users and they will just use them. Users should not be able to create or share connections. This will improve the connection security and access to data.

In our organization we would like to export path to data lake to be enforced and thus user not able to export to any location but a location desired by the application admins.

Allow a connection to a geo coding system, like USPS or Google, that allow you to join and run a demographics dataset through to have longitude and latitude added the output for mapping. I can see a lot of uses for this and especially in the Marketing and Advertising sector.

When output a Trifacta formated date, like a derived year (year(mydate) ) or formatted to mmddyyyy that when it is published to a SQL table that it maintains that format or transformation. Currently putting out a formatted or transformed date is put out to the SQL Table in the input format and all transformation and formatting is lost.

When creating a destination to and SQL table on an SQL server, allow you to set the field length and not use the Trifacta output standard of varchar (1024). This would help keep the maintenance on the table easier.

I'm looking for a way to discover which datasets, recipes, or outputs are taking up the most time and resources.

it would also be nice if we were able to view this over time as well.


an example would be sometime like the Unity3d profiler.

https://docs.unity3d.com/uploads/Main/profiler-window-layout.png

this is for a video game engine, but i hope the system can be similar.

in this profiler you can see what resource (ram,cpu, gpu) is being used and by what character/object in your video game.

similarly it would be nice to see what database is being used by what flow in trifacta.

Currently is scheduling allowed only on Instance level.

The request is to be able to allow scheduling for particular user, instead for all instance users.

When migrating flows from one environment to another, currently one must click on each flow to export it, and then import it into another environment. It would be great to do so in bulk by ticking a bunch of flows to export. Or do it by exporting an entire folder. Or do it by exporting a plan.

It would be great if you can expand the metadata selection to not be limited to 2 elements (row number and file path) but could potentially add the date timestamp (e.g. $datecreated) to be used in the recipes.

I would like the ability to specify a billing project for BigQuery as part of run options. Currently, data queried from BigQuery is associated to the project from which a Dataprep flow is run with no way to change it. For customers we work with in a multi-project environment, they need the flexibility to align queries to specific projects for purposes of cost and usage attribution.

Additionally, for customers on flat-rate BigQuery pricing, a selectable billing project will allow users to move queries to projects under different reservations for workload balancing and/or performance tuning.

When using an SQL Statement with a WITH Query Expression I am getting the following error: No select statement found. I was told that WITH statements are currently not supported at the moment.

Why this should be changed:

  1. WITH statements are very important to structure long and complex SQL scripts and reducing heavily nested (unreadable) SQL scripts.
  2. We have a lot of scripts that we want to migrate, but we are stuck as it would take too much time and effort to transform the script. Same for moving logic to Dataprep recipes.


Best regards

Marcel



Details about the syntax:

https://cloud.google.com/bigquery/docs/reference/standard-sql/query-syntax

Related customer questions

  • https://community.trifacta.com/s/question/0D53j00006OatIdCAJ/is-it-possible-to-use-cte-common-table-expressions-when-importing-a-data-set-using-custom-sql-i-am-specifically-trying-to-use-with-statements


Current syntax for WORKDAY function is workday(date1,numDays,[array_holiday]), and the array_holiday can't be a column a table, for example when there's any unpredictable non-trading days like Typhoon weather, we always need to go and change the public holidays in recipe, would prefer if the holidays can be from a column in a table that we can just import and update the table when needed.

When i use Dataprep full day, i would prefer to have a dark theme to preserve my eyes ! If we could swith between a dark or a light theme, it would be the best solution !

:-)

PS : Sorry for my bad English, I'm a French user.

When I'm on the Flow workspace, if I do a click on a recipe, the steps displays on the float right box.

Unfortunately, i'can't select and copy steps :-/

I'm forced to load the recipe to copy steps before past it in another recipe. I think that everyone would gain some time to copy steps directly from the Flow view !

:-)

PS : Sorry for my bad English, I'm a French user.

Why I must open all the recipes to reload each sample ?

For exemple :

I make flows with many recipes (between 60 and 100 - It's a real case for me).

On monday, I make a lot of modifications on "data cleaning" at the start of the data wrangling chain !

On tuesday, when i try to open others recipes, I've a warning message "your sample need to be updated" !!!

=> If I had a buttun "update all sample of the flow", I would run it on Monday before sleeping and Tuesday, i could work with smile !


PS : Sorry for my bad English, I'm a French user :-)

When you select a column for apply function or transformation, the methods to select columns are :

  • Multiple
  • All
  • Range
  • Advanced

But this is not possible to select column with a "RegEx math" method on the name of the columns.

It would be much easier!

Current NIST/NSA standard is SHA-2.

As a data wrangler, I would like to be able to hash a column's data using the SHA-256 hashing algorithm.

There is no explicit sharing functionality for user macros between users or groups of users. Currently, the closest thing is users gain access to an owner’s macros within a flow once the flow is shared with them. It would be good to be able to share macros in the same way that users can share flows.

Users have asked for the ability to create new versions of recipes so that they can collaborate safely. Also there is a need to keep an audit history of changes. Trifacta has recipe level history but that does not fulfil the whole use case of version control. 

Read data from Google Drive.