Allow for more then 1 job to be deleted at a time.
Current NIST/NSA standard is SHA-2.
As a data wrangler, I would like to be able to hash a column's data using the SHA-256 hashing algorithm.
Hello,
I have noticed the AACP is upgraded at a fast pace... however, I don't know when and now I discover the upgrade when I go sometimes on the Release Notes https://help.alteryx.com/AAC/en/release-notes.html
So an email notifications with the changes would be appreciated.
Best regards,
Simon
Hello,
Right now, there are a few filters available on the top of the window
Among the one Iwould like :
-status like status supported that would hide the "(Early Preview)".
-features (Import and Publish versus Import only)
I would like the ability to specify a billing project for BigQuery as part of run options. Currently, data queried from BigQuery is associated to the project from which a Dataprep flow is run with no way to change it. For customers we work with in a multi-project environment, they need the flexibility to align queries to specific projects for purposes of cost and usage attribution.
Additionally, for customers on flat-rate BigQuery pricing, a selectable billing project will allow users to move queries to projects under different reservations for workload balancing and/or performance tuning.
We can migrate flows from one environment to other environment using Trifacta APIs.
Export and Import the flow from source to target.
Rename the flow.
Share flow with appropriate user according to environment.
Change the input and output of the flow.
Create a connector to Mavenlink.
We at Grupo Boticário, who currently have 13k Dataprep licenses and close to the official launch internally, have noticed a recurring request for a translation of the tool. Bearing in mind that it will be an enabler for more users to use in their day-to-day work, I would like to formalize and reinforce the importance of our request for translation into Brazilian Portuguese as well as a forecast of this improvement.
Currently, If there is use-case that the data needs to brought from tables resting in different databases of a same cluster. We have to create n connections for n databases.
But being in same cluster, one should be able to access different databases with a single connection otherwise the connection list gets long and messy.
Currently there is support for parameterizing variables in custom SQL dataset in Dataprep. However it requires that the tables using this feature have the same table structure. This request is to allow this same functionality but with tables that have different table structures.
Example:
Table A
dev.animals.dogs
name | height | weight
Table B
dev.animals.cats
name| isFriendly
Would like to use a query where we have 1 custom SQL dataset where we just say
SELECT * FROM dev.animals.[typeOfAnimal]
typeOfAnimal being the parameterized variable with a default of dogs.
Users to be able to create multiple connections to data lake. Currently user needs to add new data lake path and browse it in order to import data
In our organization we would like to export path to data lake to be enforced and thus user not able to export to any location but a location desired by the application admins.
Users onboarded to Trifacta cannot be deleted from the GUI, only using API. In the GUI users can only be disabled but they still count toward the licensed users. Please allow users to be deleted from the GUI.
If a flow is shared between multiple editors and someone make changes in it, there should be a way we can see all the changes made to that flow by different users, like creating a trigger that will notify the users about the changes made in the flow by someone as soon as the recipe changes or if we can extract the information about the flow or the job. I have attached the snippet of data that can be useful to us.
At the moment the only edit history visible to Trifacta users is within each recipe. Some actions are done within flows rather than recipes, e.g. recipe creation/deletion/taking union/etc. Such actions are not covered within the edit history, but for compliance purposes/troubleshooting it is important for users to know when these actions were taken/who by. Please would it be possible to add the functionality to Trifacta to have an edit history on flows as well as within recipes?
As of now the once a user deletes the flow, the flow will not be visible to anyone, except in the database. But the flow is soft deleted in the database. So can enable the option for admins to see all the deleted flows and recover those flows if required, so that in case some one deletes the flow by mistake then admins can retrieve it by recover option. This has to be an option by check box, where they can recover those flows all at once if it is a folder. This option can also be given to folder recovery where they can recover all the flows in the folder.
When output a Trifacta formated date, like a derived year (year(mydate) ) or formatted to mmddyyyy that when it is published to a SQL table that it maintains that format or transformation. Currently putting out a formatted or transformed date is put out to the SQL Table in the input format and all transformation and formatting is lost.
When creating a destination to and SQL table on an SQL server, allow you to set the field length and not use the Trifacta output standard of varchar (1024). This would help keep the maintenance on the table easier.
Hi team,
We would need a page where a user can handle all the email notifications they are receiving from all the flows (success and failure).
Thank you
I started migrating some processes from Desktop to Cloud, but I miss the email customize sending functionality in Cloud. Is it possible?
I would like to send a customized email after a successful execution, but I couldn't find the option to customize the email or attach a file.
If it is not possible, are there plans to implement this soon?