ACT NOW: The Alteryx team will be retiring support for Community account recovery and Community email-change requests Early 2026. Make sure to check your account preferences in my.alteryx.com to make sure you have filled out your security questions. Learn more here
Start Free Trial

Alteryx Server Ideas

Share your Server product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Featured Ideas

We are using the DCM handler feature released in Alteryx 24.1 to help us manage our SDLC. We use the migration tool to move workflows from a dev to prod environment and then the handler manages the DCM connections between those two environments. In the dev Alteryx server all connections point to dev sources, and in the production environment all connections point to prod sources. This works well in RDBs because we just have both environments setup the same at a DB, Schema, Table level.

 

We have run into a situation with S3 buckets, where because S3 buckets have a unique global name, we can't have an S3 dev environment switch to an S3 prod environment. The S3 tool is wired to look for a very specific bucket name. If in the DCM handler, we could also map dev bucket names to a prod bucket name that also gets switched at run time, that would solve my problem.

 

In the meantime, we have to adjust the workflow migration tool to make those changes when the workflow moves from dev to prod.

 

Thank you,

 

Treyson Marks

0 Likes

When a workflow is error, there have no output show in the result page. even it success saved the file.

 

However, sometime I output some check file (where something need to update).

 

current option:  save to somewhere else (like networkdrive or sharepoint), so user can find the file

 

for download only, it required to proper handle all the errors. because any error will cause download file option not show. 
Where error it caused by some missing or any reason, and that why i need to output file to let user to update it somewhere. 

 

suggestion: I wish it always show the output options even it have error.
so i can just create a message tool to raise error for detail reason and let them download the file for checking.

 

0 Likes

Hi all,

actually there is the possibility to delete a workflow on the gallery which has an active schedule in the background.
This has the impact that the schedule site is not available for anybody in the respective company and nobody is able to check their schedulelist.

Possible solution:

Implement a check, which tests if there exists an active workflow and inform the user that it is not possible to delete an "in use"-workflow.
Also show a link in the information window to the schedule that the user gets the possibility to delete the schedule.

 

KR

0 Likes

Our organization heavily leverages Active Directory for all data connections. We manage access to Alteryx Gallery Collections by Active Directory. If I want to use DCM, and share the connection, I can only specify an individual or a custom group of individuals. This is against our efforts to manage security and access usingDCM Active Directory. It would be incredibly beneficial for Alteryx DCM to support connection sharing using Active Directory. Another alternative would be to allow active directory groups to be added to custom user groups. 

We do NOT want all users to be able to share their DCM connections.  The only way to do this is to disable the options on User Profile as it defaults to enabled for all new users.  

Admins should be able to control this and all user profile setting defaults by role. 

Example, New Artisans should all have access to create new Collections and Schedules, but DCM sharing is disabled.

0 Likes

I did not find a better category, but this issue relates to Data Connection Manager and credential setup with key files in a Server and Desktop environment.

 

Issue: When using key file credentials in DCM a Key File Path needs to be specified. This path cannot be a UNC or Mapped drive, has to be a local drive for Google BigQuery with the latest Simba ODBC driver at least. This prevents centralized credentials management and defeats the purpose of DCM. I understand this might be a driver limitation, however it's still an issue that I think Alteryx could handle.

 

Suggestion: Since the key file is simple text (JSON) file, it could be embedded (i.e. uploaded via the DCM interface) into the connection itself. That way it can be centrally distributed to the local computers when the DCM connections are synchronized from the server (or vica versa). I would also store them encoded for Information security reasons.

 

Bonus: Tying into this it would be really good if Alteryx Desktop would automatically sync Server DCM connections when a connection is made to Alteryx server the first time after the software was launched (i.e. opening a Server workflow) to keep it automatically updated.

Screenshot 2025-06-04 170921.png
Currently Validation runs follow these two rules:

  1. The "Validation Run" option is enabled by default for all newly uploaded workflows.
  2. The validation process runs using the local user account, even when the workflow requires specific workflow credentials to execute - often resulting in access or permission errors that would not occur in a real run.
 

 

 

To improve flexibility and reduce false flags created by the validation run, I'd like to propose the following enhancements:

  1. User-Configurable Default for Validation Runs
    Allow a Designer user to disable the 'validate and run' option by default in the User Settings.
  2. Customisable Validation Run As Functionality
    Provide the option for a user to specify who the validation run should run as, rather than always defaulting to the local user.

 

These changes would make the validation workflows more reliable for users using workflow credentials within their Server.

My gallery have more than 100+ notifications. it hard to click one by one.
Want to have mark all as read button for this.

 

if possible split message to 2 type, admin and system, where:

1. admin (for communication)

2. system (grant user access, share workflow and etc.) 


Screenshot 2025-06-04 181143.png

 

 

Hello,

Right now, we can manage DCM connections through API.

 

image.png

However, a connection is an object composed of :
-a datasource (like my sharepoint tenant or this specific postgresql database)
-credentials (my user / password or a token)

When creating a DCM connection, credentials and datasource are required but you can't create (or even get) it through API independantly.

So basically what I would like :
-management of datasources
-management of connection (and when creating, either creating a new datasource or referring to an existing datasource)
-management of credentials


I feel like we're in the middle of the bridge here, with an almost awesome solution but not finished.
NB : I understand we can't get credentials secret but being able to create it would be great.

Best regards,

Simon

0 Likes

During a recent project, I explored the Alteryx Server Download a Workflow Package API Endpoint, which includes an optional versionId parameter. The documentation suggests that this parameter should allow users to retrieve specific, prior versions of a workflow package by supplying the appropriate version ID. However, in practice, I found that regardless of which valid prior version ID was provided, the API always returned the current published version of the workflow.

This behavior presented a challenge for our automation use case, where we aimed to implement version control and retrieval solely through Alteryx Server, rather than relying on external tools like Git. After thorough testing and validation, I shared these findings with Zach Hamilton and Marty Moravec at Inspire 2025, who were able to confirm the behavior.

I appreciate the continued enhancements and support the Alteryx team provides for the Server platform and its APIs. Resolving this issue would greatly benefit users who depend on robust version management and automation capabilities within Alteryx Server. Thank you for considering this feedback, and I look forward to future improvements in this area.

0 Likes

As the Alteryx Server administrator, I would like to see this functionally restored.

See attachment

Hi Alteryx experts!

 

There were some ideas similar to this one but none like it and really really old ones, so I`m revamping the idea due to recent struggles and the many questions we got on the server discussion board!

 

Workflow events are nice and helpful but they require the user to add it to every single workflow.

 

The Gallery admin also struggles to know when a schedule fails. 

 

There was the Server Usage Report before, but now most server admins have no idea when a schedule fails.

 

There are many ways of managing schedules and failed jobs (MongoDB, logs, events), but it would be nice to add a simple option to notify a user on each schedule. It would be even better if we added another option to enable this option in all schedules globally!

 

This idea could be combined with this one from @fjablo 

https://community.alteryx.com/t5/Alteryx-Server-Ideas/Notification-from-the-Alteryx-Server-when-a-sc...

 

WhatsApp Image 2024-11-22 at 16.31.13.jpeg

 

Let me know what you all think!

Best,

Fernando Vizcaino

0 Likes

The current Enterprise Utility Workflow does not have a match/look-up in place between the two environments for user ID's.  Thus, the migration workflow doesn't work without adding in a couple extra API calls.  When we upgrade to a new version of the utility, I have to add this back in. 

 

It would be nice if the workflow would come with this already configured.

Screenshot 2024-11-12 140414.pngAdd entered parameter into the result panel. It helps to debug process and audit.
sometime users save file in different way, and it not directly related to the interactive question
(e.g. Question: "SAP FBL5N Data" Input: "EXPORT (2).XLSX")
when error happen, I need the source file and the parameter.
if this idea is implemented, i just need the screenshot, which faster, without user to re-run the workflow to screenshot and share the parameter setup.


Hello,

 

Could you make it possible for curators to administer DCM connections created by other users?

 

Currently if one of the curators creates a connection, then only that person can modify it and give other people access to it.

 

Therefore if the person who created the connection isn't available then nobody can be given access to the connection.

 

Any curator can manage legacy gallery connections so to be able to move away from legacy connections and use DCM connections instead we need the same administration capabilities for DCM connections.

 

Thanks.

The V3 jobs API endpoint woefully lacks any usefulness. The current endpoint only has a get jobs/{jobid} method that is not useful because a database admin must query the database to get a list of all job IDs. To add insult to injury, this method is only limited to the user whose job is running or queued.

 

These are new features that I am proposing

1. GET jobs/listThis method must be callable by all users. Parameters such as none (default—full list), running, or queued will display jobs with the appropriate status. The job ID of the running or queued job and the worker it is running on must be included in the resultset.

2. GET jobs/{ownerid} — This method must be callable by all users. Like the GET jobs/list above, the resultset must include the job ID of the running or queued job and the worker it is running on.

3. DELETE jobs/{jobid} This method must be callable by the person who scheduled the job, the owner of the workflow, or the curator. This method is the equivalent of cancelling a job on the Server Admin page - #!/admin/jobs by a curator. All three mentioned people have a vested interest in the running or queued jobs on the server and must be able to cancel those jobs.

4. POST jobs/reassign/{jobid}/{new_job_tag} This method is restricted to the curator and applies to any job in a queued state. It allows a curator to reassign a job to another job tag or the first available worker for reasons determined by the curator.

 

This is an enhancement that I am proposing
1. GET jobs/{jobid} This method must be callable by all users. This will allow any user to get the details of any running or queued job.

 

Logging requirements

All DELETE or POST methods must be logged and purged based on the Persistence OptionDelete queue and results after (days).

 

 

0 Likes

When there is an app that has multiple tabs across the top but can extend down below the page, the user will scroll down to complete the boxes and click the 'Next' button at the bottom. 

This takes the user to the next tab but remains at the bottom of the page.

It would be very useful for this 'Next' button to jump back to the top of the page.

Commas save lives! With large numbers, it's difficult to see what number was actually entered without commas present, and the wrong number can easily be entered as a result:

image.png

 

 

Is this number

  • 100,000
  • 1,000,000
  • 10,000,000

It's hard to say without taking a really close look!

As the single user who upload workflow MYWorkspace ends up being very crowded.  I move them to collections to be able to share but on My workspace, I can not see which collection they are associated to. It would be nice if there was a column displayed that showed associated collections.   As well as being able to have some different types of category classification.  I would want something like a development stage: (In-progress, UAT level, deployed, retired) If they are retired a way to remove them off my general workspace area.

 

 

Hello all,

Right now, we can choose either MongoDb or Microsoft SQL Server as a backend. I would suggest to add Postgresql. Why ?

-it's open-source
-it's reliable
-it's free
-it works well on many environments
-it's popular
-it's already used as backend for Tableau or Qlik and many others
-it respects SQL norm and doesn't have a lot of specificity

Best regards,

Simon

Top Liked Authors