Advent of Code is back! Unwrap daily challenges to sharpen your Alteryx skills and earn badges along the way! Learn more now.

Engine Works

Under the hood of Alteryx: tips, tricks and how-tos.
DavidSta
Alteryx
Alteryx

Alteryx’s Workflow Migration Enterprise Utility Package offers a comprehensive suite of tools designed to seamlessly transition your workflows from a sandbox to a production environment. This package covers the entire migration process, including auditing, approval, and collection management.

 

In this blog post, I will demonstrate how Alteryx’s capabilities can be leveraged to enhance this process with effective data source management. In other words, you’ll learn how to change your data connections during the release process.

 

Why is Data Source Management Important?

 

As you gain more experience with Alteryx, you may start considering how to operationalize your workflow processes. From a governance perspective, your company might restrict access to all available datasets, limiting you to test data. This necessitates managing different data sources during workflow development and when running them live on your production server.

 

Manually changing connections before publishing is an option, but Alteryx excels in automation. This is why I want to introduce you to a user-friendly toolset that automates data source changes. Instead of manually updating potentially hundreds of workflows, you can define your mapping once, and all your workflows will be updated automatically.

 

Find & Replace Your Data Sources

 

The key is to have a mapping table that translates your test data sources to production data sources. This can include filenames, UNC paths, DSN names, or even DCM connection IDs (for DCM, you can also use the 2024.1 Server Feature “DCM Connection Handling”).

 
 

DataSourceMapping.png

 

Additionally, you need to provide a list of YXZP file paths.

 

FilePath.png

 

The workflow will take each workflow and apply the data source mapping. A great feature to note is that you can directly integrate this into the Workflow Migration Enterprise Utility Package.

 

EntUtilitiesOrg.png

 

Currently, the Workflow Migration Enterprise Utility Package downloads the workflow from the sandbox and stores it temporarily. This is then mapped with the workflows to be migrated and finally uploaded to your production environment. You can easily incorporate the Data Source Migration Tool within this process.

 

After downloading the workflows and storing them in a temporary folder, the process updates all the workflow files with the new data sources before uploading them to your production environment.

 

EntUtilitiesUpdated.png

 

The Engine Under the Hood

 

MacroOverview.png

 

The migration process consists of three main steps:

 

1. Reading and extracting the provided YXZP archives.

2. Applying the data source mapping to all workflows, macros, and apps.

3. Repackaging the YXZP archives with the updated files.

 

Step 1: Read & Extract the YXZP

 

This step involves providing a path to a YXZP archive and performing a basic test to ensure the migration workflow won’t fail later. It checks that the file path is not empty and that the file is a YXZP. If these criteria are not met, the process stops immediately.

 

Once the test is passed, a Python tool extracts the YXZP archive, and stores all contained files in a temporary location on your hard disk. Each YXZP contains a comment used by some older Designer versions, which is also extracted for use during repackaging.

 

Step 2: Apply Data Source Mapping & Update Workflow Files

 

After extracting the files, the next step is to search for all workflows, macros, and apps. Since the configuration is stored as XML, they are read using the Dynamic Input Tool. The data source mapping is applied using a Find Replace Tool by injecting the mapping table defined outside the macro.

 

Keep in mind that the Find Replace Tool has basic logic. Replacing a filename may impact other values within the workflow, such as comments or other character strings. The Output Data Tool then overwrites the files in the temporary directory.

 

Step 3: Update the YXZP

 

In the final step, all updated files need to be repackaged into a YXZP. Again, a Python tool is used. It takes all updated workflows, macros, and apps, including all other dependencies from the original YXZP archive, and repackages them into a new YXZP archive. The comment extracted in the first step is also added.

 

After creating the archive, the temporary directory is cleaned up to avoid wasting resources. The updated YXZP is now ready to be uploaded to your production environment.

 

Conclusion

 

By following this sample workflow, you can see how straightforward it is to migrate your data connections from test to production environments using Alteryx. While this is a preliminary draft and may not cover all edge cases, it serves as a solid foundation. I encourage you to test it thoroughly and provide feedback for further refinement.

 

You can find the sample Macro available here: Data Source Migration Macro