Community Spring Cleaning week is here! Join your fellow Maveryx in digging through your old posts and marking comments on them as solved. Learn more here!
The Product Idea boards have gotten an update to better integrate them within our Product team's idea cycle! However this update does have a few unique behaviors, if you have any questions about them check out our FAQ.

Alteryx Designer Desktop Ideas

Share your Designer Desktop product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Allow Last Run metadata to persist

This is similar to a prior idea now marked complete "Allow macro metadata to persist until next run".  I tried the check box solution and still have the same issue, running V11.

 

What we NEED is for tools that derive columns like CrossTab to retain metadata from the most recent run and thus pass that metadata downstream for further tools and development.

 

Use case:

I have several cross tabs and before V11 I could run the flow once to push metadata downstream, then add or modify tools downstream and the derived fields from the cross tab stayed available in those tools to be recognized and referenced as I add more tools and logic. Now in V11 I am finding if I click on a tool or add a tool downstream the metadata for the derived columns disappears.

 

I attached pics to illustrate where I have 6 CrossTabs and decided I needed to add a summary downstream.  I had to run the flow to get metadata populated which is normal and I added the first summary, then inserted another summary and immediately the derived column metadata was lost in all paths after the crosstabs.  so ended up having to re-run the flow 5 more times for each summary tool added. then I had to re-run it 5 more times to adjust column names in selects after downstream joins.

 

I end up wasting a lot of time having to re-run a sufficient test file to feed all the variety of data necessary to generate all columns between most edits or new tool adds.  What used to take ~5 minutes to do now takes ~35

 

I recall seeing and discussing this issue previously and hoped the check box would resolve but It does not fix the issue.  

 

We see similar issue for tools downstream from other tools where the columns are derived or uncertain until that tool runs, such as, transpose, Joins and Unions.  I recall some discussion at user groups and in the community but the only reference I found this morning of seeming relevance is the one I mentioned above.

 

12 Comments
fharper
12 - Quasar

I forgot earlier as I was focused on the one aspect but some of the prior discussion was also on the fact that some tools like decisionTree and a few others will lose their settings/selections when you open a flow and click on the tool before running it.  when a flow uses these tools you have to run it enough to push metadata to the tool before you can open the tool to edit configuration without a total lost of configuration.  So this is kind of the same issue where metadata does not persist in the workflow.  

 

I feel we need to have the configurations of all tools, based on the last run persist, not just to the next run during a single session but be saved when flow is closed so when re-opened we can pick up where we left off without re-runs.

Hollingsworth
12 - Quasar
12 - Quasar

Great idea!

 

I wholeheartedly endorse this idea. I understand why the canvas is re-evaluated after each change, but there needs to be a way for the tools that change field structure at runtime to be able to cache the state from their previous run.

fharper
12 - Quasar

Update due to some help from Alteryx's Paul Treece.  

I had another instance of meta-data disappearing with my touching a single thing on the canvas after a few minutes.  After doing a Chat with Paul and showing him the issues via webex he educated me on a User Setting on the "Advanced Tab" called "disable Auto configure".  By default the system with auto-refresh your canvas periodically.  when it does the auto configure executes which causes meta-data associated with known fields to push downstream but for fields derived in tools like crosstab there is no data in the pipe to populate meta-data from and so the last runs meta is lost.  by checking this setting you can stop that auto configure from happening.  

 

Just remember that while checked if you modify a select or other tool to exclude or include a field, etc. the meta-data associated with the change will not push downstream as you would normally see.  If you are working on a flow with these type tools I turn it off, run the flow to force meta-data and then do my work.  When done turn it back on.

 

My original post sought a solution to Allow metadata to persist and this is a usable work around but the original goal remains.  this setting shows the pieces exist to accomplish the goal.  We just want the meta-data from the last run for derived data to persist until the next run forces a change.  So we don't have to manually flip things back and forth while we work.

chrisha
11 - Bolide

This issue has become a big problem for us recently. We have some large workflows with a custom macro at the very end where we need to do some field selection. Since there are some transposes and crosstabs involved in the workflow, whenever we open the workflow file in order to run it, the field mapping is lost... that's problematic because we do not want to run the same workflow multiple times (takes very long).

 

Is there a fix or more reliable workaround available yet?

patrick_digan
17 - Castor
17 - Castor

@chrisha @fharper I've recently discovered that my metadata will always properly flow with one exception (and thankfully there's an easy workaround). I finally figured out that the one instance I was having metadata issues was related to using the cache dataset macro. Essentially, metadata will stop flowing if you have a standard macro (cache dataset in this case) and then have a disabled tool connected as an input like this:

Capture.PNG

 

Even with that output changes box checked in the macro, there is an issue with the metadata downstream in this setup. The simple solution is to delete the line connected to the cache dataset macro:

Capture.PNG

 

Everything still works the same, AND my metadata downstream doesn't have any problems. This issue was so pervasive because I use the Cache Dataset a lot. This issue would affect all macros.

 

Hope that helps! 

fharper
12 - Quasar

Patrick, I think that is a unique isolated example.  In my experience, and I have verified with others, tools that dynamically determine downstream metadata will lose the record layout/metadata.  It is tied to the refresh/auto-configure function.  when you open a flow and make changes the auto configure feature will attempt to assess and propagate the impact of changes downstream and when it hits a cross tab which dynamically determines the meta data based on field configuration and the data in the stream.  without running data in the stream to generate the metadata the tool drops what it had and leaves you with nothing and then downstream tools are impacted accordingly.  Thus you have to keep running the flow each time you edit an impacting tool.

 

As mentioned above you can turn off the auto-configure and mitigate the issue but then you lose the benefits of auto-configure on all other aspects of editing.

 

Never used the cache dataset macro so I may be missing something but your example seems limited to that tool and being connected to a disable container.  All of my examples do not involve either.  Am I wrong there?  

 

Another thought on basically the same issue but a different spin or tangent is the SELECT tool.  the system dynamically alters the data types based on input data in the stream.  if you have an input tool followed by a select and other tools the select will dynamically determine the data types on the initial execution and if the data had all nulls in a field you intend and expect to be string the SELECT will default to Double as a data type.  If you change the data type in the SELECT and save and rerun it will reset back to double...if the data had a record where the field had a string value the first time it would have defaulted to V-string datatype but if the next run there were all null values or values of numeric content that iteration would change the datatype again...and if you have downstream tools like a join or formula using the field they will break because now you are joining a string to a double or using it inappropriately in a formula.  

 

Need to have the ability to check a box to fix the meta-data in the tool.  same kind of need to retain the meta-data.

fharper
12 - Quasar

when is say "Need to have the ability to check a box to fix the meta-data in the tool." I mean to fix in place, aka retain.  sorry for the confusing phraseology

 

fharper
12 - Quasar

This problem remains an issue for anyone with large flows using cross-tabs and similar tools where metadata will change according to data flowing in the stream.  because they know the metadata can change they blank out the meta data when a change to an upstream tool or if one opens that tool first you loase not only outgoing meta but the incoming meta.

 

I turn off the auto-refresh feature to minimize the issue but this is not a complete solution and is like tying one hand behind your back for downstream editing.

 

Has anyone from Alteryx looked at this?

Community_Admin
Alteryx
Alteryx
Status changed to: Inactive
 
Community_Admin
Alteryx
Alteryx

The status of this idea has been changed to 'Inactive'. This status indicates that:

 

1. The idea has not had activity in the form of likes or comments in over a year.

2. The idea has not reached ten likes.

3. The idea is still in the 'New Idea' status. 

 

However, this doesn't mean your idea won't be implemented! The Community can still like and comment on this idea. With enough renewed interest, this idea can be brought back into the 'New Idea' status. 

 

Thank you for contributing to the Alteryx Community and the Alteryx Product Idea Boards!