Hello All,
Does anyone has Macro or Idea to select "Dynamic or Unknown Fields" in Unique tool?
My required is to pass any number of fields and get unique records. Every time my input fields would changes
Please help.
Regards,
Vp
Hi @vepp
The solution from @JosephSerpis is close, but it fails if you include field types other than numeric types. This is because the Crosstab tool needs to be reconfigured if the Values column contains anything other a number. You can get around this by joining with the original data on RecordID instead of using a CrossTab. The J output has the Unique records from the original data set and the R output contains the duplicates
Dan
Hi @vepp I came up with an alternative solution and ensured the fields in the Unique tool are the same but can handle if the number of fields decrease or increase. Let me know what you think?
Hi Vepp,
The macro below will summarise on all fields and return your data back to its original schema so you can use the macro in the middle of your workflow if needed with no issues, therefore should be exactly what you're looking for, if this solves your issue please mark the solution as correct, if not let me know!
Jonathan
Hi @Jonathan-Sherman
Since you're using a Crosstab to rebuild your data instead of just using the record ids to join to the original data, you'll have problems if the original column names have spaces or non alpha-numeric characters in them.
Input Data
Output data
I'm validating a large migration of data from SQL Server to SnowFlake and of course I want this validation to be as automated as possible. Most of what I wanted to achieve is done but I thought if could just do this one more thing I would be really satisfied with the process.
Within a batch macro I query a table from both databases, remove duplicate records and then count records. I also union these two datasets together, remove duplicates and count records. The idea being that I have an exact match if all three counts are the same. I'm feeding this batch a large number of tables to check (all of them different). None of usual tools for removing duplicates (Unique, Sum , Tile, multi-field formula) could handle the dynamic nature of what I needed to accomplish.
This post helped me achieve that. THANKS!!! ~DD
Hi Dan,
I ran into a situation when the concept wasn't giving me the results I expected.
I isolated a sample to test with, I would expect it to consider these records to all be unique but because there are (multiple?) duplicate name value pairs the workflow pattern determines a few of the records to be duplicates.
I think perhaps we need to add a summarize tool after the transpose to concatenate all of the values together for a given record id.
What do you think?
~Data Dynamite