Hello,
has anyone ever noticed some workflows using over 0.5GB of memory upon loading?
This is before running the workflow, for some reason any change locks the interface up to a minute and it almost feels like there is a memory leak somewhere as RAM usage goes up with every change.
Has anyone experienced anything similar?
Solved! Go to Solution.
Hi @marco_zara,
When you open up a workflow, it refreshes the metadata throughout the whole flow. If your workflow is quite sizeable this could be the reason why you see a spike in resource usage when opening.
Sam 🙂
In this case it's a pretty small one (attached to the first post), the interface uses more memory than all input files, going over 6GB peak.
Hi @marco_zara,
It seems like your file had filled up with metainfo in the raw file. Noticed your file size was quite big for such a small flow so opened up the file in Notepad++ and just cleared all the metainfo tags.
Opens much faster now.
Sam 🙂
Thank you.
Is there any guide on how to do the cleaning manually?
Hey @marco_zara,
If there is I haven't come across it. If you need to do it again I just cleared out anything in between and including <MetaInfo> tags.
Sam 🙂
I wish there was a flush icon to cleanup meta tags and refresh from scratch...
Hey @Atabarezz,
Not a button, but here's an analytic app I have written that will do it for you.
Sam 🙂
This should be embedded into the software quickly… Great work...
I would like to further debug why this workflow takes so long with meta info, but not without it.
Would it be possible for you to send me the yxdb files used in the three input tools?