I have a workflow with several excels files as input, the total weight of the files is c. 1GB, first thing in the day I have not had any problem running the workflow but about 8 hours later the cache load of one of the files instead of loading 70ks records loaded 24ks also happened in the same workflow that a table as manual input cached along with the rest has not returned me any record after executing the stream.
This failure occurred without any warning from the program that the load was not correct so it could have passed an erroneous result to the client without being able to control the error.
Do you have any idea why this could happen and any way to avoid / control it?, could keeping a heavy cache for many hours cause it to become corrupted?
Thank you very much.