Hi all:
I'm running a workflow that grabs 7 excel files and does a lot of merging, unioning, filtering...etc...etc...then finally dumps the data to a Tableau server.
I'd say that 40% of the people here are doing the same thing.
My problem is that the 'data engineering' part takes about 30 seconds, but the reading from SharePoint takes about 45 seconds. I've turned off the section that dumps the putout to Tableau as I am adding some more data files from SharePoint.
Is there a way to read the excel files on SharePoint once, then put them in memory so that I can turn off the SharePoint tool as I do my development and testing so that I don't waste time during this phase of my project? Therefore I read the data from the files once and only once. as long as I am developing and testing.
The source excel files change hourly based on people's input so I'm challenged to determine if my workflow output is the same as what the stakeholders are doing manually.
Thanks.
-prpatel.
Solved! Go to Solution.
Hey @prpatel,
While testing you can use the 'Cache and Run' Option. If you right-click on one of the SharePoint Input Tools, you can click 'Cache and Run', this will save a local .yxdb to your temp folder. When you re-run the WF, this file is always used instead of the normal SharePoint file. This persists until you do changes to the input, or close the WF.
I do this with tools that take a ton more than 45 seconds - I put the input components in one macro and the processing parts in another (or more) macro. I use .yxdb files of sample inputs for processing and to populate the schema.
User | Count |
---|---|
17 | |
15 | |
15 | |
8 | |
6 |