Whenever I go to update the file in my input tool, update a filter, or even update the file location for my output Alteryx can take minutes to move to the next tool. Below is a snippet of what constantly happens. I select the filter tool to update the parameters, wait 3 minutes, update the parameter, select the input tool to update, wait 3 minutes, select the input file I need. I have tried researching this, and the only thing that seems to work is if I restart my computer. I run the same process multiple times a month, so I really need to address the root cause of this issue. Not sure if this is an issue with my computer, so I included the specifications for the computer I use.
Hello!
I can't say for certain what the root cause could be but something that helps me speed up workflow designing especially with larger workflows is disabling Auto Configure.
By default this is enabled and what it does is it will automatically update the metadata that feeds into the tools you configure on the canvas. Disabling this will stop that action from automatically occurring when you click out of a tool you just updated and instead you'll have to manually refresh the metadata by clicking F5 on your windows computer.
You can find this setting from Designer Desktop in the following:
Options -> User Settings -> Edit User Settings -> General -> Disable Auto Configure
Also, is the spreadsheet on your machine or a network share drive? If it is local that shouldn't be an issue. One easy trick is to put a sacrificial SELECT tool after the Input, make no changes and right-click run and cache. When you are all done, go back and remove the tool if you want.
This type of issue in Alteryx usually occurs due to having a lot of metadata that Alteryx is trying to update every click. Hence, turning off Auto-Configure to resolve this.
However, I see that you have a summarise as the 4th tool, and so I doubt you have that configured for many fields.So, if it's too much Metadata, then the likely cause would be an excel spreadsheet that thinks there are 65k columns in itself.
Try isolating the issue. If you write your data to a local yxdb file (local, not cloud/network, but on your actual machine and not through a OneDrive/GoogleDrive/Dropbox etc path), and use that as an input, does the same issue happen? Separating the Input file is a huge step here to track down the issue.
Hello!
Tried this and no luck...
My data files are extremely small. Max 100 lines in an excel or txt file. I run everything on my computer's C drive.
One other thing to try may be to run the data cleanse tool and set it to remove NULL columns. Is there any chance that you were reading all the blank stuff on accident? I'd also set to remove null rows while you are at it. My idea about the select tool above and caching will help, but if it only means you'd have to wait 1 time for the files to be read and from there it should be faster. I agree that isn't a good permanent fix, but it could keep you moving forward.