Hi, I may have a requirement to move 1000 tables per day from an on-prem database to the cloud. Updates only for fact tables but perhaps the full tables for dimension tables. This is a unique case where no transformations at all will be required.
Given that 1000 is a lot of tables to update every day, I'd like to understand the most optimal server configuration to get this job done as quickly as possible. Does anyone have a few ideas to test?
I've been through the "scaling a private server" links but they focus more on "processing" and I don't think any processing needs to be done here. Just I/O. Thanks for your thoughts.