Hi I'm an intern and the firm doesn't want to purchase server yet.
I connect to dB with my login details (input tool, dynamic input tool)
I created flow and share to other people. But everyone has to redo the connection because input tool errors.
Each time is problematic and I want some innovative solution to bypass this connection failure error.
Besides editing the connection string in XML, is there any smart solutions for such problems?
Thanks for your inputs
DCM is the new school way ---> old school way is via connection through file --- or use a named connection which you set up via manage in-db... Key here is that your users are really responsible for a) setting up their own ODBC connection b) naming the connection the same cross system (like 'Snowflake')
you can automate setting up the ODBC cross machine via regedit in bat scripts with .reg files --- but if users have specific passwords/usernames ---> they'd need to edit the .reg to include those.
Thanks so much for your input.
I will be studying DCM and regedits. If you have some helpful links please suggest.
I know 0 about this site --- but it showed up on a quick search on regedit -> https://www.tenforums.com/tutorials/125696-export-import-registry-keys-windows.html --> my recommendation is to create your odbc connections on your machine. export the .reg files. create a bat for import. I manage some VMs and do this on setup...
I'd point out that depending upon your database you may need additional connection settings for different users. I don't know if Alteryx has an .xml/settings file which controls in-db connections but I'd expect they do somewhere.
If auth is SAML, proper ODBC drivers are set up, and Okta Groups are provisioned, some connection credentials are going to auto populate based off of users db access. You'll still need to share the data source with the users via DCM, and they'll need to click "new credential" before linking the data source and the cred for the data connection. But no need to distribute / have them re-enter. Can think of at least two data lakes / warehouses this works with.