Dynamically run sql queries based on tables found in .csv files
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
I have 12 folders which each contain a .csv file. Within each .csv file there may be one or more database table names (column labeled FullTableName). I would like to read the 12 .csv files, dynamically read the database table names within each file and use them to populate a sql query which selects all records from each database table. In addition, for each record that it extracts, i would like to add and populate a new column, FullTableName, with the FullTableName from which it originated from.
File attached is a sample .csv that contains 6 sql database/table names.
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
You can do that with dynamic input:
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
How do you get the FullTableName from within the .csv file added to the results/output?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
I changed the input location of the .csv, updated the odbc connection and ran the workflow. I would've expected the dbo.table name data to be added at the end of the results, but the results did not contain the dbo.table name column. I am using v2019.4. Is that why it's not working?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Change the query to:
SELECT 'dbo.table' as tablename, * FROM dbo.table
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
I originally used the "select * from dbo.tablename" where dbo.tablename was replaced with the 'fulltablename' from the .csv file. The workflow ran fine, but did not contain the tablename applicable to the record. When i added 'dbo.tablename' as tablename, an ran the workflow, i am receiving an error (The file "" has a different schema than the 1st file in the set) on some of the files within the .csv. I have verified that the columns within each of the files are the same, so i don't understand why i only receive the error when i add the tablename to my output.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
In this case you should use a batch import macro to read all csv files:
https://community.alteryx.com/t5/Alteryx-Designer-Knowledge-Base/The-Ultimate-Input-Data-Flowchart/t...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
I actually found another solution to this problem. The Dynamic Input tool allows me to use the Modify SQL Query radio button to replace a specific string AND pass a field to the output. I used the latter option to add the ServerName, DatabaseName & TableName (which i had derived earlier in the process) to the output tool which allowed me to associate the output records to the applicable table in which they originated.
