Get Inspire insights from former attendees in our AMA discussion thread on Inspire Buzz. ACEs and other community members are on call all week to answer!

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

Help migrating MS Access database to PostgreSQL with a macro

Amanda_Breeze
8 - Asteroid

Hi folks,

Seeking your legendary help again.

 

This time I'm trying to migrate data from a Microsoft Access Database to PostgreSQL.

 

But here's the caveats;

  1. I want the workflow / macro to create the tables in the first instance,
  2.  After all the tables are created and their data migrated, only certain tables will need to be updated from then on, so I need to tell the workflow which ones they are,
  3.  I need to specify the schema manually – I have a text input for this,
  4. The macro needs to work with multiple databases with similar structures.

I started by creating a workflow that did this manually but as this is a process we want to repeat, I want a macro to do the heavy lifting.

Amanda_Breeze_0-1603257575820.png

This isn't very efficient.


Trouble is, I'm not all that familiar with building macros so I'm having trouble getting this to work. I've also tried looking for info on this but haven't found the result I'm trying to achieve.


To determine which tables are to be updated, I've created a query that extracts the table names from the MSysObjects table and outputs a list for the macro to consume.


Unfortunately, I cannot share the workflow or macro with you but I've created a diagram (yes, it's in PowerPoint) which hopefully explains what I'm trying to achieve here.


My first attempts creating this macro kinda worked in that it created the tables but it just put the same data in all of them.

 

I tried using a dynamic input tool to get around this but now it just won't work at all.

 

I'd greatly appreciate the help.

Amanda_Breeze_1-1603257650501.png

 

 

Kind regards,
Amanda

3 REPLIES 3
joshuaburkhow
ACE Emeritus
ACE Emeritus

Hey @Amanda_Breeze 

 

So you have this diagram that basically you are trying to implement? Am I correct to assume that you have built out the batch macro that you have in the diagram? 

 

"My first attempts creating this macro kinda worked in that it created the tables but it just put the same data in all of them" One thing you could check here is the group by on the batch macro.

Joshua Burkhow - Alteryx Ace | Global Alteryx Architect @PwC | Blogger @ AlterTricks
Amanda_Breeze
8 - Asteroid

Hi @joshuaburkhow  yes. But I'm not certain it's the best way to go about it. I can't seem to go back to the way it was because I've messed with it too much. 

Amanda_Breeze
8 - Asteroid

So it turns out all I needed was another control parameter to control the input table name. 

I also had the output incorrectly configured (I had schema_table instead of schema.table which wasn't visible here duh).

The solution is this:

 

Amanda_Breeze_0-1603330143264.png

  1. I replaced the dynamic input with a static one and used a control parameter instead of a macro input (to feed in the table names to copy over)
  2. The control parameter (Input Table name) is configured to update the table name value
  3. The PostgreSQL DB output is configured with the File - value="odbc:DSN= blah blah connection string|||schema.table", where
    • schema is updated from the text input as defined by the user in the workflow, and
    • table is updated from the Control Parameter (Output Table name) which is configured to Update Value with Formula: replace([Destination],"table",lowercase([#1]))

This 1, solved the connection problem and 2, made it so the data copied over into the correct table.

It have found this takes a very long time to write to the database though. Not sure if this is normal.

Just writing two tables containing 25k and 36k rows respectively took just over 20mins to run.

That'll be the next problem to solve.

Posting this here in case others encounter the same issue.

Thanks @joshuaburkhow for replying though.

 

 

Labels