Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

SQL duplicate primary key error when there is not a duplicate

AaronPritchard
5 - Atom

Hi all,

 

I have a rather perplexing issue that is stonewalling me here, and I’m hoping you can give me some ideas...

 

I have a data stream from an input tool (from SQL table) to an output tool (to SQL table). The output tool uses ‘delete data & append’ and the corresponding SQL table has a primary key.

 

When I run the workflow, it reports a duplicate primary key error. To me, this either means that the data stream has a duplicate key within itself or in relation to the existing data in the table. However, it cannot be the latter since I am using ‘delete data & append’.

 

I am sure that the data stream doesn’t have duplicates either because the input data source utilises the same SQL primary key and I have put a unique tool in on the key to make sure.

 

I’ve tried recreating the primary key just in case there was some bug there, but no luck.

 

Any ideas? Thanks.

3 REPLIES 3
RishiK
Alteryx
Alteryx

@AaronPritchard what's the actual error message that comes up when you try to load into the table?

Could you add a Unique tool in your workflow before you do load the table, to check the data?

 

I would test this in a new empty table as a test, with the same constraints and primary key etc, and see if the error persists.

AaronPritchard
5 - Atom

Hi @RishiK,

 

Error is as follows:

 

Error: Output Data (170): DataWrap2ODBC::SendBatch: [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Violation of PRIMARY KEY constraint 'PK_[REDACTED]'. Cannot insert duplicate key in object

 

 

I have added a unique tool (on the same primary key) so I know the data stream contains no duplicates, and 'delete data and append' should ensure there is no data in the table to conflict with.

 

 

However, I've had a shower thought...

I do have other Alteryx workflows that write to the same table on a schedule.

 

It is possible that if the output tool does not truncate the table and insert the records in the same transaction, then the other workflows are writing to the table after it truncates it and whilst it streams the data - which of course, causes a duplicate insertion issue.

 

Does this sound like it could be the source of my issue?

 

I may just have to ensure the workflow schedules do not overlap.

 

Thanks.

RishiK
Alteryx
Alteryx

@AaronPritchard I think that might be the issue. I believe the Output tool would replicate an action of Delete, Commit, then Insert, Commit.

Depending on what your data is, how about adding a Sequence to the table?

 

I'd advise not having overlapping schedules on the same table if possible.  This can lead to other issues such as table locking etc.

Labels