We are currently testing Alteryx because we're about to migrate from SQL Server to MySQL.
However we noticed that our data never arrives at the MySQL database in a complete state. Alteryx runs the workflow and says it has correctly written all the records, without any errors or warnings. Then we check the table in MySQL and there are always a few records missing.
We tested this on two separate machines, one with the data source being SQL Server, the other data source being a csv file with 284 records. Just 214 arrived at the MySQL database. The MySQL server runs as localhost.
Writing with Alteryx to SQL Server works fine.
What could be the problem here and why doesn't Alterys state that there's a problem?
Solved! Go to Solution.
Hi Tobi,
I can assume that you're connecting with the ODBC driver of MySQL. Is it 32 or 64 bits?
Have you checked the "Allow big result sets" and "automatic reconnect" in the driver config?
Please, try it and tell me...
Best,
_AG_
Does Alteryx say that it has written 284 rows to the DB? Is there something else that happens on the DB side e.g. a trigger or a validation rule?
Maybe you may attach the files (data and workflow) and I can take a look
Maybe a long shot here, but I'd also double and triple check that the fallout records aren't key violations or data type architecture mismatches or something silly like that. Like I said, probably a long shot, but I've seen stranger things prevent transaction loads.
ODBC driver 64-bit.
I activated "Allow big result sets" and "automatic reconnect" now, it doesn't make a difference.
Thank you for your offer, maybe I'll put something together.
@Aaron:
Alteryx states the correct number of rows written. It's a completely fresh database without any triggers or validation rules.
@Coxta45:
I can't say from my knowledge that this is the case.
Ok, so I've isolated the problem, it has to do with german vowel mutations and other special characters.
Once there is an umlaut (vowel mutation) in the record, it doesn't get written to the MySQL table.
Any thoughts how to solve this?
I don't know much about MySQL, just thought I'd throw out the idea as to investigate whether the data is based on a specific code page related to German.
I've had to input/output data using Chinese characters and ended up converting the code page first. You can do that within Alteryx...there is a Code Page setting on an Input tool for file based sources, and there is also a function (under Conversion) for converting from/to a different code page.
A general starting point for understanding code pages is at https://en.wikipedia.org/wiki/Code_page
Hoping this might help...
Thank you very much, that was the solution. With the codepage „ISO 8859-2 Central Europe“ the vowel mutations get read correctly and all the records are written wo the dabase.
The only difficulty could be to find the right codepages for different data sources. And I find it a bit irritating that neither Alteryx nor MySQL state that there's a problem and just ignore the records.