Alteryx Designer Knowledge Base

Definitive answers from Designer experts.

Error: "SQLPrepare: [Simba][Hardy] (80) Syntax or semantic analysis" thrown in server while executing query: Invalid column reference


Environment Details

User is attempting to read data from Hadoop Hive and getting the following error: 
Input Data (1) Error SQLPrepare: [Simba][Hardy] (80) Syntax or semantic analysis error thrown in server while executing query. Error message from server: Error while compiling statement: FAILED: SemanticException [Error 10002]: Line 1:33 Invalid column reference 'a.storenum' 

Alternatively, user is attempting to write to Hadoop Hive and getting the following error: 

Output Data (2)    Unable to find field "test" in output record.


  • Alteryx Designer
    • All versions
  • Simba Hive ODBC Driver with DSN configured
    • Version 2.6 or greater


In the Visual Query Builder, the column names are prefaced by an 'a': 

idea Skyscrapers

In the Output tool, you can see the table name precede the column name in the field map:

idea Skyscrapers


The below resolutions apply only if the error appears in the context of this change to the column names. 


Simba introduced a new feature in their driver that ensures that all column names are unique when reading in data. This feature is resulting in Alteryx not being able to recognize column names. It is a hidden feature that must be turned off in the Registry if the user can't edit the query to avoid the error.  


Solution A: Use Select * [...] in the query

1. In the Alteryx workflow, open the Input tool that is throwing the error.
2. Go to the SQL Editor tab.
3. Remove references to individual columns from the query and use the wildcard asterisk (*) instead to read in all columns in the table.

idea Skyscrapers

Solution B: Add a Server Side Property

  1. Go to the Windows ODBC Data Source Administrator
  2. Open the DSN that is being used to connect for editing
  3. Click on Advanced Options
  4. Click on Server Side Properties
  5. Add the property 
    and set it to false
  6. Hit OK on all the windows to save the changes

This is not a global setting, it is set for each DSN, individually. 

8 - Asteroid

Thanks for the solution, I went for the 2nd option, and it solved my problem immediately.

Without understanding the behaviour, and changed the setting, I had to use remove pre-fix for every table I brought in.


Hi @steven4320555 

So glad this helped!! 

8 - Asteroid
8 - Asteroid

Hi Henriette,


I'm having a similar problem. I'm trying to write a simple table to Hadoop. The table is getting created fine but the content of the data stream is being considered as "?".


To greatly simplify the situation, I've chosen only 10 records and one field. The error message shown in the second graphic is when I tried to write the first five fields. The same error happens whether I'm writing 1 field or all of them. 


Do you have any insights? I've used this style workflow in the past to write to hadoop without any trouble.







10 records and I'm only trying to write the first column10 records and I'm only trying to write the first columnThe error message is similar to the one shown in this threadThe error message is similar to the one shown in this thread


Hi @Ken_Black 

The (?,?,?,?,?) are placeholders for the data which is being sent in batches for faster loading. So it is ok that they are showing up. 


I would start by pulling some ODBC logs (see here for instructions) and see if you can find something in there. If not, you can reach out to to have them help you. 


Another thing you could check is the Advanced Options in the driver, we recommend not to check the "Use Native Query" option. Unchecking that option allows the driver to convert the query to HiveQL, otherwise the driver will assume that the query is already in HiveQL and you might get errors. 


I ran into that error message in a different context, but what I did to resolve it was to add another Server Side Property: