Bring your best ideas to the AI Use Case Contest! Enter to win 40 hours of expert engineering support and bring your vision to life using the powerful combination of Alteryx + AI. Learn more now, or go straight to the submission form.
Start Free Trial

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.

Redshift bulk uploader error

aatalai
15 - Aurora

Hi all,

 

I'm trying to use the redshift bulk uploader to with my in db (stream in) 

 

and get this error when trying to create a temp table  "

Error: Data Stream In (273): Error creating table "AYX2508292baf7b59a6ef0e4043763c29059daec4": [Redshift][ODBC Driver][Server]0A000:error: External table names must be qualified by an external schema

CREATE EXTERNAL TABLE "AYX2508292baf7b59a6ef0e4043763c29059daec4" ("Account" varchar(40),"Attribute" varchar(30),"Attribute_Value" varchar(100))ROW FORMAT DELIMITED
FIELDS TERMINATED BY '' LOCATION 's3://(URL removed)"

 

I get this error when trying to write to a specfic table 

 

Error: Output Data (268): Error creating table ""dev"."public"."dim"": [Redshift][ODBC Driver][Server]0A000:error:  External table names must be qualified by an external schema
 
CREATE EXTERNAL TABLE "dev"."public"."dim" ("Account" varchar(40),"Attribute" varchar(30),"Attribute_Value" varchar(100))ROW FORMAT DELIMITED
FIELDS TERMINATED BY '' LOCATION 's3:/(URL removed)'

 

it works fine for non bulk uploader

 

Screenshot 2025-08-29 085218.png

1 REPLY 1
apathetichell
20 - Arcturus

Is this Redshift Spectrum --> Alteryx noted:

To create Spectrum tables with the Output Data tool, specify both the schema and table name.

spectrum_schema.tablename

 

and the error makes it seem like you need a two part identity for your new table. You should be able to see your schema on your input data/connect in-db (if you query 1 row --- and then use dyanmic output in-db to retrieve your query --- you should see the internal format that your Redshift is using)....

 

I'd also check if the data is being written to S3 (not sure of how the implementation is on the Alteryx side --- but it's quite possible that it's writing to S3 first --- mapping to Redshift second --- and your error is occurring on the mapping).

Labels
Top Solution Authors