Community Spring Cleaning week is here! Join your fellow Maveryx in digging through your old posts and marking comments on them as solved. Learn more here!

Alteryx Designer Desktop Discussions

Find answers, ask questions, and share expertise about Alteryx Designer Desktop and Intelligence Suite.
SOLVED

AWS KMS Error While Uploading File to S3

pravinpande
6 - Meteoroid
Spoiler

Getting below error when uploading files from SP to S3. Can someone help?

 

Error: Output Data (12): The COPY failed with error: Error from AWS: Requests specifying Server Side Encryption with AWS KMS managed keys require AWS Signature Version 4.

 

I ran below command but doesn't help.

aws configure set default.s3.signature_version s3v4

 

3 REPLIES 3
Treyson
13 - Pulsar
13 - Pulsar

Hello @pravinpande 

 

I hate to say this, but I think this is an AWS specific error. Alteryx is simply returning whatever problem your AWS instance is having.

 

A quick Google leads me to this thread. It seems that it's a problem with your signature version. I am not an AWS developer, but if you have one in your organization, it may be good to contact them since I believe that this is a security issue and wouldn't feel right telling you to bypass anything that your team has set in place.

 

https://forums.aws.amazon.com/thread.jspa?threadID=165286

Treyson Marks
Senior Analytics Engineer
pravinpande
6 - Meteoroid

I think I fixed it. Had to change server side encryption. 

Totally different question thought but for the same mapping.

I am trying to load data from most recent flat file into S3. I can find the most recent file at the sample tool but when I want to load it, I need to read it first. I am trying to use dynamic input tool but it need exact path and file name. I have path but file name might change in the source. It will have year and month as prefix. Do you know how can I read and load it to S3?

clipboard_image_0.png

Treyson
13 - Pulsar
13 - Pulsar

Let me see if I am understanding your question.

 

1) You want to upload the most recent file in a folder to your S3 instance.

2) To do this, you need Alteryx to pull the contents of that file and then write a brand new file to S3, potentially having the same name.

3) You want to use the directory tool to send the name of the most recent file into the dynamic input tool to query.

4) Do you want to name the file output to S3 as the same thing as it was in your local folder?

 

If I was building this process there are a few steps I would take.

 

1) I would only want to load the newest file if I haven't previously loaded that file. This can be taken care of by a variety of methods, whichever you decide will probably require a little bit of command line knowledge. It's nothing crazy but it is a bit outside of the scope of what I would say a traditional Alteryx user is. 

      a) Using the S3 command line method to query the files that currently exist on the S3 instance and check against the files in your repository and have dynamic input upload all files that aren't currently up there. This would also allow you to dynamically query from there as well in the future if you used the S3 downloader in a batch macro and had it update the file you want to pull down. This article written a few years ago saved my bacon on a project.

 

      b) You could write a batch file that moves anything that you uploaded into another folder after the workflow runs complete. That way you aren't deleting any local files but also won't potentially re-upload them.

 

2) If you are wanting to avoid that and will always run this process manually, I would just build a process that parses out the name of the files you have in that folder and grab the date piece in order to find the most recent and then only send that record to the dynamic input. In order to dive into that parsing problem, I would need to see exactly what a few file names look like.

 

Reading through this I said I had a few steps, but I totally got side tracked trying to find that link above. Let me know if this makes sense and how you want to proceed.

 

Treyson Marks
Senior Analytics Engineer
Labels