I'm using the Aimpoint Amazon s3 list tool to bring in multiple objects within an S3 bucket and am getting an access denied error, I know it's not my credentials or file name because when I test bringing in one file with the built in S3 tool it works fine.
I'm wondering if the configuration syntax is different for the aimpoint s3 tool and there's something I'm missing, because I know it's not access.
Ive attached picture of the configuration for both tools, in the bucket name box I put my put my bucket name including the full path
*****/*******/bucketname
then in the object name box I just put in the object name without a path and that runs fine.
However in the aimpoint s3 tool it doesnt allow you to put in the full path so I put in just the name of the bucket
then in the object name I put in the object name, same as I do in the other tool, Im wondering if the issue is the syntax for the "object file path" box, everything works in the built in s3 tool so I know its not my access.
Does anyone know what I can do to fix this?
Tagging a few Aimpoint folks to see what we can do to help - @PhilipMannering @BenMoss @Joe_Lipski
@AbdulBalogun have you got this working? Can you confirm what you mean by the full path "*****/*******/bucketname"?
I don't understand what you would be trying to put in front of the bucket name.
If you mean you want to look in a folder then the folder should be in the object path and the bucket name should just be the bucket name.
Ben
@BenMoss - do you need both Object Name and Object Path? Isn't it Bucket/Object where sub folders tie into the Object and the Bucket is replicated in the ARN?
Hi I think this is my problem . The s3 connector works fine but the filetype is wrong (.gz) so I'm trying to use the Aimpoint Digital tool.
It strikes me that I haven't had to set an endpoint URL for the Aimpoint tool which could be the problem? I'd have to connect to eu-west-2.
The tool is locked so I can't see if I can edit that.
I think there is some confusion here between the object url (sample - https://{{bucketname}}.s3.us-west-2.amazonaws.com/{{object}}) and the s3 uri (s3://{{bucketname}}/{{object}})) - using the boto3 package - I do not believe you will need the object url and the s3 uri (which ties in to the ARN) is sufficient for accessing.
bucketnames are GLOBALLY unique - so I do not believe you need to specify the region for boto3 for an AWS bucket.
To update !
I had to whitelist the generic endpoint, then whitelist the endpoint that incorporates s3.eu-west-2.amazon.com and it's going through !! I can list files out fine.
I'm currently troubleshooting as there is a WinError 123 when I try and download a file. This error does not occur if the file is not in a path, but directly in the bucket.:
i.e. filename.csv.gz works, but folder/folder/folder/filename.csv.gz
There are many special characters (. - : =) in the folder path and filename. Could that be it? How does the Aimpoint Digital tool handle special characters if I send the filepath in?
Any clues?