AWS S3 upload error
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
I am getting this error message when I'm running a workflow that is uploading information to an AWS S3 bucket:
Error: Amazon S3 Upload (130): Error from AWS: Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.
I have run this workflow in the past without error, this only started happening recently. I don't believe it's a network or AWS issue, as I've checked both of those possibilities.
Does anyone have experience dealing with this error?
- Labels:
- Amazon S3
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
A little unsure - maybe it's a network issue?
Assuming this is the regular Amazon S3 Upload Tool (alteryx.com), have you tried the AWS tools offered by Aimpoint Digital on the Marketplace? AWS Tools by Aimpoint Digital | Alteryx Marketplace
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
It has completed when our network has been slower, so I don't think that's the issue. It is the normal S3 upload tool, I have not looked at the other tools before.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
what's the file size? can you auth into aws cli locally (and if so - can you upload with an aws s3 cp {{filename}} {s3 bucket} command?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
file size is 8 gb. we can upload on the machine and locally through the tool no issues, it doesn't fail all the time, just occasionally on bigger files
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
This is probably a network timeout issue then. because of the file size. Any chance you can partition the file and upload in bunches?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
I looked into the new S3 tools, and can't get the configuration to work. I've filled in the AWS access key id and secret access key, and the file paths, but I'm getting an error: "No module named 'boto3'. I've attached a picture of the base S3 tool since I don't want to post a picture of the keys, but I have filled in all the blank spaces and it isn't uploading. Do you know what configuration element I'm missing?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Run as admin. If that doesn't work. open a new workflow as admin. add a text input too. add a python tool. in your python tool add the following:
from ayx import Package
Package.installPackages(['boto3'])
run it. re-run your s3 tool.
you need to download the boto3 python package. @alexnajm - I'd recommend a different implementation here.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
@apathetichell I trust your opinion, just offering up alternative solutions in case it alleviated the issue! I agree batches may be the way to go
Tagging in @PhilipMannering who I think worked on building these Marketplace tools in case there's anything else I may be missing
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Notify Moderator
Uploading in batches isn't an option for me, but thank you for the options and help. I will try this solution, and look forward to the help on the S3 tool.
