Good morning, I have the following workflow:
What it does is make an initial connection to an SFTP folder that contains a bunch of files. The tools following will then filter out dates from the file names and filter down to the latest file I need for a particular client. The problem I have after this is storing it dynamically onto an AWS S3 bucket. I want to be able to schedule the workflow without any human intervention to pull the latest file automatically daily. However, after doing a lot of research I only see options where others have used a macro to get a filename from a Question for the Object section in the S3 tool. What I’m hoping is to be able to pass in the dynamic name into the Object section of the S3 tool via one of the formula tools which is already in the workflow that dynamically saves the file. Do you know of a way of doing this without having to use a macro that asks a question?
Thanks for your input.
Solved! Go to Solution.
You will have to create a batch macro that uses a Control Parameter to update the Object. The Object name from your Formula tool would be fed into the Control Parameter which would then feed into the AWS tool and then output the data back into the workflow.
This post isn't quite what you are trying to do, but the setup of the macro is very similar
Thanks Dan! that was what I was looking for and I got it to work.
Hi kuoshihyang,
I'm facing the same challenge now in creating Amazon S3 uploads with a timestamp. I tried all possible example on Alteryx community but could materialise them much. Any help would be appreciated.
In action I've used 'DateTimeToday()' to update the file name value while uploading output in S3 bucket. Unfortunately it's taking the Object name which I had given as an input while configuring S3 Upload tool.
I'm experiencing the same problem. What's happening for me is that I've got this loaded into a Gallery application.
When I run it manually on the Gallery, the Action tool will work properly using the "Update Value with Formula" selection to add the timestamp.
However, when I schedule the workflow to run automatically, it bypasses the "Update Value with Formula" tool and just uploads with the object name, which if it runs over and over, will just upload and overwrite the same file with the placeholder object name I have in the S3 tool.
I will be experimenting with this today and will respond if I fix it.
I created this exact setup and it successfully created the file with the name that I want.
However, when I use the macro it is painfully slow --- just keeps running (even for a small amount of data)
When I use the connector directly without the macro it runs very fast. Is there some setting in the macro that I need to adjust? Again I have it set up exactly as you outlined above.
I tried setting up the properties in the tools as mentioned, but it doesn't work. The S3 object getting created is with the same name that in the S3 upload tool. Can someone provide a working alteryx workflow? May be I am missing some important property.
Hi, I hope that you are well.
I am facing the same issue for several months, and could not find any solution about how to upload a bunch of files to aws s3, and then schedule it on Alteryx server. If you have any workflow for this issue or any guidance, I will be really grateful.
Please feel free to contact me in case of any information.
Kind regards, Sina
Hi, how are you doing?
I saw your post in community This post about how to configure s3 aws upload via Alteryx. Do you know how to set it up in Alteryx Server, or do you have any documentation for Alteryx Server for this regard?
Actually, I tried to transfer files from file share to s3 bucket. I could not do it with Alteryx built-in tools. So, I wrote python codes inside it which requires aws credential and AWS CLI. It works fine in windows, but in server, it does not.
I know that you are really busy with your commitments, however, in case of any information, I will be genuinely grateful due to the fact that I have been blocked for a long time.
@sinarazi your replying to a post which is 5 years old. Start a new one. Is your Server running on an EC2? If so - grant access to the S3 form the Ec2 - and use AWS CLI via role.