Imagine this: you’ve got $1000 to invest every two weeks. You’re an early-stage investor, curious about the stock market but not keen on clunky spreadsheets or manual tracking. You want data. You want automation. You want to make smart decisions, fast.
That’s where this journey begins — using Alteryx Designer Cloud (on the Alteryx One platform) to build a fully automated, API-driven stock analysis system. In this blog, we’ll move from simple API configuration to dynamic workflows and finish with a fully automated investment tracker triggered via an API call.
🚀 Part 1: EASY — Getting Started with a Single API Call
To kick things off, I started by trying to retrieve historical stock prices for Apple (AAPL) using the Financial Modeling Prep (FMP) API. My idea was to lay the groundwork — just one API call, just one dataset.
After grabbing an API key from the FMP website, I created a REST API connector in Alteryx Designer Cloud. The setup asked me for a name, base URL, and a few authentication details. I added the endpoint for historical prices, keyed in 'AAPL' as the stock symbol, and just like that, I had a dataset streaming in with Apple’s past performance.
It felt like plugging into a financial data firehose — and I hadn’t even started analyzing yet!



You will need to get the apikey value from the FMP site. Make sure to test your connection settings.
⚙️ Part 2: INTERMEDIATE — Dynamic Workflow with Multiple HTTP GET Calls
Now that I had one stock working, I wanted more. I was tracking 10 companies and didn’t want to repeat the setup manually for each one. So, I built a workflow that could generate a set of HTTP GET requests — one for each stock symbol. It felt like assembling a tiny factory of API calls.
I created a table with my 10 target stock tickers and combined each with the base URL and authentication fields to generate a dataset. This dataset powered an HTTP task within an Alteryx Plan — each row told the HTTP task what to fetch. No extra config, no extra hassle.


Once the requests were done, I saved the JSON responses to a dataset and used the JSON Parse tool to wrangle the data into a usable table. I could now see how each stock performed day-by-day. Even better? I created logic to count how often the adjusted close was higher than the adjusted open. That became my investment signal — rank the companies, pick the top five, and allocate my $1000 every two weeks.

🧰 Workflow 1: Stock API Request Builder
This is where it all begins.
I created this workflow to generate a list of HTTP GET requests. Think of it as a factory that takes in a list of stock tickers — Apple, Tesla, Nvidia, etc. — and outputs all the ingredients needed for an API call: URL, headers, method, and any parameters.
This is key because instead of building 10 API requests manually, I build one smart dataset that tells the HTTP Task exactly what to do. This dataset gets published and passed into the Plan. It’s simple, modular, and sets the tone for scalable automation.

🌐 Workflow 2: JSON Parser & Investment Logic
Once the HTTP Task does its job and pulls all the data, I hand it off to Workflow 2.
This one’s all about parsing and decision-making. It takes in the JSON response from the API and uses the JSON Parse tool to flatten the structure into something tabular — easy to manipulate, easy to analyze.
The heart of this workflow? It evaluates how many days each stock had a higher adjusted close than adjusted open, basically measuring bullish momentum. I use this count to rank the stocks and then apply my budget ($1000) to invest evenly in the top 5.

🔄 PLANS: Orchestration Magic
The Plan is the conductor of this data symphony. It pulls all the pieces together:
- Step 1: Run Workflow 1 to generate all API call parameters (created using Cloud Native Move)
- Step 2: Pass those parameters to an HTTP Task that performs all 10 API GET requests
- Step 3: Store the response as a dataset
- Step 4: Run Workflow 2, which parses, ranks, and builds your investment decision table (Created using Standard Mode)
And best of all? This Plan is fully schedule-able — you can run it every two weeks, every day, or whenever you feel like making a move. Or, like I do now, you can trigger it remotely using the Alteryx API.

🧠 Part 3: ADVANCED — Automate It All with an API Trigger
Now, I wanted to go hands-free. Every two weeks, I needed this system to run without me clicking anything. The solution? Use the Alteryx Cloud API.
I generated a Personal Access Token (PAT), wrote a small script to call the /jobs endpoint, and passed in the ID of my investment-ranking workflow. With a single POST request, the whole chain sprang into action, pulling stock data, analyzing performance, and telling me where to invest.
I monitored progress using the job status endpoint, and once done, I had a clean dataset ready for review or upload to my broker. This was automation, decision-making, and peace of mind — all in one.

This API POST command will trigger the execution of the identified PLAN using its PLAN ID. In other words, we are executing both workflows and running the HTTP task from PLANS PLAN using API calls.

📈 Wrapping Up
With just $1000, some curiosity, and the Alteryx One platform, I’d built:
- A scalable way to fetch real-time financial data
- A smart ranking system to evaluate stock performance
- A fully automated pipeline that runs at my command
Whether you’re eyeing Apple, Nvidia, or the next breakout company, Designer Cloud + REST APIs give you the edge to analyze and act — all from the browser.