Start your journey with Alteryx Machine Learning - Take our Interactive Lesson today!

Alteryx Machine Learning Discussions

Find answers, ask questions, and share expertise about Alteryx Machine Learning.
Getting Started

Start your learning journey with Alteryx Machine Learning Interactive Lessons

Go to Lessons
SOLVED

For Each Loop For A Prediction Model

AytekinUz
5 - Atom

Is there a way to create a for-each loop for a time series model? For instance, filtering the transactions for each product and sending specific product transactions to a time series model, and finally combining all predictions for all products.

4 REPLIES 4
c-lopez
Alteryx
Alteryx

Hello,

You absolutely could. The answer, as always, will depend on what tools you are using to work on this but I put a quick example together for you that illustrates the point using the R tools. If you are using Alteryx Machine Learning on the cloud the same concept applies but rather than using the TS Predict tool you would use the ML Predict tool.

In order to accomplish what you are describing here and are using the same model for the different categories there are a couple of ways to tackle
1. You could filter out each dataset and essentially create a new path for each category with its own score tool for each path. This is the easiest way but it can get very convoluted and complex fast
2. A better way is to use a Batch macro using each category as the variable in the control parameter
I have attached a working example - not the most efficient way by any means but it proves the concept. In the example there are two categories and the output of the batch macro are both charts stacked one on top of the other

 
 
 

wf.png

mc.png



fesrimo
8 - Asteroid

Also, it would be good to evaluate which category of products could assess different data series models to see if you can achieve better accuracy.

KGT
11 - Bolide

Take a look at the TS Model Factory. Disclaimer: I haven't used it in several years, but is what you're after.

 

It will have a similar effect as batching but work much much quicker. These results will help to inform your batches as well. One of the most common issues when people start going down the simple batching route is individual batches not having enough data to really predict and so it may make sense to "combine batches".

 

https://community.alteryx.com/t5/Community-Gallery/TS-Factory-Sample/ta-p/878594

 

cagobok983
5 - Atom
from transformers import FalconLLMForConditionalGeneration, FalconLLMTokenizer # Step 1: Preprocess and clean the text (simplified) def preprocess_text(text): # Remove any unwanted characters or noise cleaned_text = text.replace('\n', ' ').strip() return cleaned_text # Step 2: Use Falcon LLM model to generate summary def generate_summary(text): # Initialize Falcon LLM tokenizer and model tokenizer = FalconLLMTokenizer.from_pretrained("openai/falcon-llm-large") model = FalconLLMForConditionalGeneration.from_pretrained("openai/falcon-llm-large") # Tokenize input text inputs = tokenizer(text, return_tensors="pt", max_length=1024, truncation=True) # Generate summary summary_ids = model.generate(inputs.input_ids, max_length=150, min_length=40, length_penalty=2.0, num_beams=4, early_stopping=True) # Decode summary summary = tokenizer.decode(summary_ids[0], skip_special_tokens=True) return summary if __name__ == "__main__": # Input text of the public domain book book_text = """ YOUR_TEXT_HERE """ # Preprocess text cleaned_text = preprocess_text(book_text) # Generate summary summary = generate_summary(cleaned_text) # Print summary print("Summary:") print(summary)