Community Spring Cleaning week is here! Join your fellow Maveryx in digging through your old posts and marking comments on them as solved. Learn more here!
The Product Idea boards have gotten an update to better integrate them within our Product team's idea cycle! However this update does have a few unique behaviors, if you have any questions about them check out our FAQ.

Alteryx Designer Desktop Ideas

Share your Designer Desktop product ideas - we're listening!
Submitting an Idea?

Be sure to review our Idea Submission Guidelines for more information!

Submission Guidelines

Xgboost Regression as a predictive macro

XGboost regression is now the benchmark for every Kaggle competition and seems to consistently outperform random forest, spline regression, and all of the more basic models. For those of us using predictive modeling on a regular basis in our actual work, this tool would allow for a quick improvement in our model accuracy. And I think, from a marketing standpoint, having a core group of users competing in Kaggle using Alteryx would be a great way to show off Alteryx's power.

 

It is readily available as an R package: https://cran.r-project.org/web/packages/xgboost/index.html

29 Comments
Joe_Lipski
13 - Pulsar
13 - Pulsar

Thanks Brett!

 

Apologies, I missed that part about Intelligence Suite and I completely agree with all of your points here and above.

 

I'm hoping that with the recent developments around Python we will start seeing lots of new Python predictive tools coming soon!

TimothyL
Alteryx Alumni (Retired)

@bkramer66_dup_418 

 

Hey Brett, 

 

Thanks for your loyal support to our platform! I'm align with you that Altreyx should have an official XGboost in R. Your input and everyone's upvote here would be crucial to our product team's decision. 

 

Meanwhile, I just did a quick comparison between my xgboost R and IS xgboost classification results. As you see from the screenshot below, it indicates a very high degree of matched prediction. I guess there are some data types & problem definition needs to be examined. After all, the default parameter of both tools are different and here is the summarised table:

 

  XGboost in R IS XGboost
Learning Rate 0.1 0.3
No of Rounds/Trees/Estimator  20 100
Sub Sample 0.5 1
Min Child Weight 1 1
Max Depth 20 3

 

I intended to build the interface tools only for the Learning Rate & No of Rounds to reduce the complexity. Might consider to get the Max Depth parameter as well to match the IS configuration. Will see. 

 

Lastly, IS XGboost is the official product feature while mine is a an experiment. If situation allows, IS one is definitely preferred option at this point of time.

 

Hope you have a great and safe day,

 

TL

 

 

xgboost AM_TL comparison.PNG

bkramer66_dup_418
6 - Meteoroid

Good morning Timothy,

 

I will take another look but I know when I did so originally and posted in the community about it back in June, I had been fairly careful. However, I have been doing this long enough that I have been humbled on more than one occasion because I had overlooked something 🙂

 

Thank you for all you do and the diligence in checking the run between IS and the R tool. it speaks to your focus on supporting all of us out here in the trenches and is greatly appreciated!

 

Brett

KylieF
Alteryx Community Team
Alteryx Community Team

Hi All!

 

Glad to see such good conversation and discussion going on in this thread! We greatly appreciate when users share workarounds, use cases and their thoughts on an idea. Since it looks like the ask still hasn't been solved I'm going to recommend that those interested in this idea check out our Idea Submission Guidelines as they go over the requirements to reach product in a bit greater detail.

 

Thank you all for your input and engaging on the idea boards!

bkramer66_dup_418
6 - Meteoroid

Hi Kylie,

 

Can you elaborate a bit more on what you mean? Do I need to do something more? I can certainly detail the request a bit more but since there is an XGB tool within the Intelligence Suite, I kind of assumed the requirements would be straightforward...but please let me know if I need to do something!

KylieF
Alteryx Community Team
Alteryx Community Team

Hi @bkramer66_dup_418!

 

Apologizes for the confusion. The first part of my comment was primarily aimed to encourage those interested in this idea, both the original poster as well as those engaging with it via comments and likes, to review the submission guidelines as they include several points on how ideas are pushed to our product team and what sort of engagement we appreciate seeing.

 

The original post is a great example of what our product team needs in terms of context and detail. However we will never turn down more use cases from more users on an idea, as it helps us create features that hits as many functionalities and carriers as possible. It can also help us address tricky decisions that sometimes crop up during development that were not initially thought of.

 

You're original idea and the discussion it spawned include a lot of valuable information for our product team though and I believe it should cover all the basics that they would need. Though we will be sure to reach out on this idea should at any point we need further clarification!

denny
8 - Asteroid
  1. In addition to xgboost, please add lightgbm also. Lightgbm is faster and less CPU intensive than xgboost. So for some use case, lightgbm is a better fit.
  2. Can some post an example in a blog on how to do hyperparameter tuning with the xgboost tool and hopefully in the future, lightgbm tool. I'm thinking along the line of the Time Series tool where you can compare the results of ETS and ARIMA. Is this implementable for gradient boosting tools, like GBM, XGBoost and LightGBM?

 

 

denny
8 - Asteroid

1. I also like to request LightGBM be added as a built-in Alteryx gradient boosting tool.

 

2. Do I need to submit that as a separate idea, or can I add it here to complete the gradient boosting tools enhancement request?

 

Both xgboost and lightgbm are on CRAN so I'm assuming the integration of the package to the Alteryx macro/tool should be relatively the same.

 

1 additional enhancements would be a GBM Compare tool, modeled after to TS Compare tool.

 

Another one would be for hyperparameter tuning. This may require multiple tools so more complicated to design and implement.

 

CristonS
Alteryx Alumni (Retired)
Status changed to: Not Planned

While this will not be included in the Designer Predictive toolset (the R-based  brown tools), you can use Intelligence Suite/ Assisted Modeling to evaluate this function.