Target Variable
Datetime Variable
Forecast Horizon

LightGBM

This distributed gradient boosting library was released by Microsoft in 2016. Based on the legendary XGBoost, LightGBM is able to deliver similiar results in much less time thanks to innovations in its branching, sampling, and learning algorithms. This algorithm was the driving force behind the winners of this year's M5 competition.
  • ✔️ Distributed gradient boosting framework built by Microsoft
  • ✔️ World-class predictive power
  • ✔️ Crunches through medium to large datasets
  • ❌ Requires feature engineering for best results
  • ❌ Requires parameter tuning to constrain flexibility
  • ❌ Longer run-time than other models

What LightGBM algorithm would you like to use?
What confidence level should we use for our predictions?

How many folds should be used in your cross-validation?
Which parameter dictionary would you like to use?
How should we split our test and training sets?
How many periods should we skip between test and train folds?
What parameters would you like to test?
Loading...

Which features would you like to lag?
What transformation should we use for our target variable?

What datetime features should we add?

Which features would you like to lag?
How many periods would you like to lag them by?

Which features should we create rolling features for?
What aggregations should we use?
What rolling window(s) should we use?

Which features should we create exponential moving averages for?
What rolling window(s) should we use?
Here's a sample of what your modeling dataframe will look like