Matt-OP / hillclimbersLinks
A python module that uses hill climbing to iteratively blend machine learning model predictions.
☆54Updated last year
Alternatives and similar repositories for hillclimbers
Users that are interested in hillclimbers are comparing it to the libraries listed below
Sorting:
- All Relevant Feature Selection☆138Updated 3 months ago
- ML models + benchmark for tabular data classification and regression☆197Updated 3 weeks ago
- Scikit-learn compatible implementation of the Gauss Rank scaling method☆73Updated last year
- ☆168Updated 4 years ago
- A power-full Shapley feature selection method.☆210Updated last year
- A lightweight and fast auto-ml library☆76Updated 3 months ago
- hgboost is a python package for hyper-parameter optimization for xgboost, catboost or lightboost using cross-validation, and evaluating t…☆64Updated 4 months ago
- (ICLR 2025) TabM: Advancing Tabular Deep Learning With Parameter-Efficient Ensembling☆578Updated 3 weeks ago
- Time Series Forecasting with LightGBM☆85Updated 2 years ago
- ☆52Updated 2 years ago
- An extension of LightGBM to probabilistic modelling☆319Updated last year
- Feature selection library in python☆147Updated 2 years ago
- State-of-the art Automated Machine Learning python library for Tabular Data☆233Updated last year
- M6-Forecasting competition☆43Updated last year
- Easy Custom Losses for Tree Boosters using Pytorch☆34Updated 4 years ago
- Random Forest or XGBoost? It is Time to Explore LCE☆66Updated last year
- ☆49Updated 2 months ago
- Winning Solution of Kaggle Mechanisms of Action (MoA) Prediction.☆120Updated 3 years ago
- SHAP-based validation for linear and tree-based models. Applied to binary, multiclass and regression problems.☆150Updated 3 months ago
- Forecasting with Gradient Boosted Time Series Decomposition☆195Updated last year
- (NeurIPS 2022) On Embeddings for Numerical Features in Tabular Deep Learning☆371Updated 3 months ago
- Yunbase,first submission of your algorithm competition☆52Updated last month
- ☆232Updated 2 years ago
- An implementation of the focal loss to be used with LightGBM for binary and multi-class classification problems☆254Updated 5 years ago
- ☆203Updated 3 years ago
- Linear Prediction Model with Automated Feature Engineering and Selection Capabilities☆521Updated 3 months ago
- Benchmark tabular Deep Learning models against each other and other non-DL techniques☆55Updated 4 years ago
- Probabilistic prediction with XGBoost.☆110Updated 3 months ago
- A Tree based feature selection tool which combines both the Boruta feature selection algorithm with shapley values.☆621Updated last year
- Data, Benchmarks, and methods submitted to the M6 forecasting competition☆114Updated 9 months ago