jxzly / Kaggle-American-Express-Default-Prediction-1st-solutionLinks
☆239Updated 2 years ago
Alternatives and similar repositories for Kaggle-American-Express-Default-Prediction-1st-solution
Users that are interested in Kaggle-American-Express-Default-Prediction-1st-solution are comparing it to the libraries listed below
Sorting:
- ☆170Updated 5 years ago
- Winning solution for the Kaggle Feedback Prize Challenge.☆66Updated 3 years ago
- XGBoost + Optuna☆725Updated last year
- ☆36Updated 3 years ago
- ☆107Updated 2 years ago
- An implementation of the focal loss to be used with LightGBM for binary and multi-class classification problems☆256Updated 6 years ago
- 🛍 A real-world e-commerce dataset for session-based recommender systems research.☆360Updated 6 months ago
- Data, Benchmarks, and methods submitted to the M5 forecasting competition☆646Updated 2 years ago
- 1st Place Solution to the Curriculum Recommendations on Kaggle☆32Updated 2 years ago
- Kaggle home credit default risk competition☆60Updated 7 years ago
- Gradient boosting model for predicting credit default risk on Kaggle competition☆18Updated 5 years ago
- GBST is an optimized distributed gradient boosting survival trees library that is implemented based on the XGBoost☆37Updated 5 years ago
- ☆161Updated 3 years ago
- ☆53Updated 2 years ago
- ☆43Updated 3 years ago
- Winning Solution of Kaggle Mechanisms of Action (MoA) Prediction.☆122Updated 3 years ago
- ☆40Updated 3 years ago
- ☆74Updated 3 years ago
- Code repository for the online course "Feature Engineering for Time Series Forecasting".☆195Updated 2 years ago
- Code repository for the online course Feature Selection for Machine Learning☆336Updated last year
- 1st Place Solution for Eedi - Mining Misconceptions in Mathematics Kaggle Competition☆55Updated 11 months ago
- Hands-On Gradient Boosting with XGBoost and Scikit-learn Published by Packt☆215Updated last month
- XGBoost for label-imbalanced data: XGBoost with weighted and focal loss functions☆332Updated last year
- Solutions to Recommender Systems competitions☆200Updated 3 years ago
- Scikit-learn compatible implementation of the Gauss Rank scaling method☆74Updated 2 years ago
- A python module that uses hill climbing to iteratively blend machine learning model predictions.☆58Updated last year
- My second place solution in the M5 Accuracy competition☆73Updated 5 years ago
- [kaggle] M5 Forecasting - Accuracy (4th place solution)☆37Updated 5 years ago
- A collection of companion Jupyter notebooks for Ensemble Methods for Machine Learning (Manning, 2023)☆92Updated 2 years ago
- Code for American Express' Default Prediction competition, hosted on Kaggle☆14Updated 3 years ago