jxzly / Kaggle-American-Express-Default-Prediction-1st-solutionLinks
☆235Updated 2 years ago
Alternatives and similar repositories for Kaggle-American-Express-Default-Prediction-1st-solution
Users that are interested in Kaggle-American-Express-Default-Prediction-1st-solution are comparing it to the libraries listed below
Sorting:
- ☆169Updated 4 years ago
- ☆36Updated 2 years ago
- Winning solution for the Kaggle Feedback Prize Challenge.☆65Updated 3 years ago
- An implementation of the focal loss to be used with LightGBM for binary and multi-class classification problems☆256Updated 5 years ago
- ☆100Updated 2 years ago
- 🛍 A real-world e-commerce dataset for session-based recommender systems research.☆347Updated 4 months ago
- 1st Place Solution to the Curriculum Recommendations on Kaggle☆32Updated 2 years ago
- Kaggle home credit default risk competition☆58Updated 6 years ago
- ☆40Updated 3 years ago
- XGBoost + Optuna☆716Updated last year
- ☆74Updated 3 years ago
- ☆43Updated 3 years ago
- Winning Solution of Kaggle Mechanisms of Action (MoA) Prediction.☆121Updated 3 years ago
- Data, Benchmarks, and methods submitted to the M5 forecasting competition☆634Updated 2 years ago
- ☆52Updated 2 years ago
- Solutions to Recommender Systems competitions☆201Updated 3 years ago
- Hands-On Gradient Boosting with XGBoost and Scikit-learn Published by Packt☆212Updated 2 weeks ago
- Yunbase,first submission of your algorithm competition☆55Updated 2 months ago
- 3rd Place Solution for the OTTO – Multi-Objective Recommender System Competition☆142Updated 2 years ago
- 🥈 Silver Medal Solution to Kaggle H&M Personalized Fashion Recommendations☆72Updated 3 years ago
- Code repository for the online course "Feature Engineering for Time Series Forecasting".☆190Updated last year
- GBST is an optimized distributed gradient boosting survival trees library that is implemented based on the XGBoost☆37Updated 5 years ago
- Gradient boosting model for predicting credit default risk on Kaggle competition☆17Updated 4 years ago
- ☆23Updated 2 years ago
- A python module that uses hill climbing to iteratively blend machine learning model predictions.☆54Updated last year
- the 2nd place solution code of Kaggle CommonLit Readability Prize☆34Updated 4 years ago
- Solution for the Jane Street 2024 Kaggle competition.☆187Updated last month
- My toolbox for data analysis. :)☆176Updated 8 months ago
- ☆89Updated 2 years ago
- XGBoost for label-imbalanced data: XGBoost with weighted and focal loss functions☆328Updated last year