DidacticBoost
A Simple Implementation and Demonstration of Gradient Boosting
A basic, clear implementation of tree-based gradient boosting designed to illustrate the core operation of boosting models. Tuning parameters (such as stochastic subsampling, modified learning rate, or regularization) are not implemented. The only adjustable parameter is the number of training rounds. If you are looking for a high performance boosting implementation with tuning parameters, consider the 'xgboost' package.
- Version0.1.1
- R version≥ 3.1.1
- LicenseGPL-3
- Needs compilation?No
- Last release04/19/2016
Team
David Shaub
Insights
Last 30 days
Last 365 days
The following line graph shows the downloads per day. You can hover over the graph to see the exact number of downloads per day.
Data provided by CRAN
Binaries
Dependencies
- Depends2 packages
- Suggests1 package