smoothedLasso
A Framework to Smooth L1 Penalized Regression Operators using Nesterov Smoothing
We provide full functionality to smooth L1 penalized regression operators and to compute regression estimates thereof. For this, the objective function of a user-specified regression operator is first smoothed using Nesterov smoothing (see Y. Nesterov (2005) doi:10.1007/s10107-004-0552-5), resulting in a modified objective function with explicit gradients everywhere. The smoothed objective function and its gradient are minimized via BFGS, and the obtained minimizer is returned. Using Nesterov smoothing, the smoothed objective function can be made arbitrarily close to the original (unsmoothed) one. In particular, the Nesterov approach has the advantage that it comes with explicit accuracy bounds, both on the L1/L2 difference of the unsmoothed to the smoothed objective functions as well as on their respective minimizers (see G. Hahn, S.M. Lutz, N. Laha, C. Lange (2020) doi:10.1101/2020.09.17.301788). A progressive smoothing approach is provided which iteratively smoothes the objective function, resulting in more stable regression estimates. A function to perform cross validation for selection of the regularization parameter is provided.
- Version1.6
- R versionunknown
- LicenseGPL-2
- LicenseGPL-3
- Needs compilation?No
- Last release03/21/2021
Team
Georg Hahn
Sharon M. Lutz
Show author detailsRolesContributorChristoph Lange
Show author detailsRolesContributorNilanjana Laha
Show author detailsRolesContributor
Insights
Last 30 days
Last 365 days
The following line graph shows the downloads per day. You can hover over the graph to see the exact number of downloads per day.
Data provided by CRAN
Binaries
Dependencies
- Imports2 packages