torchopt

Advanced Optimizers for Torch

CRAN Package

Optimizers for 'torch' deep learning library. These functions include recent results published in the literature and are not part of the optimizers offered in 'torch'. Prospective users should test these optimizers with their data, since performance depends on the specific problem being solved. The packages includes the following optimizers: (a) 'adabelief' by Zhuang et al (2020), doi:10.48550/arXiv.2010.07468; (b) 'adabound' by Luo et al.(2019), doi:10.48550/arXiv.1902.09843; (c) 'adahessian' by Yao et al.(2021) doi:10.48550/arXiv.2006.00719; (d) 'adamw' by Loshchilov & Hutter (2019), doi:10.48550/arXiv.1711.05101; (e) 'madgrad' by Defazio and Jelassi (2021), doi:10.48550/arXiv.2101.11075; (f) 'nadam' by Dozat (2019), https://openreview.net/pdf/OM0jvwB8jIp57ZJjtNEZ.pdf; (g) 'qhadam' by Ma and Yarats(2019), doi:10.48550/arXiv.1810.06801; (h) 'radam' by Liu et al. (2019), doi:10.48550/arXiv.1908.03265; (i) 'swats' by Shekar and Sochee (2018), doi:10.48550/arXiv.1712.07628; (j) 'yogi' by Zaheer et al.(2019), https://papers.nips.cc/paper/8186-adaptive-methods-for-nonconvex-optimization.


Documentation


Team


Insights

Last 30 days

Last 365 days

The following line graph shows the downloads per day. You can hover over the graph to see the exact number of downloads per day.

Data provided by CRAN


Binaries


Dependencies

  • Imports1 package
  • Suggests1 package