optimg
General-Purpose Gradient-Based Optimization
Provides general purpose tools for helping users to implement steepest gradient descent methods for function optimization; for details see Ruder (2016) <arXiv:1609.04747v2>. Currently, the Steepest 2-Groups Gradient Descent and the Adaptive Moment Estimation (Adam) are the methods implemented. Other methods will be implemented in the future.
Versions across snapshots
| Version | Repository | File | Size |
|---|---|---|---|
0.1.2 |
rolling linux/jammy R-4.5 | optimg_0.1.2.tar.gz |
30.2 KiB |
0.1.2 |
rolling linux/noble R-4.5 | optimg_0.1.2.tar.gz |
30.2 KiB |
0.1.2 |
rolling source/ R- | optimg_0.1.2.tar.gz |
4.8 KiB |
0.1.2 |
latest linux/jammy R-4.5 | optimg_0.1.2.tar.gz |
30.2 KiB |
0.1.2 |
latest linux/noble R-4.5 | optimg_0.1.2.tar.gz |
30.2 KiB |
0.1.2 |
latest source/ R- | optimg_0.1.2.tar.gz |
4.8 KiB |
0.1.2 |
2026-04-26 source/ R- | optimg_0.1.2.tar.gz |
4.8 KiB |
0.1.2 |
2026-04-23 source/ R- | optimg_0.1.2.tar.gz |
4.8 KiB |
0.1.2 |
2026-04-09 windows/windows R-4.5 | optimg_0.1.2.zip |
32.7 KiB |
0.1.2 |
2025-04-20 source/ R- | optimg_0.1.2.tar.gz |
4.8 KiB |
Dependencies (latest)
Imports
- ucminf (>= 1.1-4)