rollama
Communicate with 'Ollama' to Run Large Language Models Locally
Wraps the 'Ollama' <https://ollama.com> API, which can be used to communicate with generative large language models locally.
Versions across snapshots
| Version | Repository | File | Size |
|---|---|---|---|
0.3.0 |
rolling linux/jammy R-4.5 | rollama_0.3.0.tar.gz |
3.3 MiB |
0.3.0 |
rolling linux/noble R-4.5 | rollama_0.3.0.tar.gz |
3.3 MiB |
0.3.0 |
rolling source/ R- | rollama_0.3.0.tar.gz |
3.2 MiB |
0.3.0 |
latest linux/jammy R-4.5 | rollama_0.3.0.tar.gz |
3.3 MiB |
0.3.0 |
latest linux/noble R-4.5 | rollama_0.3.0.tar.gz |
3.3 MiB |
0.3.0 |
latest source/ R- | rollama_0.3.0.tar.gz |
3.2 MiB |
0.3.0 |
2026-04-26 source/ R- | rollama_0.3.0.tar.gz |
3.2 MiB |
0.3.0 |
2026-04-23 source/ R- | rollama_0.3.0.tar.gz |
3.2 MiB |
0.3.0 |
2026-04-09 windows/windows R-4.5 | rollama_0.3.0.zip |
3.3 MiB |
0.2.0 |
2025-04-20 source/ R- | rollama_0.2.0.tar.gz |
3.2 MiB |