llm.api
Minimal LLM Chat Interface
A minimal-dependency client for Large Language Model chat APIs. Supports 'OpenAI' <https://github.com/openai>, 'Anthropic' 'Claude' <https://claude.com/>, 'Moonshot' 'Kimi' <https://www.moonshot.ai/>, 'Ollama' <https://ollama.com/>, and other 'OpenAI'-compatible endpoints. Includes an agent loop with tool use and a 'Model Context Protocol' client <https://modelcontextprotocol.io/>. API design is derived from the 'ellmer' package, reimplemented with only base R, 'curl', and 'jsonlite'.
Versions across snapshots
| Version | Repository | File | Size |
|---|---|---|---|
0.1.1 |
rolling linux/jammy R-4.5 | llm.api_0.1.1.tar.gz |
87.6 KiB |
0.1.1 |
rolling linux/noble R-4.5 | llm.api_0.1.1.tar.gz |
87.4 KiB |
0.1.1 |
rolling source/ R- | llm.api_0.1.1.tar.gz |
16.9 KiB |
0.1.1 |
latest linux/jammy R-4.5 | llm.api_0.1.1.tar.gz |
87.6 KiB |
0.1.1 |
latest linux/noble R-4.5 | llm.api_0.1.1.tar.gz |
87.4 KiB |
0.1.1 |
latest source/ R- | llm.api_0.1.1.tar.gz |
16.9 KiB |
0.1.1 |
2026-04-26 source/ R- | llm.api_0.1.1.tar.gz |
16.9 KiB |
0.1.1 |
2026-04-23 source/ R- | llm.api_0.1.1.tar.gz |
16.9 KiB |