🤖
ollama
Local LLM inference with Ollama. Model management and quick chat interface.
🤖AI
Install
jfn install ollama Package Info
- Version
- 1.0.0
- Author
- Jelly Labs
- License
- MIT
Platforms
macos linux
Shell Triggers
o Local LLM inference with Ollama. Model management and quick chat interface.
jfn install ollama o