Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm using ollama with local LLM for completion (tabby-ml) and Open WebUI for chat. What will be the goto local ACP server working with ollama ?

Ideally working with toad to experiment with it.



You may be confusing Agent Communication Protocol with Agent Client Protocol. Yeah, 2 ACP protocols. I had no hand in the naming.

If an agent can be configured to use Ollama, then you could use it from a Toad. It might be possible right now.


fast-agent has ACP support and works well with ollama. Once installed you can just use `toad acp "fast-agent-acp --model generic.<ollama-model>"`.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: