I’m curious about this too. I know that on the latest version of Ollama it’s possible to install OpenClaw. But I assumed you needed to point it to a paid API (Claude, ChatGPT, Grok, etc.) for it to really work. But yeah, maybe it works with Qwen 3 or similar models?
I guess a major factor to this is what your system resources look like, especially howmuch RAM you have. And therefore which model you are hosting locally.
I’m curious about this too. I know that on the latest version of Ollama it’s possible to install OpenClaw. But I assumed you needed to point it to a paid API (Claude, ChatGPT, Grok, etc.) for it to really work. But yeah, maybe it works with Qwen 3 or similar models?
I guess a major factor to this is what your system resources look like, especially howmuch RAM you have. And therefore which model you are hosting locally.