4 days ago
Sun Jan 25, 2026 4:34am PST
Ask HN: A good Model to choose in Ollama to run on Claude Code
Given that Claude Code supports a locally running model on Ollama, which is the best Thinking Model that supports tooling, can I pick for good output?

Also, if anyone has tried, does it still require a Claude Subscription?

(I currently have an RTX 5060 machine with 8GB of VRAM)

comments:
add comment
loading comments...