I built [sllm.nvim](https://github.com/mozanunal/sllm.nvim), a minimal (about 500 lines of Lua) Neovim plugin to bring Simon Willison’s excellent `llm` CLI directly into your coding workflow.
- Chat with LLMs (OpenAI, OpenRouter, etc.) in a split buffer. - Add files, visual selections, shell command outputs, LSP diagnostics, or URLs as fragments to your LLM context, all from Neovim. - Async streaming jobs — never block the editor. - Switch LLM models, see token/cost stats, and use keybindings for everything.
Inspired by Simon’s blog posts on long-context LLM workflows and managing context/fragments from the terminal — I wanted to make it seamless directly inside the editor. It uses [mini.nvim](https://github.com/echasnovski/mini.nvim) for the UI, but the core logic is just ~500 LOC.
Feedback/questions welcome! Thanks to Simon Willison & the llm community for all the inspiration.