We put a lot of effort into making the installation super simple, so you can run it on Linux/Windows with an NVIDIA GPU, alongside Ollama on Mac etc, or against an external LLM API.
Here's a demo of what you can do with it: https://www.youtube.com/watch?v=6QcOXq3VFpc
In the demo:
* Helix Apps, version controlled configuration for LLM-based applications
* Knowledge, continuously updated RAG from a URL
* API integrations so your app can call an API to get up to date information when needed
* New Helix App Editor UI
* New easy installer with support for Helix running on macOS (alongside Ollama) and Windows on an NVIDIA GPU, as well as Linux with Docker and Kubernetes