Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
A lightweight local LLM chat with a web UI and a C‑based server that runs any LLM chat executable as a child and communicates via pipes
C 2
There was an error while loading. Please reload this page.