Building high-performance, privacy-first AI tools and scalable web infrastructure.
Languages
TypeScript, JavaScript, Python, Rust, Golang, SQL
Frontend
React, Next.js, SolidJS, Tauri
AI & Automation
LLMs, Embeddings, RAG Systems, n8n, LLama.cpp, vLLM
Databases & MQ
PostgreSQL, MySQL, SQLite, Redis, BullMQ, Qdrant
DevOps
Linux, Docker, Git, GitHub Actions
Rust • Tauri • SQLite-Vec • TypeScript • React
A privacy-obsessed desktop search engine that uses Local LLMs to "read" your documents and "see" your images, indexing them into a Vector Database by meaning, not just keywords.
Zero Data Leaves YOUR Device. ~10MB Binary. Blazing Fast.
Built on Rust & Tauri (v2) to avoid Electron bloat. Achieved a ~10MB binary size. Optimized for zero-cost abstractions and memory safety, leaving system resources available for heavy local LLM inference.
Orchestrates local Vision & Text LLMs to index file meaning. Implements Matroshka Embeddings and SQLite-Vec for high-density, sub-100ms vector search.
Supports isolated "Spaces" with distinct configurations. Use a heavy Vision model for a "Photos" space and a specialized Code LLM for a "Dev" space. Segregates indices logically and physically.
Fully Decoupled Logic: Point to any endpoint (LLama.cpp, vLLM, Ollama, OpenAI). Users have granular control to customize System Prompts for specific file types and define custom Base URLs.
Implements a high-concurrency crawler (utilizing ripgrep logic). Tracks file metadata to process only new changes, ensuring efficient synchronization without full re-indexing loops.
100% Local: No data leaves your device. Strict Read-Only: The architecture operates on a one-way data flow. It creates a virtual index but is physically incapable of modifying, moving, or deleting user files.
