CPU_LOAD
MEM_ALLOC
AMAN.AI // AGENT_ONLINE

System AI initialized. Ask me about Aman's skills, projects, or resume.

>
AMAN_UDEWAL

OFFLINE AI
ASSISTANT

LOCAL LLM Llama 3 ON-DEVICE AI

THE_OBJECTIVE

Cloud-based AI assistants require constant internet access and pose significant privacy concerns with user data. The goal was to engineer a fully localized, privacy-first AI companion that runs complex language and speech models entirely on-device without external API calls.

CORE_SYSTEM_LOGIC

Integrated a quantized version of Llama 3 running through Ollama for local text generation. For voice capabilities, I piped the LLM's text output into Piper TTS, achieving high-fidelity, low-latency speech synthesis on consumer hardware.

UX_EXECUTION

Built a responsive, cross-platform interface using Flutter. The UI establishes a socket connection to the local Python server handling the LLM logic, displaying real-time streaming text and synchronized audio playback for a seamless conversational experience.

TECH_ARSENAL

FLUTTER PYTHON OLLAMA / LLAMA 3 PIPER TTS
ACCESS SOURCE CODE