LLM Assistant

Local LLM integration · Document ingestion · Report generation · Runs on Ollama Note: When served from GitHub Pages, set OLLAMA_ORIGINS=* before starting Ollama, or open this page locally (file://) for automatic localhost access.
Not connected
Document ingestion
Report generation
Input — client documents
Upload or paste client documents (security policies, risk registers, audit reports, privacy notices, AI governance docs). The LLM will extract profile data for the client profile engine.
Drop .txt, .md, .csv files here or click to browse
Output — extracted profile
Extracted profile data will appear here after processing...