Common questions about Agentyk.
A free desktop application for local AI chat. ChatGPT-like experience without API keys, internet (after setup), or subscriptions.
Yes. Free to download and use. No hidden costs, subscriptions, or per-user fees.
Runs entirely on your computer. No cloud processing, no internet required after model download, no usage limits.
All processing happens locally. Conversations never leave your machine. No telemetry, no analytics.
Yes. Documents are processed entirely locally using the RAG system. They remain on your computer.
8GB RAM minimum (16GB recommended), 10GB disk space, macOS 11+ or Windows 10+.
Yes, Ollama provides the AI model infrastructure. Windows installer handles this automatically.
Yes. After downloading the app and models, Agentyk works completely offline.
30+ models through Ollama: Llama 2/3, Mistral, Mixtral, CodeLlama, Phi, and more.
Llama 2 or Mistral (7B) for general use. CodeLlama for coding. Phi for low-end hardware.
Small (1-3B): 1-2GB. Medium (7-13B): 4-8GB. Large (34-70B): 20-40GB.
Retrieval-Augmented Generation. Upload documents (PDF, DOCX, TXT) and ask questions about them.
Yes. Export as TXT or JSON for backup or sharing.
Try a smaller model, close other apps, or use a machine with more RAM. GPU acceleration helps significantly.
Yes. Ollama uses NVIDIA CUDA or Apple Silicon automatically.