Download AgentykChat

Choose your platform and get started in minutes

🍎

macOS

macOS 11 (Big Sur) or later

Download for macOS

Universal Binary (Intel & Apple Silicon)

🪟

Windows

Windows 10/11 (64-bit)

Download for Windows

.exe installer

🐧

Linux

Ubuntu 20.04+ / Debian 11+

Download for Linux

.deb, .rpm, or AppImage

System Requirements

💾

RAM

8GB minimum
16GB recommended

💿

Disk Space

10GB free
(for app + models)

🌐

Internet

Required for
initial downloads

🍎macOS Installation

  1. 1

    Install Ollama (Required)

    brew install ollama

    Or download from: https://ollama.com/download

  2. 2

    Start Ollama

    ollama serve
  3. 3

    Download & Install AgentykChat

    • Open the DMG file
    • Drag AgentykChat to Applications
    • Right-click → Open (first time only)

  4. 4

    Download Your First Model

    AgentykChat will guide you through downloading your first AI model

Troubleshooting
  • • "AgentykChat can't be opened": Go to System Preferences → Security & Privacy → Open Anyway
  • • Ollama connection failed: Ensure Ollama is running with ollama serve

🪟Windows Installation

  1. 1

    Install Ollama (Required)

    Download from: https://ollama.com/download/windows
    Run the installer and Ollama will start automatically

  2. 2

    Download & Install AgentykChat

    • Run the .exe installer
    • Follow the setup wizard
    • Launch from Start Menu or Desktop

  3. 3

    Start Chatting

    AgentykChat will automatically connect to Ollama and guide you through setup

Troubleshooting
  • • Windows Defender warning: Click "More info" → "Run anyway"
  • • Connection error: Check if Ollama is running (system tray icon)
  • • Firewall blocking: Allow AgentykChat through Windows Firewall

🐧Linux Installation

Ubuntu/Debian

# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh
# Download AgentykChat .deb
wget https://github.com/sylvanity/agentyk-chat/releases/download/v1.0.0/agentyk-chat_1.0.0_amd64.deb
# Install
sudo dpkg -i agentyk-chat_1.0.0_amd64.deb

Fedora/RHEL

# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh
# Download & Install AgentykChat
wget https://github.com/sylvanity/agentyk-chat/releases/download/v1.0.0/agentyk-chat-1.0.0.x86_64.rpm
sudo rpm -i agentyk-chat-1.0.0.x86_64.rpm

Arch Linux

yay -S agentyk-chat

AppImage (Universal)

wget https://github.com/sylvanity/agentyk-chat/releases/download/v1.0.0/AgentykChat-v1.0.0.AppImage
chmod +x AgentykChat-v1.0.0.AppImage
./AgentykChat-v1.0.0.AppImage
Troubleshooting
  • • Ollama not found: Ensure Ollama is in PATH and running
  • • AppImage won't run: Install FUSE: sudo apt install fuse
  • • Missing dependencies: sudo apt install libgtk-3-0 libwebkit2gtk-4.0-37

Recommended Models to Start

For Beginners

  • llama2 (7B) - Good balance of performance and speed
  • mistral (7B) - Fast and capable

For Coding

  • codellama (7B/13B) - Optimized for code
  • deepseek-coder - Great for programming

For Better Quality

  • llama3 (13B) - Improved responses
  • mixtral (8x7B) - Excellent quality (needs 16GB+ RAM)

For Speed (Low-end Hardware)

  • phi (2.7B) - Tiny but capable
  • tinyllama (1.1B) - Fastest option

Need help? Check our FAQ or join our Discord