Agent! App Icon

Agent! for macOS26
AI for your ο£Ώ Mac Desktop

Your AI assistant that autonomously controls your Mac desktop. Capture photos, send messages, edit code, manage files, and automate workflows all through natural language.

Apple Intelligence

Leverages on-device machine learning for faster, more context-aware responses while keeping your data private.

Supported LLM Providers

Agent! works with all major LLM providers. Choose from cloud-based APIs or run locally for complete privacy.

🟒

OpenAI

GPT-4, GPT-4o, GPT-3.5

🟠

Claude

Claude 4, Claude 3.5, Claude 3

πŸ€—

HuggingFace

Open source models via API

πŸ”΅

DeepSeek

DeepSeek-V3, DeepSeek-Coder

πŸ¦™

Ollama Pro

Cloud-hosted Ollama

🧠

Apple Intelligence

C3PO Mediator

🏠

Local

Run models on your Mac

πŸ‘οΈ

vLLM

High-performance serving

πŸ”¬

LMStudio

Local model playground

More providers added regularly. Check the documentation for the latest supported APIs.

Advanced Speech Recognition

Dictate requests hands-free with on-device speech recognition. Tap the microphone to speak naturally, or enable Voice Control for the "Agent!" command. You can even text Agent from your iPhone via Messages. Some setup required.

Agentic AI in Action

Watch Agent! execute tasks on your Mac desktop in real time.

Agent! performing autonomous tasks on macOS
Getting Started

Get Agent! up and running on your Mac in six simple steps.

1. Prerequisites
  • macOS 26 (Tahoe) or later
  • Xcode Command Line Tools install via Terminal:
    xcode-select --install
  • Enter LLM API key, or local LLM setup via LMStudio, vLLM or Olama
2. Install & Run
  • Download the DMG
  • Open the DMG and drag Agent! to /Applications
  • Open Agent! from Applications
3. Register Background Services
  • Click Register in the toolbar to install background services
  • User Agent runs commands as your user account
  • Privileged Daemon escalates as admin when needed
4. Approve in System Settings
  • Go to System Settings β†’ General β†’ Login Items
  • Allow both Agent and AgentHelper
5. Configure Your Provider
  • Click the gear icon to open Settings
  • Claude enter your Anthropic API key, select a model
  • Ollama Cloud enter your Ollama Pro API key
  • Local Ollama point to your local endpoint
    (32–128 GB RAM recommended)
6. Connect & Run
  • Click Connect to test the XPC services
  • Type a task in natural language
  • Press Run (or ⌘Enter) Agent takes it from here
Full Setup Guide on GitHub
Features
πŸ–₯️

Desktop Control

Click buttons, type text, and navigate any application through Accessibility APIs.

✏️

Code Editor

Read, write, and edit code files. Agent! understands project structure and makes intelligent edits.

πŸ”„

Automation

Build workflows with Custom Agents, shell commands, and accessibility actions.

🎨

Xcode Integration

Build projects, fix errors, and iterate on apps with AI assistance.

πŸ“

File Management

Create, read, edit, and organize files across your codebase.

🧠

MCP Support

Extend Agent! with Model Context Protocol servers for databases, APIs, and tools.

Latest Release
What's Included
  • Native SwiftUI interface with Accessibility API integration
  • Custom Agents (ScriptingBridge) for app automation
  • Xcode project building and Swift dylib scripting
  • MCP server support and iMessage remote control
  • Vision support for screenshots and clipboard images
Requirements
  • macOS 26 (Tahoe) or later
  • Xcode Command Line Tools
  • Apple Silicon recommended

MOST TESTED LLM for use with Agent! is GLM-5

Release History
Version Release Date Size Downloads
Loading releases...