AI Edition
The AI edition of starkite adds the ai module (multi-provider LLM client) and the mcp module (Model Context Protocol server + client) to the base edition. Every base module remains available. It ships as the standalone kiteai binary, and is also bundled into the all-in-one kite binary.
Installation¶
# Build from source — produces ./bin/kiteai
make build-ai
# Or install via the edition manager (downloads from GitHub Releases)
kitecmd edition use ai
If you have the all-in-one kite binary installed, you already have the ai modules — no separate install needed.
Provider Credentials¶
The AI edition supports Anthropic, OpenAI, Google AI, and Ollama. Set the relevant environment variable for any provider you plan to use:
| Provider | Env var | Notes |
|---|---|---|
| Anthropic | ANTHROPIC_API_KEY |
claude-3-5-sonnet, claude-opus, etc. |
| OpenAI | OPENAI_API_KEY |
gpt-4o, gpt-4o-mini, etc. |
| Google AI | GOOGLE_API_KEY |
gemini-1.5-pro, gemini-flash, etc. |
| Ollama | (none) | Local; default endpoint http://localhost:11434. Override with ai.config(base_urls={"ollama": "..."}) or per-call base_url= |
You can also configure credentials in a script via ai.config() — useful for scripts that manage their own credentials:
Quick Start¶
The fastest path to verifying the install uses Ollama (no API key required):
Or with Anthropic once ANTHROPIC_API_KEY is set:
Model Strings¶
Every call identifies the backend by a provider/model-name string:
ai.generate(prompt, model="anthropic/claude-sonnet-4-5")
ai.generate(prompt, model="openai/gpt-4o-mini")
ai.generate(prompt, model="googleai/gemini-1.5-pro")
ai.generate(prompt, model="ollama/llama3.2")
Set a default so you don't have to repeat it:
What's Included¶
| Module | Purpose | Reference |
|---|---|---|
ai |
Multi-provider LLM client (generate, chat, tools, agents) | ai module reference |
mcp |
MCP server (expose Starlark tools over stdio or HTTP) + client (call remote MCP servers) | mcp module reference |
All 27 modules from the base edition (os, fs, http, ssh, json, yaml, concur, etc.) remain available. Agents typically compose ai.chat() with these for tool implementations — see the agents guide.
Editions Management¶
kitecmd edition status # List installed editions
kitecmd edition use ai # Install AI edition (downloads kiteai)
kitecmd edition use base # Switch back to base
kitecmd edition remove ai # Uninstall AI edition
When ai is active, kitecmd transparently execs into kiteai for every command.
Next Steps¶
- AI module reference — full
ai.generate()/ai.chat()/ai.tool()/ai.run_until()signatures - MCP module reference —
mcp.serve()andmcp.connect()with stdio, HTTP, and TLS - Building Agents — four composition patterns for multi-turn agents
- Embedding Libkite — drive agents from Go, with Starlark providing tool bodies