Ollama API Pricing

0 models. Local models via Ollama (free, self-hosted). Prices per 1M tokens in USD.

ModelInput $/1MOutput $/1MCache $/1M

Track Ollama costs with LLMKit

Proxy your Ollama requests through LLMKit. Every call gets logged with token counts, dollar costs, and session attribution. Set budget limits that actually reject requests before they hit the provider.

MIT licensed. Built with Claude Code. Source on GitHub