Rigbox workspaces come with a full Linux environment, SSH access, and a managed AI proxy — making them a natural home for AI coding agents. Tools like Claude Code, Codex CLI, OpenCode, and Gemini CLI work out of the box when you activate managed credits.
AI coding tools use the same provider SDKs (Anthropic, OpenAI, Google) that the managed proxy already supports. When you run rig proxy on inside a workspace, the proxy sets environment variables that redirect SDK traffic through Rigbox’s infrastructure. Your credits cover the usage — no API keys needed.
Claude Code works immediately after rig proxy on — it reads ANTHROPIC_BASE_URL from the environment.
eval $(rig proxy on)claude
Claude Code can edit files, run shell commands, manage git, and run tests inside the workspace. Since each workspace is an isolated VM, there’s no risk of the agent affecting other projects.
Claude Code uses Claude Sonnet by default. To use Opus, pass --model claude-opus-4-20250514. Credit consumption is higher for Opus.
Aider supports multiple providers. The proxy environment variables are picked up automatically.
eval $(rig proxy on)# Use with Anthropic (default after proxy on)aider --model claude-sonnet-4-20250514# Use with OpenAIaider --model gpt-4o# Use with Googleaider --model gemini/gemini-2.5-pro
Agentic tools like Claude Code and Codex can make dozens of API calls per task. A single complex refactoring session might use 50-200 credits depending on the model and codebase size. Monitor your balance if you’re on the free tier.