Pick the right endpoint
Z.ai exposes two endpoints with identical protocols. Use api.z.ai for USD billing and overseas access, open.bigmodel.cn for CNY billing and mainland access.
Claude Code CLI is the best agentic coding front-end. GLM Coding Plan is the cheapest Anthropic-compatible back-end. Here is how to put them together.
Set two environment variables, open Claude Code, confirm /model shows GLM-5.1, ship code. Done.
Z.ai exposes two endpoints with identical protocols. Use api.z.ai for USD billing and overseas access, open.bigmodel.cn for CNY billing and mainland access.
Export ANTHROPIC_BASE_URL and ANTHROPIC_API_KEY in your shell profile. Claude Code CLI reads these on startup and skips the Anthropic OAuth flow.
export ANTHROPIC_BASE_URL=https://api.z.ai/api/anthropic
export ANTHROPIC_API_KEY=zai_xxxxxxxxxxxxOpen a project and run claude. The CLI prints a banner showing the active base URL on startup. Confirm it matches the Z.ai endpoint.
cd ~/repos/myapp && claudeInside Claude Code type /model. You should see GLM-5.1 (or GLM-5-Turbo for fast tasks). Use /model glm-5.1 to force the flagship.
Ask Claude Code to make a real change — something like "refactor src/auth.js to use async/await". GLM-5.1 handles most Claude Code workflows without any visible difference.
Configure MCP servers in ~/.claude/config as normal. GLM supports the MCP protocol. Some niche Anthropic-first MCP servers may behave slightly differently — file bugs back to Z.ai.
No. When ANTHROPIC_BASE_URL points to Z.ai, zero traffic reaches Anthropic — billing happens on the GLM side only.
Yes, Z.ai has full MCP protocol support. Most third-party MCP servers tested against Anthropic continue to work unchanged.
Not mid-session. You need separate shell profiles (or unset the env var) to switch backend. Many developers alias one to claude-glm and the other to claude-real.