← Back to home
IDE IntegrationRead 6 min

Claude Code + GLM Coding Plan — The 2026 Integration Guide

Claude Code CLI is the best agentic coding front-end. GLM Coding Plan is the cheapest Anthropic-compatible back-end. Here is how to put them together.

TL;DR

Set two environment variables, open Claude Code, confirm /model shows GLM-5.1, ship code. Done.

Flow Overview

1Pick the right endpointZ.ai exposes two endpoints with identical protocols. Use api.z.ai for USD billing and overseas access, open.bi2Set environment variablesExport ANTHROPIC_BASE_URL and ANTHROPIC_API_KEY in your shell profile. Claude Code CLI reads these on startup 3Launch Claude CodeOpen a project and run claude. The CLI prints a banner showing the active base URL on startup. Confirm it matc4Verify the modelInside Claude Code type /model. You should see GLM-5.1 (or GLM-5-Turbo for fast tasks). Use /model glm-5.1 to 5Test an agent taskAsk Claude Code to make a real change — something like "refactor src/auth.js to use async/await". GLM-5.1 hand6Opt into MCP tools if neededConfigure MCP servers in ~/.claude/config as normal. GLM supports the MCP protocol. Some niche Anthropic-first

Detailed Steps

1

Pick the right endpoint

Z.ai exposes two endpoints with identical protocols. Use api.z.ai for USD billing and overseas access, open.bigmodel.cn for CNY billing and mainland access.

2

Set environment variables

Export ANTHROPIC_BASE_URL and ANTHROPIC_API_KEY in your shell profile. Claude Code CLI reads these on startup and skips the Anthropic OAuth flow.

export ANTHROPIC_BASE_URL=https://api.z.ai/api/anthropic
export ANTHROPIC_API_KEY=zai_xxxxxxxxxxxx
3

Launch Claude Code

Open a project and run claude. The CLI prints a banner showing the active base URL on startup. Confirm it matches the Z.ai endpoint.

cd ~/repos/myapp && claude
4

Verify the model

Inside Claude Code type /model. You should see GLM-5.1 (or GLM-5-Turbo for fast tasks). Use /model glm-5.1 to force the flagship.

5

Test an agent task

Ask Claude Code to make a real change — something like "refactor src/auth.js to use async/await". GLM-5.1 handles most Claude Code workflows without any visible difference.

6

Opt into MCP tools if needed

Configure MCP servers in ~/.claude/config as normal. GLM supports the MCP protocol. Some niche Anthropic-first MCP servers may behave slightly differently — file bugs back to Z.ai.

⚠️ Watch Out

  • !Do not paste your Anthropic API key into ANTHROPIC_API_KEY — Claude Code will try to validate it against Z.ai and fail.
  • !If /model shows claude-sonnet-4-6 anyway, the env vars did not load. Restart the terminal or source your rc file.
  • !Sub-agents using extended thinking may regress slightly versus Opus — switch back to real Claude Code for those specific prompts.

Related Reads

FAQ

Will my Claude Code Pro subscription be charged if I use GLM?

No. When ANTHROPIC_BASE_URL points to Z.ai, zero traffic reaches Anthropic — billing happens on the GLM side only.

Do MCP servers keep working?

Yes, Z.ai has full MCP protocol support. Most third-party MCP servers tested against Anthropic continue to work unchanged.

Can I alternate between real Claude and GLM in the same session?

Not mid-session. You need separate shell profiles (or unset the env var) to switch backend. Many developers alias one to claude-glm and the other to claude-real.