← Back to home
SetupRead 8 min

Use Claude Code from China 2026 — Native Setup With GLM Coding Plan

Anthropic blocks mainland China. GLM Coding Plan and Kimi Code both speak the Anthropic protocol — point Claude Code CLI at them and the China problem disappears.

TL;DR

Install Claude Code CLI → set ANTHROPIC_BASE_URL + ANTHROPIC_API_KEY to GLM or Kimi → run claude. That is the whole guide. Details below.

Flow Overview

1Subscribe to GLM Coding PlanOpen https://z.ai/subscribe (or bigmodel.cn for CNY billing) and pick a tier. Lite at $3 is fine for evaluatio2Grab your API keyAfter subscribing, open the Z.ai console → API Keys → create a key scoped to your coding plan. Copy it to your3Install Claude Code CLIFollow the official install — npm install -g @anthropic-ai/claude-code or the native installer. This step does4Point Claude Code at GLMExport two environment variables and skip the Anthropic login flow entirely. Add these to ~/.zshrc or ~/.bashr5Run Claude Code normallyOpen a project and type claude. The CLI will talk to GLM-5.1 transparently. Slash commands, MCP servers and su6Verify the modelInside Claude Code type /model or /debug to confirm GLM is actually serving requests. You should see GLM-5.1 o

Detailed Steps

1

Subscribe to GLM Coding Plan

Open https://z.ai/subscribe (or bigmodel.cn for CNY billing) and pick a tier. Lite at $3 is fine for evaluation. You will need Alipay, WeChat Pay or a USD card.

2

Grab your API key

After subscribing, open the Z.ai console → API Keys → create a key scoped to your coding plan. Copy it to your clipboard.

3

Install Claude Code CLI

Follow the official install — npm install -g @anthropic-ai/claude-code or the native installer. This step does not require a proxy; npm mirrors work fine.

npm install -g @anthropic-ai/claude-code
4

Point Claude Code at GLM

Export two environment variables and skip the Anthropic login flow entirely. Add these to ~/.zshrc or ~/.bashrc so every shell picks them up.

export ANTHROPIC_BASE_URL=https://open.bigmodel.cn/api/anthropic
export ANTHROPIC_API_KEY=your_zai_key_here
5

Run Claude Code normally

Open a project and type claude. The CLI will talk to GLM-5.1 transparently. Slash commands, MCP servers and sub-agents all continue to work.

cd your-project
claude
6

Verify the model

Inside Claude Code type /model or /debug to confirm GLM is actually serving requests. You should see GLM-5.1 or GLM-5-Turbo in the active model line.

⚠️ Watch Out

  • !Do not log into Claude Code with your Anthropic account — it will try to route to api.anthropic.com and fail behind the Great Firewall. Rely on environment variables only.
  • !If claude still targets Anthropic, your shell did not pick up the env vars. Run echo $ANTHROPIC_BASE_URL and restart your terminal.
  • !The Bigmodel mainland endpoint expects CNY billing. For USD billing use https://api.z.ai/api/anthropic instead.
  • !npm install sometimes stalls behind the Firewall — use Alibaba's npm mirror (npm config set registry https://registry.npmmirror.com).

Related Reads

FAQ

Do I need a VPN?

No. Once you set ANTHROPIC_BASE_URL to the Z.ai or Bigmodel endpoint, all Claude Code traffic routes through mainland-friendly domains.

Can I still use the same CLI for real Claude models later?

Yes. Unset the environment variables or maintain a separate shell profile. Claude Code has no lock-in once configured via env vars.

Does this work for Claude Code Max features?

The CLI features (slash commands, MCP, sub-agents) work. The specific model quality is whatever GLM provides, not Opus — so Max 20x features like extended thinking depth are not replicated.