losclouds

AI Models / Compare

GLM 4.5

GLM-4.5 is our latest flagship foundation model, purpose-built for agent-based applications. It leverages a Mixture-of-Experts (MoE) architecture and supports a context length of up to 128k tokens. GL

Creator
Zhipu AI
Lifecycle
Active
Context
131.1K
Max output
98.3K
Released
Jul 25, 2025
Status
unknown
Input
$0.60 / 1M tokens
Output
$2.20 / 1M tokens
Cached read
$0.11 / 1M tokens
Cached write
/ 1M tokens
Batch discount
%
Source
OpenRouter
Verified
Apr 5, 2026 (High)

Capabilities

Modalities
texttext
Capabilities
reasoningpromptCachingfunctionCallingstructuredOutputs
Official Links

Benchmark Coverage

BenchmarkVersionScoreDateSourceNotes

Release History

ReleaseAliasLifecycleRelease DateDeprecationShutdownSummary
GLM 4.5z-ai-glm-4-5ActiveJul 25, 2025Model available via OpenRouter.

Host Coverage

HostTypeContextPricing NoteDifferences
OpenRouteraggregator131.1K$0.60/1M in · $2.20/1M out via OpenRouter
Migration Guidance

Change Events
DateTypeTitleDescriptionSource
Jul 25, 2025family_addedGLM 4.5 publishedModel made available via OpenRouter.OpenRouter

Other models from Zhipu AI

GLM 4 32B, GLM 4.5 Air, GLM 4.5 Air, GLM 4.5V, GLM 4.6, GLM 4.6V, GLM 4.7, GLM 4.7 Flash, GLM 5, GLM 5 Turbo, GLM 5V Turbo