GLM-4.5 - Z.AI API DOC
Overview
GLM-4.5 and GLM-4.5-Air are our latest flagship models, purpose-built as foundational models for agent-oriented applications. Both leverage a Mixture-of-Experts (MoE) architecture. GLM-4.5 has a total parameter count of 355B with 32B active parameters per forward pass, while GLM-4.5-Air adopts a more streamlined design with 106B total parameters and 12B active parameters.
Both models share a similar training pipeline: an initial pretraining phase on 15 trillion tokens of general-domain ...
Read more at docs.z.ai