According to the latest data released by OpenRouter, the world’s largest API aggregation platform for AI models, China’s AI large models recorded a weekly token usage volume of 4.69 trillion tokens as of March 15, 2026—surpassing the United States for the second consecutive week to secure the top global position. J.P. Morgan projects that China’s AI inference token consumption will surge from approximately 10 quadrillion in 2025 to around 3,900 quadrillion by 2030, reflecting an estimated 370-fold increase over five years.

What Is a Token?
In artificial intelligence, a token represents the fundamental unit of text or data processed by a model. Whether a user’s query, a paragraph of text, or AI-generated code, all content is segmented into tokens for computational processing. Consequently, token usage volume has emerged as a pivotal metric for evaluating AI model activity and real-world industrial impact: higher token throughput signifies broader adoption and greater tangible value creation across applications.