1. The Evolution of Computing Power: From Human Intelligence to the Digital Revolution
“Computing power” has become a buzzword in today’s tech-driven world—a brilliant star illuminating the path of our digital future. But what exactly does it mean?
At its core, computing power refers to the ability to process information and generate desired outcomes—a capacity to calculate, analyze, and solve. It’s the engine behind everything from solving math problems to training artificial intelligence models.
Interestingly, humans themselves are natural carriers of computing power. Our brains are among the most powerful processing units in existence. Throughout our lives, we perform countless calculations—whether it’s mental math, logical reasoning, or everyday decision-making.
Think about it: when you calculate something in your head or estimate a total at the grocery store, you’re using your innate computing power. However, human brainpower alone struggles with complexity and scale.
So we started seeking external tools to enhance our capabilities.
In ancient times, early humans used ropes and stones as rudimentary counting aids. Though primitive, these tools marked the beginning of our journey toward expanding computational capacity.
As civilization advanced, more sophisticated tools emerged—such as counting rods and the abacus—which dramatically improved our ability to compute and organize information.
Then came a defining moment in the 20th century.
In February 1946, the world witnessed the birth of ENIAC, the first digital electronic computer. It was a technological milestone that ushered in the modern computing era.
Like a flash of lightning that cuts through the night, it announces that human computing power has officially entered the digital electronic era.
Since then, the development of semiconductor technology has been like a spring breeze, giving birth to the chip era, and chips have gradually become the core carrier of computing power.
1958年.webp)
The 1970s–1980s: Chip Innovation and the Rise of Personal Computing
During the 1970s and 1980s, chip technology advanced at an astonishing pace under the guidance of Moore’s Law. Processing power soared like a rocket, while chip sizes shrank dramatically. These innovations led to the miniaturization of computers and, eventually, the birth of the personal computer (PC).
In 1981, IBM launched the IBM 5150, widely recognized as the world’s first true PC. Its significance went far beyond hardware—it symbolized a new era where computing power was no longer reserved for large corporations. Instead, IT capabilities became accessible to individuals, small businesses, and households, unlocking the door to the information age for all.
With the help of PCs, people experienced improvements in both quality of life and workplace productivity. More importantly, the widespread adoption of personal computing laid the foundation for the explosive growth of the internet.
The 21st Century: Cloud Computing Reshapes the Compute Landscape
At the turn of the 21st century, cloud computing emerged and triggered another seismic shift in how computing power is delivered and consumed.
Before cloud computing, humanity was constrained by the limitations of single-point computing. Even earlier distributed models—like grid computing—struggled to meet the growing demand for large-scale computational workloads.
Cloud computing represented a breakthrough in distributed computing. It acted like a super-manager, aggregating fragmented compute resources and bundling them into a virtual, infinitely scalable resource pool.
Within this shared pool, CPUs, memory, storage, GPUs, and other compute elements work together seamlessly. When users need computing power, resources are allocated dynamically based on demand. Users only pay for what they consume—eliminating the need to build and maintain costly data centers.
Compared to self-hosted infrastructure, cloud computing delivers significant advantages in flexibility, cost-efficiency, and scalability. It marked a turning point, making advanced compute capabilities available to a much broader audience and driving digital transformation across nearly every sector.

With the cloudification of computing power, data centers have become the main carrier of computing power, and the scale of human computing power has also achieved a new leap.
2. The Diverse Landscape of Computing Power: Classification & Applications
In today’s digital era, cloud computing and data centers are thriving—driven by the deepening processes of informatization and digital transformation. Behind this boom lies a powerful surge in demand for computing power across all sectors of society.
These diverse demands weave a vibrant tapestry of computing applications, spanning consumer technology, industry, and urban governance.
Two Main Types: General-Purpose vs. Specialized Computing Power
To meet these varied needs, computing power has evolved into multiple categories. The most fundamental division is between general-purpose computing and specialized computing—primarily differentiated by the type of chips they rely on.
-
General-purpose chips (such as x86-based CPUs) are like versatile all-rounders. They can handle a wide variety of tasks with flexibility, but they typically consume more power.
-
Specialized chips, including FPGAs (Field Programmable Gate Arrays) and ASICs (Application-Specific Integrated Circuits), are like elite professionals in specific domains.
FPGAs can be reprogrammed at the hardware level for different tasks, offering a balance between flexibility and performance, while ASICs are purpose-built from the ground up, with most algorithms hard-coded into silicon. Though highly efficient, ASICs are limited to very specific tasks.
A Practical Example: Cryptocurrency Mining
In the early days, people mined Bitcoin using PCs with general-purpose CPUs. As mining difficulty increased, they switched to GPUs. However, power consumption became a major concern—mining revenue often failed to offset electricity costs. As a result, miners shifted to more energy-efficient FPGA and ASIC clusters, optimizing for both performance and cost.
Computing in Data Centers: General & High-Performance
Modern data centers handle two primary computing workloads: general-purpose computing and high-performance computing (HPC). HPC is further divided into three major categories:
(1) Scientific Computing
A field filled with complexity and mystery, scientific computing supports disciplines like physics, chemistry, environmental science, life sciences, oil exploration, and astronomy. These research domains generate massive datasets.
Take oil and gas exploration, for example—it’s like performing a full-body CT scan on the Earth. A single project can generate over 100 terabytes of raw data, sometimes exceeding 1 petabyte. Extracting valuable insights from such massive volumes requires immense computing power.
(2) Engineering Computing
This includes computer-aided engineering (CAE), computer-aided manufacturing (CAM), electronic design automation (EDA), and electromagnetic simulation. These applications are like interlocking gears—precise, synchronized, and highly demanding of compute resources.
(3) Intelligent Computing (AI)
Currently the most prominent and demanding computing category, intelligent computing encompasses machine learning, deep learning, and data analytics.
AI is a true “computational heavyweight”—its hunger for processing power is immense. This is due to its reliance on operations like matrix multiplication and vector arithmetic, which require highly specialized compute resources. CPUs often fall short here.
As a result, GPUs and dedicated AI chips have become the backbone of AI computing. Although originally designed for graphics rendering, GPUs have thousands of logic cores, enabling them to execute parallel instructions on large datasets—making them ideal for AI workloads.
Smart Compute Centers & Specialized Hardware
To support AI development, governments and enterprises are building AI-focused data centers (or “Smart Compute Centers”) across the country. These centers are dedicated to intelligent computing. Meanwhile, supercomputing centers—home to systems like Tianhe-1A—handle massive scientific and engineering workloads.
Typical cloud data centers today handle a mix of general-purpose and high-performance tasks, including heterogeneous computing with diverse chip types. With the growing need for performance, specialized chips such as TPUs, NPUs, and DPUs are increasingly being adopted.
Offloading Compute Tasks to Specialized Chips
The concept of computing offload is gaining traction. It involves shifting specific tasks—like virtualization, data compression, encryption, and transmission—from CPUs to specialized chips like NPUs and DPUs, reducing the burden on general-purpose processors.
At the same time, frontier computing technologies—such as quantum computing and photonic computing—are beginning to emerge. Though still in early stages, they offer exciting potential and merit close attention.
3. Measuring Computing Power: Metrics & Trends
As a form of “capability,” computing power—like weight or distance—requires standard units of measurement.
Common units include FLOPS, TFLOPS, GFLOPS, and more. These reflect how many floating-point operations a system can perform per second. Additional metrics include MIPS, DMIPS, and OPS, depending on the use case and processor type.
Floating-point precision also varies:
-
FP16 (half precision)
-
FP32 (single precision)
-
FP64 (double precision)
Each offers different trade-offs between speed, precision, and energy consumption.
Trend: Explosive Growth in AI & HPC Compute
According to Huawei’s GIV (Global Industry Vision), by 2030:
-
General-purpose compute power (FP32) will grow 10× to reach 3.3 ZFLOPS
-
AI compute power (FP16) will grow 500× to reach 105 ZFLOPS
This staggering difference signals a major transformation in the global computing landscape, with intelligent computing poised to become the dominant force.
4. The Present & Future of Computing: Opportunities and Challenges
In 1961, John McCarthy, the “father of AI,” envisioned Utility Computing—computing services offered like public utilities. Today, that vision has become reality.
Computing is now as essential as electricity and water, while data centers and communication networks have become foundational infrastructure. Over decades, the IT and telecom industries have made immense contributions to society by realizing this vision.
Today, computing power has transcended technology—it’s an economic and philosophical concept. It underpins digital transformation and serves as the core productive force of the digital economy.
Governments, enterprises, and even individuals rely on computing for daily operations. In national security, defense, and scientific research, computing is indispensable. The strength of a nation’s compute capacity directly affects its economic trajectory and level of technological advancement.
A Global Race for Compute Power
Across the world, a clear correlation exists between compute capacity and economic development. The more powerful a country’s computing infrastructure, the stronger its innovation and competitiveness.
This has sparked a global race—one without smoke, but with high stakes.
5. Looking Ahead: A New Computing Era Is Dawning
As informatization, digitization, and intelligent technologies accelerate, we are entering an era of ubiquitous intelligent connectivity.
The rise of smart IoT devices and AI-driven applications will generate unimaginable volumes of data, fueling demand for even more computing power.
In this new era of opportunity and challenge, a new compute revolution is on the horizon. The scarcity of advanced computing chips presents both a challenge and a chance for innovation.
As computing infrastructure continues to evolve, we stand at the threshold of a new digital spring. And each of us—developers, entrepreneurs, users—will become both witnesses and participants in this transformative journey.