Data Centre Insights – April 2026
As AI workloads move rapidly from training to real-time inference, the data centre is entering a fundamentally different era. We are seeing GPUs shift from specialist accelerators to the primary engines of modern compute, triggering structural change across power, cooling and facility design.
Inference-optimised GPUs are enabling larger AI models to run within individual accelerators, unlocking significant gains in efficiency and latency. But this progress is not incremental. It is driving a step-change in density, with racks now operating as tightly integrated systems rather than collections of discrete servers. Power demand, thermal intensity and structural loading are converging at levels that legacy designs were never built to accommodate.
This is where the challenge and the opportunity lie. The transition to inference-led architectures is raising the complexity of new development and placing renewed emphasis on flexibility. Future-ready data centres must be designed to absorb rapid technological evolution, not just meet today’s specifications. Structural capacity, electrical architecture and cooling strategies are becoming defining constraints rather than downstream considerations.
Read more about how evolving GPU architectures are reshaping the physical reality of data centres, and what this means for developers, operators and investors planning for long-term relevance in an AI-driven market. Download the full Data Centre Insights to explore the data and perspectives shaping the next generation of AI-ready data centres.
DC Insights is Cushman & Wakefield’s monthly market brief from the Asia Pacific Data Centre Group, offering timely insights on market movements, key developments and outlooks across the region.