Contact us
Need a problem solved?
Our dedicated experts, located around the world, are here to help.
AI has emerged as a pivotal general-purpose technology. Generative AI alone is expected to facilitate a 7% increase in global GDP, equivalent to around USD 7 trillion, while lifting productivity growth by 1.5 percentage points over the next decade.
Consumers and businesses are already well aware of the benefits of AI. At least three-quarters of consumers across the globe are already using generative AI tools, primarily for translation or text and image generation, while 88% of C-suite executives surveyed globally say that helping their businesses speed up AI adoption is a top priority in 2025.
This increased AI adoption is driving an exponential growth in the total compute required. Compute to run generative AI applications alone could grow 125 times between 2024 and 2030, mainly due to the rising number of use cases and users.
Access Partnership estimates that another 130% of data center capacity across the globe is needed to meet overall demand for AI inference by 2030 – a potential shortfall of approximately 18.7 QFLOPs of AI inference compute capacity.
With current gaps in investment, power, and resources, this will be difficult.
This is where on-device AI comes in, which involves running (typically) lightweight AI models within a device itself rather than in centralized server hosts. On-device AI is expected to be more power and water efficient, while also offering additional benefits such as customisation, efficiency, and security – all of which can complement existing data center infrastructure.
The future of AI will be hybrid. Access Partnership expects data centers will continue to play a central role in training and hosting the largest and most complex models. However, inference – the day-to-day workload of billions of users and enterprises – will increasingly shift to devices and edge infrastructure.
Policymakers can promote a hybrid AI architecture by:
The success of fully leveraging AI will hinge not on building more data centers, but on distributing AI across billions of connected devices. This shift will allow AI to scale sustainably, equitably, and securely – positioning on-device AI as the cornerstone of an everywhere, always-on AI future.
Our dedicated experts, located around the world, are here to help.