AI workloads require more compute resources, more energy, and faster data throughput than conventional workloads, and these factors are reshaping the market dynamics of data centers as well as cloud economics. Demand for data center capacity is surging to new heights: Global data center demand is forecast to more than triple between 2023 and 20301, leading to a potential supply shortage. Notoriously data-intensive, AI workloads are energy-intensive too, with substantially higher power consumption per server than traditional workloads2, further adding to operational costs.
As AI workloads proliferate, deployment patterns mature, and data center economics evolve, cloud consumers are reconsidering where to run their workloads to optimize for cost and performance. Understanding the “nice-to-haves” and “must-haves” for AI workloads can help save time, money, and effort in setting up AI initiatives for success while managing other operational requirements.
Rising Demands Prompt Cloud Rethink
The synergistic relationship between cloud computing and AI has accelerated the demand for each: Cloud enables access and scalability while providing the compute resources to satisfy the data intensity, volume, and velocity demands of AI workloads. The marketplace for cloud platforms has also matured beyond simple public cloud deployments toward more complex hybrid and multi-cloud deployments. However, the question for cloud consumers embracing AI workloads is the same as ever: Is a public or private cloud the better option for a given workload? The answer depends on the preferred mix of cost, control, and performance.
The benefits of public cloud are well known: financial flexibility; scalability, including of GPU resources; and lighter operational requirements, among others. However, private clouds are enjoying a resurgence driven by the increased focus on data capabilities and cost efficiency, as well as considerations surrounding data compliance. In fact, more than half of AI workloads are running on private clouds3. Although many public clouds offer robust protections, their shared responsibility models place the onus of securing workloads and data on consumers. Private cloud consumers are also responsible for security of and in the cloud, but they also enjoy greater control and potential long-term savings with reduced data transfer expenses and predictable costs.
Ideally, workload placement is determined by balancing these factors against cost pressures and business-specific concerns in the hopes of optimizing expenses, performance, and control. The many business considerations and use cases for AI workloads explain the rising popularity of hybrid and multi-cloud environments, which give organizations the options to run their apps and services in the cloud environments that make the most sense.
Other Determining Factors for Workload Placement
Let’s look at several other factors in matching workloads with appropriate environments.
- Security concerns are paramount in any organization. Highly regulated industries in particular require highly secure environments. It’s important to separate confidential and public datasets to prevent unauthorized use of private information, as sometimes happens while training AI models4. Sensitive data is often housed in private environments and served by multi-layer security and zero-trust architectures.
- Latency and mobility concerns also drive workload placement. Large-scale transit of datasets between remote cloud environments is time-consuming and expensive. Some real-time workloads—such as data from IoT applications or autonomous vehicles5—require a combination of local clouds and edge compute resources.
- Sustainability ranks with energy efficiency as a top priority for many organizations6. To minimize environmental impacts while maintaining resource availability, organizations may opt to place AI workloads in energy-efficient on-premises systems or environments run by green data centers that are powered by renewable energy.
- Skills gaps are a limiting factor in workload placement and management, especially for organizations that manage and operate their own data centers7. The availability of in-house expertise with specific platforms can decide not only workload deployment but how processes and teams are structured. Vendor tools and training can help flatten learning curves and reduce upfront costs for organizations.
Striving for Balance amid Disruption
As AI continues to reshape business and technology landscapes, organizations are giving careful thought to how they’re leveraging the cloud to achieve their objectives. Understanding the comparative advantages of public and private clouds and on-prem environments for AI workloads is essential for finding the right model to support your business.
- McKinsey & Co. “AI power: Expanding data center capacity to meet growing demand.” Oct 2024.
- SemiAnalysis. “AI Datacenter Energy Dilemma—Race for AI Datacenter Space.” Mar 2024.
- Wanclouds. “2025 Cloud and AI Index.” Jan 2025.
- Corporate Compliance Insights. “If the AI Industry Doesn’t Establish Methods to Protect Private Data, Someone Else Will.” Jul 2024.
- XenonStack. “Edge AI for Autonomous Vehicles.” Nov 2024.
- ITProToday. “IT Sustainability Trends and Predictions 2025 from Industry Insiders.” Jan 2025.
- ProSource. “Navigating Labor Challenges In The Data Center Industry: Strategies For Success.” Sep 2024.