Cloud-native computing is poised to explode, thanks to AI inference work
zdnet.com
Follow ZDNET: Add us as a preferred source on Google.
ZDNET's key takeaways
- The CNCF is bullish about cloud-native computing working hand in glove with AI.
- AI inference is the technology that will make hundreds of billions for cloud-native companies.
- New kinds of AI-first clouds, such as neoclouds, are already appearing.
At KubeCon North America 2025 in Atlanta, the Cloud Native Computing Foundation (CNCF)'s leaders predicted an enormous surge in cloud-native computing, driven by the explosive growth of AI inference workloads. How much growth? They're predicting hundreds of billions of dollars in spending over the next 18 months.
AI inference is the process by which a trained large language model (LLM) applies what it has learned to new data to make predictions, decisions, or classifications. In practical terms, the process goes like this. After a model is trained, say the new GPT ...
Copyright of this story solely belongs to zdnet.com . To see the full text click HERE

