Why you'll pay more for AI in 2026, and 3 money-saving tips to try
zdnet.com
Follow ZDNET: Add us as a preferred source on Google.
ZDNET's key takeaways
- Rising DRAM costs and more verbose chatbots will drive up prices.
- The industry seeks to mitigate costs with more efficient models.
- Users need to prioritize projects and consider polite prompting.
Whether you're a user of an AI chatbot or a developer utilizing large language models to build apps, you'll probably pay more for the technology this year. Thankfully, there are steps you can take to mitigate the cost.
We're living in a token economy. Each piece of content -- words, images, sounds, etc. -- is treated by an AI model as an atomic unit of work called a token. When you type into a prompt in ChatGPT, and you receive a paragraph in response, or you call an API to do the same thing inside an app you've ...
Copyright of this story solely belongs to zdnet.com . To see the full text click HERE

