Google releases pint-size Gemma open AI model
arstechnica.com
Big tech has spent the last few years creating ever-larger AI models, leveraging rack after rack of expensive GPUs to provide generative AI as a cloud service. But tiny AI matters, too. Google has announced a tiny version of its Gemma open model designed to run on local devices. Google says the new Gemma 3 270M can be tuned in a snap and maintains robust performance despite its small footprint.
Google released its first Gemma 3 open models earlier this year, featuring between 1 billion and 27 billion parameters. In generative AI, the parameters are the learned variables that control how the model processes inputs to estimate output tokens. Generally, the more parameters in a model, the better it performs. With just 270 million parameters, the new Gemma 3 can run on devices like smartphones or even entirely inside a web browser.
Running an AI model locally has numerous benefits ...
Copyright of this story solely belongs to arstechnica.com . To see the full text click HERE