Tech »  Topic »  Decision to use MXFP4 makes models smaller, faster, and more importantly, cheaper for everyone involved

Decision to use MXFP4 makes models smaller, faster, and more importantly, cheaper for everyone involved


Analysis Whether or not OpenAI's new open weights models are any good is still up for debate, but their use of a relatively new data type called MXFP4 is arguably more important, especially if it catches on among OpenAI's rivals.

The format promises massive compute savings compared to data types traditionally used by LLMs, allowing cloud providers or enterprises to run them using just a quarter of the hardware.

What the heck is MXFP4?

If you've never heard of MXFP4, that's because, while it's been in development for a while now, OpenAI's gpt-oss models are among the first mainstream LLMs to take advantage of it.

This is going to get really nerdy, really quickly here, so we won't judge if you want to jump straight to the why it matters section.

MXFP4 is a 4-bit floating point data type defined by the Open ...


Copyright of this story solely belongs to theregister.co.uk . To see the full text click HERE