Tech »  Topic »  How to run OpenAI's new gpt-oss-20b LLM on your computer

How to run OpenAI's new gpt-oss-20b LLM on your computer


Hands On Earlier this week, OpenAI released two popular open-weight models, both named gpt-oss. Because you can download them, you can run them locally.

The lighter model, gpt-oss-20b, has 21 billion parameters and requires about 16GB of free memory. The heavier model, gpt-oss-120b, has 117 billion parameters and needs 80GB of memory to run. By way of comparison, a frontier model like DeepSeek R1 has 671 billion parameters and needs about ~875GBs to run, which is why LLM developers and their partners are building massive datacenters as fast as they can.

Unless you're running a high-end AI server, you probably can't deploy gpt-oss-120b on your home system, but a lot of folks have the memory necessary to work with gpt-oss-20b. Your computer needs either a GPU with at least 16GB of dedicated VRAM, or 24GB or more system memory (leaving at least 8GB for the OS and software ...


Copyright of this story solely belongs to theregister.co.uk . To see the full text click HERE