Want to run AI on your PC? You're gonna need a bigger hard drive

PCWorld 

When people talk about the "size" of an AI model, they're referring to the number of "parameters" it contains. A parameter is one variable in the AI model that determines how it generates output, and any given AI model can have billions of these parameters. Also referred to as model weights, these parameters occupy storage space to operate properly -- and when an AI model has billions of parameters, storage requirements can quickly balloon. As you can see, the storage space consumed by an LLM increases with the size of its parameters. The same is true for other types of generative AI models, too.