The internal values a neural network learns during training — essentially the "knowledge" of the model encoded as numbers. When someone says a model has "7 billion parameters," they mean 7 billion individual numerical values that were adjusted during training to capture patterns in the data. More parameters generally means more capacity to learn complex patterns, but also more memory to store and more compute to run.
Why it matters
Parameter count is the most common shorthand for model size, and it directly determines how much GPU memory you need. A 7B model in 16-bit precision needs ~14GB of VRAM just for the weights. Understanding parameters helps you estimate costs, choose hardware, and understand why quantization (reducing precision per parameter) is so important for making models accessible.