Concepts → Fundamentals
Probability fundamentals, on dice
Five short pages, in order, that build up the language and the machinery the rest of /concepts assumes you have. Dice are the cleanest first example of probability — every outcome is countable, every probability is exact, and you can hold the whole distribution in your head. If you've never been formally introduced to probability distributions, expected value, variance, or convolutions, start here.
-
1. What's a probability distribution?
From
1d6(flat) to2d6(a tent) to3d6(smoother still). Sample space, PMF, support, and the convolution that takes you from one to the next. -
2. Expected value, in dice
The weighted average. Linearity of expectation —
E[X + Y] = E[X] + E[Y], always — and the(M+1)/2shortcut for fair1dMdice. Why the mean is often the wrong question. -
3. Variance and standard deviation
How spread-out the distribution is. The
(M² − 1)/12closed form for fair dice, additivity of variance under independence, and whySD(2d6) ≈ 2.41, not2 · SD(1d6). -
4. Independence, sums, and convolutions
What independence means, why summing dice is convolving PMFs, when the closed-form mean and variance shortcuts apply, and when they don't (advantage, reroll mechanics).
-
5. The normal approximation, and where it lies
8d6is bell-shaped, but1d20is not,1d6!is not, and2d20kh1is not. The central limit theorem, when to lean on it, and when to trust the engine instead.