This document introduces random variables, focusing on discrete cases before extending to continuous ones. A random variable is a numerical outcome of an experiment; its probability distribution shows the likelihood of each value. The probability mass function, p(x), maps each value to its probability, summing to one. The text distinguishes between random variables (numeric values) and events (occurrences). Expected value E(X) is calculated by summing each value multiplied by its probability, representing the average outcome over many trials. Higher moments, like E(X²), are also introduced, crucial for calculating variance. Variance, measuring the spread of the distribution, is defined as E[(X - E(X))²] = E(X²) - [E(X)]². Standard deviation is the square root of the variance. The moment generating function, MX(t) = E(e^(Xt)), provides a method for easily calculating moments through differentiation. The document then extends these concepts to continuous random variables, replacing sums with integrals and introducing the probability density function f(x), where the integral of f(x) over the range equals 1. Examples using coin flips and dice rolls illustrate the calculations of expected value, variance, and standard deviation, along with the application of the moment generating function.