Variables and Densities#
Random Variables#
Consider an experiment or a random event such as flipping a coin or measuring wind speed and direction at a specific time and location. A random variable is a function that associates a numerical value with the outcome of an experiment or random event. If the range of values of a random variable \(X\) is finite or countable (like the set of integers) then we call \(X\) a discrete random variable. If the range of values of a random variable \(X\) is uncountably infinite (like the set of real numbers \(\mathbb{R}\)) then we call \(X\) a continuous random variable. For example:
Flip a coin and let \(X = 1\) if the outcome is heads and \(X = 0\) if the outcome is tails. Then \(X\) is a discrete random variable.
Let \(X\) be the total number of points scored in a basketball game. Then \(X\) is a positive integer and so \(X\) is a discrete random variable.
Let \(X\) be the temperature measured at a specific time and location. The possible values of \(X\) is the set of positive real numbers and so \(X\) is a continous random variable.
Throw a dart at a dartboard and let \(X\) be the distance from where the dart lands to the center of the baord. The possible values of \(X\) is again the set of positive real numbers so \(X\) is a continuous random variable.
Let \(X\) be the wind direction measured at a specific time and location. The possible values of \(X\) is the set \([0,2 \pi]\) and so \(X\) is a continous random variable.
Probability Distributions#
Let \(X\) be a continuous random variable. If we could repeat the same experiment or observe the same random event to generate infinitely many values of \(X\) then the probability \(P(a \leq X \leq b)\) is the relative frequency of the occurence of the event \(a \le X \le b\). The probability distribution of \(X\) is the collection of probabilities \(P(a \leq X \leq b)\) for all intervals \([a,b] \subset \mathbb{R}\).
Probability Density Functions#
A probability density function is a function \(f : \mathbb{R} \rightarrow \mathbb{R}\) such that \(f(x) \ge 0\) for all \(x \in \mathbb{R}\) and
We say that the probability distribution of a continuous random variable \(X\) corresponds to the density function \(f_X(x)\) if
Mean and Variance#
Let \(X\) be a continuous random variable with probability density function \(f_X(x)\). The mean of \(X\) is
The mean describes the central value in the distribution of \(X\). The variance of \(X\) is
The variance is the average value of the squared distance from the mean \((x - \mu)^2\). The variance describes how spread out the distribution of \(X\) is.
Scaling and Shifting#
If \(f(x)\) is a probability density function then the shifted function \(f(x - b)\) is also a density function for any \(b \in \mathbb{R}\). Let’s see why this is true. First, \(f(x - b) \geq 0\) for all \(x \in \mathbb{R}\) since \(f(x) \geq 0\) for all \(x\). Second, compute the integral using the substitution \(y = x- b\), \(dy = dx\)
Similarly, if \(f(x)\) is a probability density function then the scaled function \(a f(ax)\) is also a probability density function for any \(a > 0\). Let’s see why this is true. First, \(a f(a x) \geq 0\) for all \(x \in \mathbb{R}\) since \(f(x) \geq 0\) for all \(x\) and \(a > 0\). Second, compute the integral using the substitution \(y = ax\), \(dy = a dx\)
Parameter Estimation#
Suppose we observe a sample of \(N\) values \(x_1,\dots,x_N\) of a random variable \(X\). An estimate of the mean is given by the sample mean
and estimate of the variance is the (unbiased) sample variance