We engineers often ignore the distinctions between joint, marginal, and conditional probabilities - to our detriment.
Figure 1 - How the Joint, Marginal, and Conditional distributions are related.
is the probability of x by itself, given specific value of variable
y, and the
. (See Figure
1) If x and y represent events A
and B, then P(A|B) = nAB/nB , where
nAB is the number of times both
and B occur, and nB is the number of times
B occurs. P(A|B) =
P(AB) = nAB/N and P(B) = nB/N so that
Joint probability is the probability of two or more things happening together. where f is the probability of x and y together as a pair, given the distribution parameters, . Often these events are not independent, and sadly this is often ignored. Furthermore, the correlation coefficient itself does NOT adequately describe these interrelationships.
Consider first the idea of a probability density or distribution: where f is the probability density of x, given the distribution parameters, . For a normal distribution, where is the mean, and is the standard deviation. This is sometimes called a pdf, probability density function. The integral of a pdf, the area under the curve (corresponding to the probability) between specified values of x, is a cdf, cumulative distribution function, . For discrete f , F is the corresponding summation.
A joint probability density two or more variables is called a multivariate distribution. It is often summarized by a vector of parameters, which may or may not be sufficient to characterize the distribution completely. Example, the normal is summarized (sufficiently) by a mean vector and covariance matrix.
f is the
probability density of x, for all possible values of y, given the
. The marginal probability is determined from
the joint distribution of x and y by integrating over all values of
called "integrating out" the variable y. In applications of Bayes's
Theorem, y is often a matrix of possible parameter values. The figure
illustrates joint, marginal, and conditional probability relationships.