To model the joint behavior of more than two random variables, we extend the concept of a joint distribution of two variables.
joint pmf
If are all discrete random variables, the joint pmf of the variables is the function
If the variables are continuous, the joint pdf of is the function such that for any intervals ,
The notion of independence of more than two random variables is similar to the notion of independence of more than two events.
independency
The random variables are said to be independent if for every subset of the variables (each pair, each triple, and so on), the joint pmf or pdf of the subset is equal to the product of the marginal pmf’s or pdf’s.
Thus if the variables are independent with , then the joint pmf or pdf of any two variables is the product of the two marginals, and similarly for any three variables and all four variables together. Intuitively, independence means that learning the values of some variables doesn’t change the distribution of the remaining variables. Most importantly, once we are told that variables are independent, then the joint pmf or pdf is the product of the marginals.
In many experimental situations to be considered in this book, independence is a reasonable assumption, so that specifying the joint distribution reduces to deciding on appropriate marginal distributions.