In many situations, information about the observed value of one of the two variables and gives information about the value of the other variable.
Example
In EXAMPLE 5.1, the marginal probability of at is .35 and at is.25. However, if we learn that , the last column of the joint probability table tells us that can’t possibly be 100 and the other two possibilities, 500 and 1000, are now equally likely. Thus knowing the value of changes the distribution of ; in such situations it is natural to say that there is a dependence between the two variables.
In Chapter 2, we pointed out that one way of defining independence of two events is via the condition . Here is an analogous definition for the independence of two rv’s.
independence
Two random variables and are said to be independent if for every pair of and values
If (5.1) is not satisfied for all , then and are said to be dependent.
The definition says that two variables are independent if their joint pmf or pdf is the product of the two marginal pmf’s or pdf’s. Intuitively, independence says that knowing the value of one of the variables does not provide additional information about what the value of the other variable might be. That is, the distribution of one variable does not depend on the value of the other variable.
EXAMPLE 5.7 (Example 5.5 continued)
Independence of two random variables is most useful when the description of the experiment under study suggests that and have no effect on one another. Then once the marginal pmf’s or pdf’s have been specified, the joint pmf or pdf is simply the product of the two marginal functions. It follows that