SUPPLEMENTARY EXERCISES (31-38)

  1. An estimator is said to be consistent if for any , as . That is, is consistent if, as the sample size gets larger, it is less and less likely that will be further than from the true value of . Show that is a consistent estimator of when by using Chebyshev’s inequality from Exercise 44 of Chapter 3. [Hint: The inequality can be rewritten in the form

Now identify with .]

  1. a. Let be a random sample from a uniform distribution on . Then the mle of is . Use the fact that iff each to derive the cdf of . Then show that the pdf of is

b. Use the result of part (a) to show that the mle is biased but that is unbiased.

  1. At time , there is one individual alive in a certain population. A pure birth process then unfolds as follows. The time until the first birth is exponentially distributed with parameter . After the first birth, there are two individuals alive. The time until the first gives birth again is exponential with parameter , and similarly for the second individual. Therefore, the time until the next birth is the minimum of two exponential variables, which is exponential with parameter . Similarly, once the second birth has occurred, there are three individuals alive, so the time until the next birth is an exponential rv with parameter , and so on (the memoryless property of the exponential distribution is being used here). Suppose the process is observed until the sixth birth has occurred and the successive birth times are , 61.8 (from which you should calculate the times between successive births). Derive the mle of . [Hint: The likelihood is a product of exponential terms.]

  2. The mean squared error of an estimator is MSE . If is unbiased, then , but in general . Consider the estimator , where sample variance. What value of minimizes the mean squared error of this estimator when the population distribution is normal? [Hint: It can be shown that

In general, it is difficult to find to minimize , which is why we look only at unbiased estimators and minimize .]

  1. Let be a random sample from a pdf that is symmetric about . An estimator for that has been found to perform well for a variety of underlying distributions is the Hodges-Lehmann estimator. To define it, first compute for each and each the pairwise average . Then the estimator is the median of the ’s. Compute the value of this estimate using the data of Exercise 44 of Chapter 1. [Hint: Construct a square Use this to obtain an unbiased estimator for of the form . What is when ?

  2. Each of specimens is to be weighed twice on the same scale. Let and denote the two observed weights for the th specimen. Suppose and are independent of one another, each normally distributed with mean value (the true weight of specimen ) and variance .

a. Show that the maximum likelihood estimator of is . [Hint: If , then .]

b. Is the mle an unbiased estimator of ? Find an unbiased estimator of . [Hint: For any rv , . Apply this to .] table with the ’s listed on the left margin and on top. Then compute averages on and above the diagonal.]

  1. When the population distribution is normal, the statistic median can be used to estimate . This estimator is more resistant to the effects of outliers (observations far from the bulk of the data) than is the sample standard deviation. Compute both the corresponding point estimate and for the data of Example 6.2.

  2. When the sample standard deviation is based on a random sample from a normal population distribution, it can be shown that