MATH5905 Statistical Inference Part two
Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit
DEPARTMENT OF STATISTICS
Additional Exercises for MATH5905, Statistical Inference
Part two: Data reduction. Sufficient statistics. Classical estimation Sufficiency
1. Use the factorization criterion to find a sufficient statistic for the parameter when X1 , X2 , . . . , Xn are independent random variables each with distribution
a) N (µ, 1),
b) N (0, σ2 ),
c) Uniform (θ, θ + 1),
d) Poisson (λ).
Check your answer in (d) using the definition of sufficiency.
2. Let X1 , X2 , X3 be a sample of size 3 from the Bernoulli (p) distribution. Consider the 2 statistics S = X1 + X2 + X3 and T = X1 X2 + X3 . Show that S is sufficient for p and T is not.
3. A random variable X = (X1 , X2 ) has the following distribution (with 1 < θ < 3):
(x1 , x2 ) |
(0,0) |
(0,1) |
(1,0) |
(1,1) |
P (X1 = x1 , X2 = x2 ) |
|
|
|
|
Check whether X1 + X2 or X1 X2 is sufficient for θ .
4. If X1 , X2 , . . . , Xn are independent Bernoulli (p) random variables, prove that X1 is not sufficient for p.
5. Given that θ is an integer and that X1 and X2 are independent random variables which are Uniformly distributed on the integers 1, 2, . . . , θ, prove that X1 + X2 is not sufficient for θ .
6. Suppose X1 , X2 , . . . , Xn are independent discrete random variables each with probability function f (x; θ), θ unknown. Prove that (X1 , X2 , . . . , Xn — 1 ) is not sufficient for θ .
7. Find a minimal sufficient statistic for the parameter when X1 , X2 , . . . , Xn are independent random variables each with distribution
a) Poisson (λ),
b) N (0, σ2 ),
c) Gamma (α), (With a density f (x, α) = exp( −x)xα — 1 , x > 0. (Here the Gamma function is
defined as Γ(α) = 0~ e —x xα — 1 dx and has the property Γ(α + 1) = αΓ(α). In particular, for a
natural number n : Γ(n + 1) = n! holds).
d) Uniform (0, θ).
e) Uniform (θ, θ + 1).
f) Uniform (θ1 , θ2 ).
8. If X1 , X2 , . . . , Xn are i.i.d. random variables with densities f (x; θ) given below, find a sufficient statistics for θ .
a) f (x; θ) = θxθ — 1 I(0 , 1) (x), θ ∈ (0, ∞).
b) f (x; θ) = x3 e —x/θ I(0 ,~ ) (x), θ ∈ (0, ∞).
9. Show that the minimal sufficient statistic Tn for the parameter σ of the Scale-Cauchy family f (x, σ) = has dimension n and is equal to Tn = (X1) , X 2) , . . . , , X n)) where X(1) < X(2) < . . . < X(n) is the variation sequence.
10. Let X1 , X2 , . . . , Xn be i.i.d. observations from a scale parameter family {Fσ (X)}, σ > 0 with Fσ (x) = F (x/σ), σ > 0(F (.)- a given continuous cumulative distribution function.) Show that any statistic − 1 values X1 /Xn , X2 /Xn , . . . , Xn — 1 /Xn is an ancillary
Cramer-Rao Bound. UMVUE
11. Calculate the Cramer-Rao lower bound for the variance of an unbiased estimator of θ and find a statistic with variance equal to the bound when X1 , X2 , . . . , Xn are independent random variables each with distribution
a) f (x, θ) = e —x/θ , x > 0,
b) Bernoulli (θ),
c) N (θ, 1),
d) N (0, θ).
e) Prove that no unbiased estimator of θ has variance equal to the bound when the distribution is N (0, θ2 ).
12. If X1 , X2 , . . . , Xn are independent Poisson (λ) random variables, find the umvue of e —2λ . Check that the estimator has mean e —2λ and compare the variance of the estimator with the Cramer-Rao lower bound for the variance of an unbiased estimator of e —2λ .
13. Suppose random variables X and Y have joint density
fX,Y (x, y) = 8xy, 0 < y < x, 0 < x < 1.
For this pair of random variables, verify directly the lemma which states that if a(x) = E(Y |X = x), then Ea(X) = E(Y) and Var{a(X)} ≤ Var(Y).
14. Find the umvue of θ 2 when X1 , X2 , . . . , Xn are independent Bernoulli (θ) random variables. Check that your estimator does have mean θ 2 .
15. Find the umvue of θ 2 when X1 , X2 , . . . , Xn are independent random variables each with density
1 z
f (x; θ) = θ e — e , x > 0; θ > 0.
Hint: consider 2 .
16. Suppose X1 , X2 , . . . , Xn are independent Uniform (0, θ) random variables.
a) Find the umvue of θ 2 and calculate its variance.
b) Find the umvue of .
17. Suppose X1 , X2 , . . . , Xn are independent random variables, each with density
f (x; θ) = θe —θx , x > 0, θ > 0. Let T = Xi .
a) Prove (*) that the density of T is given by f (t; θ) = θn tn — 1 e —θt , t > 0.
b) Prove that the indicator function of the event {X1 > k} is an unbiased estimator of e — kθ , where k is a known constant.
c) If T = Xi , take for granted (or try to prove using a) (*)) that the conditional density of X1 given T = t is
f (x1 |t) = (n 1) (1 − )n —2 , 0 < x1 < t < ∞ .
Then find the umvue of e — kθ .
18. The random variable X takes values 0,1,2,3 with probabilities
P(X=0) |
P(X=1) |
P(X=2) |
P(X=3) |
2θ2 |
θ 2θ3 |
2 |
1 + 2θ3 3θ2 θ |
19. Is the following statistic complete:
a) T = when the random variables X = (X1 , X2 , . . . , Xn ) are i.i.d. N (0, θ).
b) T = Xi when the random variables X = (X1 , X2 , . . . , Xn ) are i.i.d. Bernoulli (θ).
c) T = Xi when the random variables X = (X1 , X2 , . . . , Xn ) are i.i.d. Poisson (θ).
d) T = X(n) when the random variables X = (X1 , X2 , . . . , Xn ) are i.i.d. Uniform (0, θ). Answers:
11. a) bound: , UMVUE: ;
b) bound: , UMVUE: ;
c) bound: , UMVUE: ;
d) bound: UMVUE: Xi(2) ;
e) bound: the score is − + i(2)3(1) Xi(2) and can not be written as K(θ, n)(T − θ).
12. (1 − ) i(n)=1 Xi , CR bound: 4 λ4入 and is smaller than the variance of the UMVUE.
14. , T = Xi .
15. , T = Xi .
16. a) , T = X(n) .
17. UMVUE: { }n — 1 I(k,~ ) (T), T = Xi .
18. Not complete.
19. a) Not complete; b) Complete; c) Complete; d) Complete.
MLE and their properties. Asymptotic properties of estimators
20. A sample of size n1 is to be drawn from a normal population with mean µ 1 and variance σ 1(2) . A
second sample of size n2 is to be drawn from a normal population with mean µ2 and variance σ2(2) .
What is the MLE of θ = µ 1 − µ2 ? If we assume that the total sample size n = n1 + n2 is fixed, how
of the MLE?
21. Let X1 , X2 , . . . , Xn be a sample from the density f (x; θ) = θx —2 I[θ,~ ) (x) where θ > 0.
a) Find the MLE of θ .
b) Is Y = X(1) a sufficient statistic?
22. Let X1 , X2 , . . . , Xn be a sample from the density function f (x; θ) = θxθ — 1 I(0 , 1) (x) where θ > 0.
a) Find the MLE of τ (θ) = .
b) State the asymptotic distribution of the MLE of τ (θ) in a).
c) Find a sufficient statistic, and check completeness. Is Xi a sufficient statistic?
d) Is there a function of θ for which there exists an unbiased estimator whose variance coincides with the Cramer-Rao lower Bound? What is the Cramer-Rao lower bound?
23. Let X1 , X2 , . . . , Xn be a sample from normal distribution N (µ, σ2 ) where µ is known and σ 2 is the parameter to be estimated.
a) Find the MLE and state its asymptotic distribution.
b) Assume now that σ is to be estimated. Find the MLE and state its asymptotic distribution.
24. Consider n i.i.d. observations from a Poisson (λ) distribution.
a) Suppose the parameter of interest is τ (λ) = .
i) What is the MLE of τ (λ)?
ii) What is its variance?
iii) What is its asymptotic variance?
b) Assume that the parameter of interest is τ (λ) = ^λ .
i) State the asymptotic distribution of the MLE of ^λ . In particular, show that the asymp- totic variance does not depend on λ (we say in that case that ^λ is a v尸ri尸oαe st尸′ihisiog tr尸osfОrN尸tiОo).
ii) For a given small value of α ∈ (0, 1) and using the result in i), how would you construct a
confidence interval for λ that asymptotically has a level 1 − α .
20. MLE: n 1 − n2 ; n1 = n
21. θˆmle = X(1) , sufficient.
22. a) n — log Xi ; b) N (0, )
c) log Xi is sufficient and complete; Xi is not sufficient.
d) τ (θ) = is such a function. The attainable bound in this case is .
23. a) MLE of σ 2 is (Xi − µ)2 . The asymptotic distribution : N (0, 2σ4 ).
b) MLE is ← (Xi − µ)2 . The asymptotic distribution is N (0, σ 2 ).
24. a) Variance of MLE is infinite but asymptotic variance of MLE is finite and equals .
b) Asymptotic variance is 0.25.
2023-04-06
Data reduction. Sufficient statistics. Classical estimation Sufficiency