Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit

ST202: Assignment 1

Instructions: This is the first assignment sheet for ST202. Due to changes in how I delivered the material in the first week this is being released later than it should. In recognition of this I have made only the first part (as indicated) testable in the quiz. The rest can only be done with the knowledge in the lectures on Monday and Tuesday of week 2 and so I will NOT test your answers for those in the quiz.

How it works:

1. Do the questions in the ”For the Quiz” Section

2. Open the Quiz 1 (it will be available from Monday afternoon) and fill in the questions on there about your answers.

3. Then do the other questions in the ”Not for the Quiz” Section.   They are important for your understanding but are not assessed in the Quizzes.

This looks like a big sheet!

For the Quiz:

1. Identify which of the following are stochastic matrices for a Markov chain.

 

\   1/3     1/3   1/6   1/6

b             

(  −1/2     0     1/2     1   )

c   \     

( 1/3   1/3     0     1/3 )

d  (0(1)   1(0) )

2. Suppose that (Xn )n0  is a Markov chain on a state space I = {1, 2, 3, 4}, initial distribution λ = (1/2, 0, 0, 1/2) and stochastic matrix

 

1/3   1/3    0    1/3

(a) Draw a diagram for the Markov chain.

(b) Calculate P1 (X1  = 4).

(c) Calculate P1 (X2  = 4).

(d) Calculate Pλ (X2  = 4) = P(X2  = 4|X0  ∼ λ).

3. Suppose that (Xn )n0  is a Markov chain on a state space I  = {1, 2, 3, 4, 5}, initial distribution λ = (1/3, 0, 0, 1/6, 1/2) and stochastic matrix

l1/4   1/4   1/4    0    1/4

P =   1/3   1/6    0    1/6   1/3

0    1/4   1/4   1/4   1/4

a Calculate Pλ (X1  = 5)

b Calculate P1 (X1  = 5,X2  = 1)

c Calculate P1 (X10  = 5|X9  = 2)

d Calculate P1 (X10  = 5|X9  = 2,X4  = 3)

e Calculate P1 (X10  = 5,X11  = 1|X9  = 2,X8  = 3)

f HARDER: Calculate P1 (X8  = 3,X10  = 5,X11  = 1|X9  = 2,X7  = 1)

4. Doing some statistics!! Suppose that (Xn )n0 is a Markov chain on a state space I = {1, 2}, initialisation distribution λ = (1, 0) and stochastic matrix

P = ]

where α is unknown. A realisation of the Markov chain for the first 20 steps is observed:

(Xn )n∈{0,1,2,3,...,20}  = 111211122111112121112

(a) Write out the probability of this trajectory as a function of the unknown α (known as the likelihod for those doing ST219!!)

(b) Log the answer in α and differentiate to find the value of α that maximises the likelihood.  This gives an estimate for the unknown parameter after seeing data (the maximum likelihood estimate).

5. Prove that 1 is always an eigenvalue of any stochastic matrix.

Not for the Quiz:

1. Suppose that (Xn )n0 is a Markov chain on a state space I = {1, 2} and stochas-

tic matrix

P = ]

(a) Calculate the two eigenvalues λ 1  and λ2 .

(b) By looking at the lecture notes convince yourself that by diagonalising P then every entry of Pn  can be written as

(Pn )i,j  = a(i,j)λ1(n) + b(i,j)λ2(n)

where a(i,j),b(i,j) are constants to be found.

(c) You know P and P0  = I2  (where I2  denotes the 2 × 2 identity matrix. Use this to find a(i,j),b(i,j) so that you have an explicit form for Pn .

(d) Take the limit as n → ∞ for your answer in (c). If correct the rows should be identical in the limit! Call the limiting row π .

(e) Suppose that the initialisation distribution is π .  Calculate πP . What do you notice?

(f) Now take any general initialisation distribution, call it λ = (γ,(1 − γ)). Compute

lim λPn

n→∞

What do you notice?   What does this tell you about how the chain’s location depends on the initialisation distribution?  Thoughts for later in the course: Will this always be true for ANY chain or is there something special here?

(g) Show that

x2 |(λPn )i − πi | ≤ CDn

for constants C,D that you have to find and where |D| < 1.  (This shows that the Markov chain converges to π geometrically quickly.)

2. A triathlon consists of 3 disciplines:  swimming, cycling and running.  A keen but rubbish triathlete called Dr Tick Nawn does a training session every day. However, he doesn’t want to pay for professional coaching advice so instead his strategy for training is as follows:

If he has swum OR run on the nth  day the he chooses the next day’s training discipline uniformly from the two that he hasn’t yet done.  However, if he has just cycled then he chooses the next activity uniformly from all 3 activities.

Firstly convince yourself that the diagram representing the Markov chain can be drawn as (and also write out the associated stochastic matrix):

(a) Now, supposing that on day zero the training session undertaken is swim- ming; what is the probability that the first day’s session is swimming?

(b) Now, supposing that on day zero the training session undertaken is swim- ming; then one can represent the probability that the nth  day’s session is

swimming as

A + B ( )n + C (D)n

.

Identify the values of A and B to 2 d.p.

(c) Suppose that Dr Nawn has a partner who really wants to know what he will be doing on a given day massively in to the future because they are

 

Figure 1: MC Diagram

less impulsive than him and like planning! As such Dr Nawn decides that he will appease his partner by telling them the probability that he will be doing a certain training discipline on days long in to the future. By taking limits as n → ∞ to the previous answer what is the probability that he is swimming on a given day well into the future in terms of A,B,C and D above.