Martingales

In this chapter we will introduce a class of process that can be thought of as the fortune of a gambler betting on a fair game. These results will be important when we consider applications to finance in the next chapter. In addition, they will allow us t

  • PDF / 254,219 Bytes
  • 22 Pages / 439.36 x 666.15 pts Page_size
  • 45 Downloads / 222 Views

DOWNLOAD

REPORT


Martingales

In this chapter we will introduce a class of process that can be thought of as the fortune of a gambler betting on a fair game. These results will be important when we consider applications to finance in the next chapter. In addition, they will allow us to give more transparent proofs of some facts from Chap. 1 concerning exit distributions and exit times for Markov chains.

5.1 Conditional Expectation Our study of martingales will rely heavily on the notion of conditional expectation and involve some formulas that may not be familiar, so we will review them here. We begin with several definitions. Given an event A we define its indicator function ( 1A D

1 x2A 0 x 2 Ac

In words, 1A is “1 on A” (and 0 otherwise). Given a random variable Y, we define the integral of Y over A to be E.YI A/ D E.Y1A / Note that multiplying Y by 1A sets the product D 0 on Ac and leaves the values on A unchanged. Finally, we define the conditional expectation of Y given A to be E.YjA/ D E.YI A/=P.A/

© Springer International Publishing Switzerland 2016 R. Durrett, Essentials of Stochastic Processes, Springer Texts in Statistics, DOI 10.1007/978-3-319-45614-0_5

201

202

5 Martingales

This is the expected value for the conditional probability defined by P.jA/ D P. \ A/=P.A/ Example 5.1. A simple but important special case arises when the random variable Y and the set A are independent, i.e., for any set B we have P.Y 2 B; A/ D P.Y 2 B/P.A/ Noticing that this implies that P.Y 2 B; Ac / D P.Y 2 B/P.Ac / and comparing with the definition of independence of random variables in (A.13), we see that this holds if and only if Y and 1A are independent, so Theorem A.1 implies E.YI A/ D E.Y1A / D EY  E1A and we have E.YjA/ D EY

(5.1)

It is easy to see from the definition that the integral over A is linear: E.Y C ZI A/ D E.YI A/ C E.ZI A/

(5.2)

so dividing by P.A/, conditional expectation also has this property E.Y C ZjA/ D E.YjA/ C E.ZjA/

(5.3)

Here and in later formulas and theorems, we always assume that all of the indicated expected values exist. In addition, as in ordinary integration one can take “constants” outside of the integral. This property is very important for computations. Lemma 5.1. If X is a constant c on A, then E.XYjA/ D cE.YjA/. Proof. Since X D c on A, XY1A D cY1A . Taking expected values and pulling the constant out front, E.XY1A / D E.cY1A / D cE.Y1A /. Dividing by P.A/ now gives the result. t u Being an expected value E.jA/ it has all of the usual properties, in particular: Lemma 5.2 (Jensen’s Inequality). If  is convex, then E..X/jA/  .E.XjA// Our next two properties concern the behavior of E.YI A/ and E.YjA/ as a function of the set A.

5.1 Conditional Expectation

203

Lemma 5.3. If B is the disjoint union of A1 ; : : : ; Ak , then E.YI B/ D

k X

E.YI Aj /

jD1

Pk Proof. Our assumption implies Y1B D jD1 Y1Aj , so taking expected values, we have 1 0 k k k X X X  Y1Aj A D E.Y1Aj / D E.YI Aj / E.YI B/ D E.Y1B / D E @ jD1

jD1

jD1

Lemma 5.4. If B is the disjoint union of A1 ; : : : ; Ak , th