Misplaced Pages

Stochastic process: Difference between revisions

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
Browse history interactively← Previous editNext edit →Content deleted Content addedVisualWikitext
Revision as of 21:07, 6 April 2002 editMiguel~enwiki (talk | contribs)3,710 editsNo edit summary← Previous edit Revision as of 08:59, 7 April 2002 edit undoMiguel~enwiki (talk | contribs)3,710 editsNo edit summaryNext edit →
Line 8: Line 8:


For our first ] example, take the domain to be N, the ], and our range to be R, the ]. Then, a function f : N -> R is a ] of real ], and the following questions arise: For our first ] example, take the domain to be N, the ], and our range to be R, the ]. Then, a function f : N -> R is a ] of real ], and the following questions arise:

# How is a ] specified? # How is a ] specified?
# How do we find the answers to typical questions about sequences, such as # How do we find the answers to typical questions about sequences, such as
Line 18: Line 17:


Another important class of examples is when the domain is not a ] such as the natural numbers, but a ] such as the unit interval , the positive real numbers ], R. In this case, we have a different set of questions that we might want to answer: Another important class of examples is when the domain is not a ] such as the natural numbers, but a ] such as the unit interval , the positive real numbers ], R. In this case, we have a different set of questions that we might want to answer:

# How is a random function specified? # How is a random function specified?
# How do we find the answers to typical questions about functions, such as # How do we find the answers to typical questions about functions, such as
Line 29: Line 27:
In the ordinary ] of ] by means of ], the problem is to construct a ] of ] subsets of the space of all functions, and then put a finite ] on it. For this purpose one traditionally uses a method called ] extension. In the ordinary ] of ] by means of ], the problem is to construct a ] of ] subsets of the space of all functions, and then put a finite ] on it. For this purpose one traditionally uses a method called ] extension.


The Kolmogorov extension proceeds along the following lines: assuming that a probability measure on the space of functions f : X -> Y exists, then it can be used to specify the probability distribution of finite-dimensional random variables (f(x_1),...,f(x_n)). Now, from this n-dimensional probability distribution we can deduce an (n-1)-dimensional marginal probability distribution for (f(x_1),...,f(x_{n-1})). There is an obvious compatibility condition, namely, that the marginal probability distribution be the same as the one derived from the full-blown stochastic process. When this condition is expressed in terms of probability densities, the result is called the Chapman-Kolmogorov equation. The Kolmogorov extension proceeds along the following lines: assuming that a probability measure on the space of functions f : X -> Y exists, then it can be used to specify the probability distribution of finite-dimensional random variables (f(x_1),...,f(x_n)). Now, from this n-dimensional probability distribution we can deduce an (n-1)-dimensional marginal probability distribution for (f(x_1),...,f(x_{n-1})). There is an obvious compatibility condition, namely, that the marginal probability distribution be the same as the one derived from the full-blown stochastic process. When this condition is expressed in terms of probability densities, the result is called the ].


Given a family of compatible finite-dimensional probability distributions, the Kolmogorov extension theorem guarantees the existence of a stochastic process with the given finite-dimensional probability distributions. Given a family of compatible finite-dimensional probability distributions, the Kolmogorov extension theorem guarantees the existence of a stochastic process with the given finite-dimensional probability distributions.

]]


==== Separability, or what the Kolmogorov extension does not provide ==== ==== Separability, or what the Kolmogorov extension does not provide ====

Recall that, in the Kolmogorov axiomatization, ] are the sets which have a probability or, in other words, the sets corresponding to yes/no questions that have a (probabilistic) answer.

The Kolmogorov extension starts by declaring to be measurable all sets of functions where finitely many coordinates (f(x_1),...,f(x_n)) are restricted to lie in measurable subsets of Y^n. In other words, if a (yes/no) question can be answered by looking at the values of at most finitely many coordinates, then it has a probabilistic answer.

In measure theory, if we have a ] collection of measurable sets, then the union and intersection of al of them is a measurable set. For our purposes, this means that yes/no questions that depend on countably many coordinates have a probabilistic answer.

This means that the Kolmogorov extension makes it possible to construct stochastic processes with fairly arbitrary finite-dimensional distributions. Also, it means that every question that one could ask about a sequence has a probabilistic answer when applied to a random sequence. It also means that certain questions about functions on a continuous domain don't have a probabilistic answer.

One might hope that the questions that depend on uncountably many values of a function be of little interest, but virtually all concepts of calculus are of this sort. For example:
#the ] of a function on an interval
#limits of functions
#continuity
#differentiability
all require knowledge of uncountably many values of the function.

Revision as of 08:59, 7 April 2002

A stochastic process is a random function. This means that, if

f : D -> R

is a random function with domain D and range R, the image of each point of D, f(x), is a random variable with values in R.

Of course, the mathematical definition of a function includes the case "a function from {1,...,n} to R is a vector in R^n", so multidimensional random variables are a special case of stochastic processes.

For our first infinite example, take the domain to be N, the natural numbers, and our range to be R, the real numbers. Then, a function f : N -> R is a sequence of real numbers, and the following questions arise:

  1. How is a random sequence specified?
  2. How do we find the answers to typical questions about sequences, such as
    1. what is the probability distribution of the value of f(i)?
    2. what is the probability that f is bounded?
    3. what is the probability that is f monotonic?
    4. what is the probability that f(i) has a limit as i->infty?
    5. if we construct a series from f(i), what is the probability that the series converges? What is the probability distribution of the sum?

Another important class of examples is when the domain is not a discrete space such as the natural numbers, but a continuous space such as the unit interval , the positive real numbers [0,infty) or the entire real line, R. In this case, we have a different set of questions that we might want to answer:

  1. How is a random function specified?
  2. How do we find the answers to typical questions about functions, such as
    1. what is the probability distribution of the value of f(x)?
    2. what is the probability that f is bounded/integrable/continuous/differentiable...?
    3. what is the probability that f(x) has a limit as x->infty?

Constructing stochastic processes: the Kolmogorov extension

In the ordinary axiomatization of probability theory by means of measure theory, the problem is to construct a sigma-algebra of measurable subsets of the space of all functions, and then put a finite measure on it. For this purpose one traditionally uses a method called Kolmogorov extension.

The Kolmogorov extension proceeds along the following lines: assuming that a probability measure on the space of functions f : X -> Y exists, then it can be used to specify the probability distribution of finite-dimensional random variables (f(x_1),...,f(x_n)). Now, from this n-dimensional probability distribution we can deduce an (n-1)-dimensional marginal probability distribution for (f(x_1),...,f(x_{n-1})). There is an obvious compatibility condition, namely, that the marginal probability distribution be the same as the one derived from the full-blown stochastic process. When this condition is expressed in terms of probability densities, the result is called the Chapman-Kolmogorov equation.

Given a family of compatible finite-dimensional probability distributions, the Kolmogorov extension theorem guarantees the existence of a stochastic process with the given finite-dimensional probability distributions.

Separability, or what the Kolmogorov extension does not provide

Recall that, in the Kolmogorov axiomatization, measurable sets are the sets which have a probability or, in other words, the sets corresponding to yes/no questions that have a (probabilistic) answer.

The Kolmogorov extension starts by declaring to be measurable all sets of functions where finitely many coordinates (f(x_1),...,f(x_n)) are restricted to lie in measurable subsets of Y^n. In other words, if a (yes/no) question can be answered by looking at the values of at most finitely many coordinates, then it has a probabilistic answer.

In measure theory, if we have a countably infinite collection of measurable sets, then the union and intersection of al of them is a measurable set. For our purposes, this means that yes/no questions that depend on countably many coordinates have a probabilistic answer.

This means that the Kolmogorov extension makes it possible to construct stochastic processes with fairly arbitrary finite-dimensional distributions. Also, it means that every question that one could ask about a sequence has a probabilistic answer when applied to a random sequence. It also means that certain questions about functions on a continuous domain don't have a probabilistic answer.

One might hope that the questions that depend on uncountably many values of a function be of little interest, but virtually all concepts of calculus are of this sort. For example:

  1. the supremum of a function on an interval
  2. limits of functions
  3. continuity
  4. differentiability

all require knowledge of uncountably many values of the function.