# A Stochastic Primer

# Introduction

I often stumbled across basic stochastic properties which I do not remember exactly anymore, so here it is: A post just for me, where i can search for stochastic stuff. This also means that this post will not be very detailed, but if you had basic stochastic in school you should be able to follow along.

# Measure Space

First, let us define what a measure space is.

We have the set of samples. You can think of it as a set
containing all possible outcomes.

Then there is the set which is a subset of the powerset
of . In fact, we require it to be a -algebra, which is
a fancy way of saying, that it consists of all sets, where we can
assign a “volume” such that it makes sense.

Finally, we have a probability measure which is a function from
to the interval , where we require that it
“behaves like a volume function”.
This means that and for *countably* many pairwise disjoint
we require that .
Furthermore .

The triple is now called a measure space.

# Random Variables

To make things easy, I only consider random real-valued variables. The
interested reader can certainly try to define random variables into any
measurable set (Sorry Vitali sets).

A real-valued random variable is a function
.
To compute the probability of the event that is (which is the
hundredth prime, in case you didn’t know) we take the preimage
which lies in where we can apply our
probability measure

We will call this in the following or .

This thing is called the probability density of X. The
idea behind it is that we forget about our measure space
and define a new measure on .

Calculating with that stuff is straightforward. For example which is fortunately exactly what I would suspect.

If we have two random variables and we say that they are independent if and only if .

# Distribution Functions

For every propability density there is the corresponding distribution function .

# Expected Value

The expected value is the value which you would expect “on average”,
whatever that means exactly. It is defined as
if it exists.

This even makes sense intuitively, as we weight the events with their
corresponding probabilities and sum them up. Note that this thing does
not have to exist, since it could be possible that the integral does
not converge.

Sometimes it is also called the mean and denoted with .

Fortunately it is linear, i.e., for , random variables and , , constants, even when and are not independent.

# Variance

Variance is a measure for deviation from the expected value. Much like standard deviation. In fact it is the squared standard deviation. , and the positive squareroot is called the standard deviation. Note that you need the squares inside the computation of the variance, since .

You might find it useful to know that:

- for a random variable and a parameter .
- with , as above.
- for
*uncorrelated*random variables and . (see below for a explanation of uncorrelatedness)

# Covariance

Covariance measures how two random variables and change
together.
, where the last part
follows by linearity of the expected value and is a good exercise for the interested reader.

Two random variables and are called uncorrelated if
.

By the way, variance can be explained in terms of covariance as
.

# Normal Distribution

I should probably make a warning sign here, since there is a big
formula ahead. Normal distribution is just a special kind of
distribution. We say that a random variable has a normal distribution
with mean and variance , if for the probability density it
holds that
for all .

One then often writes things like to denote
this special relationship.

Damn, why is this monster called the normal distribution? What on earth is normal about that? It will become clear once I explain the central limit theorem. But before doing this I want to show to you an even bigger ugly formula, namely the normal distribution with mean and variance . For a random variable we write and say that has a normal distribution with mean and variance if .

To get more familiar with this stuff, I urge you to do the following exercises:

- Show that a random variable with has mean and variance . What terms do you have to compute to show this?
- Suppose again . Let be another random variable. Show that .

# Central Limit Theorem

So, what is normal about the normal distribution? Suppose we have random variables , ,… , which are independent and identically distributed with mean and variance . Now we can of course add finite subsets of them up. We then scale it, so the terms do not grow infinitely. We now scale it with and let grow to infinity. Then we get a normal distribution. I.e. Note that this is independent of the distributions of the . We only demanded them to be equal.

# The End

Congratulations, you finally made it to the end. I suppose not many people make it this far. I hope you enjoyed the journey and learned something along the way.