Likelihood

From Sean_Carver
Revision as of 14:03, 26 January 2009 by Carver (talk | contribs)
Jump to: navigation, search

The likelihood tells you how probable it is that your model produces your data. This measure tells you how good your model is. First we need to understand the concept of likelihood, and this brings us into probability theory. We start by studying random numbers on a computer.

Random Numbers in MATLAB

Start MATLAB and type the following into the MATLAB prompt

rand

This produces a random number. Repeat several times. Continue typing the material in boxes into the MATLAB prompt.

help rand

Returns help for the rand command. We'll discuss this help file. Try

rand(3)
rand(3,2)
rand(3,2,2)

These commands allow you to create arrays of random numbers without loops. Compare how long it takes to run

A = rand(10000,1);

With

for i = 1:10000
B(i,1) = rand;
end

Note however there is really two issues with the second method: first, we could have avoided the loop, and second, the we successively built a larger and larger array without first initializing it. Now that B is defined, repeating the second set of command will go much faster, but still not as fast as avoiding the loop.

Note semicolons suppress a command's insistence on displaying the result on the screen. Try it without the semicolon:

A = rand(10000,1)

Finally, if you are running a model that uses random numbers it is useful to be able to get the same random numbers again to reproduce your results. This can be done: computer random numbers are not really random, they are pseudo-random. The number returned by the generator is completely determined by the "seed" used to initialize the generator and the number of times that the generator has been used since the last initialization. Try:

rand('state',0);
rand
rand
rand
rand('state',0);
rand
rand
rand
rand('state',1);
rand
rand
rand

The command "clock" can be used to set the seed in "random" way. Use "up arrow" to repeat a command.

clock
clock
clock
sum(100*clock)
sum(100*clock)
sum(100*clock)
rand('state',sum(100*clock));
rand
rand
rand('state',sum(100*clock));
rand
rand


If you want a "random" seed and still be able to reproduce your simulations use:

s = sum(100*clock);
rand('state',s);
rand
rand
rand('state',s)
rand
rand

Probability

We want to use the command rand to roll a die. The command rand returns a number between 0 and 1. So 6*rand returns a number between 0 and 6. The command ceil rounds numbers up.

help ceil
ceil(6*rand)

Repeat this command several times. But now let's create a new command. Open the Editor and type

function d = die
% DIE Roll a die
% d = die returns a random integer between 1 and 6, inclusive

d = ceil(6*rand);

Save the file as die.m then type

die
die
die

For the next step it is useful to get more than one throw of the die. Save as dice.m then change

function d = dice(n)
% DICE Rolls dice
% d = dice(n) returns n random integers between 1 and 6, inclusive

d = ceil(6*rand(n,1));

Now some statistics:

mean(dice(100))
mean(dice(1000))
mean(dice(10000))
mean(dice(100000))
  • Homework Problem 1: Make a guess as to whether the mean converges as the number of dice increases. Assuming you do not know the answer, what numerical experiments would increase your confidence in this regard?

Now try the same thing for

median
mode
std
min
max

The command std is the standard deviation.

  • Homework Problem 2: Make guesses as to whether each of these statistics converges as the number of dice increases.

Probability Mass Function

The PMF gives the theoretical fraction of times each value is returned on repeated experiments. For a fair die

f(i) = 1/6, for i = 1,2,3,4,5,6

f(i) = 0 otherwise.

For a loaded die, some the six non-zero values of f would differ, but they must add to 1.

The statistics considered above can be calculated theoretically from the PMF. Then there are theorems that show convergence of these statistics to their theoretical values ... or counterexamples showing that convergence does not happen. Warning: Convergence is a subtle concept in Probability Theory.

Of note: The theoretical "mode" for loaded die is the i which maximizes f. For a fair die, all values are equal and the "mode" does not exist, theoretically. The idea of "most probable" is important to us.

Plotting the PMF: Roll the dice and create a histogram with 6 bins:

help hist
type hist
A = dice(100);
hist(A,6)

Repeat several times. Then try:

A = dice(1000);
hist(A,6)
A = hist(10000);
hist(A,6)
hist(A,60)
hist(A,600)

Notice that as the width of the bin gets smaller, the number in each bin (containing the integers 1,2,3,4,5,6) does not change. Also notice that the height of each bin, divided by the total number of dice, equals

1/6

Probability theory defines probability with axioms (we will not study the axioms). The most common interpretation of probability of an outcome is that it is the fraction of times it occurs when if the experiment (roll the die) could be repeated many times. The PMF gives the probability of occurance of each discrete outcome: 1,2,3,4,5, or 6.