Title A Gentle Tutorial in Bayesian Statistics.pdf Statistical Inference Receiver Operating Characteristic Student's T Test Bayesian Inference 263.4 KB 45
##### Document Text Contents
Page 1

A Gentle Tutorial in Bayesian Statistics

Theo Kypraios
http://www.maths.nott.ac.uk/∼tk

School of Mathematical Sciences − Division of Statistics

Division of Radiological and Imaging Sciences Away Day

1 / 29

Page 2

Warning

This talk includes

about 5 equations (hopefully not too hard!)

This tutorial should be accessible even if the equations might
look hard.

2 / 29

Page 22

The likelihood function

The likelihood function plays a fundamental role in statistical
inference.

In non-technical terms, the likelihood function is a function that
when evaluated at a particular point, say (α0, β0), is the
probability of observing the (observed) data given that the
parameters (α, β) take the values α0 and β0.

Let’s think of a very simple example:

Suppose we are interested in estimating the probability of
success (denoted by θ) for one particular experiment.

Data: Out of 100 times we repeated the experiment we
observed 80 successes.

11 / 29

Page 23

Classical (Frequentist) Inference

Frequentist inference tell us that:

we should for parameter values that maximise the likelihood
function → maximum likelihood estimator (MLE)

associate parameter’s uncertainty with the calculation of
standard errors . . .

. . . which in turn enable us to construct confidence intervals
for the parameters.

What’s wrong with that?

Nothing, but . . .

. . . it is approximate, counter-intuitive (data is assumed to be
random, parameter is fixed) and often mathematically
intractable.

12 / 29

Page 44

An Example in DW-MRI Analysis (2)

Suppose that we have some measurements (intensities) for
each voxel.

We could fit the two different models (on the same dataset).

Question: How do we tell which model fits the data best
taking into account the uncertainty associated with the
parameters in each model?

28 / 29

Page 45

Conclusions

Quantification of the uncertainty both in parameter estimation
and model choice is essential in any modelling exercise.

A Bayesian approach offers a natural framework to deal with
parameter and model uncertainty.

It offers much more than a single “best fit” or any sort
“sensitivity analysis”.

There is no free lunch, unfortunately. To do fancy things,
often one has to write his/her own computer programs.

Software available: R, Winbugs, BayesX . . .

29 / 29