# Bayesian Inference

Bayesian probability is an interpretation of the axioms of probability as degrees of belief.

Recall Baye's Rule: \[P(H\mid E) = \frac{P(E\mid H)P(H)}{P(E)}\]

In Bayesian probability, the above terms are given the following interpretations:

- \(P(H)\) is the
*prior*probability of the hypothesis, before the evidence is considered. - \(P(H \mid E)\) is the
*posterior*- the updated probability of the hypothesis after the evidence is considered. - \(P(E)\) is the probability of seeing the evidence. It is sometimes called the
*marginal likelihood* - \(P(E \mid H)\) is the
*likelihood*of seeing \(E\), in the case that \(H\) is true. Often times, the likelihood is a function of \(H\); \(E\) is held fixed. Note that likelihood and probability are often used interchangably in common use.

Baye's rule gives a way to rationally update beliefs, but seems not to be the only rational way (see radical probabilism).

(see bayesian hypothesis test)