Posterior Probability

#artificialintelligence 

In statistics, the posterior probability expresses how likely a hypothesis is given a particular set of data. This contrasts with the likelihood function, which is represented as P(D H). This distinction is more of an interpretation rather than a mathematical property as both have the form of conditional probability. In order to calculate the posterior probability, we use Bayes theorem, which is discussed below. Bayes theorem, which is the probability of a hypothesis given some prior observable data, relies on the use of likelihood P(D H) alongside the prior P(H) and marginal likelihood P(D) in order to calculate the posterior P(H D).