# Bioinformatics/Likelihood Algorithms

If we conduct the same experiment many times, the parameters we are looking at adopt certain values (if our experimental conditions lead to a convergence of the measured values). These values then have a certain frequency, which is higher than the frequency of random values. Then we can estimate (if no systematic error is hidden in our experimental setup) that the higher the frequency of a value, the more probable it may be measured. This is called the maximum likelihood estimation. The maximum likelihood estimate for a given parameter j is the value i that maximises the probability ${\displaystyle P(i|j)}$ (probability of i given j). This is called a conditional probability because for i to occur, j must be given. The probability for two events k and l to happen at the same time is called joint probability ${\displaystyle P(k,l)=P(k|l)*P(l)}$ (probability of k and l equals probability of k given l times probability of l). And the marginal probability is the probability of a variable if joint or conditional probabilities are known: ${\displaystyle P(m)=\sum \limits _{n}P(m,n)=\sum \limits _{n}P(m|n)*P(n)}$ (the sum just adds the probabilities of all possible n).