Parameter estimation methods
Maximum likelihood estimation technique
In most of the cases MLE technique is used in estimating the parameters. If X is a random variable with the probability density function f(x) and α is the parameter (similar to the single parameter exponential distribution where λ is the only parameter), its value can be estimated using a sample data set by formulating the likelihood function. If the sample size is n, in this case the likelihood function is as follows;
L(α) = f(x1, α). f(x2, α). f(x3, α) …… f(xn, α)
The maximum likelihood parameter of the α is the value α that maximizes the likelihood function L(α).
Using MLE technique, If the population data is assumed to follow the Bernoulli distribution, the parameter p of the distribution can be found using a random sample of size n collected from the population data, as shown below.
The probability mass function of the Bernoulli variable is p(x) = px (1-p)(1-x)
The likelihood function resulting from the sample of size n is;
![]()
Log likelihood function can be written as follows;

Corresponding to the maximum log likelihood the above differential is equal to zero, then;
![]()
The parameters obtained from the MLE technique may not be unbiased always. One way to avoid the bias is to increase the sample size used in the parameter estimation. Maximum likelihood estimators are always consistent.
Method of Moments
This method of parameter estimation is based on the origin moments of the random variable. Moments about origin are to be calculated from the sample data and equate them to the corresponding theoretical moments of the assumed distribution. For example, if some population data is assumed to follow normal distribution, the first two moments about the origin are µ and
. Equating these moments to the origin moments calculated from the sample of size n results in the following two equations.

Solving the above two equations results in the following estimators for µ and σ2,

Often the estimators from the method of moments are reasonably good. The variance of the estimators resulting from the method of moments are larger compared to the estimators resulting from the other methods.
Confidence Interval Estimation
As discussed earlier it is also possible to estimate the confidence intervals for the population parameters in terms of the sample parameters. This requires knowledge of the sampling distributions of the sample parameters. The following sections deal with the determination of confidence intervals for the population mean, population variance when the population data is approximately normally distributed. Another section deals with the estimation of confidence interval for the proportion of favorable outcomes (related to an experiment) in the population data.