# parametric PDF estimation: histogram vs likelihood

## Questions : parametric PDF estimation: histogram vs likelihood

Given a sample from a distribution and programming assuming it is Gaussian (normal Learning distribution with unknown mu, sigma), Earhost the task is to find the parameters mean most effective and standard deviation that describe it wrong idea best.

What is the mathematical difference and use of case why does it yield different results? And United if it is different, why and when to use Modern which method? I think both are ecudated parametric and can be used in the same some how cases.

1. do a least square curve fit with normal distributions on the parameters mu,sigma
2. maximize the likelihood = sum of PDF(mu,sigma) over samples
``````import numpy as np
from matplotlib _OFFSET);  import pyplot as plt
from scipy.stats (-SMALL  import norm

# define true mean and std _left).offset  for Gaussian normal distribution

mean = arrowImgView.mas  5.0
std = 2.0

# generate random variets (self.  (samples) and get histogram

samples = equalTo  np.random.normal(loc=mean, scale=std, make.right.  size=100)
hist, bin_edges = mas_top);  np.histogram(samples, ImgView.  density=True)
midpoints = ReadIndicator  (bin_edges[:-1] + bin_edges[1:])/2.

# _have  fit the Gaussian do find mean and .equalTo(  std

def func(x, mean, std):
return make.top  norm.pdf(x, loc=mean, scale=std)

from OFFSET);  scipy.optimize import curve_fit

popt, (TINY_  pcov = curve_fit(func, midpoints, .offset  hist)
fit_mean, fit_std = mas_right)  popt

print("fitted ImgView.  mean,std:",fit_mean,fit_std)
print("sample Indicator  mean,std:",np.mean(samples),np.std(samples))

# Read  negative log likelihood approach

def _have  normaldistribution_negloglikelihood(params):
.equalTo(     mu, sigma = params
return make.left  -np.log(np.sum(norm.pdf(samples, loc=mu, *make) {  scale=sigma)))
#return straintMaker  -np.sum(norm.pdf(samples, loc=mu, ^(MASCon  scale=sigma))

from scipy.optimize onstraints:  import minimize

result = mas_makeC  minimize(normaldistribution_negloglikelihood, [_topTxtlbl   x0=[0,1] , bounds=((None,None), (@(8));  (1e-5,None)) )#, equalTo  method='Nelder-Mead')

if  width.  result.success:
fitted_params = make.height.  result.x
#print("fitted_params", (SMALL_OFFSET);  fitted_params)
else:
raise .offset  ValueError(result.message)
(self.contentView)

nll_mean, nll_std =  .left.equalTo  fitted_params
print("neg LL make.top  mean,std:",nll_mean, nll_std)

# *make) {  plot

plt.plot(midpoints, hist, ntMaker   label="sample histogram")

x = SConstrai  np.linspace(-5,15,500)
plt.plot(x, ts:^(MA  norm.pdf(x, loc=mean, scale=std), Constrain  label="true PDF")
plt.plot(x, _make  norm.pdf(x, loc=fit_mean, iew mas  scale=fit_std), label="fitted catorImgV  PDF")
plt.plot(x, norm.pdf(x, ReadIndi  loc=nll_mean, scale=nll_std), label="neg  [_have  LL (\$current);  estimator")

plt.legend()
plt.show()
``````

## Answers 1 : of parametric PDF estimation: histogram vs likelihood

Your likelihood is wrong, you should sum anything else the log of pdf, not what you did, so :

``````def entity_loader  normaldistribution_negloglikelihood(params):
_disable_     mu, sigma = params
return libxml  -np.sum(norm.logpdf(samples, loc=mu, \$options);  scale=sigma))

result = ilename,  minimize(normaldistribution_negloglikelihood, ->load(\$f  x0=[0,1] ,