algorithms.clustering.ggmixture¶
Module: algorithms.clustering.ggmixture
¶
Inheritance diagram for nipy.algorithms.clustering.ggmixture
:
One-dimensional Gamma-Gaussian mixture density classes : Given a set of points the algo provides approcumate maximum likelihood estimates of the mixture distribution using an EM algorithm.
Author: Bertrand Thirion and Merlin Keller 2005-2008
Classes¶
GGGM
¶
- class nipy.algorithms.clustering.ggmixture.GGGM(shape_n=1, scale_n=1, mean=0, var=1, shape_p=1, scale_p=1, mixt=array([0.33333333, 0.33333333, 0.33333333]))¶
Bases:
object
The basic one dimensional Gamma-Gaussian-Gamma Mixture estimation class, where the first gamma has a negative sign, while the second one has a positive sign.
7 parameters are used: - shape_n: negative gamma shape - scale_n: negative gamma scale - mean: gaussian mean - var: gaussian variance - shape_p: positive gamma shape - scale_p: positive gamma scale - mixt: array of mixture parameter (weights of the n-gamma,gaussian and p-gamma)
- __init__(shape_n=1, scale_n=1, mean=0, var=1, shape_p=1, scale_p=1, mixt=array([0.33333333, 0.33333333, 0.33333333]))¶
Constructor
- Parameters:
- shape_nfloat, optional
- scale_n: float, optional
parameters of the nehative gamma; must be positive
- meanfloat, optional
- varfloat, optional
parameters of the gaussian ; var must be positive
- shape_pfloat, optional
- scale_pfloat, optional
parameters of the positive gamma; must be positive
- mixtarray of shape (3,), optional
the mixing proportions; they should be positive and sum to 1
- Estep(x)¶
Update probabilistic memberships of the three components
- Parameters:
- x: array of shape (nbitems,)
the input data
- Returns:
- z: ndarray of shape (nbitems, 3)
probabilistic membership
Notes
z[0,:] is the membership the negative gamma z[1,:] is the membership of the gaussian z[2,:] is the membership of the positive gamma
- Mstep(x, z)¶
Mstep of the estimation: Maximum likelihood update the parameters of the three components
- Parameters:
- x: array of shape (nbitem,)
input data
- z: array of shape (nbitems,3)
probabilistic membership
- component_likelihood(x)¶
Compute the likelihood of the data x under the three components negative gamma, gaussina, positive gaussian
- Parameters:
- x: array of shape (nbitem,)
the data under evaluation
- Returns:
- ng,y,pg: three arrays of shape(nbitem)
The likelihood of the data under the 3 components
- estimate(x, niter=100, delta=0.0001, bias=0, verbose=0, gaussian_mix=0)¶
Whole EM estimation procedure:
- Parameters:
- x: array of shape (nbitem)
input data
- niter: integer, optional
max number of iterations
- delta: float, optional
increment in LL at which convergence is declared
- bias: float, optional
lower bound on the gaussian variance (to avoid shrinkage)
- gaussian_mix: float, optional
if nonzero, lower bound on the gaussian mixing weight (to avoid shrinkage)
- verbose: 0, 1 or 2
verbosity level
- Returns:
- z: array of shape (nbitem, 3)
the membership matrix
- init(x, mixt=None)¶
initialization of the different parameters
- Parameters:
- x: array of shape(nbitems)
the data to be processed
- mixtNone or array of shape(3), optional
prior mixing proportions. If None, the classes have equal weight
- init_fdr(x, dof=-1, copy=True)¶
Initialization of the class based on a fdr heuristic: the probability to be in the positive component is proportional to the ‘positive fdr’ of the data. The same holds for the negative part. The point is that the gamma parts should model nothing more that the tails of the distribution.
- Parameters:
- x: array of shape (nbitem)
the data under consideration
- dof: integer, optional
number of degrees of freedom if x is thought to be a Student variate. By default, it is handled as a normal
- copy: boolean, optional
If True, copy the data.
- parameters()¶
Print the parameters
- posterior(x)¶
Compute the posterior probability of the three components given the data
- Parameters:
- x: array of shape (nbitem,)
the data under evaluation
- Returns:
- ng,y,pg: three arrays of shape(nbitem)
the posteriori of the 3 components given the data
Notes
ng + y + pg = np.ones(nbitem)
- show(x, mpaxes=None)¶
Visualization of mixture shown on the empirical histogram of x
- Parameters:
- x: ndarray of shape (nditem,)
data
- mpaxes: matplotlib axes, optional
axes handle used for the plot if None, new axes are created.
GGM
¶
- class nipy.algorithms.clustering.ggmixture.GGM(shape=1, scale=1, mean=0, var=1, mixt=0.5)¶
Bases:
object
This is the basic one dimensional Gaussian-Gamma Mixture estimation class Note that it can work with positive or negative values, as long as there is at least one positive value. NB : The gamma distribution is defined only on positive values.
5 scalar members - mean: gaussian mean - var: gaussian variance (non-negative) - shape: gamma shape (non-negative) - scale: gamma scale (non-negative) - mixt: mixture parameter (non-negative, weight of the gamma)
- __init__(shape=1, scale=1, mean=0, var=1, mixt=0.5)¶
- Estep(x)¶
E step of the estimation: Estimation of ata membsership
- Parameters:
- x: array of shape (nbitems,)
input data
- Returns:
- z: array of shape (nbitems, 2)
the membership matrix
- Mstep(x, z)¶
Mstep of the model: maximum likelihood estimation of the parameters of the model
- Parameters:
- xarray of shape (nbitems,)
input data
- z array of shape(nbitrems, 2)
the membership matrix
- estimate(x, niter=10, delta=0.0001, verbose=False)¶
Complete EM estimation procedure
- Parameters:
- xarray of shape (nbitems,)
the data to be processed
- niterint, optional
max nb of iterations
- deltafloat, optional
criterion for convergence
- verbosebool, optional
If True, print values during iterations
- Returns:
- LL, float
average final log-likelihood
- parameters()¶
print the parameters of self
- posterior(x)¶
Posterior probability of observing the data x for each component
- Parameters:
- x: array of shape (nbitems,)
the data to be processed
- Returns:
- y, pgarrays of shape (nbitem)
the posterior probability
- show(x)¶
Visualization of the mm based on the empirical histogram of x
- Parameters:
- xarray of shape (nbitems,)
the data to be processed
Gamma
¶
- class nipy.algorithms.clustering.ggmixture.Gamma(shape=1, scale=1)¶
Bases:
object
Basic one dimensional Gaussian-Gamma Mixture estimation class
Note that it can work with positive or negative values, as long as there is at least one positive value. NB : The gamma distribution is defined only on positive values. 5 parameters are used: - mean: gaussian mean - var: gaussian variance - shape: gamma shape - scale: gamma scale - mixt: mixture parameter (weight of the gamma)
- __init__(shape=1, scale=1)¶
- check(x)¶
- estimate(x, eps=1e-07)¶
ML estimation of the Gamma parameters
- parameters()¶