NIPY logo

Site Navigation

NIPY Community

Table Of Contents

This Page

neurospin.utils.two_binomial_mixture

Module: neurospin.utils.two_binomial_mixture

Inheritance diagram for nipy.neurospin.utils.two_binomial_mixture:

TwoBinomialMixture

class nipy.neurospin.utils.two_binomial_mixture.TwoBinomialMixture(r0=0.20000000000000001, r1=0.80000000000000004, l=0.90000000000000002, v=0)

This is the basic Fitting of a mixture of 2 binomial distributions it contains the follwing fields: - r0=0.2:the parameter of the first binomial - r1=0.8: the parameter of the second binomial - lambda=0.9 = the mixture parameter (proportion of the first compoenent) Note that all these parameters are within the [0,1] interval - verbose = 0 verbosity level It is now advised to proceed with the estimation using the EM method

Methods

EMalgo
EMalgo_from_histo
Estep
Mstep
estimate_parameters
estimate_parameters_from_histo
kappa
parameters
reset
show
update_lambda_fh
update_parameters_fh
__init__(r0=0.20000000000000001, r1=0.80000000000000004, l=0.90000000000000002, v=0)
EMalgo(X, xmax, eps=9.9999999999999995e-08, maxiter=100, maxh=100)

Estimate the parameters of the mixture from the input data using an EM algorithm

Parameters:

X array of shape (nbitems) :

a vector of interegers in [0,xmax] range

xmax: the maximal value of the input variable :

eps = 1.e-7 = parameter to decide convergence: when lambda :

changes by less than this amount, convergence is declared

maxiter=100 : maximal number of iterations

EMalgo_from_histo(H, eps=9.9999999999999995e-08, maxiter=100)

Estimate the parameters given an histogram of some data, using an EM algorithm

Parameters:

H the histogram, i.e. the empirical count of values, whose :

range is given by the length of H (to be padded with zeros when necesary)

eps = 1.e-7 :

parameter to decide convergence: when lambda changes by less than this amount, convergence is declared

maxiter=100 :

Estep(H)
E-step of the EM algorithm
Mstep(H, Z)
M-step of the EM algorithm
estimate_parameters(X, n_bins=10, eps=9.9999999999999995e-08, maxiter=100)

Estimate the parameters of the mixture from the input data using a gradient descent algorithm this is strongly discouraged: rather use the EM

Parameters:

X : 1D ndarray

The data to estimate the binomial mixture from.

n_bins: integer :

The number of bins used to build the histogram.

eps: float, optional :

Parameter to decide convergence: when lambda changes by less than this amount, convergence is declared.

maxiter : integer, optional

Maximal number of iterations

estimate_parameters_from_histo(H, eps=9.9999999999999995e-08, maxiter=100, reset=True)

Estimate the parameters given an histogram of some data using a gradient descent. this is strongly discouraged: rather use the EM

Parameters:

H : 1D ndarray

The histogram, i.e. the empirical count of values, whose range is given by the length of H (to be padded with zeros when necesary)

eps : float, optional

Parameter to decide convergence: when lambda changes by less than this amount, convergence is declared

maxiter : float, optional

Maximal number of iterations

reset : boolean, optional

If reset is True, the previously estimate parameters are forgotten before performing new estimation.

kappa()
Compute the corefficient kappa to measure the separation of the two modes
parameters()
reset(r0=0.20000000000000001, r1=0.80000000000000004, l=0.90000000000000002)
show(H)

Display the histogram of the data, together with the mixture model

Parameters:

H : ndarray

The histogram of the data.

update_lambda_fh(H, eps=1e-08, maxiter=100)

update lambda given the histogram H

Parameters:

H array of shape (nbins) :

histogram, i.e. the empirical count of values, whose range is given by the length of H (to be padded with zeros when necesary)

eps = 1.e-8 :

quantum parameter to avoid zeros and numerical degeneracy of the model

maxiter = 100: maximum number of iterations :

update_parameters_fh(H, eps=1e-08)

update the binomial parameters given a certain histogram Parameters ———- H array of shape (nbins)

histogram, i.e. the empirical count of values, whose range is given by the length of H (to be padded with zeros when necesary)
eps = 1.e-8
quantum parameter to avoid zeros and numerical degeneracy of the model