threeML.bayesian.autoemcee_sampler module
- class threeML.bayesian.autoemcee_sampler.AutoEmceeSampler(likelihood_model=None, data_list=None, **kwargs)[source]
Bases:
UnitCubeSampler
- sample(quiet=False)[source]
sample using the UltraNest numerical integration method :rtype:
- Returns:
- setup(num_global_samples=10000, num_chains=4, num_walkers=None, max_ncalls=1000000, max_improvement_loops=4, num_initial_steps=100, min_autocorr_times=0)[source]
Sample until MCMC chains have converged.
The steps are:
Draw num_global_samples from prior. The highest num_walkers points are selected.
Set num_steps to num_initial_steps
Run num_chains MCMC ensembles for num_steps steps
For each walker chain, compute auto-correlation length (Convergence requires num_steps/autocorrelation length > min_autocorr_times)
For each parameter, compute geweke convergence diagnostic (Convergence requires |z| < 2)
For each ensemble, compute gelman-rubin rank convergence diagnostic (Convergence requires rhat<1.2)
If converged, stop and return results.
Increase num_steps by 10, and repeat from (3) up to max_improvement_loops times.
- num_global_samples: int
Number of samples to draw from the prior to
- num_chains: int
Number of independent ensembles to run. If running with MPI, this is set to the number of MPI processes.
- num_walkers: int
Ensemble size. If None, max(100, 4 * dim) is used
- max_ncalls: int
Maximum number of likelihood function evaluations
- num_initial_steps: int
Number of sampler steps to take in first iteration
- max_improvement_loops: int
Number of times MCMC should be re-attempted (see above)
- min_autocorr_times: float
if positive, additionally require for convergence that the number of samples is larger than the min_autocorr_times times the autocorrelation length.