threeML.bayesian package
Submodules
threeML.bayesian.autoemcee_sampler module
- class threeML.bayesian.autoemcee_sampler.AutoEmceeSampler(likelihood_model=None, data_list=None, **kwargs)[source]
-
Bases:
UnitCubeSampler- sample(quiet=False)[source]
-
Sample using the UltraNest numerical integration method :rtype:
- Returns:
- setup(num_global_samples=10000, num_chains=4, num_walkers=None, max_ncalls=1000000, max_improvement_loops=4, num_initial_steps=100, min_autocorr_times=0)[source]
-
Sample until MCMC chains have converged.
The steps are:
-
- Draw num_global_samples from prior. The highest num_walkers points are
-
selected.
Set num_steps to num_initial_steps
Run num_chains MCMC ensembles for num_steps steps
-
- For each walker chain, compute auto-correlation length (Convergence requires
-
num_steps/autocorrelation length > min_autocorr_times)
-
- For each parameter, compute geweke convergence diagnostic (Convergence
-
requires |z| < 2)
-
- For each ensemble, compute gelman-rubin rank convergence diagnostic
-
(Convergence requires rhat<1.2)
If converged, stop and return results.
-
- Increase num_steps by 10, and repeat from (3) up to
-
max_improvement_loops times.
- num_global_samples: int
-
Number of samples to draw from the prior to
- num_chains: int
-
Number of independent ensembles to run. If running with MPI, this is set to the number of MPI processes.
- num_walkers: int
-
Ensemble size. If None, max(100, 4 * dim) is used
- max_ncalls: int
-
Maximum number of likelihood function evaluations
- num_initial_steps: int
-
Number of sampler steps to take in first iteration
- max_improvement_loops: int
-
Number of times MCMC should be re-attempted (see above)
- min_autocorr_times: float
-
if positive, additionally require for convergence that the number of samples is larger than the min_autocorr_times times the autocorrelation length.
-
threeML.bayesian.bayesian_analysis module
- class threeML.bayesian.bayesian_analysis.BayesianAnalysis(likelihood_model: Model, data_list: DataList, **kwargs)[source]
-
Bases:
object- property analysis_type: str
- convergence_plots(n_samples_in_each_subset, n_subsets)[source]
-
Compute the mean and variance for subsets of the samples, and plot them. They should all be around the same values if the MCMC has converged to the posterior distribution.
The subsamples are taken with two different strategies: the first is to slide a fixed-size window, the second is to take random samples from the chain (bootstrap)
- Parameters:
-
n_samples_in_each_subset – number of samples in each subset
n_subsets – number of subsets to take for each strategy
- Returns:
-
a matplotlib.figure instance
- property likelihood_model: Model
-
likelihood model (a Model instance)
- Type:
-
return
- property log_like_values: ndarray | None
-
Returns the value of the log_likelihood found by the bayesian sampler while sampling from the posterior. If you need to find the values of the parameters which generated a given value of the log. likelihood, remember that the samples accessible through the property .raw_samples are ordered in the same way as the vector returned by this method.
- Returns:
-
a vector of log. like values
- property log_marginal_likelihood: float | None
-
return:
marginal_likelihood.
- Type:
-
Return the log marginal likelihood (evidence) if computed
- property log_probability_values: ndarray | None
-
Returns the value of the log_probability (posterior) found by the bayesian sampler while sampling from the posterior. If you need to find the values of the parameters which generated a given value of the log. likelihood, remember that the samples accessible through the property .raw_samples are ordered in the same way as the vector returned by this method.
- Returns:
-
a vector of log probabilty values
- plot_chains(thin=None)[source]
-
Produce a plot of the series of samples for each parameter.
- Parameters:
-
thin – use only one sample every ‘thin’ samples
- Returns:
-
a matplotlib.figure instance
- property raw_samples: ndarray | None
-
Access the samples from the posterior distribution generated by the selected sampler in raw form (i.e., in the format returned by the sampler)
- Returns:
-
the samples as returned by the sampler
- property results: BayesianResults | None
- sample(quiet=False, **kwargs) None[source]
-
Sample the posterior of the model with the selected algorithm.
If no algorithm as been set, then the configured default algorithm we default parameters will be run
- Parameters:
-
quiet – if True, then no output is displayed
- Returns:
- property sampler: SamplerBase | None
-
Access the instance of the sampler used to sample the posterior distribution :return: an instance of the sampler.
- property samples: Dict[str, ndarray] | None
-
Access the samples from the posterior distribution generated by the selected sampler.
- Returns:
-
a dictionary with the samples from the posterior distribution for each parameter
threeML.bayesian.dynesty_sampler module
- class threeML.bayesian.dynesty_sampler.DynestyDynamicSampler(likelihood_model=None, data_list=None, **kwargs)[source]
-
Bases:
UnitCubeSampler- sample(quiet: bool = False, **kwargs)[source]
-
Sample using the Dynestey DynamicNestedSampler class.
- Parameters:
-
quiet (bool) – verbosity. Defaults to False.
kwargs (dict) – Additional keywords that get passed to the run_nested() function.
- Return type:
- Returns:
- setup(nlive: int = 500, history_filename=None, **kwargs)[source]
-
Setup the Dynesty dynamic nested sampler. All available parameters can be found in the respective version of https://dynesty.readthedocs.io/en/v3.0.0/api.html#dynesty.dynesty.DynamicNestedSampler
- Parameters:
-
nlive (int) – Number of live points used during the inital nested sampling run
history_filename (str) – Path to save the history. Defaults to None
kwargs (dict) – Additional keyword arguments - must be same name and type as paramters in constructor of the dynesty.DynamicNestedSampler class. Defaults to the values used by dynesty.
- class threeML.bayesian.dynesty_sampler.DynestyNestedSampler(likelihood_model=None, data_list=None, **kwargs)[source]
-
Bases:
UnitCubeSampler- sample(quiet: bool = False, **kwargs)[source]
-
Sample using the Dynesty NestedSampler class
- Parameters:
-
quiet (bool) – verbosity. Defaults to False.
kwargs (dict) – Additional keywords that get passed to the run_nested() function.
- Return type:
- Returns:
- setup(nlive: int = 500, bound: Literal['multi', 'single', 'none', 'balls', 'cubes'] | None = 'multi', history_filename: str | None = None, **kwargs)[source]
-
Setup the Dynesty nested sampler. All available parameters can be found in the respective version of https://dynesty.readthedocs.io/en/v3.0.0/api.html#dynesty.dynesty.NestedSampler
- Parameters:
-
nlive (int) – Number of live points. Defaults to 500.
bound – Method to approximately bound the prior using the current set of live points. Options are “multi”, “single”, “none”, “balls” or “cubes”. Defaults to “multi”.
history_filename (str) – Path to save the history. Defaults to None
kwargs (dict) – Additional keyword arguments - must be same name and type as paramters in constructor of the dynesty.NestedSampler class. Defaults to the values used by dynesty.
threeML.bayesian.emcee_sampler module
- class threeML.bayesian.emcee_sampler.EmceeSampler(likelihood_model=None, data_list=None, **kwargs)[source]
-
Bases:
MCMCSampler
threeML.bayesian.multinest_sampler module
- class threeML.bayesian.multinest_sampler.MultiNestSampler(likelihood_model: Model | None = None, data_list: DataList | None = None, **kwargs)[source]
-
Bases:
UnitCubeSampler- sample(quiet: bool = False)[source]
-
Sample using the MultiNest numerical integration method.
- Returns:
- Return type:
- setup(n_live_points: int = 400, chain_name: str = 'chains/fit-', resume: bool = False, importance_nested_sampling: bool = False, auto_clean: bool = False, **kwargs)[source]
-
Setup the MultiNest Sampler. For details see: https://github.com/farhanferoz/MultiNest
- Parameters:
-
n_live_points – number of live points for the evaluation
chain_name – the chain name
importance_nested_sampling – use INS
auto_clean – automatically remove multinest chains after run
- Resume:
-
resume from previous fit
- Returns:
- Return type:
threeML.bayesian.nautilus_sampler module
- class threeML.bayesian.nautilus_sampler.NautilusSampler(likelihood_model=None, data_list=None, **kwargs)[source]
-
Bases:
UnitCubeSampler- sample(quiet=False, **kwargs) array[source]
-
Sample using the Nautilus sampler
- Parameters:
-
quiet (bool) – Flag for anti-verbosity, defaults to false
kwargs (dict)
- Return type:
-
np.array
- Returns:
-
samples
- setup(n_live: int = 3000, verbose: bool = False, **kwargs) None[source]
-
Setup the nautilus sampler. See: https://nautilus-sampler.readthedocs.io/en/stable/index.html and https://doi.org/10.1093/mnras/stad2441
- Parameters:
-
n_live (bool) – Number of livepoints, default is 3000
verbose – Verbosity, default is false
kwargs (dict) – Additional keyword arguments for the Nautilus Sampler. Please refer to the official documentation. Defaults to the Nautilus defaults.
threeML.bayesian.sampler_base module
- class threeML.bayesian.sampler_base.MCMCSampler(likelihood_model, data_list, **kwargs)[source]
-
Bases:
SamplerBase
- class threeML.bayesian.sampler_base.SamplerBase(likelihood_model: Model, data_list: DataList, **kwargs)[source]
-
Bases:
object- get_posterior_proxy()[source]
-
Return a weakref-backed posterior callable.
This prevents external samplers from keeping a strong reference to this SamplerBase instance via a bound method.
- property log_like_values: ndarray | None
-
Returns the value of the log_likelihood found by the bayesian sampler while samplin g from the posterior. If you need to find the values of the parameters which generated a given value of the log. likelihood, remember that the samples accessible through the property .raw_samples are ordered in the same way as the vector returned by this method.
- Returns:
-
a vector of log. like values
- property log_marginal_likelihood: float | None
-
Return the log marginal likelihood (evidence) if computed :return:
- property log_probability_values: ndarray | None
-
Returns the value of the log_probability (posterior) found by the bayesian sampler while sampling from the posterior. If you need to find the values of the parameters which generated a given value of the log. likelihood, remember that the samples accessible through the property .raw_samples are ordered in the same way as the vector returned by this method.
- Returns:
-
a vector of log probabilty values
- property raw_samples: ndarray | None
-
Access the samples from the posterior distribution generated by the selected sampler in raw form (i.e., in the format returned by the sampler)
- Returns:
-
the samples as returned by the sampler
- property results: BayesianResults
- property samples: Dict[str, ndarray] | None
-
Access the samples from the posterior distribution generated by the selected sampler.
- Returns:
-
a dictionary with the samples from the posterior distribution for each parameter
- class threeML.bayesian.sampler_base.UnitCubeSampler(likelihood_model, data_list, **kwargs)[source]
-
Bases:
SamplerBase
threeML.bayesian.tutorial_material module
- class threeML.bayesian.tutorial_material.BayesianAnalysisWrap(likelihood_model: Model, data_list: DataList, **kwargs)[source]
-
Bases:
BayesianAnalysis
- threeML.bayesian.tutorial_material.array_to_cmap(values, cmap, use_log=False)[source]
-
Generates a color map and color list that is normalized to the values in an array. Allows for adding a 3rd dimension onto a plot.
- Parameters:
-
values – a list a values to map into a cmap
cmap – the mpl colormap to use
use_log – if the mapping should be done in log space
- threeML.bayesian.tutorial_material.plot_likelihood_function(bayes, fig=None, show_prior=False)[source]
- threeML.bayesian.tutorial_material.plot_sample_path(bayes, burn_in=None, truth=None)[source]
-
- Parameters:
-
jl (JointLikelihood)
- Returns:
threeML.bayesian.ultranest_sampler module
- class threeML.bayesian.ultranest_sampler.UltraNestSampler(likelihood_model=None, data_list=None, **kwargs)[source]
-
Bases:
UnitCubeSampler- sample(quiet=False)[source]
-
Sample using the UltraNest numerical integration method :rtype:
- Returns:
- setup(min_num_live_points: int = 400, dlogz: float = 0.5, chain_name: str | None = None, resume: str = 'overwrite', wrapped_params=None, stepsampler=None, use_mlfriends: bool = True, **kwargs)[source]
-
set up the Ultranest sampler. Consult the documentation: https://johannesbuchner.github.io/UltraNest/ultranest.html?highlight=reactive# ultranest.integrator.ReactiveNestedSampler
- Parameters:
-
min_num_live_points (int) – minimum number of live points throughout the run
dlogz – Target evidence uncertainty. This is the std between bootstrapped
logz integrators. :type dlogz: float :param chain_name: where to store output files :type chain_name: :param resume: (‘resume’, ‘resume-similar’, ‘overwrite’ or ‘subfolder’) – if ‘overwrite’, overwrite previous data. if ‘subfolder’, create a fresh subdirectory in log_dir. if ‘resume’ or True, continue previous run if available. Only works when dimensionality, transform or likelihood are consistent. if ‘resume-similar’, continue previous run if available. Only works when dimensionality and transform are consistent. If a likelihood difference is detected, the existing likelihoods are updated until the live point order differs. Otherwise, behaves like resume. :type resume: str :param wrapped_params: (list of bools) – indicating whether this parameter wraps around (circular parameter). :type wrapped_params: :param stepsampler: :type stepsampler: :param use_mlfriends: Whether to use MLFriends+ellipsoidal+tellipsoidal region (better for multi-modal problems) or just ellipsoidal sampling (faster for high-dimensional, gaussian-like problems). :type use_mlfriends: bool :returns:
threeML.bayesian.zeus_sampler module
- class threeML.bayesian.zeus_sampler.ZeusSampler(likelihood_model=None, data_list=None, **kwargs)[source]
-
Bases:
MCMCSampler