Machine learning–based inference toolkit for particle physics
MadMiner.add_systematics()
, which in addition to the previous PDF and scale variations also allows normalization uncertainties. When adding samples in the LHEReader
and DelphesReader
functions, each sample can be linked to an arbitrary subset of systematics, giving the user a lot of flexibility.SampleAugmenter
functions have new keywords partition
and validation_split
. With partition="validation"
, validation data without potential overlap with the training samples can be generated. In the madminer.ml
classes, this can be provided with new keywords like x_val
, theta_val
when calling train()
.mode="score"
are now the ensemble covariance, without dividing by sqrt(n_estimators)
as before. This is also the default behavior. The old default behavior can be used with mode="modified_score"
.DoubleParameterizedRatioEstimator
and LikelihoodEstimator
.Ensemble
instances of multiple score estimators.SampleAugmenter
functions do no longer accept the keyword switch_train_test_events
. Use partition="train"
or partition="test"
instead.AsymptoticLimits
where only 1 observed event was returned.Estimator
classes with parameter rescaling into new ConditionalEstimator
class.ParameterizedRatioEstimator
now optionally rescales parameters (theta
) to zero mean and unit variance during training. Use the keyword rescale_params
in ParameterizedRatioEstimator.train()
.AsymptoticLimits
functions now also when using weighted events.LHEReader
, which caused the y component of the MET object to be wrong.plot_histograms()
can now also visualize observed data / Asimov data.mode="adaptive-sally"
now works as described in https://arxiv.org/abs/1805.00020. In higher dimensions, it still just concatenates the scalar product of score and parameter vector with all score components to form a (d+1)
-dimensional observable space.plot_histograms()
.AsymptoticLimits
, the adaptive histogram binning can now be based on the weights summed over the whole parameter grid instead of just a central point. This is now also the default option.plot_histograms()
in madminer.plotting
to visualize the histograms used by AsymptoticLimits
.AsymptoticLimits
functions.LHEReader.add_observable
: Users can use "p_truth"
to access particles before smearing, and (at least with XML parsing) there are new global observables "alpha_qcd", "alpha_qed", "scale"
. LHEReader.add_observable_from_function()
now accepts functions that take unsmeared particles as first argument.sample_train_ratio()
with return_individual_n_effective=True
DelphesReader
when no events survive cutsAsymptoticLimits
now supports the SALLINO method, estimating the likelihood with one-dimensional histograms of the scalar product of theta
and the estimated score.AsymptoticLimits
and added more binning options, including fully manual specification of the binning.Histogram.histo_uncertainties
lets the user access the uncertainties.)AsymptoticLimits
functions expected_limits()
and observed_limits()
now return (theta_grid, p_values, i_ml, llr_kin, log_likelihood_rate, histos)
. histos
is a list of histogram classes, the tutorial shows how they allow us to plot the histograms. The returns
keyword to these functions is removed. The keywords theta_ranges
and resolutions
were renamed to grid_ranges
and grid_resolutions
.FisherInformation
class (the old names are still available as aliases for now, but deprecated).AsymptoticLimits
is finally properly documented.AsymptoticLimits
is now much more memory efficient.