opinionated bayesian optimisation for stable diffusion models block merge
meh
enginescorer_device
; remove aes
and cafe_*
scorers; add score_weight
to payload .yaml
latin-hypercube-sampling
for bayes
optimiseradaptive-tpe
optimisertensor_sum
merging methodweighted_subtraction
merging methodmanual
scoring methodgroup
parametersfreeze
parameters or set custom optimisation ranges
An opinionated take on stable-diffusion models-merging automatic-optimisation.
The main idea is to treat models-merging procedure as a black-box model with 26 parameters: one for each block plus base_alpha
.
We can then try to apply black-box optimisation techniques, in particular we focus on Bayesian optimisation with a Gaussian Process emulator.
Read more here, here and here.
The optimisation process is split in two phases:
--init_points
argument. We use each set of weights to merge the two models we use the merged model to generate batch_size * number of payloads
images which are then scored.--n_iters
number of times. This time we don't sample all of them in one go. Instead, we sample once, merge the models,
generate and score the images and update the optimiser knowledge about the merging space. This way the optimiser can adapt the strategy step-by-step.At the end of the exploitation phase, the set of weights scoring the highest score are deemed to be the optimal ones.
Head to the wiki for all the instructions to get you started.