Bitmap generation from a single example with convolutions and MCMC
ConvChain is a Markov chain of images that converges to input-like images. That is, the distribution of NxN patterns in the outputs converges to the distribution of NxN patterns in the input as the process goes on.
In the examples a typical value of N is 3.
How to construct such a process? We want the final probability of a given pattern to be proportional to the pattern's weight, where pattern's weight is the number of such patterns in the input. For this it is sufficient that a stronger condition is satisfied: the probability of a given state (image) S
should be proportional to the product of pattern weights over all patterns in S
.
p(S) ~ product of pattern weights over all patterns in S
Fortunately, there are general methods to build a Markov chain that has the desired probability distribution over states as its stationary distribution.
Additional definitions:
E
, E(S) := - sum over all patterns P in S of log(weight(P))
so the probability distribution over states becomes p(S) ~ exp(-E(S))
. Note that this energy function is a generalization of the Ising model energy. In the Ising model patterns are 1x2 instead of NxN.T
so the probability distribution over states becomes p(S) ~ exp(-E(S)/T)
. Low temperatures make the distribution more concentrated in energy wells, high temperatures make the distribution more uniform. If one uses ConvChain to generate dungeons, low temperatures correspond to accurate newly built dungeons while high temperatures correspond to ruins.weight(P)
of a pattern P
to be the number of patterns P
in the input if that number is more than zero and some small number eps
otherwise, 0 < eps < 1
.S0
.E
of the current state S
.S'
.E'
of the state S'
.E'
to E
. If E' < E
assign the current state to be E'
. Otherwise, assign the current state to be E'
with probability exp(-(E'-E)/T)
.If there are more than 2 colors, Gibbs sampling may converge faster than Metropolis:
S
to a state S'
according to the probability distribution p(S'|S) ~ exp(-E'/T)
.ConvChain supports constraints, so you can easily combine it with other generators or handcrafted content.
In the language of WFC readme ConvChain satisfies strong condition 2 (Strong C2), but not condition 1 (C1).
If you freeze out the system as the Metropolis simulation goes on, you'll get a variant of the simulated annealing algorithm.
The detailed balance condition for ConvChain is exp(-E1/T)p(S2|S1) = exp(-E2/T)p(S1|S2)
, so both Gibbs p(S2|S1) ~ exp(-E2/T)
and Metropolis p(S2|S1) = min(1, exp(-(E2-E1)/T))
chains converge to the desired distribution over states.
ConvChain is a console application that depends only on the standard library. Get .NET Core for Windows, Linux or macOS and run
dotnet run --configuration Release ConvChain.csproj
ConvChain.cs
contains the basic program, ConvChainFast.cs
contains an equivalent faster program (~100 times faster on a 4-core CPU), but in a less human-readable form.