You can not sample the distribution directly when “it is impossible to solve for analytically”. There are a few methods to overcome this issue, in particular “MCMC methods allow us to estimate the shape of a posterior distribution in case we can’t compute it directly”.
You only need to be able to compute the relative values of the function you want to integrate (in this case a probability density) at the current point and the proposed destination. Where is this function coming from is not relevant to understand MCMC methods, they can be applied in many problems unrelated to Bayesian statistics.
Here is another (non-zero-math) introduction to the topic: https://arxiv.org/pdf/cond-mat/9612186.pdf https://www.coursera.org/learn/statistical-mechanics/lecture...
BTW there is a nice related course on coursera: "Statistical Mechanics: Algorithms and Computations". Also notice there is a rosettacode entry for this.
The intrinsic important of MCMC is clear. What kinds of applications would be of excitement to startup-oriented readers here on YCombinator?
Also why is this called Quantum MCMC rather than just normal Markov Chain Monte Carlo?
By coincidence there is a Coursera Cousrse that just started on Statistical Mechanics and Algorithms, and the first exercise is to approximate pi.