Commit cda60047 authored by Michiel Cottaar's avatar Michiel Cottaar
Browse files

Made docstrings easier to read

parent d74bb536
This source diff could not be displayed because it is too large. You can view the blob instead.
......@@ -110,10 +110,14 @@ of the log-likelihood and log-prior given some set of parameters.
def metropolis_hastings(f, x0, nsteps=10000, step_size=1.):
"""MCMC using Metropolis-Hastings algorithm
:param f: function mapping vector of parameters to sum of log-likelihood and log-prior
:param x0: starting set of parameters
:param nsteps: how many steps to run the MCMC for
:param step_size: size of the Gaussian proposal function
Args:
f: function mapping vector of parameters to sum of log-likelihood and log-prior
x0: starting set of parameters
nsteps: how many steps to run the MCMC for
step_size: size of the Gaussian proposal function
Returns:
2D array (nsteps x nparams) with the Markov chain through parameter space
"""
params = []
......@@ -235,10 +239,14 @@ The proposal function will be drawn from a Gaussian with this variance.
def metropolis_hastings_cov(f, x0, variance_matrix, nsteps=10000):
"""MCMC using Metropolis-Hastings algorithm
:param f: function mapping vector of parameters to sum of log-likelihood and log-prior
:param x0: starting set of parameters
:param variance_matrix: variance to use for Gaussian proposal function
:param nsteps: how many steps to run the MCMC for
Args:
f: function mapping vector of parameters to sum of log-likelihood and log-prior
x0: starting set of parameters
variance_matrix: variance to use for Gaussian proposal function
nsteps: how many steps to run the MCMC for
Returns:
2D array (nsteps x nparams) with the Markov chain through parameter space
"""
params = []
......@@ -265,14 +273,19 @@ def iterative_metropolis_hastings(f, x0, nsteps=10000, niter=3):
Proposal variance is determined iteratively
:param f: function mapping vector of parameters to sum of log-likelihood and log-prior
:param x0: starting set of parameters
:param nsteps: how many steps to run the MCMC for
Args:
f: function mapping vector of parameters to sum of log-likelihood and log-prior
x0: starting set of parameters
nsteps: how many steps to run the MCMC for
niter: number of iterations to estimate the covariance before running the final MCMC
Returns:
2D array (nsteps x nparams) with the Markov chain through parameter space
"""
# start with small, isotropic step size
current_variance_mat = np.eye(len(x0)) * 1e-2
for _ in range(niter - 1):
for _ in range(niter):
# For each iteration we start by running a short (i.e., with reduced number of steps) MCMC
samples = metropolis_hastings_cov(f, x0, current_variance_mat, nsteps=nsteps//10)
......@@ -286,13 +299,13 @@ def iterative_metropolis_hastings(f, x0, nsteps=10000, niter=3):
Without any iterations we can see from the trace that the initial step size is clearly too small:
```python
samples_no_iteration = iterative_metropolis_hastings(logp, [0, 0, 1.], niter=1)
samples_no_iteration = iterative_metropolis_hastings(logp, [0, 0, 1.], niter=0)
plt.plot(samples_no_iteration[:, 0])
```
However, after several iterations of updating the proposal function covariance matrix, we get a nice trace:
```python
iter_samples = iterative_metropolis_hastings(logp, [0, 0, 1.], niter=5)
iter_samples = iterative_metropolis_hastings(logp, [0, 0, 1.], niter=3)
plt.plot(iter_samples[:, 0])
```
......
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment