Commit 44b5a01d authored by Ying-Qiu Zheng's avatar Ying-Qiu Zheng
Browse files

Update 2021JUL21.md

parent 7f9cb92e
......@@ -19,8 +19,8 @@ In summary, in addition to finding the the hyper-parameters $`\pi, \mu, \Sigma_{
### Pseudo code - Algorithm 1. EM for the Fusion of GMMs
1. Run K-means clustering on the high-quality data to generate the assignment of the voxels $`R^{(0)}`$.
2. Initialise the means $`\mu_{k}`$, covariances $`\Sigma_{k}`$, and mixing coefficients $`\pi_k`$ using the K-means assignment $`R^{(0)}`$, and evaluate the initial likelihood.
3. Initialise the transformation matrix $`\mathbf{U}`$ using Algorithm 3.
2. Initialise the means $`\mu_{k}^{L}`$, $`\mu_{k}^{H}`$, covariances $`\Sigma_{k}^{L}`$, $`\Sigma_{k}^{H}`$, and mixing coefficients $`\pi_k`$ using the K-means assignment $`R^{(0)}`$, and evaluate the initial likelihood.
3. Initialise the transformation matrix $`\mathbf{U} = \mathbf{MN}^{T}`$, where $`\mathbf{MDN}^{T}`$ is the SVD of $`\sum_{k=1}^{K}\mu_{k}^{H}(\mu_{k}^{L})^{T}`$.
4. For iteration = $`1, 2, ...`$, do
- **E-step.** Evaluate the responsibilities using the current parameter values
- $`\gamma(y_{nk}) = \frac{\pi_{k}\mathcal{N}(\mathbf{x}^{L}_{n} | \mu_{k}, \Sigma_{k}^{L})\mathcal{N}(\mathbf{Ux}^{H}_{n} | \mu_{k}, \Sigma_{k}^{H})}{\sum_{j=1}^{K}\pi_{j}\mathcal{N}(\mathbf{x}^{L}_{n} | \mu_{k}, \Sigma_{k}^{L})\mathcal{N}(\mathbf{Ux}^{H}_{n} | \mu_{k}, \Sigma_{k}^{H})}`$
......@@ -31,7 +31,7 @@ In summary, in addition to finding the the hyper-parameters $`\pi, \mu, \Sigma_{
- $`\pi_k = \frac{N_{k}}{N}`$
- $`\mathbf{U}=`$
- Evaluate the likelihood and check for convergence.
5. Using $`\mu_{k}, \Sigma_{k}^{L}, \pi_{k}`$ to assignment unseen data.
5. Using $`\mu_{k}, \Sigma_{k}^{L}, \pi_{k}`$ to assignment unseen data points.
### Simulation results
#### We considered three scenarios
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment