Commit 88789027 authored by Ying-Qiu Zheng's avatar Ying-Qiu Zheng
Browse files

update JUN20

parent 79a7133f
## Fusion of High-quality and low-quality classification models
### Graphical models
![diagram1](/figs/2021JUL01/diagram-20210630-3.png)
##### An alternative formulation
![diagram2](/figs/2021JUL01/diagram-20210702-01.png)
### Panel A - (the most basic) model formulation (with classical ARD priors)
The model for high quality data classification follows a regression form with ARD priors. The low-quality model is trained by marginalising over the posterior distribution of the high quality coefficients $`\mathbf{w}^{H}`$ to give (the distribution of) a set of low-quality coefficients (with ARD priors likewise).
......@@ -79,9 +80,10 @@ XLtest[outlier_row, :] .+= randn(Int(d * 0.01), d)
```
And we compared three methods:
- Blue: High + Low training
- Red: trained on low quality data
- Blue: using posterior of w (trained on the high) as priors for the low-quality data.
- Red: trained on low quality data only
- Orange: Lasso Logistic regression.
- Pale blue: trained on low quality data (marginalising over the posterior of w from high)
accuracy
![accuracy](/figs/2021JUL01/acc.svg)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment