Code / Contact me @ laurent.perrinet@univ-amu.fr
Generative model of image synthesis:
$I[x, y] = $ $\sum_{i=1}^{K} a[i] \cdot \phi[i, x, y]$ $ + \varepsilon[x, y]$
Where $\phi$ is a dictionary of $K$ atoms, $a$ is a sparse vector of coefficients, and $\varepsilon$ is a noise term.Given an observation $I$,
$$ \begin{aligned} \mathcal{L}(a) & = - \log Pr( a | I ) \\ \end{aligned} $$
Given an observation $I$,
$$ \begin{aligned} \mathcal{L}(a) & = - \log Pr( a | I ) \\ & = - \log Pr( I | a ) - \log Pr(a) \\ \end{aligned} $$
Given an observation $I$,
$$ \begin{aligned} \mathcal{L}(a) & = - \log Pr( a | I ) \\ & = - \log Pr( I | a ) - \log Pr(a) \\ & = \frac{1}{2\sigma_n^2} \sum_{x, y} ( I[x, y] - \sum_{i=1}^{K} a[i] \cdot \phi[i, x, y])^2 - \sum_{i=1}^{K} \log Pr( a[i] ) \end{aligned} $$
The problem is formalized as an optimization problem $a^\ast = \arg \min_a \mathcal{L}(a)$ with:
$$ \mathcal{L} = \frac{1}{2} \sum_{x, y} ( I[x, y] - \sum_{i=1}^{K} a[i] \cdot \phi[i, x, y])^2 + \lambda \cdot \sum_i ( a[i] \neq 0) $$
The problem is formalized as an optimization problem $a^\ast = \arg \min_a \mathcal{L}(a)$ with:
$$ \mathcal{L}(a) = \frac{1}{2} \sum_{x, y} ( I[x, y] - \sum_{i=1}^{K} a[i] \cdot \phi[i, x, y])^2 + \lambda \cdot \sum_{i=1}^{K} | a[i] | $$
Init : Residual $R = I$, sparse vector $a$ such that $\forall i$, $a[i] = 0$
while $\frac{1}{2} \sum_{x, y} R[x, y]^2 > \vartheta $, do :
Init : $R = I$, $\forall i$, $a[i] = 0$
while $\frac{1}{2} \sum_{x, y} R[x, y]^2 > \vartheta $, do :
Init : $R = I$, $\forall i$, $a[i] = 0$
while $\frac{1}{2} \sum_{x, y} R[x, y]^2 > \vartheta $, do :
Init : $R = I$, $\forall i$, $a[i] = 0$
while $\frac{1}{2} \sum_{x, y} R[x, y]^2 > \vartheta $, do :
Init : $R = I$, $\forall i$, $a[i] = 0$, and normalize $\sum_{x, y} \phi[i, x, y]^2 = 1$
while $\frac{1}{2} \sum_{x, y} R[x, y]^2 > \vartheta $, do :
Init : $R = I$, $\forall i$, $a[i] = 0$, $\sum_{x, y} \phi[i, x, y]^2 = 1$
while $\frac{1}{2} \sum_{x, y} R[x, y]^2 > \vartheta $, do :
Init : $R = I$, $\forall i$, $a[i] = 0$, $\sum_{x, y} \phi[i, x, y]^2 = 1$
compute $c[i] = \sum_{x, y} R[x, y] \cdot \phi[i, x, y]$
compute $X[i, j] = \sum_{x, y} \phi[i, x, y] \cdot \phi[j, x, y]$
while $\frac{1}{2} \sum_{x, y} R[x, y]^2 > \vartheta $, do :
Hebbian learning (once the sparse code is known):
$$ \phi_{i}[x, y] \leftarrow \phi_{i}[x, y] + \eta \cdot a[i] \cdot (I[x, y] - \sum_{i=1}^{K} a[i] \cdot \phi_{i}[x, y] ) $$
$$ (f \ast g)[x, y] = \sum_{i=-K}^{K} \sum_{j=-K}^{K} f[x-i, y-j] \cdot g[i, j] $$
$$ (f \ast \tilde{g})[x, y] = \sum_{i=-K}^{K} \sum_{j=-K}^{K} f[x+i, y+j] \cdot g[i, j] $$
$$ (f \ast \tilde{g})[x, y] = \sum_{c=1}^{C} \sum_{c,i,j} f[c, x+i, y+j] \cdot g[c, i, j] $$
$$ (f \ast \tilde{g})[k, x, y] = \sum_{c,i,j} f[c, x+i, y+j] \cdot g[k, c, i, j] $$
Code @ SparseEdges
Code / Contact me @ laurent.perrinet@univ-amu.fr