::When and where
IDSIA's meeting room
Monday 4th February h 15.45
::Title
Learning overcomplete and sparsifying transform with approximate and
exact closed form solutions
::Abstract
In this work, I addressed the learning problem for data-adaptive
transform that provides sparse representation in a space with dimensions
larger than (or equal to) the dimensions of the original space. I will
present the iterative, alternating algorithm that has two steps: (i)
transform update and (ii) sparse coding. In the transform update step,
the focus is on a novel problem formulation based on a lower bound of
the objective that addresses a trade-off between:
(a) how much are aligned the gradients of the approximative objective
and the original objective, and
(b) how much the lower bound is close to the original objective.
This allows us not only to propose approximate closed form solution, but
it also gives the possibility to find an update that can lead to
accelerated local convergence and enables us to estimate an update that
can lead to a satisfactory solution under a small amount of data. Since
in the transform update, the approximate closed form solution preserves
the gradient and in the sparse coding step, we use exact closed form
solution, the resulting algorithm is convergent. On the practical side,
we evaluate on image denoising application and demonstrate promising
denoising performance together with advantages in training data
requirements, accelerations in the local convergence and the resulting
computational complexity.
::Speaker
Dimche Kostadinov
PhD Student in Computer Science, University of Geneve
Show replies by date