ICA 2007

ICA 2007
7th International Conference on
Independent Component Analysis
and Signal Separation

London, UK        9 - 12 September 2007

Banner showing images of London
- Home
- Committee
- Call for Papers
- Submission
- Info for Presenters
- Dates
- Programme
- Tutorials
- Keynotes
- Papers
- Registration
- Accommodation
- Venue
- Maps
- Arrival
- Travel Tips
- Links
- Contact

Tutorials

The ICA 2007 Tutorials will take place on the afternoon of Sunday 9 September 2007.

Tutorials attendance is included in Regular and Student registration.


Tutorial 1 - Sparse Representations: From Source Separation to Compressed Sensing

Remi Gribonval
IRISA-INRIA, France

Abstract

This tutorial will give an overview of the state of the art in the theory, algorithms and applications of sparse signal representations. The main emphasis will be on (blind) under-determined source separation, and I will discuss some connections with the emerging field of compressed sensing, which aims at exploiting sparse models to replace classical Shannon sampling.

Starting with the example of under-determined blind audio source separation, I will illustrate intuitively the nature of sparse signal models and how they can help solve ill-posed inverse problems. I will then survey some recent mathematical results which analyze the properties of these models and explain the performance of sparse representation algorithms such as convex optimization methods and matching pursuit. I will conclude by an introduction to compressed sensing with a discussion of its connections to under-determined (blind) source separation, outlining the main theoretical and practical challenges raised by these approaches.


Tutorial 2: Information Filtering

Jose C. Principe, Ph.D.
Distinguished Professor of Electrical and Computer Engineering
University of Florida, Gainesville, USA

Abstract

This tutorial will introduce a methodology to train linear or nonlinear systems with entropy and divergence, as opposed to the well known moment methods (e.g. mean square error (MSE)). The advantage is that more information about the error signal is captured in the weights of the mapper and a wealth of new applications become possible, including independent component analysis (ICA) as a special case. One of the corner stones of information filtering is a methodology called information theoretic learning (ITL) to estimate entropy directly from data, without estimating the probability density function explicitly. Applications to system identification, blind deconvolution, independent component analysis, principal curves and dimensionality reduction will be presented.

There is a very tight link between ITL and kernel methods being developed now in the machine learning community. This tutorial will also present a new similarity function called correntropy. The name was coined to show that it is similar to correlation but its mean value across delays (or dimensions) is the argument of Renyi's quadratic entropy. This similarity function defines a new reproducing kernel Hilbert space (RKHS) that shows a lot of promise to design and implement nonlinear signal processing algorithms.

Last Updated: 14-Aug-2007   Please read our disclaimer