Project leader

LIF - Laboratoire d'Informatique Fondamentale de Marseille

Partners

PICXEL, Aix Marseille Université, LIP6 - Laboratoire d'Informatique de Paris 6, LHC - Laboratoire Hubert Curien (Saint Etienne)

Funders

ANR,

LIVES

To learn with interactive sights


Imagine that you have to answer the following questions. How to build a tool to help the medical diagnosis of neurological disorders, from brain images acquired according to different devices? How to build a digital application that would detect the emotion felt by a person, from his face and his voice? How to ensure that these tools are robust to the absence or unreliability of certain types of information (eg missing MRI image, silent face)?

These questions come from the Timone Neuroscience Institute (INT), whose expertise is medical diagnosis from brain imaging techniques, and Picxel, an SME specializing in automatic emotion recognition. They are part of central issues that these two partners of the LIVES project meet in their activities and refer to issues both fundamental and practical computing.

To study and provide answers, the project consortium has three other partners, the Lab. of Computer Science of Paris 6 (LIP6), the Lab. Hubert Curien (LaHC), and the Lab. of Fundamental Computer Science of Marseille (LIF, project leader), whose members who are committed to LIVES are experts in machine learning. This discipline of computer science will constitute the scientific framework of the work developed in this project, and equally strong interest will be given to the proposal of new theoretical results and to the practical implementation of these results. The articulation of these results of fundamental and applicative nature is a strength of the project, on which the 5 partners are convinced that they will be able to innovate.

The questions mentioned above pose the problem:

to construct a classifier capable of predicting the class (i.e. a diagnosis, or emotion) of an observed object,
to take advantage of the few modalities, or views, available to describe the object to be classified,
possibly, to build intermediate representations built on these different views.
This is precisely the goal of this project: the development of an automatic learning framework to learn in the presence of “interacting views”, central notion that we will take the time to formalize and study. We will structure our work in the following way.

We will study (T1 task) how the taking into account of interacting views modifies otherwise valid learning outcomes; this will require the formalization of the multi-view learning problem and the definition of interaction measures between the views. On the other hand, based on the established theoretical results, we will be interested in designing new multiview learning algorithms able to work on real datasets.

For this, we will work on the question of learning (compact) representations of data described in several views (task T2), as well as transfer and multitask learning algorithms (task T3).

Task T4 concerns the implementation of algorithms adapted to respond to unfavorable conditions encountered in real-world applications: large numbers of data (and / or large data), and missing data (views).

Finally, the learning algorithms that we have developed will be confronted with open data sets that we propose to collect (task T5: two real data sets, in neuroimaging (4 types of scans) and in emotion recognition (RGB, IR, and voice image), and more controlled data sets.

These data collections will be essential for

(i) the reproducibility and comparison of searches,

(ii) identifying learning strategies capable of processing the multiview data, and

(iii) bring the machine learning community to the processing of multiview data in emotion recognition and neuroimaging.

Project leader

LIF - Laboratoire d'Informatique Fondamentale de Marseille

Partners

PICXEL, Aix Marseille Université, LIP6 - Laboratoire d'Informatique de Paris 6, LHC - Laboratoire Hubert Curien (Saint Etienne)

Funders

ANR,
Themes Markets R&D Investment Duration Funding Year
Software
Health - Medical - Pharmaceutical
2517 K€ 42 months 2015
Themes
Software
Markets
Health - Medical - Pharmaceutical
R&D Investment
2517 K€
Duration
42 months
Funding Year
2015

Back to projects directory