Catégorie
The Palaisien Seminar
Date de tri

« Le Séminaire Palaisien » | Quentin Bouniot & Anna Korba

Bandeau image
Séminaire Le Palaisien

« Le Séminaire Palaisien » | Quentin Bouniot & Anna Korba

Lieu de l'événement
ENSAE, room 1001
Date de l'événement (intitulé)
February 6th, 2024 - 12pm
Chapo
Every first Tuesday of the month, the Palaisien seminar brings together Saclay's vast research community to discuss statistics and machine learning.
Contenu
Corps de texte

Each seminar session is divided into two scientific presentations of 40 minutes each: 30 minutes of talk and 10 minutes of questions.

Quentin Bouniot and Anna Korba will host the February 2024 session!


Registration is free but compulsory, subject to availability. A buffet will be served at the end of the seminar.

Nom de l'accordéon
Quentin Bouniot | "Understanding Deep Neural Networks Through the Lens of their Non-Linearity"
Texte dans l'accordéon

Abstract

The remarkable success of deep neural networks (DNN) is often attributed to their high expressive power and their ability to approximate functions of arbitrary complexity. Indeed, DNNs are highly non-linear models, and activation functions introduced into them are largely responsible for this. While many works studied the expressive power of DNNs through the lens of their approximation capabilities, quantifying the non-linearity of DNNs or of individual activation functions remains an open problem. In this work, we propose the first theoretically sound solution to track non-linearity propagation in deep neural networks with a specific focus on computer vision applications. Our proposed affinity score allows us to gain insights into the inner workings of a wide range of different architectures and learning paradigms. We provide extensive experimental results that highlight the practical utility of the proposed affinity score and its potential for long-reaching applications. (https://arxiv.org/abs/2310.11439)

Nom de l'accordéon
Anna Korba | "Sampling through optimization of discrepancies"
Texte dans l'accordéon

Abstract

Sampling from a target measure when only partial information is available (e.g. unnormalized density as in Bayesian inference, or true samples as in generative modeling) is a fundamental problem in computational statistics and machine learning. The sampling problem can be formulated as an optimization over the space of probability distributions of a well-chosen discrepancy (e.g. a divergence or distance). In this talk, we'll discuss several properties of sampling algorithms for some choices of discrepancies (well-known ones, or novel proxies), both regarding their optimization and quantization aspects.