Seminario CMM- Maths&AI

Thermodynamics-informed Neural Networks (THINNs).

Event Date: Jan 20, 2026 in Seminario CMM- Maths&AI, Seminars

Abstract: Physics-Informed Neural Networks (PINNs) are a class of deep learning models aiming to approximate solutions of PDEs by training neural networks to minimize the residual of the equation. Focusing on non-equilibrium fluctuating systems, we propose a physically informed choice of penalization that is consistent with the underlying fluctuation structure, as characterized by a large deviations principle. This approach yields a novel formulation of PINNs in which the penalty term is chosen to penalize improbable deviations, rather than being selected heuristically. The resulting...

Read More

What to align in multimodal contrastive learning?

Event Date: Jun 04, 2025 in Seminario CMM- Maths&AI, Seminars

Abstract: Humans perceive the world through multisensory integration, blending the information of different modalities to adapt their behavior. Contrastive learning offers an appealing solution for multimodal self-supervised learning. Indeed, by considering each modality as a different view of the same entity, it learns to align features of different modalities in a shared representation space. However, this approach is intrinsically limited as it only learns shared or redundant information between modalities, while multimodal interactions can arise in other ways. In this work, we introduce...

Read More

Understanding encoder–decoder structures in machine learning using information measures.

Event Date: May 14, 2025 in Seminario CMM- Maths&AI, Seminars

Abstract: We present a theory of representation learning to model and understand the role of encoder–decoder design in machine learning (ML) from an information-theoretic angle. We use two main information concepts, information sufficiency (IS) and mutual information loss to represent predictive structures in machine learning. Our first main result provides a functional expression that characterizes the class of probabilistic models consistent with an IS encoder–decoder latent predictive structure. This result formally justifies the encoder–decoder forward stages many modern ML architectures...

Read More