(07/12/2025) Exciting news! GRAM is going to Rio!
We are pleased to announce this second edition of GRaM, as an ICLR 2026 workshop. This year, we will have a focus on scale and simplicity. We open three tracks: paper tracks, blogposts track and a new competition track! Have a look at our previous edition. More updates to come soon...
Deadlines
- Paper submission (tentative): January 30, 2026 (AoE)
- Paper acceptance notification: March 1st, 2026 (AoE)
- Workshop dates: April 25-26, 2026 (exact date to be announced)
Motivation
Many real-world datasets have geometric structure, but most ML methods ignore such structure, and treat all inputs as plain vectors. GRaM is a workshop about grounding models in geometry, using ideas from group equivariance to non-Euclidean metrics, to build better, more interpretable representations and generative models.
An approach is geometrically grounded if it respects the geometric structure of the problem domain and supports geometric reasoning.
For this second edition, we aim to explore the relevance of geometric methods, particularly in the context of large models, focusing on the theme of scale and simplicity.
Topics
We solicit submissions that present theoretical research, methodologies, applications, insightful analysis, and even open problems, within the following topics (list not exhaustive):
- Preserving data geometry
- Preservation of symmetries; e.g., through equivariant operators.
- Geometric representation systems; e.g., encoding data in intrinsically structured forms via Clifford algebras or steerable vectors with Clebsch-Gordan products.
- Isometric latent mappings; e.g., learning latent representations of the data via pullback metrics.
- Inducing geometric structure
- Geometric priors; e.g., introducing curvature, symmetry, or topological constraints through explicit regularization.
- non-Euclidean generative models; e.g., extending diffusion models or flow matching models to non-Euclidean domains with a predefined metric.
- Metric-preserving embeddings; e.g., learning latent spaces where intrinsic geodesic distances are mapped to Euclidean ones.
- Geometry in theoretical analysis
- Data and latent geometry · Gaining insights on the data manifold, statistical manifold or the latent variables using geometric tools.
- Loss landscape geometry · Viewing parameters and their optimization trajectory as lying on a manifold, enabling analysis of curvature, critical points, and generalization.
- Theoretical frameworks · Using differential geometry, algebraic geometry, or group theory to provide a generalizing perspective on generation or representation learning.
- Open problems · Identifying and addressing unresolved questions and challenges that lie at the intersection of geometry and learning.
- Scale and Simplicity
- Geometry at scale · Does equivariance retain value in large-scale models?
- Redundancy and minimality · Evaluating when geometric structure is essential versus when simpler architectures suffice.
- Challenging assumptions · Reporting negative results or limitations of geometric methods to guide future development.
Organizers
Alison Pouplin
Sharvaree Vadgama
Erik Bekkers
Sékou-Oumar Kaba
Hannah Lawrence
Manuel Lecha
Elizabeth (Libby) Baker
Robin Walters
Jakub Tomczak
Stefanie Jegelka