Skip to content

Latest commit

 

History

History
46 lines (46 loc) · 1.85 KB

2024-06-11-allen-blanchette24a.md

File metadata and controls

46 lines (46 loc) · 1.85 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Hamiltonian GAN
A growing body of work leverages the Hamiltonian formalism as an inductive bias for physically plausible neural network based video generation. The structure of the Hamiltonian ensures conservation of a learned quantity (e.g., energy) and imposes a phase-space interpretation on the low-dimensional manifold underlying the input video. While this interpretation has the potential to facilitate the integration of learned representations in downstream tasks, existing methods are limited in their applicability as they require a structural prior for the configuration space at design time. In this work, we present a GAN-based video generation pipeline with a learned configuration space map and Hamiltonian neural network motion model, to learn a representation of the configuration space from data. We train our model with a physics-inspired cyclic-coordinate loss function which encourages a minimal representation of the configuration space and improves interpretability. We demonstrate the efficacy and advantages of our approach on the Hamiltonian Dynamics Suite Toy Physics dataset.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
allen-blanchette24a
0
{H}amiltonian {GAN}
1662
1674
1662-1674
1662
false
Allen-Blanchette, Christine
given family
Christine
Allen-Blanchette
2024-06-11
Proceedings of the 6th Annual Learning for Dynamics & Control Conference
242
inproceedings
date-parts
2024
6
11