Skip to content

Latest commit

 

History

History
46 lines (46 loc) · 1.8 KB

2024-06-11-hanson24a.md

File metadata and controls

46 lines (46 loc) · 1.8 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Rademacher complexity of neural ODEs via Chen-Fliess series
We show how continuous-depth neural ODE models can be framed as single-layer, infinite-width nets using the Chen-Fliess series expansion for nonlinear ODEs. In this net, the output “weights” are taken from the signature of the control input — a tool used to represent infinite-dimensional paths as a sequence of tensors — which comprises iterated integrals of the control input over a simplex. The “features” are taken to be iterated Lie derivatives of the output function with respect to the vector fields in the controlled ODE model. The main result of this work applies this framework to derive compact expressions for the Rademacher complexity of ODE models that map an initial condition to a scalar output at some terminal time. The result leverages the straightforward analysis afforded by single-layer architectures. We conclude with some examples instantiating the bound for some specific systems and discuss potential follow-up work.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
hanson24a
0
{R}ademacher complexity of neural {ODE}s via {C}hen-{F}liess series
758
769
758-769
758
false
Hanson, Joshua and Raginsky, Maxim
given family
Joshua
Hanson
given family
Maxim
Raginsky
2024-06-11
Proceedings of the 6th Annual Learning for Dynamics & Control Conference
242
inproceedings
date-parts
2024
6
11