CROM: Continuous Reduced-Order Modeling of PDEs
Using Implicit Neural Representation
ICLR 2023 (notable-top-25%)



The long runtime of high-fidelity partial differential equation (PDE) solvers makes them unsuitable for time-critical applications. We propose to accelerate PDE solvers using reduced-order modeling (ROM). Whereas prior ROM approaches reduce the dimensionality of discretized vector fields, our continuous reduced-order modeling (CROM) approach builds a low-dimensional embedding of the continuous vector fields themselves, not their discretization. We represent this reduced manifold using continuously differentiable neural fields, which may train on any and all available numerical solutions of the continuous system, even when they are obtained using diverse methods or discretizations. We validate our approach on an extensive range of PDEs with training data from voxel grids, meshes, and point clouds. Compared to prior discretization-dependent ROM methods, such as linear subspace proper orthogonal decomposition (POD) and nonlinear manifold neural-network-based autoencoders, CROM features higher accuracy, lower memory consumption, dynamically adaptive resolutions, and applicability to any discretization. For equal latent space dimension, CROM exhibits 79X and 49X better accuracy, and 39X and 132X smaller memory footprint, than POD and autoencoder methods, respectively. Experiments demonstrate 109X and 89X wall-clock speedups over unreduced models on CPUs and GPUs, respectively. Videos and codes are available on the project page:

Technical Video

Example Results:



This work was supported in part by the National Science Foundation (Grant CBET-17-06689), Meta, Natural Sciences and Engineering Research Council of Canada (Discovery Grant), and SideFX. We thank Honglin Chen and Rundi Wu for their insightful discussions. We thank Rundi Wu for sharing his implementation of Stable Fluids. We thank Raymond Yun Fei for rendering advice. We thank Keenan Crane for the "Origins of the Pig" mesh.

The website template was borrowed from NeRV and Michaël Gharbi.