Generalising universal function approximators
What do some of the hardest and most important problems in science have in common? From improving our ability to model turbulence and climate change, to developing nuclear fusion or superconductor technology -- the solutions to all of these problems rely on being able to solve and evaluate complex ordinary or partial differential equations (ODEs or PDEs) and integrals. While this is not an issue for simple systems where the equations are well studied and analytical solutions exist, the majority of open problems do not share these properties. Hence, the equations are often approximated from noisy experimental data, and their evaluation around new data points is obtained from running complex numerical simulations on large supercomputers -- both expensive options, computationally and financially. A system that could both estimate the equations from data, and evaluate them around new data points in a fast and reliable way could be revolutionary in supercharging scientific progress. In their recent work, Lu et al (2020) make a step towards developing such a system. They introduce Deep Operator Network (DeepONet), a neural network model that is capable of learning nonlinear operators -- for example, how to evaluate integrals or solve differential equations -- directly from data in a way that generalises to new inputs.