# Difference between revisions of "Applied/ACMS/absF19"

Line 1: | Line 1: | ||

− | Leonardo Andrés Zepeda Núñez | + | = ACMS Abstracts: Fall 2019 = |

+ | |||

+ | === Leonardo Andrés Zepeda Núñez === | ||

Title: Deep Learning for Electronic Structure Computations: A Tale of Symmetries, Locality, and Physics | Title: Deep Learning for Electronic Structure Computations: A Tale of Symmetries, Locality, and Physics | ||

Line 8: | Line 10: | ||

− | Prashant G. Mehta | + | === Daniel Floryan (UW-Madison) === |

+ | |||

+ | Title: Flexible Inertial Swimmers | ||

+ | |||

+ | Abstract: Inertial swimmers deform their bodies and fins to push against the water and propel themselves forward. The deformation is driven partly by active musculature, and partly by passive elasticity. The interaction between elasticity and hydrodynamics confers features on the swimmers not enjoyed by their rigid friends, for example, boosts in speed when flapping at certain frequencies. We explain the salient features of flexible swimmers by drawing ideas from airfoils, vibrating beams, and flags flapping in the wind. The presence of fluid drag has important consequences. We also explore optimal arrangements of flexibility. (It turns out that nature is quite good.) | ||

+ | |||

+ | === Prashant G. Mehta === | ||

Title: What is the Lagrangian for Nonlinear Filtering? | Title: What is the Lagrangian for Nonlinear Filtering? |

## Revision as of 03:28, 6 September 2019

## Contents

# ACMS Abstracts: Fall 2019

### Leonardo Andrés Zepeda Núñez

Title: Deep Learning for Electronic Structure Computations: A Tale of Symmetries, Locality, and Physics

Abstract: Recently, the surge of interest in deep neural learning has dramatically improved image and signal processing, which has fueled breakthroughs in many domains such as drug discovery, genomics, and automatic translation. These advances have been further applied to scientific computing and, in particular, to electronic structure computations. In this case, the main objective is to directly compute the electron density, which encodes most of information of the system, thus bypassing the computationally intensive solution of the Kohn-Sham equations. However, similar to neural networks for image processing, the performance of the methods depends spectacularly on the physical and analytical intuition incorporated in the network, and on the training stage.

In this talk, I will show how to build a network that respects physical symmetries and locality. I will show how to train the networks and how such properties impact the performance of the resulting network. Finally, I will present several examples for small yet realistic chemical systems.

### Daniel Floryan (UW-Madison)

Title: Flexible Inertial Swimmers

Abstract: Inertial swimmers deform their bodies and fins to push against the water and propel themselves forward. The deformation is driven partly by active musculature, and partly by passive elasticity. The interaction between elasticity and hydrodynamics confers features on the swimmers not enjoyed by their rigid friends, for example, boosts in speed when flapping at certain frequencies. We explain the salient features of flexible swimmers by drawing ideas from airfoils, vibrating beams, and flags flapping in the wind. The presence of fluid drag has important consequences. We also explore optimal arrangements of flexibility. (It turns out that nature is quite good.)

### Prashant G. Mehta

Title: What is the Lagrangian for Nonlinear Filtering?

Abstract: There is a certain magic involved in recasting the equations in Physics, and the algorithms in Engineering, in variational terms. The most classical of these ‘magics’ is the Lagrange’s formulation of the Newtonian mechanics. An accessible modern take on all this and more appears in the February 19, 2019 issue of The New Yorker magazine: https://www.newyorker.com/science/elements/a-different-kind-of-theory-of-everything?reload=true

My talk is concerned with a variational (optimal control type) formulation of the problem of nonlinear filtering/estimation. Such formulations are referred to as duality between optimal estimation and optimal control. The first duality principle appears in the seminal (1961) paper of Kalman-Bucy, where the problem of minimum variance estimation is shown to be dual to a linear quadratic optimal control problem.

In my talk, I will describe a generalization of the Kalman-Bucy duality theory to nonlinear filtering. The generalization is an exact extension, in the sense that the dual optimal control problem has the same minimum variance structure for linear and nonlinear filtering problems. Kalman-Bucy’s classical result is shown to be a special case. During the talk, I will also attempt to review other types of duality relationships that have appeared over the years for the problem of linear and nonlinear filtering.

This is joint work with Jin Won Kim and Sean Meyn. The talk is based on the following papers: https://arxiv.org/pdf/1903.11195.pdf and https://arxiv.org/pdf/1904.01710.pdf.