# Applied/ACMS/absS19

## Contents

- 1 ACMS Abstracts: Spring 2019
- 1.1 Jerry Zhu (University of Wisconsin-Madison, CS)
- 1.2 Abhishek Deshpande (UW-Madison, math)
- 1.3 Chung-Nan Tzou (UW-Madison, Math)
- 1.4 Amy Cochran (UW-Madison, Math and Medical Informatics)
- 1.5 Kui Ren (Columbia Applied math and UT-Austin Mathematics)
- 1.6 Nicolas Garcia Trillos (UW-Madison, statistics)
- 1.7 Weiran Sun (Simon Fraser University)
- 1.8 Jean-Luc Thiffeault (UW-Madison, Math)
- 1.9 Alexandru Hening (Tufts University)
- 1.10 Lei Li (Shanghai Jiao Tong University)
- 1.11 Jiajun Tong (UCLA)

# ACMS Abstracts: Spring 2019

### Jerry Zhu (University of Wisconsin-Madison, CS)

*Machine Teaching: Optimal Control of Machine Learning*

As machine learning is increasingly adopted in science and engineering, it becomes important to take a higher level view where the machine learner is only one of the agents in a multi-agent system. Other agents may have an incentive to control the learner. As examples, in adversarial machine learning an attacker can poison the training data to manipulate the model the learner learns; in education a teacher can optimize the curriculum to enhance student (modeled as a computational learning algorithm) performance. Machine teaching is optimal control theory applied to machine learning: the plant is the learner, the state is the learned model, and the control is the training data. In this talk I survey the mathematical foundation of machine teaching and the new research frontiers opened up by this confluence of machine learning and control theory.

### Abhishek Deshpande (UW-Madison, math)

*Switches in chemical and biological networks*

Switches are ubiquitous in both chemical and biological circuits. We explore the behaviour of autocatalytic switches in the context of the persistence conjecture. We show that networks without autocatalytic switches are persistent. The notion of a “critical siphon” forms the connecting link between autocatalysis and persistence. The talk will expand upon this connection.

Swtiches are also relevant from a biological perspective. We show that catalytic switches help in reducing retroactivity - the back effect on the upstream system when connected to the downstream system. In addition, for certain catalytic networks like the push-pull motif, high rates of energy consumption are not required to attenuate retroactivity. One can accomplish this by reducing the coupling to the push-pull motif. However, this reduction in coupling is not robust to cross-talk caused by leak reactions.

References:
1) https://arxiv.org/abs/1309.3957
2) https://arxiv.org/abs/1708.01792

### Chung-Nan Tzou (UW-Madison, Math)

*Fluid Models with Sharp Interfaces - Clouds and Plumes*

In this talk, I will discuss two models describing the interaction of fluids across sharp interfaces. The first model is a discontinuous Poisson equation where the interfacial discontinuity arises from phase changes such as the interior and exterior of a cloud. A simple second-order numerical scheme aiming at solving this type of equations is proposed and tested. The second model is a simplified system of ODEs describing the mixing of jets and plumes with the ambient fluid. With the ambient density profile being sharply stratified, we established a criterion for a plume to be trapped underwater or rise to the top surface and also showed that this profile is the optimal mixer. This theory has been applied to the Gulf of Mexico oil spill incident and also compared with the data we collected through hands-on experiments in the fluids lab.

### Amy Cochran (UW-Madison, Math and Medical Informatics)

*A model of online latent state learning*

Researchers are increasingly interested in how humans perform a structured form of learning known as latent-state inferences. Latent state inferences refer to someone's ability to weigh competing hypotheses about one’s environment. Critically, this type of learning can help explain behavior and neural activity important to cognitive neuroscience and psychiatry. In this talk, I will first present a model of latent state learning that uses online, or recursive, updates. I will also discuss open questions related to this topic in hopes of generating discussion. Ultimately, I would like to engage students interested in the emerging area of computational psychiatry, as I will be joining the math department as an assistant professor in the Fall.

### Kui Ren (Columbia Applied math and UT-Austin Mathematics)

*Uncertainty Characterization in Model-Based Inverse and Imaging Problems*

In model-based inverse and imaging problems, it is often the case that only a portion of the relevant physical quantities in the model can be reconstructed/imaged. The rest of the model parameters are assumed to be known. In practice, these parameters are often only known partially (up to a certain accuracy). It is therefore important to characterize the dependence of the inversion/imaging results on the accuracy of these parameters. This is an uncertainty quantification problem that is challenging due to the fact that both the map from the uncertainty parameters (the ones we assumed partially known) to the measured data and the map from the measured data to the quantities to be imaged are difficult to analyze. In this talk, we review some recent computaitonal and mathematical results on such uncertainty characterization problems in nonlinear inverse problems for PDEs.

### Nicolas Garcia Trillos (UW-Madison, statistics)

*Large sample asymptotics of spectra of Laplacians and semilinear elliptic PDEs on random geometric graphs*

Given a data set $\mathcal{X}=\{x_1, \dots, x_n\}$ and a weighted graph structure $\Gamma= (\mathcal{X},W)$ on $\mathcal{X}$, graph based methods for learning use analytical notions like graph Laplacians, graph cuts, and Sobolev semi-norms to formulate optimization problems whose solutions serve as sensible approaches to machine learning tasks. When the data set consists of samples from a distribution supported on a manifold (or at least approximately so), and the weights depend inversely on the distance between the points, a natural question to study concerns the behavior of those optimization problems as the number of samples goes to infinity. In this talk I will focus on optimization problems closely connected to clustering and supervised regression that involve the graph Laplacian. For clustering, the spectrum of the graph Laplacian is the fundamental object used in the popular spectral clustering algorithm. For regression, the solution to a semilinear elliptic PDE on the graph provides the minimizer of an energy balancing regularization and data fidelity, a sensible object to use in non-parametric regression. Using tools from optimal transport, calculus of variations, and analysis of PDEs, I will discuss a series of results establishing the asymptotic consistency (with rates of convergence) of many of these analytical objects, as well as provide some perspectives on future research directions.

### Weiran Sun (Simon Fraser University)

*Aggregation equations over bounded domains*

Numerical computations have shown that due to the boundary effect, solutions of aggregation equations can evolve into non-energy minimizing states. Meanwhile, adding a small noise seems to bypass such non- energy minimizers. This motivates our study of aggregation equations over bounded domains. In this talk we will use basic probabilistic methods to show well-posedness and mean-field limits of aggregation equations with singular potentials (such as the Newtonian potential). We will also show the zero-diffusion limit of aggregations equations over bounded domains and obtain a convergence rate that is consistent with what has been observed in numerical simulations. This is joint work with Razvan Fetecau, Hui Huang, and Daniel Messenger.

### Jean-Luc Thiffeault (UW-Madison, Math)

*The mathematics of burger flipping*

Ever since the dawn of time people have (literally) asked the question — what is the most effective way to grill food? Timing is everything, since only one surface is exposed to heat at a given time. Should we flip only once, or many times? I will show a simple model of cooking by flipping, and some interesting mathematics will emerge. The rate of cooking depends on the spectrum of a linear operator, and on the fixed point of a map. If the system is symmetric, the rate of cooking becomes independent of the sequence of flips, as long as the last point to be cooked is the midpoint. This toy problem has some characteristics reminiscent of more realistic scenarios, such as thermal convection and heat exchangers.

### Alexandru Hening (Tufts University)

*Stochastic persistence and extinction*

A key question in population biology is understanding the conditions under which the species from an ecosystem persist or go extinct. Theoretical and empirical studies have shown that coexistence can be facilitated or negated by both biotic interactions and environmental fluctuations. We study the dynamics of n interacting species that live in a stochastic environment. Our models are described by n dimensional piecewise deterministic Markov processes. These are processes (X(t), r(t)) where the vector X denotes the density of the n species and r(t) is a finite state space process which keeps track of the environment. In any fixed environment the process follows the flow given by a system of ordinary differential equations. The randomness comes from the changes or switches in the environment, which happen at random times. We give sharp conditions under which the the populations persist as well as conditions under which some populations go extinct exponentially fast. As an example we look at the competitive exclusion principle from ecology and show how the random switching can `rescue' species from extinction. The talk is based on joint work with Dang H. Nguyen (University of Alabama).

### Lei Li (Shanghai Jiao Tong University)

*The Random Batch Method and its application to sampling*

First order interacting particle systems are ubiquitous. For example, they can be viewed as the over-damped Langevin equations. We first introduce a random algorithm, called the Random Batch Method (RBM), for simulating first order systems. The algorithms are motivated by the mini-batch idea in machine learning and statistics. Under some special conditions, we show the convergence of RBMs for the first marginal distribution under the Wasserstein distance. Compared with traditional tree code and fast multipole expansion algorithms, RBM works for kernels that do not necessarily decay. We then apply the RBM to Stein Variational Gradient Descent, a recent algorithm in statistics and machine learning, to obtain an efficient sampling method. This talk is based on joint work with Shi Jin (Shanghai Jiao Tong University), Jian-Guo Liu (Duke University), Jianfeng Lu (Duke University) and Zibu Liu (Duke University).

### Jiajun Tong (UCLA)

*2-D Stokes Immersed Boundary Problem and its Regularizations: Well-posedness, Singular Limit, and Error Estimates*

Studying coupled motion of immersed elastic structures and surrounding fluid is important in science and engineering. In this talk, we first consider 2-D Stokes immersed boundary problem that models a 1-D closed elastic string immersed and moving in a 2-D Stokes flow, and we discuss its well-posedness. Inspired by the numerical immersed boundary method, we then introduce a regularized version of the problem, in which a regularized delta-function is used to mollify the flow field and singular forcing. We prove global well-posedness of the regularized problems, and show that as the regularization parameter diminishes, the string dynamics in the regularized problems converge to that in the un-regularized problem under certain assumptions. Viewing the latter as a benchmark, we derive error estimates under various norms for the string dynamics. Our rigorous analysis shows that the regularized problems achieve improved accuracy if the regularized delta-function is suitably chosen. This may imply potential improvement in the numerical immersed boundary method, which is worth further investigation. This is joint work with Fanghua Lin.