Applied/ACMS/absF18: Difference between revisions

From UW-Math Wiki
Jump to navigation Jump to search
Line 56: Line 56:


There are four pieces to this simple theory.  First, sparse and stochastic graphs create a lot of small trees that are connected to the core of the graph by only one edge. Second, graph conductance is sensitive to these noisy "dangling sets." Third, by Cheeger's inequality and an inequality from Ky Fan, spectral clustering inherits this sensitivity. These three pieces explain why spectral clustering gives bad results in practice.  The fourth piece uses Cheeger's inequality to show how the hack creates a new form of graph conductance that we call CoreCut. Simple inspection of CoreCut reveals why it is less sensitive to small cuts in the graph. In addition to this statistical benefit, these results also demonstrate why the hack also improves the computational speed of spectral clustering.
There are four pieces to this simple theory.  First, sparse and stochastic graphs create a lot of small trees that are connected to the core of the graph by only one edge. Second, graph conductance is sensitive to these noisy "dangling sets." Third, by Cheeger's inequality and an inequality from Ky Fan, spectral clustering inherits this sensitivity. These three pieces explain why spectral clustering gives bad results in practice.  The fourth piece uses Cheeger's inequality to show how the hack creates a new form of graph conductance that we call CoreCut. Simple inspection of CoreCut reveals why it is less sensitive to small cuts in the graph. In addition to this statistical benefit, these results also demonstrate why the hack also improves the computational speed of spectral clustering.
=== Yimin Zhong (Univ. of California - Irvine)
''Instability of an inverse problem for the stationary radiative transport near the diffusion limit''
In this work, we study the instability of an inverse problem of radiative transport equation with angularly averaged measurement near the diffusion limit, i.e.  the normalized mean free path (the Knudsen number) $0 < \eps \ll 1$. It is well-known that there is a transition of stability from H\"{o}lder type to logarithmic type with $\eps\to 0$, the theory of this transition of stability is still an open problem. In this study, we show the transition of stability by establishing the balance of two different regimes depending on the relative sizes of $\eps$ and the perturbation in measurements. When $\eps$ is sufficiently small, we obtain exponential instability, which stands for the diffusive regime, and otherwise we obtain H\"{o}lder instability instead, which stands for the transport regime.

Revision as of 13:36, 26 October 2018

ACMS Abstracts: Fall 2018

Ting Zhou (Northeastern University)

Nonparaxial near-nondiffracting accelerating optical beams

We show that new families of accelerating and almost nondiffracting beams (solutions) for Maxwell’s equations can be constructed. These are complex geometrical optics (CGO) solutions to Maxwell’s equations with nonlinear limiting Carleman weights. They have the form of wave packets that propagate along circular trajectories while almost preserving a transverse intensity profile. We also show similar waves constructed using the approach combining CGO solutions and the Kelvin transform.


Daniel Sanz-Alonso (University of Chicago)

Discrete and Continuous Learning in Information and Geophysical Sciences

The formulation of Bayesian inverse problems in function space has led to new theoretical and computational developments, providing improved understanding on regularization techniques and suggesting new scalable algorithms. The approach has found numerous applications throughout the geophysical and medical sciences, where interest often lies in recovering an unknown field defined on a physical domain. Learning problems in the information sciences, in contrast, typically seek to recover functions defined on discrete point clouds. My talk will have two parts. In the first one, I will prove that in certain large data limit, discrete learning problems converge to a continuous one, thus allowing to transfer scalable Markov chain Monte Carlo methodology developed in the geophysical sciences to novel applications in the information sciences. In the second part I will introduce a fully Bayesian, data-driven methodology to discretize complex forward models with the specific goal of solving inverse problems. This methodology has the potential of producing cheap surrogates that still allow for satisfactory input reconstruction.

Nan Chen (University of Wisconsin-Madison)

A simple stochastic model for El Nino with westerly wind bursts and the prediction of super El Nino events

Atmospheric wind bursts in the tropics play a key role in the dynamics of the El Nino Southern Oscillation (ENSO). A simple modeling framework is proposed that summarizes this relationship and captures major features of the observational record while remaining physically consistent and amenable to detailed analysis. Within this simple framework, wind burst activity evolves according to a stochastic two-state Markov switching–diffusion process that depends on the strength of the western Pacific warm pool, and is coupled to simple ocean–atmosphere processes that are otherwise deterministic, stable, and linear. A simple model with this parameterization and no additional nonlinearities reproduces a realistic ENSO cycle with intermittent El Nino and La Nina events of varying intensity and strength as well as realistic buildup and shutdown of wind burst activity in the western Pacific. The wind burst activity has a direct causal effect on the ENSO variability: in particular, it intermittently triggers regular El Nino or La Nina events, super El Nino events, or no events at all, which enables the model to capture observed ENSO statistics such as the probability density function and power spectrum of eastern Pacific sea surface temperatures. The present framework is then applied to understand the mechanism of different super El Ninos. In particular, the framework is used to simulate and analyze the two famous super El Nino events in 1997-1998 and 2014-2016, with the conclusion that the delayed super El Nino events in 2014-2016 are not necessarily unusual in the tropical Pacific despite not appearing in the recent observational record and could reoccur in the future.

Sulian Thual (Fudan University)

A Stochastic Skeleton Model for the Madden-Julian Oscillation and El Nino-Southern Oscillation

A broad range of random atmospheric disturbances in the tropics may be considered as possible triggers to the El Niño Southern Oscillation (ENSO), such as for example westerly wind bursts, easterly wind bursts, as well as the convective envelope of the Madden-Julian Oscillation (MJO). Here a simple dynamical stochastic model for the tropical ocean-atmosphere is proposed that captures those processes as well as their multiscale interactions. Realistic features include for the first time altogether the MJO wavenumber-frequency power spectra, eastward propagation, structure and confinement to the warm pool region and similarly for atmospheric Kelvin and Rossby equatorial waves, in addition to the ENSO intermittency, power spectrum and non-Gaussian statistics of sea surface temperatures, among others.

Importantly, intraseasonal atmospheric disturbances such as the MJO are here solved dynamically which renders more explicit their upscale contribution to the interannual flow as well as their modulation in return. First, the background red noise spectrum of atmospheric disturbances rather than their individual characteristics is shown to be most important for the triggering of the ENSO. Second, the onset, strength and demise of El Niño events is linked to the increase and eastward expansion of atmospheric disturbances eastward of the warm pool region. The present framework serves as a prototype for general circulation models that solve similar dynamical interactions on several spatial and temporal scales.

Matthew Thorpe (Cambridge University)

Continuum Limits of Semi-Supervised Learning on Graphs

Given a data set $\{x_i\}_{i=1}^n$ with labels $\{y_i\}_{i=1}^N$ on the first $N$ data points the goal of semi-supervised is to infer labels on the remaining $\{x_i\}_{i=N+1}^n$ data points. In this talk we use a random geometric graph model with connection radius $r(n)$. The framework is to consider objective functions which reward the regularity of the estimator function and impose or reward the agreement with the training data, more specifically we will consider discrete p-Laplacian and fractional Laplacian regularization.

The talk concerns the asymptotic behaviour in the limit where the number of unlabelled points increases while the number of training points remains fixed. The results are to uncover a delicate interplay between the regularizing nature of the functionals considered and the nonlocality inherent to the graph constructions. I will give almost optimal ranges on the scaling of $r(n)$ for asymptotic consistency to hold. Furthermore, I will setup the Bayesian interpretation of this problem.

This is joint work with Matt Dunlop (Caltech), Dejan Slepcev (CMU) and Andrew Stuart (Caltech).

Fei Lu (Johns Hopkins University)

Data-informed stochastic model reduction for complex dynamical systems

The need to develop reduced nonlinear statistical-dynamical models from time series of partial observations of complex systems arises in many applications such as geophysics, biology and engineering. The challenges come mainly from memory effects due to the nonlinear interactions between resolved and unresolved scales, and from the difficulty in inference from discrete data.

To address these challenges, we introduce a discrete-time stochastic parametrization framework, in which we infer nonlinear autoregression moving average (NARMA) type models to take the memory effects into account. We show by examples that the NARMA type stochastic reduced models that can capture the key statistical and dynamical properties, and therefore can improve the performance of ensemble prediction in data assimilation. The examples include the Lorenz 96 system (which is a simplified model of the global atmosphere) and the Kuramoto-Sivashinsky equation of spatiotemporally chaotic dynamics. Applications of this inference approach to model reduction for stochastic Burgers equations will be discussed.


Matthew Dixon (Illinois Institute of Technology)

"Quantum Equilibrium-Disequilibrium”: Asset Price Dynamics, Symmetry Breaking and Defaults as Dissipative Instantons

We propose a simple non-equilibrium model of a financial market as an open system with a possible exchange of money with an outside world and market frictions (trade impacts) incorporated into asset price dynamics via a feedback mechanism. Using a linear market impact model, this produces a non-linear two-parametric extension of the classical Geometric Brownian Motion (GBM) model, that we call the ”Quantum Equilibrium-Disequilibrium” model. Our model gives rise to non-linear mean-reverting dynamics, broken scale invariance, and corporate defaults. In the simplest one-stock (1D) formulation, our parsimonious model has only one degree of freedom, yet calibrates to both equity returns and credit default swap spreads. Defaults and market crashes are associated with dissipative tunneling events, and correspond to instanton (saddle-point) solutions of the model. When market frictions and inflows/outflows of money are neglected altogether, ”classical” GBM scale-invariant dynamics with an exponential asset growth and without defaults are formally recovered from our model. However, we argue that this is only a formal mathematical limit, and in reality the GBM limit is non- analytic due to non-linear effects that produce both defaults and divergence of perturbation theory in a small market friction parameter.


Karl Rohe (UW-Madison, statistics)

Making Spectral Graph Theory work in practice. Making the practice work in theory

After introducing Cheeger's Inequality and spectral clustering, this talk has two parts. The first part will (1) show how spectral clustering gives "bad results" in many applied settings and (2) illustrate a "hack" that makes it work very well. Most of the talk will be spent on the second part, which will provide a simple theory to provide a deeper understanding of where the bad results come from and why the hack works so well.

There are four pieces to this simple theory. First, sparse and stochastic graphs create a lot of small trees that are connected to the core of the graph by only one edge. Second, graph conductance is sensitive to these noisy "dangling sets." Third, by Cheeger's inequality and an inequality from Ky Fan, spectral clustering inherits this sensitivity. These three pieces explain why spectral clustering gives bad results in practice. The fourth piece uses Cheeger's inequality to show how the hack creates a new form of graph conductance that we call CoreCut. Simple inspection of CoreCut reveals why it is less sensitive to small cuts in the graph. In addition to this statistical benefit, these results also demonstrate why the hack also improves the computational speed of spectral clustering.


=== Yimin Zhong (Univ. of California - Irvine)

Instability of an inverse problem for the stationary radiative transport near the diffusion limit

In this work, we study the instability of an inverse problem of radiative transport equation with angularly averaged measurement near the diffusion limit, i.e. the normalized mean free path (the Knudsen number) $0 < \eps \ll 1$. It is well-known that there is a transition of stability from H\"{o}lder type to logarithmic type with $\eps\to 0$, the theory of this transition of stability is still an open problem. In this study, we show the transition of stability by establishing the balance of two different regimes depending on the relative sizes of $\eps$ and the perturbation in measurements. When $\eps$ is sufficiently small, we obtain exponential instability, which stands for the diffusive regime, and otherwise we obtain H\"{o}lder instability instead, which stands for the transport regime.