Graduate student reading seminar
(... in probability)
Email list: firstname.lastname@example.org
Tuesday 2:30pm, 901 Van Vleck
Tuesday 2:30pm, 901 Van Vleck
The topic this semester is large deviation theory. Send me (BV) an email, if you want access to the shared Box folder with some reading material.
9/25, 10/2: Dae Han
10/9, 10/16: Kurt
10/23, 10/30: Stephen Davis
11/6, 11/13: Brandon Legried
11/20, 11/27: Shuqi Yu
12/4, 12/11: Yun Li
Tuesday 2:30pm, B135 Van Vleck
2/20, 2/27: Yun
3/6, 3/13: Greg
3/20, 4/3: Yu
4/10, 4/17: Shuqi
4/24, 5/1: Tony
Tuesday 2:30pm, 214 Ingraham Hall
9/26, 10/3: Hans
10/10, 10/17: Guo
10/24, 10/31: Chaoji
11/7, 11/14: Yun
11/21, 11/28: Kurt
12/5, 12/12: Christian
Tuesday 2:25pm, B211
1/31, 2/7: Fan
I will talk about the Hanson-Wright inequality, which is a large deviation estimate for random variable of the form X^* A X, where X is a random vector with independent subgaussian entries and A is an arbitrary deterministic matrix. In the first talk, I will present a beautiful proof given by Mark Rudelson and Roman Vershynin. In the second talk, I will talk about some applications of this inequality.
Reference: M. Rudelson and R. Vershynin, Hanson-Wright inequality and sub-gaussian concentration, Electron. Commun. Probab. Volume 18 (2013).
3/7, 3/14 : Jinsu
Title : Donsker's Theorem and its application. Donsker's Theorem roughly says normalized random walk with linear interpolation on time interval [0,1] weakly converges to the Brownian motion B[0,1] in C([0,1]). It is sometimes called Donsker's invariance principle or the functional central limit theorem. I will show main ideas for the proof of this theorem tomorrow and show a couple of applications in my 2nd talk.
Stochastic reaction networks.
Stochastic reaction networks are continuous time Markov chain models used primarily in biochemistry. I will define them, prove some results that connect them to related deterministic models and introduce some open questions.
10/11, 10/18: Dae Han
10/25, 11/1: Jinsu
Coupling of Markov processes.
When we have two distributions on same probability space, we can think of a pair whose marginal probability is each of two distributions. This pairing can be used to estimate the total variation distance between two distributions. This idea is called coupling method. I am going to introduce basic concepts,ideas and applications of coupling for Markov processes.
Links of References
11/8, 11/15: Hans
11/22, 11/29: Keith
Surprisingly Determinental: DPPs and some asymptotics of ASEP
I'll be reading and presenting some recent papers of Alexei Borodin and a few collaborators which have uncovered certain equivalences between determinental point processes and non-determinental processes.
Tuesday, 2:25pm, B321 Van Vleck
3/29, 4/5: Fan Yang
I will talk about the ergodic decomposition theorem (EDT). More specifically, given a compact metric space X and a continuous transformation T on it, the theorem shows that any T-invariant measure on X can be decomposed into a convex combination of ergodic measures. In the first talk I introduced the EDT and some related facts. In the second talk, I will talk about the conditional measures, and prove that the ergodic measures in EDT are indeed the conditional measures.
2/16 : Jinsu
Lyapunov function for Markov Processes.
For ODE, we can show stability of the trajectory using Lyapunov functions.
There is an analogy for Markov Processes. I'd like to talk about the existence of stationary distribution with Lyapunov function.
In some cases, it is also possible to show the rate of convergence to the stationary distribution.
This semester we will focus on tools and methods.
9/15, 9/22: Elnur
I will talk about large deviation theory and its applications. For the first talk, my plan is to introduce Gartner-Ellis theorem and show a few applications of it to finite state discrete time Markov chains.
9/29, 10/6, 10/13 :Dae Han
10/20, 10/27, 11/3: Jessica
I will first present an overview of concentration of measure and concentration inequalities with a focus on the connection with related topics in analysis and geometry. Then, I will present Log-Sobolev inequalities and their connection to concentration of measure.
11/10, 11/17: Hao Kai
11/24, 12/1, 12/8, 12/15: Chris
2/2, 2/9: Louis
2/16, 2/23: Jinsu
3/1, 3/8: Hans
2/3, 2/10: Scott
An Introduction to Entropy for Random Variables
In these lectures I will introduce entropy for random variables and present some simple, finite state-space, examples to gain some intuition. We will prove the MacMillan Theorem using entropy and the law of large numbers. Then I will introduce relative entropy and prove the Markov Chain Convergence Theorem. Finally I will define entropy for a discrete time process. The lecture notes can be found at http://www.math.wisc.edu/~shottovy/EntropyLecture.pdf.
2/17, 2/24: Dae Han
3/3, 3/10: Hans
3/17, 3/24: In Gun
4/7, 4/14: Jinsu
4/21, 4/28: Chris N.
I will go over Mike Giles’ 2008 paper “Multi-level Monte Carlo path simulation.” This paper introduced a new Monte Carlo method to approximate expectations of SDEs (driven by Brownian motions) that is significantly more efficient than what was the state of the art. This work opened up a whole new field in the numerical analysis of stochastic processes as the basic idea is quite flexible and has found a variety of applications including SDEs driven by Brownian motions, Levy-driven SDEs, SPDEs, and models from biology
A very quick introduction to Stein's method.
I will give a brief introduction to Stein's method, mostly based on the the first couple of sections of the following survey article:
Ross, N. (2011). Fundamentals of Stein’s method. Probability Surveys, 8, 210-293.
The following webpage has a huge collection of resources if you want to go deeper: https://sites.google.com/site/yvikswan/about-stein-s-method
Note that the Midwest Probability Colloquium (http://www.math.northwestern.edu/mwp/) will have a tutorial program on Stein's method this year.
10/7, 10/14: Chris J. An introduction to the (local) martingale problem.
10/21, 10/28: Dae Han
11/4, 11/11: Elnur
11/18, 11/25: Chris N. Free Probability with an emphasis on C* and Von Neumann Algebras
12/2, 12/9: Yun Zhai
2/04, 2/11: Scott
2/18: Phil-- Examples of structure results in probability theory.
2/25, 3/4: Beth-- Derivative estimation for discrete time Markov chains
3/11, 3/25: Chris J Some classical results on stationary distributions of Markov processes
4/1, 4/8: Chris N
4/15, 4/22: Yu Sun
4/29. 5/6: Diane
9/24, 10/1: Chris A light introduction to metastability
10/8, Dae Han Majoring multiplicative cascades for directed polymers in random media
10/15, 10/22: no reading seminar
10/29, 11/5: Elnur Limit fluctuations of last passage times
11/12: Yun Helffer-Sj?ostrand representation and Brascamp-Lieb inequality for stochastic interface models
11/19, 11/26: Yu Sun
12/3, 12/10: Jason
Young diagrams, RSK correspondence, corner growth models, distribution of last passage times.
A brief introduction to enlargement of filtration and the Dufresne identity Notes
3/13: Dae Han
An introduction to random polymers
3/20: Dae Han
Directed polymers in a random environment: path localization and strong disorder
Scale and Speed for honest 1 dimensional diffusions
Rogers & Williams - Diffusions, Markov Processes and Martingales
Ito & McKean - Diffusion Processes and their Sample Paths
Breiman - Probability
Introduction to stochastic interface models
Dynamics and Gaussian equilibrium sytems
5/1: This reading seminar will be shifted because of a probability seminar.
5/8: Greg, Maso
The Bethe ansatz vs. The Replica Trick. This lecture is an overview of the two approaches. See  for a nice overview.
5/15: Greg, Maso
Rigorous use of the replica trick.