Past Probability Seminars Fall 2007
UW Math Probability Seminar Fall 2007
Thursdays in 901 Van Vleck Hall at 2:25 PM, unless otherwise noted.
Organized by Tom Kurtz
Schedule and Abstracts
|| Thursday, September 6 || || *David Anderson*, University of Wisconsin - Madison || || *Simulation methods for discrete stochastic chemical systems arising from a random time change representation* slides ||
Chemical reaction systems with a low to moderate number of molecules are typically modeled as discrete jump Markov processes. I will demonstrate how representing the initiation times of the reactions as the firing times of independent, unit rate Poisson process with internal times given by integrated propensity functions leads naturally to both exact and approximate simulation methods that achieve efficiency over existing methods.
Measure-valued branching processes diffusions can be represented in terms of the Cox measures of particle systems that are conditionally Poisson at each time _t_. The representations are useful for characterizing the processes, establishing limit theorems, and analyzing the behavior of the measure-valued processes. Examples to be discussed include models that arise as limits of branching processes in random environments and multitype branching processes.
|| Thursday, September 20 || || No Seminar || || ||
This talk is a survey of some recent fluctuation results for three models: ballistic random walk in random environment, the random average process, and the asymmetric simple exclusion process. We will see fluctuations of different magnitudes, expressed as powers of the ratio between macroscopic and microscopic space and time scales.
|| Thursday, October 4 || || *Sigurd Angenent*, University of Wisconsin-Madison || || *A PDE and a stochastic model for polarization in yeast cells*** ||
Somewhere in the division cycle for a yeast cell certain molecules need to form a localized concentration on the membrane of the cell. In joint work with Steve Altschuler and Lani Wu (UT Dallas) we came up with various models for this process, some deterministic and other stochastic. In the talk I will describe these models and explain heuristically why some of the models are much better than the others.
|| Thursday, October� 11 || || *Dan Gillespie*, Dan T. Gillespie Consulting || || *Stochastic chemical kinetics� *slides** ||
The time evolution of a well-stirred chemically reacting system is traditionally modeled by a set of coupled ordinary differential equations called the reaction rate equation (RRE).� The resulting picture of continuous deterministic evolution is, however, valid only for infinitely large systems.� That condition is usually well approximated in test tube systems.� But in biological systems formed by single living cells, the small population numbers of some reactant species can result in dynamical behavior that is noticeably discrete rather than continuous, and stochastic rather than deterministic.� In that case, a more physically accurate mathematical modeling is obtained by using the machinery of Markov process theory, specifically, the chemical master equation (CME) and the stochastic simulation algorithm (SSA).� After reviewing the theoretical foundations of stochastic chemical kinetics, we will describe a way to approximate the SSA by a faster simulation procedure, and then show how this way also provides a logical bridge between the CME/SSA description and the RRE description.
|| Thursday, October� 18 || || No Seminar || || *Midwest Probability Colloquium* ||
|| Thursday, October 25 || || No Seminar || || ||
|| Thursday, November 1 || || *Nancy Garcia*,� Universidade Estadual de Campinas || || *Consistent estimation of unbounded� probabilistic suffix trees*** ||
Stochastic chains with variable length memory define an interesting family of stochastic chains of infinite order on a finite alphabet. The idea is that for each past, only a finite suffix of the past, called "context", is enough to predict the next symbol. The set of contexts can be represented by a rooted tree with finite labeled branches.� The law of the chain is characterized by its tree of contexts and by an associated family of transition probabilities indexed by the tree.
These models were first introduced in the information theory literature by Rissanen (1983) as a universal tool to perform data compression. Recently, they have been used to model scientific data in areas as different as biology, linguistics and music. Originally called "finite memory source" or "tree machines", these models became quite popular in the statistics literature under the name of "Variable Length Markov Chains" coined by Buhlmann and Wyner (1999).
In this talk I will present some of the basic ideas, problems and examples of applications in the field. I will focus on the velocity of convergence of a modified version of the algorithm Context which estimates the tree of contexts and the associated family of transition probabilities defining the chain.
|| Thursday, November 8 || || *Benedek Valko*, University of Toronto || || *Scaling limits of random matrices*** ||
We derive the point process limits of random eigenvalues in the bulk of the spectrum for the general beta-ensembles of random matrix theory. The limit is described by a one-parameter family of stochastic differential equations, the stochastic sine equation. We also show that this limit may be expressed as a simple functional of hyperbolic Brownian motion.� Together with recent work of Ramirez, Rider and Virag on the edge of the spectrum, this gives a complete treatment of possible point process limits for the eigenvalues of the beta-ensembles.
|| Thursday, November 15 || || *Ton Dieker*, IBM TJ Watson Research Center || || Determinantal transition kernels for some interacting particles on the line ||
Our goal is to study the n-step transition kernels of four Markovian interacting particle systems on the line. A key role in the analysis is played by related kernels arising from the (multidimensional) reflection principle. The latter kernels are determinantal and known to be linked to random-matrix theory. In the talk I will address how and why the particle-system kernels inherit this determinantal structure.
(Based on joint work with Jon Warren, University of Warwick, UK)
|| Thursday, November� 22 || || Thanksgiving || || No Seminar ||
|| Thursday, November 29*4:00 PM**� (NOTE change of time)**B223 Van Vleck (NOTE change of location)* || || *Soumik Pal*, Cornell University || || *Brownian motions interacting through ranks and a phase transition phenomenon*** ||
Consider n positive diffusions whose logarithms are Brownian motions whose drift vector at every time point is determined by the order in which the coordinates are arranged as a decreasing sequence. These processes appear naturally in a variety of areas from queueing theory, statistical physics, and economic modeling.
For finite n, the invariant distribution of the vector of spacings between the Brownian particles can be completely described. The interest is to describe a limiting invariant distribution when n is large. We show, as n grows to infinity, a curious phenomenon occurs for the rescaled positive diffusions divided by the sum of their coordinate values. Under very weak conditions, one of three things can happen to the scaled values: either they all go to zero, or the maximum grows to one while the rest go to zero, or they stabilize and converge in law to a Poisson-Dirichlet point process. The proof borrows ideas from Talagrand's analysis of Derrida's Random Energy Model of spin glasses.
The other alternative is to start with a countable collection of diffusions. We consider one such model and discuss the similarities and differences with the previous limit. This countable model is related to the Harris model of elastic collision and the discrete Ruzmaikina-Aizenmann model for competing particles.
This is based on separate joint works with Sourav Chatterjee and Jim Pitman.
Counting the number of distinct elements in a multi-set of objects (say a sequence of m numbers between 1 and n that can have repetitions) may seem like a trivial task. But if we want to accomplish this task using limited memory and with a single pass over the input (a.k.a. the streaming model), this becomes a hard problem and we have to settle for algorithms that produce approximate estimates. The analysis of the bias and error of the estimates produced by these algorithms often relies extensively on techniques from probability theory.
In this talk I will summarize various solutions to this problem that have been proposed in the last few decades. I will also present lower bounds on the amount of memory required by any possible solution to keep the error in the estimate under a given threshold. Finally I will present a new approach that outperforms existing solutions and more importantly could be the first one to come within a constant factor of the bounds that apply to all solutions to this problem. Along the way I will highlight some open questions where a better analysis of the behavior of various estimators can lead to improved solutions to this important problem.
|| Thursday, December 13 || || *Jesus Rodriguez*, Rutgers University || || *Pricing�issues�in the energy markets* ||
Energy markets around the world have undergone rapid deregulation in the past decade and the trend appears to be continuing. �This deregulation has naturally led to increased levels of volatility in the price of electricity, and hence a need to reduce exposure to risk for participants in the market. �To do this, we need to know how to price derivatives in the electricity market. �We will discuss, in a probabilistic framework, the issues in this market, and show how existing ideas can be used to deal with some of the more complicated issues.� Special attention will be given to the "swing" option which is particular to�energy markets.