|Meetings: TuTh 9:30-10:45 Van Vleck B115|
|Instructor: Timo Seppäläinen|
|Office: 425 Van Vleck. Office Hours: after class, or by appointment.|
|E-mail: seppalai and then at math dot wisc dot edu|
This is the course homepage. Part of this information is repeated in the course syllabus that you find on Canvas. Here you will find our weekly schedule and updates on scheduling matters. The Mathematics Department has also a general information page on this course. Deadlines from the Registrar's page.
Math 531 is a mathematically rigorous introduction to probability theory at an advanced undergraduate level. Probability theory is the part of mathematics that studies random phenomena. From a broad intellectual perspective, probability is one of the core areas of mathematics with its own distinct style of reasoning. Among the other core areas are analysis, algebra, geometry/topology, logic and computation.
Probability theory is ubiquitous in natural science, social science and engineering, so this course can be valuable in conjunction with many different majors. 531 is not a course in statistics. Statistics is a discipline mainly concerned with analyzing and representing data. Probability theory forms the mathematical foundation of statistics, but the two disciplines are separate.
Math 531 gives an introduction to the basics (Kolmogorov axioms, conditional probability and independence, random variables, expectation) and goes over some classical parts of probability theory with proofs, such as the weak and strong laws of large numbers, DeMoivre-Laplace central limit theorem, the study of simple random walk, and applications of generating functions. Math 531 serves both as a stand-alone undergraduate introduction to probability theory and as a sequel to Math/Stat 431 for students who wish to learn the 431 material at a deeper level and tackle some additional topics.
After 531 the path forward in probability theory goes as follows. At the undergraduate level there are two courses on stochastic processes: 632 Introduction to Stochastic Processes and 635 Introduction to Brownian Motion and Stochastic Calculus. Another alternative is to take 629 Measure Theory or 721 Real Analysis I as preparation for graduate probability Math/Stat 733-734.
The great majority of the probability topics covered by 431 and 531 are the same. In 531 we gain a deeper understanding of the limit theorems (law of large numbers and central limit theorem) of probability. Math 431 is an intermediate course. It is more challenging than the recipe-oriented standard calculus and linear algebra courses, but it is not as demanding as rigorous 500 level math courses. Math 431 concentrates on calculations with examples. Examples are important in 531 also, but much class time is spent on developing theory and many examples are left to the students. In 531 homework and exams are a mixture of examples and proofs.
Recommendations. (i) If you enjoy proofs and are eager to work harder for a deeper introduction to probability, then 531 is your course. Otherwise take 431 for your introduction to probability. (ii) If you have already had analysis and 431 and wish to move ahead to new topics in probability, look at 632 and 635 for stochastic processes, and possibly at 629 as preparation for graduate probability. On the other hand, if you are looking to repeat an undergraduate introduction to probability, this time with more mathematical depth, then 531 is right for you.
Students who would benefit from reading a gentle introduction to probability on the side can consider acquiring the textbook for Math 431:
Anderson-Seppäläinen-Valkó: Introduction to Probability, Cambridge University Press, 2017.
The following is an example of a textbook that is pitched more or less at the right level for 531:
Grimmett-Stirzaker: Probability and Random Processes, Oxford University Press, 3rd edition.
Grimmett-Stirzaker is a more comprehensive book. It covers also part of the material of Math 632.
Course grades will be based on homework and quizzes (20%), two midterm exams (20%+20%), and a comprehensive final exam (40%). Midterm exams will be in class on the following dates:
Here are grade lines that can be guaranteed in advance. A percentage score in the indicated range guarantees at least the letter grade next to it.
[100,89) A, [89,87) AB, [87,76) B, [76,74) BC, [74,62) C, [62,50) D, [50,0] F.
|1 ||1.3 Kolmogorov's axioms for a probability space. Examples: finitely many fair coin flips, Poisson distribution on nonnegative integers, infinite sequence of fair coin flips.||1.3 Uniformly random point on a disk. Continuity of probability. Repeated rolls of a fair die produce a six eventually. 1.4 Conditional probability. Product rule, law of total probability. Coin example.|
||1.5 Independence. 1.6 is skipped.||1.7 Gambler's ruin. Other examples for reading. 2.1, 2.3 Random variables, discrete and continuous, PMF, PDF and CDF. 2.2 is skipped for now. HW 1 due.|
|| 2.1, 2.3 Random variables, discrete and continuous, PMF, PDF and CDF. [ASV 1.5, 3.1-3.2] |
2.5 Random vectors. Quiz 1.
|Finish Sections 2.1, 2.3-2.5. Prove characterization of CDF. HW 2 due.|
||Exam 1.||3.1, 3.5 The most important discrete distributions, derived from independent repeated trials: Bernoulli, binomial, geometric, Poisson. [ASV 2.4, 4.4]|
|| 3.1, 3.5 Poisson distribution. [ASV 4.4] |
4.1, 4.5 Continuous distributions: uniform, exponential, normal. [ASV 3.1, 3.5, 4.5]
| 4.5 Normal distribution. [ASV 3.5] |
3.2, 4.2 Independence of random variables. [ASV 2.3, 6.3]
Exam 1 bonus quiz.
|6 || Expectation of random variables.
3.3, 4.3 Special definitions for discrete and continuous random variables. Poisson and uniform examples. [ASV 3.3]
Development of general definition of expectation from separate lecture notes.
HW 3 due.
|Continue development of general definition of expectation from separate lecture notes.|
||Continue development of general definition of expectation from separate lecture notes.|| Finish development of expectation. |
3.3, 4.3 Formulas for E[g(X)]. Variance.
HW 4 due.
||Exam 2.|| 3.4 Computing expectations with indicator random variables. [ASV 8.1] |
Examples and properties of variance.
|| Independence and expectation. [ASV 8.2] |
3.6, 4.5 Covariance, its properties. Variance of a sum. Examples: indicators, uniform random points in higher dimensions. [ASV 6.1, 6.2, 8.4]
| 3.6, 4.5 Multinomial distribution. Correlation coefficient. Cauchy-Schwarz inequality. [ASV 6.1, 8.4] |
HW 5 due.
||SPRING BREAK||SPRING BREAK|
|| 3.6, 4.5 Completion of correlation topic. [ASV 8.4] |
3.8, 4.8 Sums of independent random variables. [ASV 7.1, 8.3]
Gamma function and gamma distribution. [ASV p. 153, GS p. 96]
|Moment generating function: identifying the distribution of a sum, calculating moments. [ASV 5.1, 7.1, 8.3]|
||Separate lecture notes: Markov and Chebyshev inequalities, weak law of large numbers. Convergence in probability and almost surely. Part of this material is in ASV 9.1-9.2. GS 7.2-7.4 cover more material.||Separate lecture notes: review of limsup and liminf, convergence of random variables as an event, first Borel-Cantelli lemma. |
HW 6 due.
||Separate lecture notes: strong law of large numbers.||Exam 3.|
||Separate lecture notes: central limit theorem for i.i.d. Bernoulli random variables. Confidence intervals.||3.7, 4.6 Conditional distributions: conditional probability mass function, conditional density function, conditional expectation. [ASV 10]|
||3.9, 3.10 Random walk. (Also separate lecture notes.)|| Further properties of random walk. |
4.12 Coupling and Poisson approximation.
||General central limit theorem for IID sequences with finite variance. Sketch of proof assuming finite MGF. Comparison of normal and Poisson approximation of the binomial. Review of joint densities.||Review of convolutions. Review of types of convergence. Application of Borel-Cantelli to prove that convergence in probability implies almost sure convergence along a subsequence|