|Meetings: MWF 8:50-9:40 Van Vleck B139|
|Instructor: Timo Seppäläinen|
|Office: 425 Van Vleck. Office Hours: Mondays and Wednesdays 11-12, other times by appointment.|
|E-mail: seppalai and then at math dot wisc dot edu (I can answer questions by email.)|
This is the course homepage that also serves as the syllabus for the course. Here you will find our weekly schedule and updates on scheduling matters. The Mathematics Department also has a general information page on this course.
Math 431 is an introduction to probability theory, the part of mathematics that studies random phenomena. We model simple random experiments mathematically and learn techniques for studying these models. Topics covered include methods of counting (combinatorics), axioms of probability, random variables, the most important discrete and continuous probability distributions, expectations, moment generating functions, conditional probability and conditional expectations, multivariate distributions, Markov's and Chebyshev's inequalities, laws of large numbers, and the central limit theorem.
Probability theory is ubiquitous in natural science, social science and engineering, so this course can be valuable in conjunction with many different majors. 431 is not a course in statistics. Statistics is a discipline mainly concerned with analyzing and representing data. Probability theory forms the mathematical foundation of statistics, but the two disciplines are separate.
From a broad intellectual perspective, probability is one of the core areas of mathematics with its own distinct style of reasoning. Among the other core areas are analysis, algebra, geometry/topology, logic and computation.
To go beyond 431 in probability you should take next 521 Analysis, and after that one or both of these: 632 Introduction to Stochastic Processes and 635 Introduction to Brownian Motion and Stochastic Calculus.
Here are grade lines that can be guaranteed in advance. A percentage score in the indicated range guarantees at least the letter grade next to it.
[100,89) A, [89,87) AB, [87,76) B, [76,74) BC, [74,62) C, [62,50) D, [50,0] F.
|1 ||1.2 Axioms of probability.||1.3 Sampling.|
||1.4 Infinitely many outcomes. Review of geometric series. 1.5 Inclusion-exclusion rules.||1.5 Using inclusion-exclusion and complements to calculate probabilities. DeMorgan's laws.||1.6 Random variables. Probability mass function of a discrete random variable.|
||2.1 Conditional probability. 2.2 Bayes formula.||2.2 Bayes formula. 2.3 Independence. Quiz on Chapter 1.||2.1 Conditional probability obeys the rules of probability. 2.3 Independence of 3 or more events. Independence of random variables. 2.4 Independent trials, Bernoulli and binomial distribution.|
||2.4 Independent trials, geometric distribution. 2.5 Conditional independence.||2.5 Birthday problem. 3.1 Probability mass functions, probability density functions. Quiz on Chapter 2.||3.1 Probability mass functions, probability density functions, cumulative distribution functions.|
||Review of sampling, this time in terms of random variables. 3.2 Expectation EX for discrete and continuous random variables.||Exam 1||3.2 Expectation E[g(X)] and variance Var(X) for discrete and continuous random variables.|
|6 ||3.2 Expectation and variance of aX+b. 3.3 Normal distribution.||4.1 Normal approximation of the binomial.||4.1 Normal approximation of the binomial with continuity correction.|
||4.1 Law of large numbers for binomial. Estimating an unknown success probability.||4.1 Estimating an unknown success probability. Confidence intervals and polling.||4.2 Poisson distribution and the Poisson approximation of the binomial.|
||4.2 Poisson approximation of the binomial. 4.3 Exponential distribution.||Exam 2||5.1 Moment generating function.|
||5.2 Distribution of a function of a random variable.||5.2 One more example of density of g(X). Gamma distribution. 6.1 Joint distribution of discrete random variables.||6.1 Joint distribution of discrete random variables. Multinomial distribution.|
||6.1 Independence in terms of probability mass functions. 6.2 Jointly continuous random variables.||6.2 Jointly continuous random variables, independence.||7.1 Sums of independent random variables.|
||7.1 Sums of independent random variables. 7.3 Excheangable random variables.||7.3 Excheangable random variables.||7.3 Hypergeometric distribution. 8.1 Linearity of expectation, indicator method.|
||8.2 Independence and expectations. 8.3 Moment generating function of a sum.||Exam 3||8.3 Sum of independent normals through m.g.f. 8.4 Covariance and correlation.|
||8.4 Covariance and correlation.||9. Markov's and Chebyshev's inequalities, law of large numbers.||Thanksgiving|
||9. Summarize Markov's and Chebyshev's inequalities, law of large numbers. Discuss central limit theorem. 10. Conditional distributions.|
The Math Club provides interesting lectures and other math-related events. Everybody is welcome.