# Algebra in Statistics and Computation Seminar

**When**: 1:30-2:25, Thursdays (1:30-1:45pm Social Chit-Chat, 1:45-2:25 Talk Time)

**Where**: Virtual: https://uwmadison.zoom.us/j/95934501565

**Contact**: Jose Israel Rodriguez, Zinan Wang (Lead)

**Remark**: This informal seminar is held on the second Thursday of the month

## Contents

## Spring 2021 Schedule

Date | Speaker | Title | |
---|---|---|---|

February 11 | Elina Robeva (UBC) | Hidden Variables in Linear Causal Models | |

March 11 | Carlos Amendola Ceron (ULM) | Likelihood Geometry of Correlation Models | |

April 8 | Anna Seigal (Oxford) | TBD |

## Abstracts

### March 11: Carlos Amendola Ceron

Title: Likelihood Geometry of Correlation Models.

Abstract: Correlation matrices are standardized covariance matrices. They form an affine space of symmetric matrices defined by setting the diagonal entries to one. In this talk we will consider the fascinating geometry of maximum likelihood estimation for this model and linear submodels that encode additional symmetries. We also consider the problem of minimizing two closely related functions of the covariance matrix, the Stein's loss and the symmetrized Stein's loss, which lead naturally to the algebraic statistical concepts of dual ML degree and SSL degree. I will also present exciting open problems in this direction.

### February 11: Elina Robeva

Title: Hidden Variables in Linear Causal Models.

References: https://arxiv.org/abs/1807.07561 https://arxiv.org/abs/2001.10426, and https://arxiv.org/abs/2010.05306

Abstract: Identifying causal relationships between random variables from observational data is an important hard problem in many areas of data science. The presence of hidden variables, though quite realistic, pauses a variety of further problems. Linear structural equation models, which express each variable as a linear combination of all of its parent variables, have long been used for learning causal structure from observational data. Surprisingly, when the variables in a linear structural equation model are non-Gaussian the full causal structure can be learned without interventions, while in the Gaussian case one can only learn the underlying graph up to a Markov equivalence class. In this talk, we first discuss how one can use high-order cumulant information to learn the structure of a linear non-Gaussian structural equation model with hidden variables. While prior work posits that each hidden variable is the common cause of two observed variables, we allow each hidden variable to be the common cause of multiple observed variables. Next, we discuss hidden variable Gaussian causal models and the difficulties that arise with learning those. We show it is hard to even describe the Markov equivalence classes in this case, and we give a semi algebraic description of a large class of these models.

## Other events to note

date | event/title | location/speaker | info |
---|---|---|---|

Fourth Thursday's of the month | Applied Algebra Seminar, UW Madison | Virtual | |

Second Tuesday of the month, 10am | SIAM SAGA | Virtual: Recordings | Registration needed once. |

Biweekly Mondays | Algebraic Statistics Online Seminar (ASOS) | Virtual: Recordings | Mailing list sign-up for Zoom-links |

Fall 2020 | ASC Seminar | Virtual | |

More events | are listed here |