# Applied/ACMS/absS19

## Contents

# ACMS Abstracts: Spring 2019

### Jerry Zhu (University of Wisconsin-Madison, CS)

*Machine Teaching: Optimal Control of Machine Learning*

As machine learning is increasingly adopted in science and engineering, it becomes important to take a higher level view where the machine learner is only one of the agents in a multi-agent system. Other agents may have an incentive to control the learner. As examples, in adversarial machine learning an attacker can poison the training data to manipulate the model the learner learns; in education a teacher can optimize the curriculum to enhance student (modeled as a computational learning algorithm) performance. Machine teaching is optimal control theory applied to machine learning: the plant is the learner, the state is the learned model, and the control is the training data. In this talk I survey the mathematical foundation of machine teaching and the new research frontiers opened up by this confluence of machine learning and control theory.

### Abhishek Deshpande (UW-Madison, math)

*Switches in chemical and biological networks*

Switches are ubiquitous in both chemical and biological circuits. We explore the behaviour of autocatalytic switches in the context of the persistence conjecture. We show that networks without autocatalytic switches are persistent. The notion of a “critical siphon” forms the connecting link between autocatalysis and persistence. The talk will expand upon this connection.

Swtiches are also relevant from a biological perspective. We show that catalytic switches help in reducing retroactivity - the back effect on the upstream system when connected to the downstream system. In addition, for certain catalytic networks like the push-pull motif, high rates of energy consumption are not required to attenuate retroactivity. One can accomplish this by reducing the coupling to the push-pull motif. However, this reduction in coupling is not robust to cross-talk caused by leak reactions.

References:
1) https://arxiv.org/abs/1309.3957
2) https://arxiv.org/abs/1708.01792

## = Amy Cochran (UW-Madison, math and medical school)

*A model of online latent state learning'*

Researchers are increasingly interested in how humans perform a structured form of learning known as latent-state inferences. Latent state inferences refer to someone's ability to weigh competing hypotheses about one’s environment. Critically, this type of learning can help explain behavior and neural activity important to cognitive neuroscience and psychiatry. In this talk, I will first present a model of latent state learning that uses online, or recursive, updates. I will also discuss open questions related to this topic in hopes of generating discussion. Ultimately, I would like to engage students interested in the emerging area of computational psychiatry, as I will be joining the math department as an assistant professor in the Fall.