# Teaching

## ECE 2066: Science of Information (Fall 2018)

(Also Spring 2018.) This course is an introduction to the fundamental principles of the science of information. These principles apply broadly to information storage, processing, and transmission on any device. We will study both the mathematical foundations and the engineering solutions that rely on them. We will often use the iPhone as an example to demonstrate them: How does the iPhone, which has mostly digital components, interacts with an analog world? How does it store different types of data (music, video, apps) reliably, when the storage device itself (flash memory) is unreliable? What makes it possible to stream music over noisy wireless channels that sound so good (well, most of the time)? In the course, we will learn how fundamental concepts of information theory, computation, and signal processing can give us a better understanding of the answers to these questions. Topics include: definition of information; entropy; information representation in analog and digital forms; information transmission; spectrum and bandwidth; information transformation including data compression, filtering, encryption, and error correction; information storage and display; and large-scale information systems. Technologies for implementing information functions.

As the class moves forward, notes and assignments will be posted. Many have contributed to the notes, most notably Scott Acton.

## ECE, CS 4501/6501: Statistical Learning and Graphical Models (Fall 2017)

This course is focused on probabilistic models representing complex systems and using these models to draw conclusions about hidden values from observation and data. In particular, we will learn about probabilistic graphical models, which provide a flexible mechanism for representing statistical relationships between variables and processes. We will then review the fundamentals of inference, regression, and classification methods. Finally, we will study computational methods for inference and learning, enabling us to analyze, interpret, and explain patterns in complex data. Topics include:

• Representing belief and uncertainty with probability
• Directed and undirected graphical models
• Elimination algorithm, Factor graphs
• Sum-product algorithm and belief propagation
• Frequentist and Bayesian inference
• Linear regression and minimum mean square error
• Linear Classification
• Expectation Maximization (EM)
• Markov and hidden Markov models
• Markov Chain sampling methods

Material:

## ECE 6502/BME 6550: Inference methods (Spring 2017)

In this course, we focus on statistical inference techniques and their applications. Inference allows us to learn about unobserved quantities from observed data based on a probability model. For example, we can infer the evolutionary relationships between organisms based on their genomic sequence data and a probability model of evolutionary changes. We will consider both frequentist and Bayesian methods, but will focus on the latter which aims to combine existing information with new observations in a statistically consistent manner. A main component of the course is computational methods that make possible Bayesian analysis of large datasets, which are common in many engineering and scientific disciplines, including machine learning, artificial intelligence, computational biology, and statistical physics.

Structure: The first two thirds of the course will consist of lectures. In the last third, enough time will be devoted to project presentations and the rest will be instructor lectures.

Activities: The homework will consist of problems and programming excersises. There will also be a final course project which will either involve data analysis of a real dataset to gain new insights or explores developing new inference approaches.

Prerequisites: Standard linear algebra and calculus; Probability theory (briefly reviewed); A basic understanding of molecular biology is helpful but not necessary.

### Syllabus

1. Review of probability
1. Random variables & processes
2. Markov chains and Perron-Frobenius theory
3. Hidden Markov models
2. Frequentist inference methods
1. Maximum likelihood
2. Hypothesis testing
3. Point estimation methods and intervals
4. Applications to phylogenetics
3. Introduction to Bayesian methods
1. The Bayesian approach
2. Single-parameter models
3. Multiparameter models
4. Hierarchical models
4. Computational approaches to Bayesian inference
1. Monte-Carlo Markov chains
2. Expectation-maximization
3. Variational inference
5. Hidden Markov models
1. Three problems: evaluation, decoding, and inference
2. Gapped sequence alignment, Gene finding, Protein classification
6. Information theory and inference in computational biology:
1. Introduction to Information theory
2. Source coding and compression of biological sequences
3. Stochastic approximation and sequence evolution
4. Constrained codes and models of DNA as language

### References

• Gelman, Bayesian Data Analysis
• MacKay, Information Theory, Inference, and Learning Algorithms
• Gascuel, Mathematics of Evolution and Phylogeny

## Probability with Engineering Applications

This is an undergraduate probability course geared towards electrical and computer engineering students. I taught this course while I was a Ph.D. candidate in the ECE department at UIUC in the Summer of 2012.

My students rated my teaching effectiveness 5/5 in the course feedback forms, along with these very encouraging comments.

In this course, I gave the homework sets along with their solutions. The students were asked to solve the problems on their own and then check their solutions. They were tested by quizzes that were very similar to the homework problems.

Problem Sets Quizzes Exams

ME2 and its solution

ME3 and its solution