I. Undergraduate Subjects
6.008 Introduction to InferenceP. Golland, G. W. Wornell, and L. Zheng
Introduces probabilistic modeling for problems of inference and machine learning from data, emphasizing analytical and computational aspects. Distributions, marginalization, conditioning, and structure, including graphical and neural network representations. Belief propagation, decision-making, classification, estimation, and prediction. Sampling methods and analysis. Introduces asymptotic analysis and information measures. Computational laboratory component explores the concepts introduced in class in the context of contemporary applications. Students design inference algorithms, investigate their behavior on real data, and discuss experimental results.
II. Graduate Subjects
6.437 Inference and InformationP. Golland and G. W. Wornell
Introduction to principles of Bayesian and non-Bayesian statistical inference. Hypothesis testing and parameter estimation, sufficient statistics; exponential families. EM agorithm. Log-loss inference criterion, entropy and model capacity. Kullback-Leibler distance and information geometry. Asymptotic analysis and large deviations theory. Model order estimation; nonparametric statistics. Computational issues and approximation techniques; Monte Carlo methods. Selected topics such as universal inference and learning, and universal features and neural networks.
6.438 Algorithms for InferenceG. Bresler, D. Shah and G. W. Wornell
Introduction to statistical inference with probabilistic graphical models. Directed and undirected graphical models, and factor graphs, over discrete and Gaussian distributions; hidden Markov models, linear dynamical systems. Sum-product and junction tree algorithms; forward-backward algorithm, Kalman filtering and smoothing. Min-sum and Viterbi algorithms. Variational methods, mean-field theory, and loopy belief propagation. Particle methods and filtering. Building graphical models from data, including parameter estimation and structure learning; Baum-Welch and Chow-Liu algorithms. Selected special topics.
6.432 Stochastic Processes, Detection and EstimationA. S. Willsky and G. W. Wornell
Fundamentals of detection and estimation for signal processing, communications, and control. Vector spaces of random
variables. Bayesian and Neyman-Pearson hypothesis testing. Bayesian and nonrandom parameter estimation. Minimum-variance unbiased estimators and the Cramer-Rao bounds. Representations for stochastic processes; shaping and whitening filters; Karhunen-Loeve expansions. Detection and estimation from waveform observations. Advanced topics; linear prediction and spectral estimation; Wiener and Kalman filters.
- Preamble: Notational Conventions
- Chapter 1 : Probability, Random Vectors, and Vector Spaces
- Chapter 2 : Detection Theory, Decision Theory, and Hypothesis Testing
- Chapter 3 : Estimation Theory
- Chapter 4 : Stochastic Processes and Systems
- Chapter 5 : Karhunen-Loeve and Sampled Signal Expansions
- Chapter 6 : Detection and Estimation from Waveforms
- Chapter 7 : Waveform Estimation, Wiener and Kalman Filtering