Data, Inference, and Decisions
This course develops the probabilistic foundations of inference in data science, and builds a comprehensive view of the modeling and decision-making life cycle in data science including its human, social, and ethical implications. Topics include: frequentist and Bayesian decision-making, permutation testing, false discovery rate, probabilistic interpretations of models, Bayesian hierarchical models, basics of experimental design, confidence intervals, causal inference, robustness, Thompson sampling, optimal control, Q-learning, differential privacy, fairness in classification, recommendation systems and an introduction to machine learning tools including decision trees, neural networks and ensemble methods.
This class is listed as STAT 102.
- When: Lectures Tuesdays and Thursdays from 9:30AM to 11:00AM
- Where: GPB 100
- What: See the lecture schedule
- News: We will post updates about the class on Piazza
Lab, Section, and Office Hours Schedules
For official holidays see the academic calendar.
While we are working to make this class widely accessible we currently require the following (or equivalent) prerequisites :
Principles and Techniques of Data Science: DS100 covers important computational and statistical skills that will be necessary for DS102.
Probability: Probability and Random Processes EECS126, or Concepts of Probability STAT134, or Probability for Data Science STAT140, or Probability and Risk Analysis for Engineers IEOR172. EECS126 and STAT140 are prefered. These courses cover the probabilistic tools that will form the underpinning for the concepts covered in DS102.
Math: Linear Algebra & Differential Equations Math54, or Linear Algebra MATH110, or both Designing Information Devices and Systems I EE16A and Designing Information Devices and Systems II EE16B, or Linear Algebra for Data Science Stat89a, or Introduction to Mathematical Physics PHYSICS89. We will need some basic concepts like linear operators, eigenvectors, derivatives, and integrals to enable statistical inference and derive new prediction algorithms.
OH: W 4-5 (Soda 525)
OH: Th 11-12 (Evans 325)
Disc: M 9-10 (Evans 344)
Lab: W 9-10 (Evans 344)
OH: Tu 11-12 (Evans 426)
Disc: M 12-1 and 1-2 (Evans 344)
Lab: W 12-1 and 1-2 (Evans 344)
OH: Fr 11-12 (Evans 426)
Disc: M 11-12 (Evans 344)
Lab: W 10-11 (Evans 344)
OH: M 12-1 (Evans 426)
Disc: M 2-3 (Evans 344)
Lab: W 11-12 (Evans 344)
OH: Th 3-4 (Evans 426)
Disc: M 10-11 (Evans 344)
Lab: W 2-3 (Evans 344)
OH: W 1-2 (Evans 426)