TOPICS IN ECONOMETRICS: CROSS-SECTION ANALYSIS
(Econ/ARE 240F)
 
 
SYLLABUS
Department of Economics , University of California - Davis
SPRING 2024

Instructor:
Professor Colin Cameron      accameron@ucdavis.edu   https://cameron.econ.ucdavis.edu/

Meeting:
Tuesday Thursday 10.30 am - 11.50 pm   Hoagland 113

Office Hours:
Tuesday 3.30-5.00 pm (in office) and Wednesday 3.30-5.00pm (zoom and in office)

Discussion Section Meeting:
Friday 9.00 - 9.50 am  Room TBA

Teaching Assistant:

Kathya Tapia     kattapia@ucdavis.edu
Office hours: Friday 1.00 - 3.00 pm in SSH 0118

Pre-requisites:
The listed pre-requisite is Econ / ARE 240D.  The essential pre-requisite is Econ / ARE 240D.

Course Goals: Cover several topics in cross-section econometrics not covered in previous classes.
Basic theory will be presented plus implementation using Stata.

Brief Course Outline
Classes 1-10     Statistical Learning / Machine Learning and Causal Econometrics with Machine Learning
Class 12-13       Simulation and Monte Carlo experiments
Classes 14-16   Bayesian regression, multiple imputation
Class     17        Simulated maximum likelihood
Classes  18-19  Cluster-robust infrence
Classes 20        Spatial regression

Software
Course will mostly use Stata.
Additionally we wilI use Python for machine learning. For Python click here.
A good Python reference is Kevin Sheppard (2021), Introduction to Python for Econometrics, Statistics and Numerical Analysis: Fourth+ Edition"
pdf at  https://www.kevinsheppard.com/teaching/python/notes/ 

Texts for Machine Learning

For Stata Implementation and Some Theory
MUS2: Chapter 28 of Colin Cameron and Pravin Trivedi (2023), Microeconometrics using Stata: Volume 2: Nonlinear Models and Causal Inference, Second Edition, Stata Press. https://cameron.econ.ucdavis.edu/mus2/
Pdf of a draft of chapter 28 will be available at the course Canvas site.
 
For statistical learning the main text is an undergraduate level / Masters level book
ISLR2: Gareth James, Daniela Witten, Trevor Hastie and Robert Tibshirani (2021), An Introduction to Statistical Learning: with Applications in R, Second Edition, Springer.
ISLP: Gareth James, Daniela Witten, Trevor Hastie, Robert Tibshirani and Jonathan Taylor (2023), An Introduction to Statistical Learning: with Applications in Python, Springer.
Free legal pdfs are at https://www.statlearning.com/
A $40 hardcopy can be ordered via Springer MyCopy
 
Supplementary material on statistical learning is in the Ph.D. level book
ESL: Trevor Hastie, Robert Tibshirani and Jerome Friedman (2009), The Elements of Statistical Learning: Data Mining, Inference and Prediction, Springer.
A free legal pdf is at http://statweb.stanford.edu/~tibs/ElemStatLearn/index.html

Texts for Other Topics
MMA:
Colin Cameron and Pravin Trivedi (2005), Microeconometrics: Methods and Applications, Cambridge University Press.
MUS2:
Colin Cameron and Pravin Trivedi (2023), Microeconometrics using Stata, Second Edition, Stata Press.

More detailed Course Outline

Class 1
Overview
Course slides

ML Part 1  Model selection and cross validation
Course slides
ISL Chapters 5.1, 6.1; MUS2 Chapter 28.1-28.2 and 11.3.8.

Classes 2-3
ML Part 2:
Shrinkage methods (lasso, ridge, elastic net)
Course slides
ISL Chapters 6.2;  ESL pp.73-79, 86-9; MUS2 Chapter 28.3-28.4

Classes 4-5
ML Part 3:
ML for causal inference using lasso
Course slides
MUS2 Chapter 28.8
Alex Belloni, Victor Chernozhukov and Christian Hansen (2014), "High-dimensional methods and inference on structural and treatment effects," Journal of Economic Perspectives, Spring, 29-50.
Victor Chernozhukov, Denis Chetverikov, Mert Demirer, Esther Duflo, Christian Hansen, Whitney Newey and James Robins (2018), "Double/debiased machine learning for treatment and structural parameters," The Econometrics Journal, 21, C1-C68.

Classes 6-7
ML Part 4:
Other ML methods for prediction
Course slides
MUS2 Chapter 28.5, 28.6.1-28.6.6, 28.7
Principal Components and Partial Least Squares (ISL chapter 6.3)
High-dimensional Data (ISL chapter 6.4)
Polynomials, Step Functions and Basis Functions (ISL chapter 7.1-7.3)
Splines (ISL chapter 7.4-7.5)
Local Regression (ISL chapter 7.6)
Generalized Additive Models (ISL chapter 7.7)
Regression Trees (ISL chapter 8.1)
Bagging, Random Forests, Boosting (ISL chapter 8.2)
Neural Networks (ISL chapter 10.1-10.7)
Sendhil Mullainathan and J. Spiess: "Machine Learning: An Applied Econometric Approach", Journal of Economic Perspectives, Spring 2017, 87-106.
Hal Varian, "Big Data: New Tricks for Econometrics", Journal of Economic Perspectives, Spring, 3-28.

Classes 8-9
ML Part 5:
More ML for causal inference, especially ATE with heterogeneous effects
Course slides which include references
MUS2 Chapter 28.6.7, 28.6.8

Class 10
ML Part 6:
Classification and unsupervised learning
Course slides
Logistic Regression and Discriminant Analysis  (ISL chapter 4.1-4.5)
Support Vector Machines (ISL chapter 9.1-9.3)
Unsupervised Learning (ISL chapter 12.1-12.4)

Class 11
Midterm Exam

Classes 12-13
Simulation and Monte Carlo Experiments
Course slides, MMA chapter 12, MUS2 chapter 5

Classes 12-14
Bayes Part 1:
MCMC theory and application in Stata
Course slides, MMA chapter 13, MUS2 chapter 29

Class 15-16
Bayes Part 2:
Data Augmentation, Imputation
Course slides, MMA chapter 13, 27, MUS2 chapter 30.

Class 17
Maximum Simulated Likelihood; Brief Bootstrap
Course slides, MMA Chapter 12.

Classes 18-19
Cluster-robust Inference
Course slides

Classes 20
Spatial Regression

Other Material:
Slides, assignments, programs, data, etc will be posted at the course Canvas website under Files.

Computer Materials:

Assignments will use STATA and Python.
The Stata code for my machine learning slides is at https://cameron.econ.ucdavis.edu/sfu2022/ 

Course Grading:
Assignments 40%  Tentative: Due 10.30 am Ass 1 due April 16 (Tuesday); Ass 2 due May 2 (Thursday), Ass 3 due May 21 (Tuesday), Ass 4 due June 5 (Wednesday).
(Submit as pdf under Canvas / assignments)
Midterm 30%     Tuesday May 7  10.30 am  Machine learning  In-class exam
Final 30%           Wednesday June 12  8.00 - 10.00 am  Material after machine learning. In-class exam.
Assignments must be handed in on time, so solutions can be discussed in class and distributed in a timely manner.
No credit for late assignments. All must be done.
Academic integrity is required. What is academic integrity? See the UCD Student Judicial Affairs website http://sja.ucdavis.edu/
As an exception to their rules, I permit some collaboration with other students in doing assignments, but the work handed in must be your own. Each person must create their own Stata output and write up their own answers. And you are to write on your assignment the name of the person(s) you worked with.
Exams will be closed book.