June 17-28, 2013
Computer Lab I
June 17, 2013 2:00 pm - 3:00 pm
The labs' goal is for students to learn R commands and experience
data with R.
Computer Lab II
June 17, 2013 3:30 pm - 4:30 pm
The labs' goal is for students to learn R commands and experience
data with R.
Computer Lab III
June 17, 2013 8:00 pm - 9:00 pm
The labs' goal is for students to learn R commands and experience
data with R.
Computer Lab I
June 18, 2013 2:00 pm - 3:00 pm
The labs' goal is for students to learn R commands and experience
data with R.
Computer Lab II
June 18, 2013 3:30 pm - 4:30 pm
The labs' goal is for students to learn R commands and experience
data with R.
Computer Lab III
June 18, 2013 8:00 pm - 9:00 pm
The labs' goal is for students to learn R commands and experience
data with R.
Computer Lab
June 19, 2013 8:00 pm - 10:00 pm
The labs' goal is for students to learn R commands and experience
data with R.
Computer Lab
June 21, 2013 8:00 pm - 10:00 pm
The labs' goal is for students to learn R commands and experience
data with R.
Computer Lab
June 24, 2013 8:00 pm - 10:00 pm
This labs' goal is for students to learn R commands and experience data with R.
Computer Lab
June 25, 2013 8:00 pm - 10:00 pm
This labs' goal is for students to learn R commands and experience data with R.
Computer Lab I
June 26, 2013 2:00 pm - 3:00 pm
The labs' goal is for students to learn R commands and experience data with R.
Computer Lab II
June 26, 2013 3:30 pm - 4:30 pm
The labs' goal is for students to learn R commands and experience data with R.
Computer Lab III
June 26, 2013 8:00 pm - 9:00 pm
The labs' goal is for students to learn R commands and experience data with R.
Computer Lab I
June 27, 2013 2:00 pm - 3:00 pm
The labs' goal is for students to learn R commands and experience data with R.
Computer Lab II
June 27, 2013 3:30 pm - 4:30 pm
The labs' goal is for students to learn R commands and experience data with R.
Computer Lab III
June 27, 2013 8:00 pm - 9:00 pm
The labs' goal is for students to learn R commands and experience data with R.
Computer Lab I
June 28, 2013 2:00 pm - 3:00 pm
The labs' goal is for students to learn R commands and experience data with R.
Computer Lab II
June 28, 2013 3:30 pm - 4:30 pm
The labs' goal is for students to learn R commands and experience data with R.
Computer Lab III
June 28, 2013 8:00 pm - 9:00 pm
The labs' goal is for students to learn R commands and experience data with R.
Bayesian Linear Model
June 20, 2013 11:00 am - 12:30 pm
This lecture will review basic principles of Bayesian inference,
Bayesian hierarchical models, and the BUGS tool
for conducting Bayesian analyses.
Graphical Model (Theory Latent Models)
June 21, 2013 11:00 am - 12:30 pm
Graphical Markov models use graphs with nodes
corresponding to random variables, and edges that
encode conditional independence relationships between
those variables. Directed graphical models (aka Bayesian
networks) in particular have received considerable attention.
This lecture will review basic concepts in graphical
model theory such as Markov properties, equivalence,
and connections with causal inference.
Graphical Model Applications: Localization in Wireless Networks and Vaccine Response Modeling
June 24, 2013 11:00 am - 12:30 pm
This lecture will describe two applications of Bayesian graphic models.
Counter-factual Concepts
June 25, 2013 11:00 am - 12:30 pm
The counterfactual approach to casual inference dates back at least to Neyman and has developed considerably in recent decades due to pioneering work by Rubin, Pearl, Robbins, and others. This lecture will introduce the counterfactual approach and discuss specific examples.
Longitudinal Models I
June 26, 2013 9:00 am - 10:30 am
Estimating the effects of medical interventions from massive longitudinal observational medical data is of central importance in healthcare. This talk will describe typical methods used for this purpose and recent attempts to understand the limitation of these approaches.
Monte Carlo and Sequential Monte Carlo
June 26, 2013 11:00 am - 12:30 pm
This lecture will cover core Monte Carlo ideas such as Monte Carlo sampling, importance sampling, Markov chain Monte Carlo, and sequential Monte Carlo.
Model Averaging
June 27, 2013 9:00 am - 10:30 am
This lecture will introduce Bayesian model averaging and describe algorithms for applying model averaging in different contexts.
Online Learning
June 27, 2013 11:00 am - 12:30 pm
Algorithms for online (one example at a time) learning and model averaging.
Text Mining
June 28, 2013 9:00 am - 10:30 am
This lecture will introduce various problems in the analysis of textual data such as text categorization, entity recognition, and authorship attribution, and describe related methodology.
ABC
June 28, 2013 11:00 am - 12:30 pm
Approximate Bayesian computation is an extraordinarily simple method for doing Bayesian calculations. The lecture will describe the basic approach and consider a number of applications.
Data Collection
June 17, 2013 9:00 am - 10:30 am
In this lecture, we will discuss basic experimental design principles in data collection and issues regarding data quality. Specific data examples
such as the entron data set will be used.
Exploratory Data Analysis (EDA)
June 17, 2013 11:00 am - 12:30 pm
This lecture will cover data summarization and visualization tools
such as kernel estimation, loess, scatterplot and dimension
reduction via principal component analysis (PCA). Specific
data examples will be used.
Linear Regression
June 18, 2013 9:00 am - 10:30 am
This lecture reviews least squares (LS) method for linear fitting
and its statistical properites under various linear regression
model assumptions. Methods will be illustrated with real data examples
from instructor's research projects.
Generalized Linear Models
June 18, 2013 11:00 am - 12:30 pm
This lecture will generalize LS to weighted LS (WLS) and use WLS
to connect with generalized linear models including logistic
regression. Remote sensing data for cloud detection will be used.
Regularization: Model Selection and Ridge
June 19, 2013 9:00 am - 10:30 am
LS and Maximum Likelihood estimation (MLE) overfit when the dimension
of the model is not small relative to the sample size. This happens
almost always in high-dimensions. Regularziation often works
by adding a penalty to the fitting criterion as in classical
model selection methods such as AIC or BIC and L2-penalized
LS called Tikhonov regularization or Ridge regression.
We will also introduce Cross-validation (CV) for
regularization parameter selection.
Boosting and SVM
June 19, 2013 11:00 am - 12:30 pm
Boosting and Support Vector Machines (SVMs) are
two successful supervised machine learning methods.
This lecture will introduce them and relate them to LS and generalized
linear models. These methods will be applied to the remote sensing data
problem.
Lasso and Low-rank
June 20, 2013 9:00 am - 10:30 am
This lectures will cover two modern regularization topics: (1) Least Absolute Shrinkage and Selection Operator (Lasso) as a convex relaxation
to AIC or BIC and (2) low-rank regularization arising from the Netflix
competition. A subset of the netflix data will be investigated.
Structured Sparsity
June 21, 2013 9:00 am - 10:30 am
This lecture will discuss variants and extenstions of Lasso
such as Lasso+LS, adaptive Lasso, and group Lasso.
Sparse Modeling Theory
June 24, 2013 9:00 am - 10:30 am
This lecture will give a heuristic overview of theoretical results on Lasso that explains when and why Lasso and extensions work.
V4 Modeling
June 25, 2013 9:00 am - 10:30 am
This lecture will illustrate the power of the sparse coding principle and low-rank regularization in modeling neuron responses to natural images in the very challenging visual cortex area V4.