Statistical Learning

Statistical learning theory provides the theoretical foundation for machine learning. This section covers the key concepts and methods in statistical learning.

Introduction to Statistical Learning

Key Concepts

  • Statistical Learning vs Machine Learning
  • Supervised vs Unsupervised Learning
  • Parametric vs Non-parametric Methods
  • Function Approximation

Learning Theory

  • Risk Minimization
  • Empirical Risk Minimization
  • VC Dimension
  • Generalization Bounds

Linear Models

Linear Regression

  • Simple Linear Regression
  • Multiple Linear Regression
  • Polynomial Regression
  • Regularization Methods (Ridge, Lasso)

Classification

  • Linear Discriminant Analysis
  • Logistic Regression
  • Perceptron Algorithm
  • Multiclass Classification

Advanced Methods

Support Vector Machines

  • Maximum Margin Classifier
  • Soft Margin Classification
  • Kernel Trick
  • SVMs for Regression

Kernel Methods

  • Kernel Functions
  • Radial Basis Functions
  • Polynomial Kernels
  • Custom Kernels

Model Selection and Assessment

Cross-Validation

  • K-fold Cross-validation
  • Leave-one-out Cross-validation
  • Stratified Cross-validation
  • Time Series Cross-validation

Model Selection

  • Bias-Variance Tradeoff
  • Model Complexity
  • Information Criteria (AIC, BIC)
  • Regularization Parameters

Probabilistic Methods

Bayesian Methods

  • Bayesian Linear Regression
  • Bayesian Classification
  • Prior and Posterior Distributions
  • Maximum A Posteriori Estimation

Probabilistic Graphical Models

  • Bayesian Networks
  • Markov Random Fields
  • Hidden Markov Models
  • Factor Graphs

Advanced Topics

High-Dimensional Statistics

  • Curse of Dimensionality
  • Sparse Models
  • Dimension Reduction
  • Feature Selection

Modern Methods

  • Ensemble Methods
  • Boosting and Bagging
  • Random Forests
  • Neural Networks from a Statistical Perspective