/MAT-494

MAT 494: Mathematical Methods in Data Science

Primary LanguageJupyter Notebook

MAT 494

Mathematical Methods in Data Science

Instructor: Professor Haiyan Wang

Section: 96893 (Fall 2022)


Contents

1. Linear Algebra

1.2. Elements of Linear Algebra

  • Orthogonality
  • Gram-Schmidt Process
  • Eigenvalues and Eigenvectors

1.3. Linear Regression

  • QR Decomposition
  • Least Squares
  • Linear Regression
  • Gradient Descent

1.4. Principal Component Analysis

  • Singular Value Decomposition
  • Low-Rank Matrix Approximations
  • Principal Component Analysis
  • Image Compression

2. Probability

2.2. Probability Distribution

  • Probability Axioms
  • Conditional Probability
  • Discrete Random Variables
  • Continuous Random Variables

2.3. Independent Variables and Random Samples

  • Joint Probability Distributions
  • Random Samples
  • Correlation and Covariance
  • Central Limit Theorem

2.4. Maximum Likelihood Estimation

  • MLE for Random Samples
  • Linear Regression

3. Calculus and Optimization

3.2. Continuity and Differentiation

  • Limits and Continuity
  • Derivatives
    • Partial Derivatives
    • Jacobian
    • Chain Rule
    • Directional Derivatives
    • Gradient
    • Hessian
  • Mean Value Theorem
  • Taylor's Theorem

3.3 Unconstrained Optimization

  • Local and Global Minimizers
  • Convexity
  • Gradient Descent

3.4 Logistic Regression

  • Logit Function
  • Sigmoid Function
  • Cross-Entropy Loss
  • Gradient Descent

3.5 K-Means

  • Within-Cluster Sum of Squares (WCSS)
  • K-Means Algorithm

3.6 Support Vector Machines

  • Hyperplane
  • Margin
  • Support Vectors
  • Loss Function

3.7 Neural Networks

  • Mathematical Model
  • Activation Functions
  • Cost Functions
  • Backpropagation

4. Network Analysis

4.1 Introduction

  • Graph Models
  • Laplacian Matrix

4.2 Spectral Graph Bipartitioning

  • Graph Partitioning
  • Raleigh Quotient
  • Balancing the Cut