/Machine_Learning

Some notes and codes in the process of learning machine learning | 机器学习基础算法的笔记和代码 | KNN (k近邻) | Linear Model (线性模型) | SVM (支持向量机) | Decision Tree (决策树)

Primary LanguageJupyter NotebookGNU Affero General Public License v3.0AGPL-3.0

🤖Machine Learning

📝Notes and Code

Here are some notes and code from the process of learning machine learning. You can read these notes online by clicking the links in the notes directory below, and the code for each chapter can be downloaded from the GitHub repository.

这里是一些学习机器学习过程中的笔记和代码。你可以通过点击下面笔记目录中的链接在线阅读这些笔记,在GitHub仓库中可以下载各章节的代码。

📚Some Recommended Books:

Author Book Name
Mehryar Mohri, Afshin Rostamizadeh, Ameet Talwalkar Foundations of Machine Learning (2nd Edition)
Shai Shalev-Shwartz and Shai Ben-David Understanding Machine Learning From Theory to Algorithms
Aurélien Géron Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems (2nd Edition)
周志华 机器学习
李航 统计学习方法

👨‍🏫Some Recommended Courses

⭐Links to Notes

  • 1 The Machine Learning Landscape

    Basic Framework for Machine Learning | Basic Concepts of Machine Learning | Hypothesis Space | Loss Functions and Learning Algorithms | Main Challenges of Machine Learning | Training and Validation | K-Fold Cross Validation

    机器学习基本框架 | 机器学习基本概念 | 假设空间 | 损失函数和学习算法 | 机器学习的主要挑战 | 训练和验证 | K折交叉验证

  • 2 Classification Overview

    MNIST handwritten digit dataset as an example to demonstrate the basic process of using machine learning classification algorithms | Cross-validation | Confusion matrix | Precision and recall | ROC curve | Multi-objective classification | Error analysis

    以MNIST手写数字数据集为例演示机器学习分类算法使用的基本流程 | 交叉验证 | 混淆矩阵 | 精度和召回率 | ROC曲线 | 多目标分类 | 错误分析

  • 3 K-Nearest Neighbor

    K-nearest Neighbor Algorithm

    K近邻算法

  • 4 Linear Model

    Linear Regression | Gradient Descent | L1 Regularization | L2 Regularization | Linear Classification | Logistic Regression | Softmax Regression

    线性回归 | 梯度下降 | L1正则化 | L2正则化 | 线性分类 | Logistic 回归 | Softmax回归

  • 5 SVM Theory

    Basic Concepts of Support Vector Machines | Hard-Margin Support Vector Machines | Soft-Margin Support Vector Machines | Lagrangian Pair Problem | Kernel Trick

    支持向量机基本概念 | 硬间隔支持向量机 | 软间隔支持向量机 | 拉格朗日对偶问题 | 核技巧

  • 6 SVM Applications

    SVM Applications | Linear SVM Classification | Nonlinear SVM Classification | Polynomial Kernel Function | Gaussian RBF Kernel Function

    SVM应用 | 线性SVM分类 | 非线性SVM分类 | 多项式核函数 | 高斯RBF核函数

  • 7 Decision Tree

    Decision Tree Model | Decision Trees and Conditional Probability Distributions | Feature Selection | Information Gain | ID3 Algorithm | C4.5 Algorithm | Pruning of Decision Trees | CART Algorithm | Missing Values Processing

    决策树模型 | 决策树与条件概率分布 | 特征选择 | 信息增益 | ID3算法 | C4.5算法 | 决策树的剪枝 | CART算法 | 缺失值处理

  • 8 Random Forest

    Bagging | Random Forest

    装袋算法 | 随机森林

  • 9 Bossting

    Ensemble Learning | Parallel Ensemble vs.Sequential Ensemble | Weak Learner | Boosting |Adaptive Boosting (AdaBoost) | Coordinate Descent |Gradient Boosting

    集成学习 | 并行集成vs.串行集成 | 弱学习 | 提升方法 | AdaBoost | 坐标下降 | 梯度增强

  • Kmeans

    Kmeans | Kmeans++