參考讀物:
- 深度學習:CNN原理
- Understanding and Calculating the number of Parameters in Convolution Neural Networks
- CNN入門-圖像增強
- 關於圖像語義分割的總結和感悟
- Exploring ZCA and color image whitening
- Activation Functions Explained - GELU, SELU, ELU, ReLU and more
- Activation Functions in Neural Networks
主題 | 作業 | 重點&相關專有名詞 |
---|---|---|
卷積層( Convolution Layer ) | Day_011_HW | Filter、Kernel、特徵接受域(Receptive Field)、 特徵圖(Feature Map)、權值共享(Shared Weights and Biases) |
步長( Strides ) & 填充( Padding ) | Day_012_HW | padding='valid' 表示對特徵圖周圍不補值 padding='same' 表示對特徵圖周圍補值 |
池化( Pooling ) & 全連接層( Fully Connected Layer ) | Day_013_HW | - 池化層 : 用以提取特徵、降低特徵圖的尺寸,以降低運算量且加速收斂; - 攤平(Flatten) : 用以銜接卷積層與全連接層; - 全連接層 : 卷積神經網絡的輸出層,主要作為預測各類別機率的分類器。 |
Batch Normalization | Day_014_HW | 梯度消失問題(Vanishing Gradient Problem)、激活函數(Activation Function) |
建構及訓練 CNN 模型 | Day_015_HW | |
圖像增強( Image Augmentation ) | Day_016_HW | 圖像處理-圖像白化、Zero-phase Component Analysis Whitening(ZCA Whitening)、語義分割(Semantic Segmentation) |
CNN 的演進 - LeNet、AlexNet、VGG | Day_018_HW | LeNet、AlexNet、VGG、Inception、ResNet、Dropout |
CNN 的演進 - Inception | Day_019_HW | 1x1 Kernel |
CNN 的演進 - ResNet | Day_020_HW | Residual Block : 可降低梯度消失發生的可能性 |
遷移式學習( Transfer Learning ) | Day_021_HW | |
實作: CNN 驗證識別碼 | Day_022_HW | 卷積循環神經網絡(Convolutional Recurrent Neural Network, CRNN)、Connectionist Temporal Classification(CTC) |
論文:
參考讀物:
- 階層式分群法 Clustering: Hierarchical Clustering
- Hierarchical Clustering — Agglomerative Clustering
- Non Maximum Suppression
- 卷積神經網路(Convolutional neural network, CNN): 1×1卷積計算在做什麼
- How are 1x1 convolutions the same as a fully connected layer?
- What does 1x1 convolution mean in a neural network?
- 建立自己的YOLO辨識模型 – 以柑橘辨識為例
參考資源:
- Browse State-of-the-Art
- Training Tensorflow for free: Pet Object Detection API Sample Trained On Google Colab