/k-neural-api

K-CAI NEURAL API - Keras based neural network API that will allow you to create parameter-efficient, memory-efficient, flops-efficient multipath models with new layer types. There are plenty of examples and documentation.

Primary LanguagePythonGNU Lesser General Public License v3.0LGPL-3.0

K-CAI NEURAL API VERSIONDOI

K-CAI NEURAL API is a Keras based neural network API that allows you to:

This project is a subproject from a bigger and older project called CAI and is sister to the free pascal based CAI NEURAL API.

Prerequisites

All you need is Keras, python and pip. Alternatively, if you prefer running on your web browser without installing any software on your computer, you can run it on Google Colab. Open In Colab

Quick Start with Image Classification on Your Own Web Browser

For a quick start, you can try the Simple Image Classification with any Dataset example. This example shows how to create a model and train it with a dataset passed as parameter. Feel free to modify the parameters and to add/remove neural layers directly from your browser. Open In Colab

Installation

Via Shell

Installing via shell is very simple:

git clone https://github.com/joaopauloschuler/k-neural-api.git k
cd k && pip install .

Installing on Google Colab

Place this on the top of your Google Colab Jupyter Notebook:

import os

if not os.path.isdir('k'):
  !git clone https://github.com/joaopauloschuler/k-neural-api.git k
else:
  !cd k && git pull

!cd k && pip install .

Documentation

The documentation is composed by examples and PyDoc.

Image Classification Examples

These examples show how to train a neural network for the task of image classification. Most examples train a neural network with the CIFAR-10 or CIFAR-100 datasets.

Advanced Image Classification Examples

These papers show how to create parameter-efficient models (source code is available):

First Layer Filters

The Heatmap and Activation Map with CIFAR-10 example shows how to quickly display heatmaps (CAM), activation maps and first layer filters/patterns.

These are filter examples:

Above image has been created with a code similar to this:

weights = model.get_layer('layer_name').get_weights()[0]
neuron_patterns = cai.util.show_neuronal_patterns(weights, NumRows = 8, NumCols = 8, ForceCellMax = True)
...
plt.imshow(neuron_patterns, interpolation='nearest', aspect='equal')

Activation Maps

These are activation map examples:

The above shown activation maps have been created with a code similar to this:

conv_output = cai.models.PartialModelPredict(InputImage, model, 'layer_name', False)
...
activation_maps = cai.util.slice_3d_into_2d(aImage=conv_output[0], NumRows=8, NumCols=8, ForceCellMax=True);
...
plt.imshow(activation_maps, interpolation='nearest', aspect='equal')

Heatmaps

The following image shows a car (left - input sample), its heatmap (center) and both added together (right).

Heatmaps can be produced following this example:

heat_map = cai.models.calculate_heat_map_from_dense_and_avgpool(InputImage, image_class, model, pOutputLayerName='last_conv_layer', pDenseLayerName='dense')

Gradient Ascent & Deep Dream

With cai.gradientascent.run_gradient_ascent_octaves, you can easily run gradient ascent to create Deep Dream like images:

base_model = tf.keras.applications.InceptionV3(include_top=False, weights='imagenet')
pmodel = cai.models.CreatePartialModel(base_model, 'mixed3')
new_img = cai.gradientascent.run_gradient_ascent_octaves(img=original_img, partial_model=pmodel, low_range=-4, high_range=1)
plt.figure(figsize = (16, 16))
plt.imshow(new_img, interpolation='nearest', aspect='equal')
plt.show()

Above image was generated from:

There is a ready to use example: Gradient Ascent / Deep Dream Example. Open In Colab

PyDoc

After installing K-CAI, you can find documentation with:

python -m pydoc cai.datasets
python -m pydoc cai.densenet
python -m pydoc cai.layers
python -m pydoc cai.models
python -m pydoc cai.util

Scientific Research

These papers were made with K-CAI API:

Feature List

  • A number of new layer types (see below).
  • cai.util.create_image_generator: this wrapper has extremely well tested default parameters for image classification data augmentation. For you to get a better image classification accuracy might be just a case of replacing your current data augmentation generator by this one. Give it a go!
  • cai.util.create_image_generator_no_augmentation: image generator for test datasets.
  • cai.densenet.simple_densenet: simple way to create DenseNet models. See example.
  • cai.datasets.load_hyperspectral_matlab_image: downloads (if required) and loads hyperspectral image from a matlab file. This function has been tested with AVIRIS and ROSIS sensor data stored as a matlab files.
  • cai.models.calculate_heat_map_from_dense_and_avgpool: calculates a class activation mapping (CAM) inspired on the paper Learning Deep Features for Discriminative Localization (see example below).
  • cai.util.show_neuronal_patterns: creates an array for visualizing first layer neuronal filters/patterns (see example below).
  • cai.models.CreatePartialModel(pModel, pOutputLayerName, hasGlobalAvg=False): creates a partial model up to the layer name defined in pOutputLayerName.
  • cai.models.CreatePartialModelCopyingChannels(pModel, pOutputLayerName, pChannelStart, pChannelCount): creates a partial model up to the layer name defined in pOutputLayerName and then copies channels starting from pChannelStart with pChannelCount channels.
  • cai.models.CreatePartialModelFromChannel(pModel, pOutputLayerName, pChannelIdx): creates a partial model up to the layer name defined in pOutputLayerName and then copies the channel at index pChannelIdx. Use it in combination with cai.gradientascent.run_gradient_ascent_octaves to run gradient ascent from a specific channel or neuron.
  • cai.models.CreatePartialModelWithSoftMax(pModel, pOutputLayerName, numClasses, newLayerName='k_probs'): creates a partial model up to the layer name defined in pOutputLayerName and then adds a dense layer with softmax. This method was built to be used for image classification with transfer learning.
  • cai.gradientascent.run_gradient_ascent_octaves: allows visualizing patterns recognized by inner neuronal layers. See example. Use it in combination with cai.models.CreatePartialModel, cai.models.CreatePartialModelCopyingChannels or cai.models.CreatePartialModelFromChannel.
  • cai.datasets.save_tfds_in_format: saves a TensorFlow dataset as image files. Classes are folders. See example.
  • cai.datasets.load_images_from_folders: practical way to load small datasets into memory. It supports smart resizing, LAB color encoding and bipolar inputs.

New Layers

  • cai.layers.ConcatNegation: concatenates the input with its negation (input tensor multiplied by -1).
  • cai.layers.CopyChannels: copies a subset of the input channels.
  • cai.layers.EnforceEvenChannelCount: enforces that the number of channels is even (divisible by 2).
  • cai.layers.FitChannelCountTo: forces the number of channels to fit a specific number of channels. The new number of channels must be bigger than the number of input channels. The number of channels is fitted by concatenating copies of existing channels.
  • cai.layers.GlobalAverageMaxPooling2D: adds both global Average and Max poolings. cai.layers.GlobalAverageMaxPooling2D speeds up training when used as a replacement for standard average pooling and max pooling.
  • cai.layers.InterleaveChannels: interleaves channels stepping according to the number passed as parameter.
  • cai.layers.kPointwiseConv2D: parameter-efficient pointwise convolution as shown in the papers Grouped Pointwise Convolutions Reduce Parameters in Convolutional Neural Networks and An Enhanced Scheme for Reducing the Complexity of Pointwise Convolutions in CNNs for Image Classification Based on Interleaved Grouped Filters without Divisibility Constraints.
  • cai.layers.Negate: negates (multiplies by -1) the input tensor.
  • cai.layers.SumIntoHalfChannels: divedes channels into 2 halfs and then sums both halfs. This results into an output with the half of the input channels.

Give this Project a Star

This project is an open source project. If you like what you see, please give it a star on github.

Citing this API

You can cite this API in BibTeX format with:

@software{k_cai_neural_api_2021_5810092,
  author       = {Joao Paulo Schwarz Schuler},
  title        = {K-CAI NEURAL API},
  month        = dec,
  year         = 2021,
  publisher    = {Zenodo},
  doi          = {10.5281/zenodo.5810092},
  url          = {https://doi.org/10.5281/zenodo.5810092}
}