This report provides an overview of current practice of Electrical Impedance Tomography (EIT), its imaging and use-cases. Electrical Impedance Tomography is a non-invasive type of medical imaging. These advances are improving our capacity to treat and even prevent cancers. The full implications of the subject remain to be explored. Examples of research techniques used in this project are detailed.
- Python 3.x
- Numpy
- Scipy
- Pandas
- Matplotlib
- Sci-kit learn
- OpenCV Python
eit
│ README.md
│
└───assets
│ datasets - contains datasets in csv
│ eit_images - generated images
│
└───classification
│ *.ipynb - Classification ML algorithms
│ results.ipynb - Final results and graphs
│
└───docs
│ documentation and reports
│
└───main
eit_analysis.py
eit_classify.py
eit_dataset.py
-
generate_image.py
- Generates 1000 images
- Linspace and Meshgrid are numpy methods
-
read_img.py
- Reads an image into code
matrix
contains three dimensional array of imageimg
contains three dimensional array of image - image importgrayscale
contains two dimensional array of imagex
contains x dimension of imagey
contains y dimension of image
-
eit.py
- Plots a contour graphs
- Adds list of colors and be saved as an image
-
eit_dataset.py
- Generates dataset without labels - creates file
eit.csv
intensity_range_strings
contains ranges of intensitiesclassify_dict
contains dataset in the form of dictionarydf
contains final file to be converted to csv
- Generates dataset without labels - creates file
-
eit_analysis.py
- Assigns targets 1 or 0 and created another dataset - creates file
eit_data.csv
target
contains target array 0s and 1s
- Assigns targets 1 or 0 and created another dataset - creates file
-
eit_classify.py
- Generate classification plots - generates
eit_contour_plot.csv
autolabel
- function labels bar graphs
- Generate classification plots - generates
-
<*>.ipynb
- All classification ML algorithms - '<*>' means all files
No | Algorithms | Accuracy (%) |
---|---|---|
1 | K Nearest Neighbours | 93.6% |
2 | Decision Tree Classification | 98.8% |
3 | Kernel Support Vector Machines | 94% |
4 | Logistic Regression | 88.4% |
5 | Naive Bayes | 92.4% |
6 | Random Forest Classification | 99.2% |
7 | Support Vector Machines | 88% |
Copyright(c) 2018, Faststream Technologies
Authors:
CTO, Faststream Technologies