Pinned Repositories
BinaryClassificationMNIST
This project classifies the handwritten digit images of the mnist dataset using a logistic regression single neuron neural network. Open and run the jupyter notebook to get a tutorial on how to build this logistic regression model by scratch.
cifar100_cnn
Implements multiple CNN architectures to classify cifar100 dataset
differential-privacy-sgd
DogShelterGraph
Example of minimum spanning tree problem and minimum vertex coloring problem for a dog shelter. Assigns dogs that do not get along in the minimum number of kennels, then with given distances, creates a minimum spanning tree with kennels as vertices and paths as edges.
doh-dot-gateway
dot-fingerprinting
follower-recommendations
By counting the number of two-paths in a graph, this program gives social media follower recommendations for each user.
HuffmanEncoding
Huffman encoding is a variable-length encoding method that can be used to compress a file without losing any information in the compression. This implies that when the file is decompressed, you get back exactly the original file. A variable-length encoding method allows colors to be represented by a different number of bits -- not always one byte (which is 8 bits). Colors that are used more often should be represented with fewer bits, and vice versa. The Huffman encoding algorithm was described in class. Please refer to your class notes for details on the algorithm. The basic steps are these: ● Read through the image file and count the frequencies of the colors. ● From the frequency table, make leaf nodes representing the colors that exist in the image. Store the frequency of the color and the color value (a number between 0 and 255) in each node. ● Build a Huffman tree from the bottom up. You do this as follows, continuing until only one node doesn't have parent nodes. ○ Choosing from among the nodes with no parent nodes, find the two nodes with the smallest frequencies stored in them. Make a parent node from them, storing in parent node the sum of the frequencies of the children node. You don't need to store a color in the new node. Only leaf nodes (the original ones) have a color stored in them. ● Once you have the Huffman tree built, you use it to encode the colors represented by the leaf nodes. Traverse the tree, gathering a 0 each time you move down a left edge and a 1 each time you move down a right edge. You gather the 0s and 1s as characters in a string and print out the string as the bit encoding of the color at the leaf node. This suffices as a simulation of the bit encoding to colors. 1 ● Figure out how many bits would be required in the compressed grayscale image, and how many bits would be required for the image if each pixel was one byte. Convert this to a percentage of the original file's size. For example, 64% would mean that the compressed file would be 64% the size of the original file's uncompressed size (not counting the size of the table that would have to be included in the compressed file for decoding
marl-anomaly-detect
This project tests multiple different machine learning algorithms that can detect adversarial attacks in multi-agent reinforcement learning settings. Baselines were used to compare performance of a proposed ensemble model. Then, using FGSM, we re-attacked the ensemble detection model with perturbed observations. Read more at the pdf titled FinalPaper.
nslkdd-intrusion
camlischke1's Repositories
camlischke1/marl-anomaly-detect
This project tests multiple different machine learning algorithms that can detect adversarial attacks in multi-agent reinforcement learning settings. Baselines were used to compare performance of a proposed ensemble model. Then, using FGSM, we re-attacked the ensemble detection model with perturbed observations. Read more at the pdf titled FinalPaper.
camlischke1/nslkdd-intrusion
camlischke1/SceneClassificationCNN
The task is to identify which kind of scene can the image be categorized into: Buildings, Forests, Mountains, Glacier, Street, Sea
camlischke1/seed-labs-jhu
Includes lab writeups and code for Syracuse SEED Security Labs
camlischke1/BinaryClassificationMNIST
This project classifies the handwritten digit images of the mnist dataset using a logistic regression single neuron neural network. Open and run the jupyter notebook to get a tutorial on how to build this logistic regression model by scratch.
camlischke1/cifar100_cnn
Implements multiple CNN architectures to classify cifar100 dataset
camlischke1/differential-privacy-sgd
camlischke1/DogShelterGraph
Example of minimum spanning tree problem and minimum vertex coloring problem for a dog shelter. Assigns dogs that do not get along in the minimum number of kennels, then with given distances, creates a minimum spanning tree with kennels as vertices and paths as edges.
camlischke1/doh-dot-gateway
camlischke1/dot-fingerprinting
camlischke1/follower-recommendations
By counting the number of two-paths in a graph, this program gives social media follower recommendations for each user.
camlischke1/HuffmanEncoding
Huffman encoding is a variable-length encoding method that can be used to compress a file without losing any information in the compression. This implies that when the file is decompressed, you get back exactly the original file. A variable-length encoding method allows colors to be represented by a different number of bits -- not always one byte (which is 8 bits). Colors that are used more often should be represented with fewer bits, and vice versa. The Huffman encoding algorithm was described in class. Please refer to your class notes for details on the algorithm. The basic steps are these: ● Read through the image file and count the frequencies of the colors. ● From the frequency table, make leaf nodes representing the colors that exist in the image. Store the frequency of the color and the color value (a number between 0 and 255) in each node. ● Build a Huffman tree from the bottom up. You do this as follows, continuing until only one node doesn't have parent nodes. ○ Choosing from among the nodes with no parent nodes, find the two nodes with the smallest frequencies stored in them. Make a parent node from them, storing in parent node the sum of the frequencies of the children node. You don't need to store a color in the new node. Only leaf nodes (the original ones) have a color stored in them. ● Once you have the Huffman tree built, you use it to encode the colors represented by the leaf nodes. Traverse the tree, gathering a 0 each time you move down a left edge and a 1 each time you move down a right edge. You gather the 0s and 1s as characters in a string and print out the string as the bit encoding of the color at the leaf node. This suffices as a simulation of the bit encoding to colors. 1 ● Figure out how many bits would be required in the compressed grayscale image, and how many bits would be required for the image if each pixel was one byte. Convert this to a percentage of the original file's size. For example, 64% would mean that the compressed file would be 64% the size of the original file's uncompressed size (not counting the size of the table that would have to be included in the compressed file for decoding
camlischke1/IndexedColorCompression
camlischke1/Infix-to-Prefix
Translates infix mathematical expressions to prefix and evaluates them
camlischke1/KnapsackProblem
You are given a knapsack of capacity W and n items with weights {wi} and values {vi}. Each item may be included at most once. This program finds the optimal solution to a given knapsack problem.
camlischke1/MNISTClassification
This program implements a Concurrent Neural Network that takes in 28x28 pixel images of handwritten digits and classifies it as a zero, one, two, three, four, five, six, seven, eight, or nine. It uses tensorflow and keras and numpy libraries to recognize an image and predict its digit through a deep learning technique.
camlischke1/NwSec
EN.650.624 Network Security
camlischke1/sans-index-template
camlischke1/UnsupervisedLearningMNIST
This Jupyter notebook implements Hierarchical Agglomerative Clustering and KMeans clustering to cluster the MNIST dataset into 10 clusters.
camlischke1/vcenter_sdk_test
just playing around with the vcenter automation SDK for python
camlischke1/WarGameLisp
This program is written in Common Lisp and is a simulation of the card game "War". Two players are each dealt hands of 26 cards, half the deck. Every round, both players flip their card. Whoever has the highest card wins and keeps the other person's card. If it is a tie, then both players flip 3 more cards face down, and then flip one last one to break the tie. The winner keeps all cards on the table. The goal is to get the other player's cards.
camlischke1/WarGameProlog
This program is written in Common Lisp and is a simulation of the card game "War". Two players are each dealt hands of 26 cards, half the deck. Every round, both players flip their card. Whoever has the highest card wins and keeps the other person's card. If it is a tie, then both players flip 3 more cards face down, and then flip one last one to break the tie. The winner gets a point. The objective is to beat your opponent. To start, using a Prolog interpreter, you must call the function start(). in the listener window.