Pinned Repositories
Acoustics_FTB
Algorithms
AnimationDemo
architecture-fullstack
This project scrapes data of stocks and feeds them down a pipeline that trains an ML model and displays a dashboard using Streamlit. This application is then containerized using docker and deployed to a virtual machine on AWS using EC2. This instance is then accessed from a custom domain name that is being hosted on a Cloudflare DNS server. The major problem within this project was the complex nature resulting from a high number of composition files. Due to the uniqueness of the deployment I found no insight from research and ultimately performed brute force through experimentation to reach a solution.
audio-signal-processing
bioinformatics-btree
This project will read the Human Genome by encoding the sequence. Sequenced using the 4 different organic chemicals, known as bases, the Human Genome is about 2.87 billion bases long. Utilizing the B-tree data structure as a solution for memory constraints we will also be able to search the sequence with a O(log n) capability.
Docker_VPS_App
This project scrapes data of stocks and feeds them down a pipeline that trains an ML model and displays it as a dashboard using Streamlit. This Streamlit app is then containerized using docker and deployed to a virtual machine on AWS using EC2. This EC2 instance is then accessed from a custom domain name that is being hosted on a Cloudflare DNS server.
forecast-product-demand
Harnessing the power of machine learning this project offers large retail stores a more efficient solution to their supply chain by identifying future sales of inventory. Allowing supply chain managers insight that can be used to measure lost sales from shortages along with a means of identifying products with the greatest upward trend; ultimately bolstering the decision-making process when faced with a limited shelf space. The major problem faced with this problem was designing a single model which understood a single product was listed at various indices where each record represents said product at an individual shop. I was able to solve this by slicing my data describing every product and then creating one of these slices for every store; this not only solved the problem but provided predictions on demand for every good at every location.
strategic-financial-insight
A project I completed for the Idaho Policy Institute where I delivered insightful models to bolster their decision-making process to form a strategy which best allocates government resources intended to reduce local crime. There were two main problems I encountered during this project; the first of which was imbalance of data across a twenty-year period. I was able to solve this problem when I identified a pattern showing consistent data collection amongst the variables every five years. The second problem was the quantity of variables within the data far exceeded my ability to manually analyze each individual value. I solved this by creating a pipeline which returned variables that surpassed a set threshold based upon their relevance to the set targets.
TensorAudio
cogentdom's Repositories
cogentdom/forecast-product-demand
Harnessing the power of machine learning this project offers large retail stores a more efficient solution to their supply chain by identifying future sales of inventory. Allowing supply chain managers insight that can be used to measure lost sales from shortages along with a means of identifying products with the greatest upward trend; ultimately bolstering the decision-making process when faced with a limited shelf space. The major problem faced with this problem was designing a single model which understood a single product was listed at various indices where each record represents said product at an individual shop. I was able to solve this by slicing my data describing every product and then creating one of these slices for every store; this not only solved the problem but provided predictions on demand for every good at every location.
cogentdom/Docker_VPS_App
This project scrapes data of stocks and feeds them down a pipeline that trains an ML model and displays it as a dashboard using Streamlit. This Streamlit app is then containerized using docker and deployed to a virtual machine on AWS using EC2. This EC2 instance is then accessed from a custom domain name that is being hosted on a Cloudflare DNS server.
cogentdom/TensorAudio
cogentdom/Algorithms
cogentdom/AnimationDemo
cogentdom/architecture-fullstack
This project scrapes data of stocks and feeds them down a pipeline that trains an ML model and displays a dashboard using Streamlit. This application is then containerized using docker and deployed to a virtual machine on AWS using EC2. This instance is then accessed from a custom domain name that is being hosted on a Cloudflare DNS server. The major problem within this project was the complex nature resulting from a high number of composition files. Due to the uniqueness of the deployment I found no insight from research and ultimately performed brute force through experimentation to reach a solution.
cogentdom/audio-signal-processing
cogentdom/bioinformatics-btree
This project will read the Human Genome by encoding the sequence. Sequenced using the 4 different organic chemicals, known as bases, the Human Genome is about 2.87 billion bases long. Utilizing the B-tree data structure as a solution for memory constraints we will also be able to search the sequence with a O(log n) capability.
cogentdom/strategic-financial-insight
A project I completed for the Idaho Policy Institute where I delivered insightful models to bolster their decision-making process to form a strategy which best allocates government resources intended to reduce local crime. There were two main problems I encountered during this project; the first of which was imbalance of data across a twenty-year period. I was able to solve this problem when I identified a pattern showing consistent data collection amongst the variables every five years. The second problem was the quantity of variables within the data far exceeded my ability to manually analyze each individual value. I solved this by creating a pipeline which returned variables that surpassed a set threshold based upon their relevance to the set targets.
cogentdom/API-Dashboard
cogentdom/API-Data-Runner
cogentdom/AppModTraining
cogentdom/Bengali-Audio
cogentdom/Cloud_Build
cogentdom/cogentdom
cogentdom/Flask_REST
cogentdom/LocationDemo
cogentdom/mastering-supply-chain
cogentdom/MessageDemo
cogentdom/MusicManipulation
cogentdom/PickerViewDemo
cogentdom/portfolio
cogentdom/Prices_Indices
cogentdom/RR_Diner_Coffee
cogentdom/Sentiment_API
cogentdom/Streamlit
cogentdom/SurveyDataCollector
cogentdom/SwiftDemo2
cogentdom/tf2-for-deep-learning
cogentdom/TFTwalkthrough