Pinned Repositories
algorithm-design-analysis-and-complexity
csc373 summer 2018
angularjs-google-maps
The Simplest AngularJS Google Maps V3 Directive
angularjs-google-maps-components
Making google maps components with AngularJS
computer-algorithms-and-data-structures-2015
intro-to-artificial-intelligence-winter-2017
machine-learning-and-data-mining-winter-2017
Map-ListContactView
mars-rover-palnning
pizza-ordering-website
ROB301-Robot-Pizza-Delivery-
dawnduan's Repositories
dawnduan/algorithm-design-analysis-and-complexity
csc373 summer 2018
dawnduan/angularjs-google-maps
The Simplest AngularJS Google Maps V3 Directive
dawnduan/angularjs-google-maps-components
Making google maps components with AngularJS
dawnduan/computer-algorithms-and-data-structures-2015
dawnduan/intro-to-artificial-intelligence-winter-2017
dawnduan/machine-learning-and-data-mining-winter-2017
dawnduan/Map-ListContactView
dawnduan/mars-rover-palnning
dawnduan/pizza-ordering-website
dawnduan/ROB301-Robot-Pizza-Delivery-
dawnduan/aer1810_llm_paper
Transformers for annotationg scientific literature. We propose to use Transformer architectures for annotating scientific papers. Our target annotations are (1) results that can be used in meta-studies (2) items that match the SIGSOFT Empirical Standards items.
dawnduan/Contact-List-Map-View
dawnduan/cs230-code-examples
Code examples in pyTorch and Tensorflow for CS230
dawnduan/findYourCompetitorsNearby
Using Google Place API Web Service to find competitors around you
dawnduan/fun-written-ups
articles written stimulated by side-learning
dawnduan/GraphPDE
understanding the GNN for inverse problem
dawnduan/MIE1666_Machine-Learning-for-Mathematical-Optimization
Machine Learning for Mathematical Optimization
dawnduan/RAWSim-O
A simulation framework for Robotic Mobile Fulfillment Systems
dawnduan/rectangle-packer
Rectangle packing program
dawnduan/Robot-factory-
dawnduan/TPGNN-MTS
We aim to explore advanced deep learning strategies, comparing the effectiveness of Long Short-Term Memory (LSTM) networks, enhanced with various attention mechanisms, against Transformer models in an encoder-decoder architecture.