Negfir's Stars
DmitryRyumin/AAAI-2024-Papers
AAAI 2024 Papers: Explore a comprehensive collection of innovative research papers presented at one of the premier artificial intelligence conferences. Seamlessly integrate code implementations for better understanding. ⭐ experience the forefront of progress in artificial intelligence with this repository!
automl/HW-GPT-Bench
HW-GPT-Bench: Hardware-Aware Architecture Benchmark for Language Models
microsoft/nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
D-X-Y/AutoDL-Projects
Automated deep learning algorithms implemented in PyTorch.
SamsungLabs/eagle
Measuring and predicting on-device metrics (latency, power, etc.) of machine learning models
felixchenfy/Speech-Commands-Classification-by-LSTM-PyTorch
Classification of 11 types of audio clips using MFCCs features and LSTM. Pretrained on Speech Command Dataset with intensive data augmentation.
NVIDIA/NeMo
A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Automatic Speech Recognition and Text-to-Speech)
jha-lab/txf_design-space
[JAIR'23] FlexiBERT tool for Transformer design space exploration.
aaronserianni/training-free-nas
automl/NASLib
NASLib is a Neural Architecture Search (NAS) library for facilitating NAS research for the community by providing interfaces to several state-of-the-art NAS search spaces and optimizers.
automl/awesome-transformer-search
A curated list of awesome resources combining Transformers with Neural Architecture Search
quark0/darts
Differentiable architecture search for convolutional and recurrent networks
fmsnew/nas-bench-nlp-release
zheng-ningxin/brp-nas
greentfrapp/deep-learning-book-notes
Notes on Deep Learning textbook by Ian Goodfellow, Yoshua Bengio and Aaron Courville
eriklindernoren/ML-From-Scratch
Machine Learning From Scratch. Bare bones NumPy implementations of machine learning models and algorithms with a focus on accessibility. Aims to cover everything from linear regression to deep learning.
yaoxingcheng/TLM
ICML'2022: NLP From Scratch Without Large-Scale Pretraining: A Simple and Efficient Framework
huawei-noah/Pretrained-Language-Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
ethereon/netscope
Neural network visualizer
rbonghi/jetson_stats
📊 Simple package for monitoring and control your NVIDIA Jetson [Orin, Xavier, Nano, TX] series
anyoptimization/pymoo
NSGA2, NSGA3, R-NSGA3, MOEAD, Genetic Algorithms (GA), Differential Evolution (DE), CMAES, PSO
jmschrei/pomegranate
Fast, flexible and easy to use probabilistic modelling in Python.
mit-han-lab/hardware-aware-transformers
[ACL'20] HAT: Hardware-Aware Transformers for Efficient Natural Language Processing
VicentePerezSoloviev/EDAspy
Estimation of Distribution algorithms Python package
HarisIqbal88/PlotNeuralNet
Latex code for making neural networks diagrams
ray-project/ray
Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.
huggingface/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
fatemehNe/grad-process
فرآیند فارغ التحصیلی در دانشکده کامپیوتر امیرکبیر
gpoore/minted
minted is a LaTeX package that provides syntax highlighting using the Pygments library. Highlighted source code can be customized using fancyvrb.
academicpages/academicpages.github.io
Github Pages template for academic personal websites, forked from mmistakes/minimal-mistakes