Pinned Repositories
mcunet
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning
tiny-training
On-Device Training Under 256KB Memory [NeurIPS'22]
tinychat-tutorial
TinyChatEngine
TinyChatEngine: On-Device LLM Inference Library
tinyengine
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning; [NeurIPS 2022] MCUNetV3: On-Device Training Under 256KB Memory
Folder-Structure-Conventions
Folder / directory structure options and naming conventions for software projects
llama.cpp
Port of Facebook's LLaMA model in C/C++
mcunet
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning
tiny-training
On-Device Training Under 256KB Memory [NeurIPS'22]
tinyengine
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning; [NeurIPS 2022] MCUNetV3: On-Device Training Under 256KB Memory
RaymondWang0's Repositories
RaymondWang0/llama.cpp
Port of Facebook's LLaMA model in C/C++
RaymondWang0/Folder-Structure-Conventions
Folder / directory structure options and naming conventions for software projects
RaymondWang0/mcunet
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning
RaymondWang0/tiny-training
On-Device Training Under 256KB Memory [NeurIPS'22]
RaymondWang0/tinyengine
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning; [NeurIPS 2022] MCUNetV3: On-Device Training Under 256KB Memory