Pinned Repositories
alphafold3-pytorch
Implementation of Alphafold 3 in Pytorch
DiffSynth-Studio
Enjoy the magic of Diffusion models!
dotFiles
o2 commands, etc
FIDDLE
functional genomic data integration
funsae
disentangling function and structure in homologous protein sequences with autoencoders
LightRAG
The "PyTorch" library for LLM applications.
LlamaEdge
The easiest & fastest way to run customized and fine-tuned LLMs locally or on the edge
LLM101n
LLM101n: Let's build a Storyteller
LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
moondream
tiny vision language model
dylanmmarshall's Repositories
dylanmmarshall/alphafold3-pytorch
Implementation of Alphafold 3 in Pytorch
dylanmmarshall/DiffSynth-Studio
Enjoy the magic of Diffusion models!
dylanmmarshall/dotFiles
o2 commands, etc
dylanmmarshall/FIDDLE
functional genomic data integration
dylanmmarshall/funsae
disentangling function and structure in homologous protein sequences with autoencoders
dylanmmarshall/LightRAG
The "PyTorch" library for LLM applications.
dylanmmarshall/LlamaEdge
The easiest & fastest way to run customized and fine-tuned LLMs locally or on the edge
dylanmmarshall/LLM101n
LLM101n: Let's build a Storyteller
dylanmmarshall/LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
dylanmmarshall/notebooks
dylanmmarshall/openfold
Trainable, memory-efficient, and GPU-friendly PyTorch reproduction of AlphaFold 2
dylanmmarshall/ProteinMPNN
Code for the ProteinMPNN paper
dylanmmarshall/pyspark-examples
Pyspark RDD, DataFrame and Dataset Examples in Python language
dylanmmarshall/pyspark-tutorial
PySpark-Tutorial provides basic algorithms using PySpark
dylanmmarshall/rag_cookbooks
dylanmmarshall/RFdiffusion
Code for running RFdiffusion
dylanmmarshall/seqsal
dylanmmarshall/stageify
dylanmmarshall/stageifyUpdated
dylanmmarshall/triton
Development repository for the Triton language and compiler
dylanmmarshall/unet.cu
UNet diffusion model in pure CUDA
dylanmmarshall/x-transformers
A simple but complete full-attention transformer with a set of promising experimental features from various papers