Pinned Repositories
2021-RW-MOP
AgentBoard
An Analytical Evaluation Board of Multi-turn LLM Agents
AgentKit
An intuitive LLM prompting framework for multifunctional agents, by explicitly constructing a complex "thought process" from simple natural language prompts.
AutoComplete
copy of autocomplete
CS231n-
Assignments
IndicBERT
Pretraining, fine-tuning and evaluation scripts for IndicBERT-v2 and IndicXTREME
KProj
PsycProj
Udacity-AIND
Udacity Artificial Intelligence Nanodegree
Udacity-Deep-Learning-Foundation
Udacity Course
preritt's Repositories
preritt/AutoComplete
copy of autocomplete
preritt/IndicBERT
Pretraining, fine-tuning and evaluation scripts for IndicBERT-v2 and IndicXTREME
preritt/AgentBoard
An Analytical Evaluation Board of Multi-turn LLM Agents
preritt/AgentKit
An intuitive LLM prompting framework for multifunctional agents, by explicitly constructing a complex "thought process" from simple natural language prompts.
preritt/brownie_fund_me
Smart Contract application
preritt/AutoRAG
RAG AutoML Tool - Find optimal RAG pipeline for your own data.
preritt/BrainLM
preritt/camel
🐫 CAMEL: Communicative Agents for “Mind” Exploration of Large Language Model Society (NeruIPS'2023) https://www.camel-ai.org
preritt/CKA-Centered-Kernel-Alignment
Reproduce CKA: Similarity of Neural Network Representations Revisited
preritt/Diffusion-Models-pytorch
Pytorch implementation of Diffusion Models (https://arxiv.org/pdf/2006.11239.pdf)
preritt/dinov2
PyTorch code and models for the DINOv2 self-supervised learning method.
preritt/dowhy
DoWhy is a Python library for causal inference that supports explicit modeling and testing of causal assumptions. DoWhy is based on a unified language for causal inference, combining causal graphical models and potential outcomes frameworks.
preritt/fastbook
The fastai book, published as Jupyter Notebooks
preritt/genomics-research
Google genomics paper (SR)
preritt/gpt-pilot
Dev tool that writes scalable apps from scratch while the developer oversees the implementation
preritt/GroundingDINO
Official implementation of the paper "Grounding DINO: Marrying DINO with Grounded Pre-Training for Open-Set Object Detection"
preritt/jupyter-ai
A generative AI extension for JupyterLab
preritt/leetcode
🔥LeetCode solutions in any programming language | 多种编程语言实现 LeetCode、《剑指 Offer(第 2 版)》、《程序员面试金典(第 6 版)》题解
preritt/lit-gpt
Hackable implementation of state-of-the-art open-source LLMs based on nanoGPT. Supports flash attention, 4-bit and 8-bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
preritt/mamba
preritt/OpenDevin
🐚 OpenDevin: Code Less, Make More
preritt/plip
Pathology Language and Image Pre-Training (PLIP) is the first vision and language foundation model for Pathology AI (Nature Medicine). PLIP is a large-scale pre-trained model that can be used to extract visual and language features from pathology images and text description. The model is a fine-tuned version of the original CLIP model.
preritt/python_for_microscopists
https://www.youtube.com/channel/UC34rW-HtPJulxr5wp2Xa04w?sub_confirmation=1
preritt/Quality-and-Safety-for-LLM-Applications
Explore new metrics and best practices to monitor your LLM systems and ensure safety and quality
preritt/rags
Build ChatGPT over your data, all with natural language
preritt/selfcheckgpt
SelfCheckGPT: Zero-Resource Black-Box Hallucination Detection for Generative Large Language Models
preritt/Swin-Transformer
This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".
preritt/tab-transformer-pytorch
Implementation of TabTransformer (PT), attention network for tabular data, in Pytorch
preritt/udlbook
Understanding Deep Learning - Simon J.D. Prince
preritt/Vim
Vision Mamba: Efficient Visual Representation Learning with Bidirectional State Space Model