Pinned Repositories
autoprompt
AutoPrompt: Automatic Prompt Construction for Masked Language Models.
bigscience
Central place for the engineering/scaling WG: documentation, SLURM scripts and logs, compute environment and data.
copilot-docs
Documentation for GitHub Copilot
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
deepspeed-gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
Domainspecific-pretrain-for-BERT
gpt-neo-fine-tuning-example
Fine-Tune EleutherAI GPT-Neo And GPT-J-6B To Generate Netflix Movie Descriptions Using Hugginface And DeepSpeed
gptbot
GPT4 & LangChain Chatbot for large PDF docs
Hungarian-gpt-3
this is the repo of our bilingual English-Hungarian GPT-3 model
Hungarian-Wikipedia-QA
This repository contains a multilingual transformer-based Q/A solution for Hungarian Wikipedia pages.
dataandai's Repositories
dataandai/autoprompt
AutoPrompt: Automatic Prompt Construction for Masked Language Models.
dataandai/bigscience
Central place for the engineering/scaling WG: documentation, SLURM scripts and logs, compute environment and data.
dataandai/copilot-docs
Documentation for GitHub Copilot
dataandai/DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
dataandai/deepspeed-gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
dataandai/Domainspecific-pretrain-for-BERT
dataandai/gpt-neo-fine-tuning-example
Fine-Tune EleutherAI GPT-Neo And GPT-J-6B To Generate Netflix Movie Descriptions Using Hugginface And DeepSpeed
dataandai/gptbot
GPT4 & LangChain Chatbot for large PDF docs
dataandai/Hungarian-gpt-3
this is the repo of our bilingual English-Hungarian GPT-3 model
dataandai/Hungarian-Wikipedia-QA
This repository contains a multilingual transformer-based Q/A solution for Hungarian Wikipedia pages.
dataandai/Kismama-chatbot
dataandai/lm-evaluation-harness
A framework for few-shot evaluation of autoregressive language models.
dataandai/memit
Mass-editing thousands of facts into a transformer memory (MEMIT)
dataandai/minGPT
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
dataandai/minimal-gpt-neox-20b
dataandai/mup
maximal update parametrization (µP)
dataandai/neox
Simple Annotated implementation of GPT-NeoX in PyTorch
dataandai/onnxruntime
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
dataandai/OpenPrompt
An Open-Source Framework for Prompt-Learning.
dataandai/prompt-engine
A library for helping developers craft prompts for Large Language Models
dataandai/PromptPapers
Must-read papers on prompt-based tuning for pre-trained language models.
dataandai/promptsource
Toolkit for creating, sharing and using natural language prompts.
dataandai/PubmedQA-BERT
This repo contains a BERT-based engine for medical QA tasks using Pubmed abstracts.
dataandai/sambanova_starter
SambaNova starter files, a.k.a., boilerplates.
dataandai/soft-prompts
dataandai/text-generation-testing-ui
Web app for demoing the EAI models
dataandai/training_policies
Issues related to MLPerf™ training policies, including rules and suggested changes