Pinned Repositories
auto-evaluator
Data_KoSuperNI
StrategyQA 데이터 세트 번역
flasma
High-performance vector search engine with no loss of accuracy through GPU and dynamic placement
kubespray
Deploy a Production Ready Kubernetes Cluster
llama3.cuda-ko
llama3.cuda is a pure C/CUDA implementation for Llama 3 model.
web-stable-diffusion
Bringing stable diffusion models to web browsers. Everything runs inside the browser with no server support.
webgpu-llm-loader
A loader that lets you try running LLMs built for WebGPU.
WikiQA
xionic
xionic-ko-llama-3-70b
SIONIC AI's Repositories
sionic-ai/xionic-ko-llama-3-70b
sionic-ai/flasma
High-performance vector search engine with no loss of accuracy through GPU and dynamic placement
sionic-ai/webgpu-llm-loader
A loader that lets you try running LLMs built for WebGPU.
sionic-ai/xionic
sionic-ai/Data_KoSuperNI
StrategyQA 데이터 세트 번역
sionic-ai/web-stable-diffusion
Bringing stable diffusion models to web browsers. Everything runs inside the browser with no server support.
sionic-ai/auto-evaluator
sionic-ai/WikiQA
sionic-ai/kubespray
Deploy a Production Ready Kubernetes Cluster
sionic-ai/llama3.cuda-ko
llama3.cuda is a pure C/CUDA implementation for Llama 3 model.
sionic-ai/CICERO_Ko
The purpose of this repository is to introduce new dialogue-level commonsense inference datasets and tasks. We chose dialogues as the data source because dialogues are known to be complex and rich in commonsense.
sionic-ai/Data_Ko_hh-rlhf
sionic-ai/mlc-llm
Enable everyone to develop, optimize and deploy AI models natively on everyone's devices.
sionic-ai/notion-blog
Deploy your own Notion-powered website in minutes with Next.js and Vercel.
sionic-ai/privacy
사이오닉에이아이 주식회사 개인정보 처리방침
sionic-ai/tensor-grad-test
sionic-ai/vllm
A high-throughput and memory-efficient inference and serving engine for LLMs
sionic-ai/web-llm
Bringing large-language models and chat to web browsers. Everything runs inside the browser with no server support.