Pinned Repositories
auto-release
bionic-gpt
BionicGPT is an on-premise replacement for ChatGPT, offering the advantages of Generative AI while maintaining strict data confidentiality
box2dweb
Automatically exported from code.google.com/p/box2dweb
gitui
Blazing 💥 fast terminal-ui for git written in rust 🦀
helioviewer-visualisation
Backend for the Helioviewer timeline
inbox-app
Google Inbox packaged as Electron app
keystone-demo
Demo app for Keystone
LiteDB
LiteDB - A .NET NoSQL Document Store in a single data file - www.litedb.org
nvshare
Practical GPU Sharing Without Memory Size Constraints
OrellBuehler's Repositories
OrellBuehler/auto-release
OrellBuehler/bionic-gpt
BionicGPT is an on-premise replacement for ChatGPT, offering the advantages of Generative AI while maintaining strict data confidentiality
OrellBuehler/box2dweb
Automatically exported from code.google.com/p/box2dweb
OrellBuehler/gitui
Blazing 💥 fast terminal-ui for git written in rust 🦀
OrellBuehler/helioviewer-visualisation
Backend for the Helioviewer timeline
OrellBuehler/inbox-app
Google Inbox packaged as Electron app
OrellBuehler/keystone-demo
Demo app for Keystone
OrellBuehler/LiteDB
LiteDB - A .NET NoSQL Document Store in a single data file - www.litedb.org
OrellBuehler/nvshare
Practical GPU Sharing Without Memory Size Constraints
OrellBuehler/onnxruntime
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
OrellBuehler/OrellBuehler
Config files for my GitHub profile.
OrellBuehler/plex_dupefinder
Find and delete duplicate files in Plex
OrellBuehler/prcpp-file-storage
OrellBuehler/rich-uncle-pennybags-bot
A telegram bot for all of your crypto needs. Works over the bitfinex and coinmarketcap APIs
OrellBuehler/runpod-containers
🐳 | Dockerfiles for the RunPod container images used for our official templates.
OrellBuehler/space-radar
Disk Space Visualization App built with Electron & d3.js
OrellBuehler/stable-diffusion-webui
Stable Diffusion web UI
OrellBuehler/subdir-heroku-buildpack
Allows to use subdirectory configured via environment variable as a project root
OrellBuehler/vllm
A high-throughput and memory-efficient inference and serving engine for LLMs