Pinned Repositories
arcade-mcp
The best way to create, deploy, and share MCP Servers
SlackAgent
An assistant for Slack built with Arcade and Langgraph. Interact with Google Calendar, Mail, Github, Search Engines, Firecrawl and more all from within Slack
SmartSim
SmartSim Infrastructure Library.
ArXivChatGuru
Use ArXiv ChatGuru to talk to research papers. This app uses LangChain, OpenAI, Streamlit, and Redis as a vector database/semantic cache.
redis-vl-python
Redis Vector Library (RedisVL) -- the AI-native Python client for Redis.
arcade-llm-memory
LLM tools for managing semantic memory of LLM applications through Arcade tool calling.
arcade-obsidian
Arcade Tools for using LLMs to interact with Obsidian Vault contents
LLM-VectorDB-Bootcamp
Course material for the DSD bootcamp on combining large language models with vector databases.
LSTM-RNN
An exploration into Recurrent Neural Networks using LSTM. The implmentation is named Cryptonet as it attempts to model trends in the price of bitcoin of varying lengths. Results of the experiment can be found within the results folder.
SmartSim-additional-materials
The additional material for the SmartSim paper
Spartee's Repositories
Spartee/arcade-obsidian
Arcade Tools for using LLMs to interact with Obsidian Vault contents
Spartee/LLM-VectorDB-Bootcamp
Course material for the DSD bootcamp on combining large language models with vector databases.
Spartee/arcade-llm-memory
LLM tools for managing semantic memory of LLM applications through Arcade tool calling.
Spartee/arcade-js
Arcade AI NodeJS Client
Spartee/arcade-py
Arcade AI Python Client
Spartee/Book-recommendation-system
EDA and recsys models on Goodreads book dataset.
Spartee/arcade-ai
Arcade AI Python SDK and CLI
Spartee/ML-Smart-Pricing
Smart pricing tool for cars on craigslist
Spartee/MOM6
SmartSim augmented MOM6 Source code for collaboration with NCAR
Spartee/openai-cookbook
Examples and guides for using the OpenAI API
Spartee/smartsim-slurm-demo
A docker recipe for spinning up a slurm cluster to run some SmartSim applications
Spartee/arcade-go
Official Arcade Go Client
Spartee/azure-open-ai-embeddings-qna
Spartee/core
The core library and APIs implementing the Triton Inference Server.
Spartee/data
A PyTorch repo for data loading and utilities to be shared by the PyTorch domain libraries.
Spartee/devcluster
A developer tool for running the determined cluster.
Spartee/featureform
The Virtual Feature Store. Turn your existing data infrastructure into a feature store.
Spartee/HugeCTR
HugeCTR is a high efficiency GPU framework designed for Click-Through-Rate (CTR) estimating training
Spartee/hugectr_backend
Spartee/mrnet
The Multicast/Reduction Network library.
Spartee/openai-agents-arcade
Integration package for using Arcade tools with OpenAI Agents in Python
Spartee/openpbs
An HPC workload manager and job scheduler for desktops, clusters, and clouds.
Spartee/RedisGears
Dynamic execution framework for your Redis data
Spartee/server
The Triton Inference Server provides an optimized cloud and edge inferencing solution.
Spartee/SmartRedis
SmartSim Infrastructure Library Clients.
Spartee/SmartSim
SmartSim Infrastructure Library.
Spartee/SmartSim-Scaling
A repository of SmartSim scaling data and information
Spartee/spartee
repo for public readme
Spartee/systems
Merlin Systems provides tools for combining recommendation models with other elements of production recommender systems (like feature stores, nearest neighbor search, and exploration strategies) into end-to-end recommendation pipelines that can be served with Triton Inference Server.
Spartee/triton_third_party
Third-party source packages that are modified for use in Triton.