This repository packages StableLM as a Truss.
Stability AI recently announced the ongoing development of the StableLM series of language models, and simultaneously released a number of checkpoints for this model.
Utilizing these models for inference can be challenging given the hardware requirements. With Baseten and Truss, this can be dead simple. You can see the full code repository here.
There are four models that were released:
- "stabilityai/stablelm-base-alpha-7b"
- "stabilityai/stablelm-tuned-alpha-7b"
- "stabilityai/stablelm-base-alpha-3b"
- "stabilityai/stablelm-tuned-alpha-3b"
You can modify the load
method in model.py
to select the version you'd like to deploy.
model_name = "stabilityai/stablelm-tuned-alpha-7b" #@param ["stabilityai/stablelm-base-alpha-7b", "stabilityai/stablelm-tuned-alpha-7b", "stabilityai/stablelm-base-alpha-3b", "stabilityai/stablelm-tuned-alpha-3b"]
We found this model runs reasonably fast on A10Gs; you can configure the hardware you'd like in the config.yaml
.
...
resources:
cpu: "3"
memory: 14Gi
use_gpu: true
accelerator: A10G
...
The usual GPT-style parameters will pass right through to the inference point:
- max_new_tokens (default: 64)
- temperature (default: 0.5)
- top_p (default: 0.9)
- top_k (default: 0)
- num_beams (default: 4)
If the tuned versions are needed for use in Chatbots; prepend the input message with the system prompt as described in the StableLM Readme:
system_prompt = """<|SYSTEM|># StableLM Tuned (Alpha version)
- StableLM is a helpful and harmless open-source AI language model developed by StabilityAI.
- StableLM is excited to be able to help the user, but will refuse to do anything that could be considered harmful to the user.
- StableLM is more than just an information source, StableLM is also able to write poetry, short stories, and make jokes.
- StableLM will refuse to participate in anything that could harm a human.
"""
prompt = f"{system_prompt}<|USER|>What's your mood today?<|ASSISTANT|>"
Deploying the Truss is easy; simply load it and push.
import baseten
import truss
stablelm_truss = truss.load('.')
baseten.deploy(stablelm_truss)