Rompt streamlines your workflow, improves collaboration, enhances GPT model performance, and provides seamless integration with its CLI tool and output format support.
- Version control and changelog on prompts
- Generate prompts from template strings
- Pull prompts from Rompt into your codebase
Install the client & CLI library:
pip install rompt
Pull your prompts into your codebase using the CLI:
rompt pull --token {YOUR_TOKEN}
To use the library, you'll first need to import it:
from rompt import generate
generated_with_metadata = generate(
prompt_name="your-prompt-name",
template_object={
NAME: "Michael",
DIRECTION: "Generate a Tweet"
},
)
prompt = generated_with_metadata.get('prompt')
# Your result is now in the prompt variable
from rompt import generate, track
# ...continued from above
track(generated_with_metadata)
# Your GPT responses can be included; example with OpenAI:
const gpt_response = openai.Completion.create({
prompt=prompt,
#...
})
track(generated_with_metadata, gpt_response)
For detailed documentation, including API references and more examples, please visit the the Rompt.ai website.
We welcome contributions to the Rompt Node.js library. If you'd like to contribute, please submit a pull request on GitHub.
This project is licensed under the MIT License. For more information, please see the LICENSE file.