Rompt streamlines your workflow, improves collaboration, enhances GPT model performance, and provides seamless integration with its CLI tool and output format support.
- Version control and changelog on prompts
- Generate prompts from template strings
- Pull prompts from Rompt into your codebase
Pull your prompts into your codebase using the CLI:
npm i -g @romptai/cli
rompt pull --token {YOUR_TOKEN}
Install the client library:
npm install @romptai/client
To use the library, you'll first need to import it:
import { generate } from '@romptai/client';
const romptData = generate("your-prompt-name", {
NAME: "Michael",
DIRECTION: "Generate a Tweet",
SENTIMENT: `Make the Tweet about ${myOtherVariable}`
});
const { prompt } = romptData;
// Your result is now in the prompt variable
import { generate, track } from '@rompt/client';
// ...continued from above
track(romptData)
// Your GPT responses can be included; example with OpenAI:
const gptResponse = await openai.createCompletion({
prompt,
//...
});
track(romptData, gptResponse.data)
For detailed documentation, including API references and more examples, please visit the the Rompt.ai website.
We welcome contributions to the Rompt Node.js library. If you'd like to contribute, please submit a pull request on GitHub.
This project is licensed under the MIT License. For more information, please see the LICENSE file.