replicate/replicate-javascript

Value error missing text 'input'

jochemstoel opened this issue · 1 comments

When I run Llama-3-70b using this NPM module, I receive an error.

 {
  model: [
    'meta/meta-llama-3-70b:83c5bdea9941e83be68480bd06ad792f3f295612a24e4678baed34083083a87f'
  ],
  inputs: {
    debug: true,
    top_k: 50,
    top_p: 0.9,
    prompt: 'Paper title: A proof that drinking coffee causes supernovas\n' +
      '\n' +
      'In this essay, I will',
    max_tokens: 512,
    min_tokens: 0,
    temperature: 0.6,
    prompt_template: '{prompt}',
    presence_penalty: 1.15,
    frequency_penalty: 0.2
  }
}
{"detail":[{"loc":["body","input","text"],"msg":"field required","type":"value_error.missing"}]}
mattt commented

Hi @jochemstoel. Two things to fix here:

  1. You should be running meta/meta-llama-3-70b without a version SHA
  2. The key is input not inputs

You can find correct code for running this model with this client library here: https://replicate.com/meta/meta-llama-3-70b?input=nodejs

const input = {
  top_k: 50,
  top_p: 0.9,
  prompt: "Paper title: A proof that drinking coffee causes supernovas\n\nIn this essay, I will",
  max_tokens: 512,
  min_tokens: 0,
  temperature: 0.6,
  prompt_template: "{prompt}",
  presence_penalty: 1.15,
  frequency_penalty: 0.2
};

const output = await replicate.run("meta/meta-llama-3-70b", { input });
console.log(output);

Or streaming output by token:

const input = {
  top_k: 50,
  top_p: 0.9,
  prompt: "Paper title: A proof that drinking coffee causes supernovas\n\nIn this essay, I will",
  max_tokens: 512,
  min_tokens: 0,
  temperature: 0.6,
  prompt_template: "{prompt}",
  presence_penalty: 1.15,
  frequency_penalty: 0.2
};

for await (const event of replicate.stream("meta/meta-llama-3-70b", { input })) {
  process.stdout.write(event.toString());
};