ymcui/Chinese-LLaMA-Alpaca-2

How can I output generation scores(logits)?

Sishxo opened this issue · 2 comments

Check before submitting issues

  • Make sure to pull the latest code, as some issues and bugs have been fixed.
  • I have read the Wiki and FAQ section AND searched for similar issues and did not find a similar problem or solution
  • Third-party plugin issues - e.g., llama.cpp, LangChain, text-generation-webui, we recommend checking the corresponding project for solutions

Type of Issue

Model inference

Base Model

Chinese-Alpaca-2 (7B/13B)

Operating System

Linux

Describe your issue in detail

This is my generation config

generation_config = GenerationConfig(
    temperature=0.1,
    top_k=20,
    top_p=0.9,
    do_sample=True,
    num_beams=1,
    repetition_penalty=1.0,
    max_new_tokens=400,
    output_scores=True,
)

However, It didn't output logits as expected

Dependencies (must be provided for code-related issues)

None

Execution logs or screenshots

None

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration.

Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.