Random trashy output
Opened this issue · 0 comments
Hello, when I speak to my bot using a llama model (vigogne-2-7b-instruct.Q4_0.gguf), the output is very random with no error in the debug mode.
Example of error :
`[user]: Hello!
[bot]: Shux a one can em use long Sh you for long use em Sh one you long a message one useem response can you use you a useist help use a one for a write can messageem cas Sh Sh. they help useem can. write a can you youux useist you message response long. response use canemem Sh write one they can you casem casist you cas one one you Sh youux you response Sh one help a. Sh write response you ist Sh messageux one help.ist a one you can a a a responseist one oneist write. a one use cas responseux Sh message write cas a cas one message casem casux a write a message Shux canux. help cas response youist you message youist useem Sh help message write.ux Sh. . you responseistem response write write can can a ux help messageux Sh can use. writeux help casemistemux write you help cas Shuxemem cas write cas Sh a ux use Sh messageist a use response Shist use you a write a messageux. response messageemem casist you Sh help Sh casem. message write message response Shuxemem`
I can't find the origin of the bug, can you help me?