Issue
#9
by
ItzPingCat
- opened
I think I found an issue with this model. When it gets short inputs, it is prone to hallucinate. Send it “what”. No system prompt. Now 3/5 times it will make some shit up. Quant used: Q4_K_M
SicariusSicariiStuff
changed discussion status to
closed