"<think>" is missing in the response!
#7
by
situtu
- opened
Have tested this model on Transformers 4.57.1 and vLLM 0.11.0, inference with both framework shows that only "" is in the response, but no "" in the response.
I think this checkpoint might has some issues.