addressing the error: get_max_length
#7 opened 8 months ago
by
rzgar
That docker example image is awesome.
🔥
1
#6 opened 11 months ago
by
1TBGPU4EVR
push to ollama please?
#5 opened about 1 year ago
by
loshka2
Still censored
2
#4 opened over 1 year ago
by
qe2
The checkpoint you are trying to load has model type `bunny-qwen` but Transformers does not recognize this architecture.
15
#3 opened over 1 year ago
by
catworld1212
Missing configuration_llava_qwen2.py and configuration_llava_qwen2.py ??
1
#1 opened over 1 year ago
by
nicolollo