Update checkpoint for transformers>=4.29
#14
by
						
ArthurZ
	
							HF Staff
						- opened
							
					
Following the merge of a PR in transformers it appeared that this model was not properly converted. This PR will fix the inference and was tested using the following script:
>>> from transformers import MarianModel, MarianMTModel
>>> tokenizer = AutoTokenizer.from_pretrained(Helsinki-NLP/opus-mt-tc-big-en-ar)
>>> inputs = tokenizer(Hey! Let's learn together, return_tensors=pt, padding=True)
>>> model = MarianMTModel.from_pretrained(Helsinki-NLP/opus-mt-tc-big-en-ar)
>>> print(tokenizer.batch_decode(model.generate(**inputs)))
['<pad> ููุชุนููู
 ู
ุนุงู</s>']
