Description

Adaptation of the flan-t5-xl weights to make it compatible with the FAT5 framework (Flash Attention T5).
This adaptation should enable the user to efficiently continue the pre-training of the flan-t5 to adapt it to more recent data, or to specialize it in a specific domain, for example.

Usage

For this FAT5-xl-flan-en, we forgot to add a file that is necessary for the model to run (and which we have since lost) ๐Ÿ™ƒ
Thanks to Thalesian for proposing a solution to this problem. We invite you to read his message here to run his fix.

from transformers import AutoModel, AutoTokenizer
model = AutoModel.from_pretrained("CATIE-AQ/FAT5-xl-flan-en", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("google/flan-t5-xl")
Downloads last month
17
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Collection including CATIE-AQ/FAT5-xl-flan-en