Update README.md
60aa28f
verified
-
1.52 kB
initial commit
2.pt
Detected Pickle imports (27)
- "transformers.models.longformer.modeling_longformer.LongformerOutput",
- "transformers.models.longformer.modeling_longformer.LongformerModel",
- "transformers.models.longformer.modeling_longformer.LongformerPooler",
- "__main__.CustomLongformer",
- "collections.OrderedDict",
- "transformers.models.longformer.modeling_longformer.LongformerEmbeddings",
- "torch.nn.modules.dropout.Dropout",
- "torch._C._nn.gelu",
- "transformers.models.longformer.modeling_longformer.LongformerLayer",
- "transformers.models.longformer.modeling_longformer.LongformerIntermediate",
- "__builtin__.set",
- "torch._utils._rebuild_parameter",
- "transformers.activations.GELUActivation",
- "torch._utils._rebuild_tensor_v2",
- "transformers.models.longformer.modeling_longformer.LongformerAttention",
- "transformers.models.longformer.configuration_longformer.LongformerConfig",
- "transformers.models.longformer.modeling_longformer.LongformerSelfOutput",
- "transformers.models.longformer.modeling_longformer.LongformerEncoder",
- "__main__.FlashAttention2",
- "torch.FloatStorage",
- "transformers.models.longformer.modeling_longformer.LongformerSelfAttention",
- "torch.nn.modules.sparse.Embedding",
- "torch.nn.modules.linear.Linear",
- "torch.nn.modules.activation.Tanh",
- "__main__.MixAdaptiveAttention",
- "torch.nn.modules.container.ModuleList",
- "torch.nn.modules.normalization.LayerNorm"
How to fix it?
874 MB
Upload 2.pt
-
5.44 kB
Update README.md