Similar-Text-Generation

This model is a fine-tuned version of google-t5/t5-base on a google-research-datasets/paws dataset. It achieves the following results on the evaluation set:

• ROUGE-1: 92.54

• ROUGE-2: 76.70

• ROUGE-L: 84.82

• ROUGE-Lsum: 84.81

Training hyperparameters

The following hyperparameters were used during training:

• learning_rate: 3e-5

• train_batch_size: 2

• gradient_accumulation_steps=2

• seed: 42

• weight_decay=0.01

• lr_scheduler_type: linear

• warmup_ratio=0.1

• num_epochs: 2

Training results

Training Loss Epoch Step
0.496400 1 5450
0.471000 2 10900

Framework versions

• transformers: 4.56.0

• pytorch: 2.0.1+cu118

• datasets: 2.14.4

• tokenizers: 0.13.3

Downloads last month
42
Safetensors
Model size
0.2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Keyurjotaniya007/t5-base-paws-sentence

Base model

google-t5/t5-base
Finetuned
(698)
this model

Dataset used to train Keyurjotaniya007/t5-base-paws-sentence