svenbl80/roberta-base-finetuned-new-mnli-run-0
This model is a fine-tuned version of roberta-base on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.0242
- Validation Loss: 0.7506
- Train Accuracy: 0.8638
- Epoch: 9
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 245430, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
Training results
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|---|---|---|---|
| 0.4535 | 0.4013 | 0.8557 | 0 |
| 0.3264 | 0.3638 | 0.8641 | 1 |
| 0.2447 | 0.4053 | 0.8649 | 2 |
| 0.1799 | 0.4217 | 0.8683 | 3 |
| 0.1305 | 0.4702 | 0.8621 | 4 |
| 0.0937 | 0.5705 | 0.8624 | 5 |
| 0.0664 | 0.6041 | 0.8616 | 6 |
| 0.0480 | 0.6936 | 0.8627 | 7 |
| 0.0342 | 0.7156 | 0.8624 | 8 |
| 0.0242 | 0.7506 | 0.8638 | 9 |
Framework versions
- Transformers 4.28.0
- TensorFlow 2.9.1
- Datasets 2.15.0
- Tokenizers 0.13.3
- Downloads last month
- 1