deberta_toxic_cls
This model is a fine-tuned version of microsoft/deberta-v3-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.3694
- Accuracy: 0.8054
- Precision: 0.7440
- Recall: 0.9942
- F1: 0.8511
- Auc: 0.8908
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 13
- gradient_accumulation_steps: 8
- total_train_batch_size: 256
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 8
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Auc |
|---|---|---|---|---|---|---|---|---|
| No log | 1.0 | 141 | 0.4441 | 0.8012 | 0.7428 | 0.9861 | 0.8473 | 0.8880 |
| No log | 2.0 | 282 | 0.3568 | 0.8042 | 0.7453 | 0.9875 | 0.8495 | 0.8905 |
| No log | 3.0 | 423 | 0.3691 | 0.8052 | 0.7444 | 0.9926 | 0.8508 | 0.8922 |
| 0.4062 | 4.0 | 564 | 0.3701 | 0.8054 | 0.7440 | 0.9942 | 0.8511 | 0.8908 |
| 0.4062 | 5.0 | 705 | 0.3925 | 0.8051 | 0.7436 | 0.9944 | 0.8509 | 0.8915 |
| 0.4062 | 6.0 | 846 | 0.3891 | 0.8056 | 0.7498 | 0.9793 | 0.8493 | 0.8921 |
| 0.4062 | 7.0 | 987 | 0.3860 | 0.8070 | 0.7573 | 0.9638 | 0.8482 | 0.8943 |
| 0.3208 | 8.0 | 1128 | 0.3909 | 0.8073 | 0.7603 | 0.9575 | 0.8475 | 0.8939 |
Framework versions
- Transformers 4.57.1
- Pytorch 2.8.0+cu129
- Datasets 4.4.1
- Tokenizers 0.22.1
- Downloads last month
- 12
Model tree for reichenbach/deberta_toxic_cls
Base model
microsoft/deberta-v3-base