roberta-sst2-lora-ep20-lr0p0003-bs16-2025-06-18-2050

This model is a fine-tuned version of roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2882
  • Accuracy: 0.9220

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.06
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.3878 0.1188 500 0.2728 0.9037
0.2949 0.2375 1000 0.2129 0.9186
0.2878 0.3563 1500 0.2032 0.9300
0.2787 0.4751 2000 0.2175 0.9174
0.2725 0.5938 2500 0.2130 0.9266
0.2705 0.7126 3000 0.2233 0.9220
0.2646 0.8314 3500 0.2274 0.9278
0.2547 0.9501 4000 0.2045 0.9289
0.2226 1.0689 4500 0.2683 0.9232
0.236 1.1876 5000 0.2474 0.9335
0.2155 1.3064 5500 0.2416 0.9392
0.2129 1.4252 6000 0.2280 0.9312
0.2118 1.5439 6500 0.1991 0.9243
0.2068 1.6627 7000 0.2125 0.9278
0.2195 1.7815 7500 0.2320 0.9323
0.2338 1.9002 8000 0.2197 0.9323
0.189 2.0190 8500 0.2589 0.9300
0.2028 2.1378 9000 0.1971 0.9358
0.1867 2.2565 9500 0.2429 0.9415
0.1836 2.3753 10000 0.1874 0.9415
0.1838 2.4941 10500 0.2014 0.9346
0.2083 2.6128 11000 0.1746 0.9461
0.1718 2.7316 11500 0.2116 0.9346
0.1789 2.8504 12000 0.2324 0.9323
0.1744 2.9691 12500 0.2097 0.9300
0.1514 3.0879 13000 0.2472 0.9369
0.188 3.2067 13500 0.2393 0.9381
0.1899 3.3254 14000 0.2148 0.9335
0.1768 3.4442 14500 0.2605 0.9323
0.186 3.5629 15000 0.2117 0.9346
0.1804 3.6817 15500 0.2242 0.9346
0.1762 3.8005 16000 0.2293 0.9404
0.1928 3.9192 16500 0.2421 0.9358
0.181 4.0380 17000 0.2244 0.9369
0.1602 4.1568 17500 0.2303 0.9392
0.16 4.2755 18000 0.2671 0.9369
0.1475 4.3943 18500 0.2882 0.9220

Framework versions

  • PEFT 0.15.2
  • Transformers 4.52.4
  • Pytorch 2.1.0+cu118
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for ekiprop/roberta-sst2-lora-ep20-lr0p0003-bs16-2025-06-18-2050

Adapter
(306)
this model