square_run_with_16_batch_size
This model is a fine-tuned version of google/vit-base-patch16-224 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.4457
- F1 Macro: 0.4685
- F1 Micro: 0.5455
- F1 Weighted: 0.5242
- Precision Macro: 0.5341
- Precision Micro: 0.5455
- Precision Weighted: 0.5870
- Recall Macro: 0.4829
- Recall Micro: 0.5455
- Recall Weighted: 0.5455
- Accuracy: 0.5455
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 35
Training results
| Training Loss | Epoch | Step | Validation Loss | F1 Macro | F1 Micro | F1 Weighted | Precision Macro | Precision Micro | Precision Weighted | Recall Macro | Recall Micro | Recall Weighted | Accuracy |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 1.9976 | 1.0 | 29 | 1.9107 | 0.0915 | 0.2045 | 0.1209 | 0.0793 | 0.2045 | 0.1019 | 0.1535 | 0.2045 | 0.2045 | 0.2045 |
| 1.7575 | 2.0 | 58 | 1.8877 | 0.1474 | 0.2348 | 0.1805 | 0.1704 | 0.2348 | 0.2163 | 0.1989 | 0.2348 | 0.2348 | 0.2348 |
| 1.8336 | 3.0 | 87 | 1.7319 | 0.1659 | 0.3182 | 0.2117 | 0.1611 | 0.3182 | 0.2133 | 0.2586 | 0.3182 | 0.3182 | 0.3182 |
| 1.452 | 4.0 | 116 | 1.5316 | 0.3336 | 0.4167 | 0.3752 | 0.3518 | 0.4167 | 0.3903 | 0.3682 | 0.4167 | 0.4167 | 0.4167 |
| 1.2545 | 5.0 | 145 | 1.4192 | 0.3999 | 0.4848 | 0.4447 | 0.4601 | 0.4848 | 0.5021 | 0.4318 | 0.4848 | 0.4848 | 0.4848 |
| 1.6479 | 6.0 | 174 | 1.3642 | 0.4649 | 0.5455 | 0.5265 | 0.5072 | 0.5455 | 0.5559 | 0.4750 | 0.5455 | 0.5455 | 0.5455 |
| 1.301 | 7.0 | 203 | 1.3015 | 0.4178 | 0.5303 | 0.4735 | 0.4090 | 0.5303 | 0.4503 | 0.4535 | 0.5303 | 0.5303 | 0.5303 |
| 0.9006 | 8.0 | 232 | 1.4861 | 0.4234 | 0.4924 | 0.4699 | 0.4586 | 0.4924 | 0.5286 | 0.4621 | 0.4924 | 0.4924 | 0.4924 |
| 0.4134 | 9.0 | 261 | 1.2101 | 0.4852 | 0.5833 | 0.5545 | 0.5427 | 0.5833 | 0.5894 | 0.5010 | 0.5833 | 0.5833 | 0.5833 |
| 0.9532 | 10.0 | 290 | 1.3783 | 0.4577 | 0.5682 | 0.5204 | 0.4557 | 0.5682 | 0.5160 | 0.4972 | 0.5682 | 0.5682 | 0.5682 |
| 0.4521 | 11.0 | 319 | 1.3602 | 0.5266 | 0.6136 | 0.5923 | 0.5296 | 0.6136 | 0.5907 | 0.5403 | 0.6136 | 0.6136 | 0.6136 |
| 0.633 | 12.0 | 348 | 1.4293 | 0.5032 | 0.5833 | 0.5727 | 0.4969 | 0.5833 | 0.5674 | 0.5140 | 0.5833 | 0.5833 | 0.5833 |
| 0.4268 | 13.0 | 377 | 1.4388 | 0.5031 | 0.5833 | 0.5676 | 0.5543 | 0.5833 | 0.6189 | 0.5124 | 0.5833 | 0.5833 | 0.5833 |
| 0.2857 | 14.0 | 406 | 1.6012 | 0.5071 | 0.5833 | 0.5676 | 0.5209 | 0.5833 | 0.5879 | 0.5211 | 0.5833 | 0.5833 | 0.5833 |
| 0.2606 | 15.0 | 435 | 1.5817 | 0.5579 | 0.6136 | 0.6109 | 0.5657 | 0.6136 | 0.6178 | 0.5590 | 0.6136 | 0.6136 | 0.6136 |
| 0.2028 | 16.0 | 464 | 1.8048 | 0.4526 | 0.5227 | 0.5112 | 0.4703 | 0.5227 | 0.5378 | 0.4668 | 0.5227 | 0.5227 | 0.5227 |
| 0.3251 | 17.0 | 493 | 1.6340 | 0.4942 | 0.5833 | 0.5625 | 0.5049 | 0.5833 | 0.5631 | 0.5031 | 0.5833 | 0.5833 | 0.5833 |
| 0.0369 | 18.0 | 522 | 1.5847 | 0.5860 | 0.6439 | 0.6349 | 0.6267 | 0.6439 | 0.6476 | 0.5824 | 0.6439 | 0.6439 | 0.6439 |
| 0.1133 | 19.0 | 551 | 1.5825 | 0.5457 | 0.6288 | 0.6157 | 0.5377 | 0.6288 | 0.6111 | 0.5615 | 0.6288 | 0.6288 | 0.6288 |
| 0.0457 | 20.0 | 580 | 1.7253 | 0.5258 | 0.6136 | 0.5938 | 0.5229 | 0.6136 | 0.5854 | 0.5391 | 0.6136 | 0.6136 | 0.6136 |
| 0.1109 | 21.0 | 609 | 1.7898 | 0.5708 | 0.6212 | 0.6154 | 0.6150 | 0.6212 | 0.6283 | 0.5613 | 0.6212 | 0.6212 | 0.6212 |
| 0.046 | 22.0 | 638 | 1.7368 | 0.5656 | 0.6136 | 0.6029 | 0.6021 | 0.6136 | 0.6103 | 0.5615 | 0.6136 | 0.6136 | 0.6136 |
| 0.0553 | 23.0 | 667 | 2.2478 | 0.4822 | 0.5682 | 0.5430 | 0.4851 | 0.5682 | 0.5380 | 0.4975 | 0.5682 | 0.5682 | 0.5682 |
| 0.0047 | 24.0 | 696 | 2.1705 | 0.5133 | 0.5909 | 0.5750 | 0.5158 | 0.5909 | 0.5716 | 0.5220 | 0.5909 | 0.5909 | 0.5909 |
| 0.0104 | 25.0 | 725 | 2.2669 | 0.4950 | 0.5833 | 0.5622 | 0.5035 | 0.5833 | 0.5609 | 0.5038 | 0.5833 | 0.5833 | 0.5833 |
| 0.0287 | 26.0 | 754 | 2.0390 | 0.5267 | 0.6061 | 0.5935 | 0.5265 | 0.6061 | 0.5898 | 0.5346 | 0.6061 | 0.6061 | 0.6061 |
| 0.0212 | 27.0 | 783 | 2.1345 | 0.5344 | 0.6136 | 0.6005 | 0.5308 | 0.6136 | 0.5946 | 0.5449 | 0.6136 | 0.6136 | 0.6136 |
| 0.0221 | 28.0 | 812 | 2.1555 | 0.5607 | 0.6136 | 0.6035 | 0.5953 | 0.6136 | 0.6107 | 0.5583 | 0.6136 | 0.6136 | 0.6136 |
| 0.001 | 29.0 | 841 | 2.1102 | 0.5833 | 0.6364 | 0.6289 | 0.6172 | 0.6364 | 0.6353 | 0.5789 | 0.6364 | 0.6364 | 0.6364 |
| 0.0045 | 30.0 | 870 | 2.0669 | 0.5862 | 0.6364 | 0.6290 | 0.6164 | 0.6364 | 0.6326 | 0.5831 | 0.6364 | 0.6364 | 0.6364 |
| 0.0021 | 31.0 | 899 | 2.1442 | 0.5833 | 0.6364 | 0.6282 | 0.6165 | 0.6364 | 0.6330 | 0.5789 | 0.6364 | 0.6364 | 0.6364 |
| 0.0014 | 32.0 | 928 | 2.1435 | 0.5616 | 0.6136 | 0.6049 | 0.5957 | 0.6136 | 0.6099 | 0.5569 | 0.6136 | 0.6136 | 0.6136 |
| 0.0016 | 33.0 | 957 | 2.1279 | 0.5621 | 0.6136 | 0.6047 | 0.5966 | 0.6136 | 0.6093 | 0.5569 | 0.6136 | 0.6136 | 0.6136 |
| 0.0008 | 34.0 | 986 | 2.1310 | 0.5691 | 0.6212 | 0.6127 | 0.6030 | 0.6212 | 0.6170 | 0.5641 | 0.6212 | 0.6212 | 0.6212 |
| 0.0006 | 35.0 | 1015 | 2.1338 | 0.5690 | 0.6212 | 0.6130 | 0.6026 | 0.6212 | 0.6176 | 0.5641 | 0.6212 | 0.6212 | 0.6212 |
Framework versions
- Transformers 4.48.2
- Pytorch 2.6.0+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 10
Model tree for corranm/square_run_with_16_batch_size
Base model
google/vit-base-patch16-224