square_run_second_vote
This model is a fine-tuned version of google/vit-base-patch16-224 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.1557
- F1 Macro: 0.5777
- F1 Micro: 0.6667
- F1 Weighted: 0.6629
- Precision Macro: 0.5756
- Precision Micro: 0.6667
- Precision Weighted: 0.6734
- Recall Macro: 0.5912
- Recall Micro: 0.6667
- Recall Weighted: 0.6667
- Accuracy: 0.6667
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30
Training results
| Training Loss | Epoch | Step | Validation Loss | F1 Macro | F1 Micro | F1 Weighted | Precision Macro | Precision Micro | Precision Weighted | Recall Macro | Recall Micro | Recall Weighted | Accuracy |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 1.8754 | 1.0 | 58 | 1.7961 | 0.1385 | 0.2803 | 0.1722 | 0.1426 | 0.2803 | 0.1603 | 0.2098 | 0.2803 | 0.2803 | 0.2803 |
| 2.0246 | 2.0 | 116 | 2.0138 | 0.2236 | 0.3106 | 0.2484 | 0.2558 | 0.3106 | 0.2692 | 0.2842 | 0.3106 | 0.3106 | 0.3106 |
| 1.6189 | 3.0 | 174 | 1.5039 | 0.2444 | 0.3864 | 0.3195 | 0.2633 | 0.3864 | 0.3301 | 0.2847 | 0.3864 | 0.3864 | 0.3864 |
| 1.3445 | 4.0 | 232 | 1.3982 | 0.3287 | 0.4394 | 0.3866 | 0.3186 | 0.4394 | 0.3730 | 0.3696 | 0.4394 | 0.4394 | 0.4394 |
| 1.3387 | 5.0 | 290 | 1.1920 | 0.4401 | 0.5758 | 0.5265 | 0.4315 | 0.5758 | 0.5031 | 0.4683 | 0.5758 | 0.5758 | 0.5758 |
| 1.1664 | 6.0 | 348 | 1.1778 | 0.4179 | 0.5076 | 0.4988 | 0.5068 | 0.5076 | 0.5862 | 0.4395 | 0.5076 | 0.5076 | 0.5076 |
| 1.1622 | 7.0 | 406 | 1.1723 | 0.4518 | 0.5379 | 0.5251 | 0.4514 | 0.5379 | 0.5526 | 0.4867 | 0.5379 | 0.5379 | 0.5379 |
| 0.9827 | 8.0 | 464 | 1.0619 | 0.5084 | 0.6212 | 0.6074 | 0.5037 | 0.6212 | 0.6140 | 0.5345 | 0.6212 | 0.6212 | 0.6212 |
| 1.3416 | 9.0 | 522 | 1.3995 | 0.3997 | 0.5 | 0.4690 | 0.4218 | 0.5 | 0.5024 | 0.4509 | 0.5 | 0.5 | 0.5 |
| 0.758 | 10.0 | 580 | 1.1693 | 0.5066 | 0.5985 | 0.5836 | 0.5262 | 0.5985 | 0.6031 | 0.5279 | 0.5985 | 0.5985 | 0.5985 |
| 0.7758 | 11.0 | 638 | 1.0800 | 0.5491 | 0.6515 | 0.6320 | 0.5729 | 0.6515 | 0.6501 | 0.5710 | 0.6515 | 0.6515 | 0.6515 |
| 0.2319 | 12.0 | 696 | 1.1553 | 0.5467 | 0.6742 | 0.6410 | 0.5816 | 0.6742 | 0.6699 | 0.5711 | 0.6742 | 0.6742 | 0.6742 |
| 0.3528 | 13.0 | 754 | 1.1685 | 0.5794 | 0.6894 | 0.6711 | 0.5887 | 0.6894 | 0.6752 | 0.5955 | 0.6894 | 0.6894 | 0.6894 |
| 0.6238 | 14.0 | 812 | 1.1781 | 0.5579 | 0.6439 | 0.6285 | 0.5451 | 0.6439 | 0.6278 | 0.5856 | 0.6439 | 0.6439 | 0.6439 |
| 0.1869 | 15.0 | 870 | 1.2305 | 0.5146 | 0.6061 | 0.5983 | 0.5032 | 0.6061 | 0.6013 | 0.5369 | 0.6061 | 0.6061 | 0.6061 |
| 0.1015 | 16.0 | 928 | 1.3576 | 0.5019 | 0.5909 | 0.5932 | 0.5440 | 0.5909 | 0.6312 | 0.4959 | 0.5909 | 0.5909 | 0.5909 |
| 0.3809 | 17.0 | 986 | 1.2998 | 0.5667 | 0.6591 | 0.6527 | 0.5828 | 0.6591 | 0.6885 | 0.5838 | 0.6591 | 0.6591 | 0.6591 |
| 0.0887 | 18.0 | 1044 | 1.4154 | 0.5572 | 0.6667 | 0.6489 | 0.5682 | 0.6667 | 0.6518 | 0.5683 | 0.6667 | 0.6667 | 0.6667 |
| 0.1422 | 19.0 | 1102 | 1.3989 | 0.5609 | 0.6667 | 0.6472 | 0.5672 | 0.6667 | 0.6420 | 0.5695 | 0.6667 | 0.6667 | 0.6667 |
| 0.0037 | 20.0 | 1160 | 1.5134 | 0.5242 | 0.6212 | 0.6078 | 0.5263 | 0.6212 | 0.6093 | 0.5374 | 0.6212 | 0.6212 | 0.6212 |
| 0.0602 | 21.0 | 1218 | 1.5349 | 0.5660 | 0.6667 | 0.6544 | 0.5710 | 0.6667 | 0.6503 | 0.5671 | 0.6667 | 0.6667 | 0.6667 |
| 0.0353 | 22.0 | 1276 | 1.4489 | 0.6137 | 0.7045 | 0.6919 | 0.6146 | 0.7045 | 0.6909 | 0.6242 | 0.7045 | 0.7045 | 0.7045 |
| 0.001 | 23.0 | 1334 | 1.4781 | 0.5715 | 0.6667 | 0.6541 | 0.5657 | 0.6667 | 0.6449 | 0.5805 | 0.6667 | 0.6667 | 0.6667 |
| 0.0007 | 24.0 | 1392 | 1.6326 | 0.5713 | 0.6591 | 0.6511 | 0.5871 | 0.6591 | 0.6648 | 0.5786 | 0.6591 | 0.6591 | 0.6591 |
| 0.0084 | 25.0 | 1450 | 1.5856 | 0.5684 | 0.6591 | 0.6569 | 0.5662 | 0.6591 | 0.6672 | 0.5802 | 0.6591 | 0.6591 | 0.6591 |
| 0.0008 | 26.0 | 1508 | 1.5799 | 0.5826 | 0.6818 | 0.6675 | 0.5849 | 0.6818 | 0.6632 | 0.5884 | 0.6818 | 0.6818 | 0.6818 |
| 0.0053 | 27.0 | 1566 | 1.5308 | 0.5719 | 0.6667 | 0.6556 | 0.5667 | 0.6667 | 0.6524 | 0.5843 | 0.6667 | 0.6667 | 0.6667 |
| 0.0004 | 28.0 | 1624 | 1.5639 | 0.5732 | 0.6667 | 0.6617 | 0.5684 | 0.6667 | 0.6673 | 0.5867 | 0.6667 | 0.6667 | 0.6667 |
| 0.0007 | 29.0 | 1682 | 1.5346 | 0.5835 | 0.6742 | 0.6678 | 0.5786 | 0.6742 | 0.6703 | 0.5965 | 0.6742 | 0.6742 | 0.6742 |
| 0.0004 | 30.0 | 1740 | 1.5232 | 0.5791 | 0.6742 | 0.6661 | 0.5707 | 0.6742 | 0.6628 | 0.5918 | 0.6742 | 0.6742 | 0.6742 |
Framework versions
- Transformers 4.48.2
- Pytorch 2.6.0+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 4
Model tree for corranm/square_run_second_vote
Base model
google/vit-base-patch16-224