helldivers2-jarvis-asrV8

This model is a fine-tuned version of facebook/wav2vec2-base-960h on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 19.1922
  • Wer: 0.1770
  • Cer: 0.7950

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 15
  • eval_batch_size: 15
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 70

Training results

Training Loss Epoch Step Validation Loss Wer Cer
585.4988 1.0 73 271.5495 0.3633 0.8109
378.3453 2.0 146 168.1931 0.3276 0.8056
319.9564 3.0 219 140.2753 0.3131 0.8034
281.8407 4.0 292 127.9118 0.2933 0.8023
264.2254 5.0 365 103.9656 0.2814 0.8020
233.3232 6.0 438 87.8603 0.2655 0.8005
213.0622 7.0 511 80.5251 0.2497 0.7993
194.7588 8.0 584 77.2796 0.2457 0.7991
197.9257 9.0 657 72.9397 0.2378 0.7985
183.4031 10.0 730 57.6201 0.2312 0.7979
172.3133 11.0 803 56.8209 0.2219 0.7976
165.2571 12.0 876 52.5103 0.2219 0.7972
161.9286 13.0 949 50.8869 0.2180 0.7970
158.4977 14.0 1022 46.2574 0.2127 0.7969
146.6963 15.0 1095 39.4797 0.2114 0.7968
131.5992 16.0 1168 38.5002 0.2127 0.7966
136.5205 17.0 1241 34.6582 0.2100 0.7965
130.8528 18.0 1314 31.8192 0.2061 0.7964
122.9594 19.0 1387 31.4980 0.1968 0.7961
122.6427 20.0 1460 29.6717 0.1955 0.7961
116.4709 21.0 1533 30.8555 0.1955 0.7960
124.5335 22.0 1606 31.9685 0.1942 0.7959
110.134 23.0 1679 32.0465 0.1968 0.7963
105.5423 24.0 1752 30.3238 0.1955 0.7961
113.3944 25.0 1825 28.7380 0.1889 0.7959
112.8572 26.0 1898 25.8070 0.1876 0.7957
106.1037 27.0 1971 28.3786 0.1876 0.7957
105.8874 28.0 2044 27.5647 0.1889 0.7957
96.3391 29.0 2117 29.5317 0.1915 0.7958
103.2364 30.0 2190 26.4454 0.1836 0.7956
101.3744 31.0 2263 24.9761 0.1836 0.7955
94.1529 32.0 2336 25.9349 0.1810 0.7954
89.4031 33.0 2409 18.5316 0.1810 0.7954
98.6291 34.0 2482 26.2887 0.1810 0.7954
88.7591 35.0 2555 26.2593 0.1823 0.7954
91.8604 36.0 2628 27.2263 0.1823 0.7954
83.4981 37.0 2701 26.0174 0.1810 0.7954
88.0891 38.0 2774 26.2153 0.1823 0.7955
85.7155 39.0 2847 24.1102 0.1810 0.7954
87.0231 40.0 2920 25.7208 0.1836 0.7955
85.2278 41.0 2993 25.6799 0.1797 0.7952
83.8083 42.0 3066 25.9679 0.1810 0.7955
83.1864 43.0 3139 23.0670 0.1797 0.7953
92.3624 44.0 3212 23.0629 0.1797 0.7953
101.0567 45.0 3285 23.7475 0.1823 0.7955
77.1269 46.0 3358 21.2927 0.1836 0.7954
79.782 47.0 3431 22.2427 0.1797 0.7952
79.0065 48.0 3504 22.0512 0.1783 0.7952
78.0512 49.0 3577 22.9018 0.1770 0.7950
84.4353 50.0 3650 23.3538 0.1783 0.7952
76.3221 51.0 3723 21.3755 0.1823 0.7952
74.6622 52.0 3796 22.8240 0.1783 0.7951
82.2913 53.0 3869 21.2273 0.1770 0.7950
71.0879 54.0 3942 20.8637 0.1770 0.7950
78.6892 55.0 4015 22.2313 0.1770 0.7951
68.3446 56.0 4088 20.8211 0.1797 0.7952
78.6938 57.0 4161 21.4202 0.1783 0.7952
76.9933 58.0 4234 20.0210 0.1783 0.7951
70.8471 59.0 4307 22.2786 0.1783 0.7951
76.5866 60.0 4380 20.0782 0.1770 0.7951
69.9045 61.0 4453 21.1576 0.1797 0.7951
81.1508 62.0 4526 18.7841 0.1797 0.7951
82.9743 63.0 4599 18.4739 0.1823 0.7952
80.8333 64.0 4672 20.2582 0.1810 0.7954
71.9855 65.0 4745 21.3909 0.1770 0.7952
80.0273 66.0 4818 20.2303 0.1797 0.7952
74.9079 67.0 4891 19.8304 0.1783 0.7952
75.7448 68.0 4964 21.5770 0.1823 0.7954
71.0606 69.0 5037 20.9629 0.1797 0.7952
75.027 70.0 5110 19.1922 0.1770 0.7950

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.5.1+cu121
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
1
Safetensors
Model size
94.4M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for 8688chris/helldivers2-jarvis-asrV8

Finetuned
(169)
this model