mondhs's picture
Model save
36883a4 verified
metadata
library_name: transformers
license: apache-2.0
base_model: openai/whisper-medium
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: l3-whisper-medium-l3c3_e1_v7
    results: []

l3-whisper-medium-l3c3_e1_v7

This model is a fine-tuned version of openai/whisper-medium on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3650
  • Wer: 13.5596

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 1
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.2112 0.0927 3000 0.4719 22.7174
0.1775 0.1854 6000 0.4478 19.2846
0.16 0.2782 9000 0.4300 17.8644
0.1499 0.3709 12000 0.4118 16.7179
0.1429 0.4636 15000 0.3920 15.8724
0.1354 0.5563 18000 0.3749 15.3029
0.1317 0.6490 21000 0.3774 15.0261
0.1279 0.7417 24000 0.3668 14.1700
0.1247 0.8345 27000 0.3704 13.8788
0.1213 0.9272 30000 0.3650 13.5596

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.4.1+cu124
  • Datasets 3.6.0
  • Tokenizers 0.21.4