52Hz Small Fr - IMT Atlantique X 52 Hertz

This model is a fine-tuned version of openai/whisper-small on the Transcriptions IMTx52Hz v3 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4065
  • Wer: 17.8105

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.4611 1.0 22 0.8947 39.8693
0.6451 2.0 44 0.5211 25.4902
0.3082 3.0 66 0.4803 20.4248
0.224 4.0 88 0.4255 37.7451
0.1531 5.0 110 0.4062 17.9739
0.1294 6.0 132 0.4146 19.1176
0.1018 7.0 154 0.4123 19.6078
0.0546 8.0 176 0.4034 20.2614
0.0401 9.0 198 0.4386 17.3203
0.0476 10.0 220 0.4225 18.6275
0.0261 11.0 242 0.4131 18.4641
0.0226 12.0 264 0.4035 17.8105
0.0323 13.0 286 0.4050 17.6471
0.0208 14.0 308 0.4064 17.8105
0.0189 15.0 330 0.4065 17.8105

Framework versions

  • Transformers 4.57.3
  • Pytorch 2.9.1+cu130
  • Datasets 4.4.2
  • Tokenizers 0.22.2
Downloads last month
89
Safetensors
Model size
0.2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MathildeB3/52Hz-small-fr-v3

Finetuned
(3316)
this model