Configuration Parsing Warning: In adapter_config.json: "peft.task_type" must be a string

52Hz Small Fr - IMT Atlantique X 52 Hertz

This model is a fine-tuned version of openai/whisper-small on the Premier dataset organisé de 52 Hertz dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3311
  • Wer: 19.5896

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 8e-05
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
2.6698 1.0 21 1.0703 65.6716
1.4505 2.0 42 0.7071 40.1119
0.8063 3.0 63 0.4246 33.9552
0.5185 4.0 84 0.3435 30.5970
0.3976 5.0 105 0.3134 28.5448
0.2784 6.0 126 0.3100 28.1716
0.2058 7.0 147 0.2878 23.6940
0.1687 8.0 168 0.3124 25.3731
0.1543 9.0 189 0.3016 26.6791
0.1509 10.0 210 0.2871 22.5746
0.0932 11.0 231 0.2906 24.4403
0.0881 12.0 252 0.3020 22.0149
0.0774 13.0 273 0.2969 20.3358
0.0807 14.0 294 0.2988 21.6418
0.0558 15.0 315 0.3083 19.5896
0.0487 16.0 336 0.3145 18.4701
0.0489 17.0 357 0.3200 19.4030
0.0376 18.0 378 0.3349 21.8284
0.0417 19.0 399 0.3251 22.9478
0.0288 20.0 420 0.3154 18.2836
0.0311 21.0 441 0.3236 20.3358
0.0306 22.0 462 0.3177 19.5896
0.0229 23.0 483 0.3253 19.7761
0.0231 24.0 504 0.3298 19.9627
0.0198 25.0 525 0.3316 19.7761
0.0212 26.0 546 0.3322 19.7761
0.0208 27.0 567 0.3309 19.5896
0.0183 28.0 588 0.3317 19.5896
0.0254 29.0 609 0.3312 19.5896
0.0229 30.0 630 0.3311 19.5896

Framework versions

  • PEFT 0.18.1
  • Transformers 4.57.3
  • Pytorch 2.9.1+cu130
  • Datasets 4.4.2
  • Tokenizers 0.22.2
Downloads last month
258
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MathildeB3/52Hz-small-LORA-fr

Adapter
(171)
this model