Whisper - 22 hours of training data
Collection
Collection of different data selection methods, where the amount of training data is the same
•
3 items
•
Updated
This model is a fine-tuned version of openai/whisper-large-v2 on the JASMIN-CGN dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Wer |
|---|---|---|---|---|
| 1.0263 | 0.1543 | 25 | 1.2168 | 38.0448 |
| 1.0661 | 0.3086 | 50 | 1.1739 | 37.5583 |
| 0.9871 | 0.4630 | 75 | 1.1019 | 36.3337 |
| 0.8873 | 0.6173 | 100 | 1.0211 | 35.1126 |
| 0.8572 | 0.7716 | 125 | 0.9299 | 34.6831 |
| 0.7706 | 0.9259 | 150 | 0.8373 | 32.4219 |
| 0.7349 | 1.0802 | 175 | 0.7395 | 31.9455 |
| 0.6256 | 1.2346 | 200 | 0.6531 | 31.5127 |
| 0.5956 | 1.3889 | 225 | 0.5880 | 28.6141 |
| 0.5474 | 1.5432 | 250 | 0.5423 | 26.7555 |
| 0.5441 | 1.6975 | 275 | 0.5079 | 25.7490 |
| 0.5065 | 1.8519 | 300 | 0.4803 | 23.9541 |
| 0.5186 | 2.0062 | 325 | 0.4605 | 22.4477 |
| 0.4526 | 2.1605 | 350 | 0.4491 | 22.9241 |
| 0.4636 | 2.3148 | 375 | 0.4415 | 22.7229 |
| 0.4768 | 2.4691 | 400 | 0.4364 | 22.6692 |
| 0.4721 | 2.6235 | 425 | 0.4332 | 22.5081 |
| 0.4729 | 2.7778 | 450 | 0.4312 | 22.5115 |
| 0.4786 | 2.9321 | 475 | 0.4303 | 22.4947 |
Base model
openai/whisper-large-v2