File size: 5,075 Bytes
fff5d98
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
---
license: cc-by-nc-4.0
tags:
- generated_from_trainer
model-index:
- name: wav2vec2-large-uralic-voxpopuli-v2-finnish
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# wav2vec2-large-uralic-voxpopuli-v2-finnish

This model is a fine-tuned version of [facebook/wav2vec2-large-uralic-voxpopuli-v2](https://huggingface.co/facebook/wav2vec2-large-uralic-voxpopuli-v2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0828
- Wer: 0.1075

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Wer    |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 1.9421        | 0.17  | 500   | 0.8633          | 0.8870 |
| 0.572         | 0.33  | 1000  | 0.1650          | 0.1829 |
| 0.5149        | 0.5   | 1500  | 0.1416          | 0.1711 |
| 0.4884        | 0.66  | 2000  | 0.1265          | 0.1605 |
| 0.4729        | 0.83  | 2500  | 0.1205          | 0.1485 |
| 0.4723        | 1.0   | 3000  | 0.1108          | 0.1403 |
| 0.443         | 1.16  | 3500  | 0.1175          | 0.1439 |
| 0.4378        | 1.33  | 4000  | 0.1083          | 0.1482 |
| 0.4313        | 1.49  | 4500  | 0.1110          | 0.1398 |
| 0.4182        | 1.66  | 5000  | 0.1024          | 0.1418 |
| 0.3884        | 1.83  | 5500  | 0.1032          | 0.1395 |
| 0.4034        | 1.99  | 6000  | 0.0985          | 0.1318 |
| 0.3735        | 2.16  | 6500  | 0.1008          | 0.1355 |
| 0.4174        | 2.32  | 7000  | 0.0970          | 0.1361 |
| 0.3581        | 2.49  | 7500  | 0.0968          | 0.1297 |
| 0.3783        | 2.66  | 8000  | 0.0881          | 0.1284 |
| 0.3827        | 2.82  | 8500  | 0.0921          | 0.1352 |
| 0.3651        | 2.99  | 9000  | 0.0861          | 0.1298 |
| 0.3684        | 3.15  | 9500  | 0.0844          | 0.1270 |
| 0.3784        | 3.32  | 10000 | 0.0870          | 0.1248 |
| 0.356         | 3.48  | 10500 | 0.0828          | 0.1214 |
| 0.3524        | 3.65  | 11000 | 0.0878          | 0.1218 |
| 0.3879        | 3.82  | 11500 | 0.0874          | 0.1216 |
| 0.3521        | 3.98  | 12000 | 0.0860          | 0.1210 |
| 0.3527        | 4.15  | 12500 | 0.0818          | 0.1184 |
| 0.3529        | 4.31  | 13000 | 0.0787          | 0.1185 |
| 0.3114        | 4.48  | 13500 | 0.0852          | 0.1202 |
| 0.3495        | 4.65  | 14000 | 0.0807          | 0.1187 |
| 0.34          | 4.81  | 14500 | 0.0796          | 0.1162 |
| 0.3646        | 4.98  | 15000 | 0.0782          | 0.1149 |
| 0.3004        | 5.14  | 15500 | 0.0799          | 0.1142 |
| 0.3167        | 5.31  | 16000 | 0.0847          | 0.1123 |
| 0.3249        | 5.48  | 16500 | 0.0837          | 0.1171 |
| 0.3202        | 5.64  | 17000 | 0.0749          | 0.1109 |
| 0.3104        | 5.81  | 17500 | 0.0798          | 0.1093 |
| 0.3039        | 5.97  | 18000 | 0.0810          | 0.1132 |
| 0.3157        | 6.14  | 18500 | 0.0847          | 0.1156 |
| 0.3133        | 6.31  | 19000 | 0.0833          | 0.1140 |
| 0.3203        | 6.47  | 19500 | 0.0838          | 0.1113 |
| 0.3178        | 6.64  | 20000 | 0.0907          | 0.1141 |
| 0.3182        | 6.8   | 20500 | 0.0938          | 0.1143 |
| 0.3           | 6.97  | 21000 | 0.0854          | 0.1133 |
| 0.3151        | 7.14  | 21500 | 0.0859          | 0.1109 |
| 0.2963        | 7.3   | 22000 | 0.0832          | 0.1122 |
| 0.3099        | 7.47  | 22500 | 0.0865          | 0.1103 |
| 0.322         | 7.63  | 23000 | 0.0833          | 0.1105 |
| 0.3064        | 7.8   | 23500 | 0.0865          | 0.1078 |
| 0.2964        | 7.97  | 24000 | 0.0859          | 0.1096 |
| 0.2869        | 8.13  | 24500 | 0.0872          | 0.1100 |
| 0.315         | 8.3   | 25000 | 0.0869          | 0.1099 |
| 0.3003        | 8.46  | 25500 | 0.0878          | 0.1105 |
| 0.2947        | 8.63  | 26000 | 0.0884          | 0.1084 |
| 0.297         | 8.8   | 26500 | 0.0891          | 0.1102 |
| 0.3049        | 8.96  | 27000 | 0.0863          | 0.1081 |
| 0.2957        | 9.13  | 27500 | 0.0846          | 0.1083 |
| 0.2908        | 9.29  | 28000 | 0.0848          | 0.1059 |
| 0.2955        | 9.46  | 28500 | 0.0846          | 0.1085 |
| 0.2991        | 9.62  | 29000 | 0.0839          | 0.1081 |
| 0.3112        | 9.79  | 29500 | 0.0832          | 0.1071 |
| 0.29          | 9.96  | 30000 | 0.0828          | 0.1075 |


### Framework versions

- Transformers 4.19.2
- Pytorch 1.11.0+cu102
- Datasets 2.2.2
- Tokenizers 0.11.0