javi8979 commited on
Commit
f3cd101
·
verified ·
1 Parent(s): 0b01bb4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +51 -0
README.md CHANGED
@@ -604,6 +604,57 @@ Below are the evaluation results on Flores-200 dev and devtest compared to NLLB-
604
 
605
  </details>
606
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
607
 
608
 
609
 
 
604
 
605
  </details>
606
 
607
+ ## Evaluation Aranese, Aragonese, Asturian
608
+
609
+ Using [MT Lens](https://github.com/langtech-bsc/mt-evaluation) we evaluate Spanish-Asturian, Spanish-Aragonese and Spanish-Aranese on BLEU and ChrF scores on the [Flores+ dev](https://github.com/openlanguagedata/flores) evaluation dataset. We also report BLEU and ChrF scores for catalan directions.
610
+
611
+ ### Asturian Flores+ dev
612
+
613
+ Below are the evaluation results compared to [Apertium](https://www.apertium.org/), [Eslema](https://eslema.it.uniovi.es/) and NLLB ([Costa-jussà et al., 2022](https://arxiv.org/abs/2207.04672)).
614
+
615
+ | | source | target | Bleu | ChrF |
616
+ |:-----------------------|:---------|:---------|------:|-------:|
617
+ | nllb 3.3B | es | ast | **18.78** | 50.5 |
618
+ | Eslema | es | ast | 17.30 | **50.77** |
619
+ | nllb 600M | es | ast | 17.23 | 49.72 |
620
+ | SalamandraTA-2B | es | ast | 17.11 | 49.49 |
621
+ | Apertium | es | ast | 16.66 | 50.57 |
622
+ | | | | | | | | | |
623
+ | | | | | | | | | |
624
+ | nllb 3.3B | ca | ast | **25.87** | 54.9 |
625
+ | SalamandraTA-2B | ca | ast | 25.17 | **55.17** |
626
+
627
+
628
+ ### Aragonese Flores+ dev
629
+
630
+ Below are the evaluation results on compared to [Apertium](https://www.apertium.org/), [Softcatala](https://www.softcatala.org/traductor/) and [Traduze](https://traduze.aragon.es).
631
+
632
+ | | source | target | Bleu | ChrF |
633
+ |:-----------------------|:---------|:---------|-------:|-------:|
634
+ | Apertium | es | an | **65.34** | **82.00** |
635
+ | Softcatala | es | an | 50.21 | 73.97 |
636
+ | SalamandraTA-2B | es | an | 49.13 | 74.22 |
637
+ | Traduze | es | an | 37.43 | 69.51 |
638
+ | | | | | | | | | |
639
+ | | | | | | | | | |
640
+ | SalamandraTA-2B | ca | an | 17.06 | 49.12 |
641
+
642
+
643
+ ### Aranese Flores+ dev
644
+
645
+ Below are the evaluation results on compared to [Apertium](https://www.apertium.org/) and [Softcatala](https://www.softcatala.org/traductor/).
646
+
647
+
648
+ | | source | target | Bleu | ChrF |
649
+ |:-----------------------|:---------|:---------|-------:|-------:|
650
+ | Apertium | es | arn | **48.96** | **72.63** |
651
+ | Softcatala | es | arn | 34.43 | 58.61 |
652
+ | SalamandraTA-2B | es | arn | 34.35 | 57.78 |
653
+ | | | | | | | | | |
654
+ | | | | | | | | | |
655
+ | SalamandraTA-2B | ca | arn | 21.95 | 48.67 |
656
+
657
+
658
 
659
 
660