JingweiZuo
commited on
Chore: update Nemo12B eval results
Browse filesUpdate evaluation results for Mistral-Nemo-Base-2407 (12B)
README.md
CHANGED
@@ -328,6 +328,7 @@ Also, we evaluate our model on the benchmarks of the first leaderboard using `li
|
|
328 |
| `Meta-Llama-3-8B` | 60.24 | 82.23 | 66.70 | 78.45 | 42.93 | 45.19 | 62.62 |
|
329 |
| `Meta-Llama-3.1-8B` | 58.53 | 82.13 | 66.43 | 74.35 | 44.29 | 47.92 | 62.28 |
|
330 |
| `Mistral-7B-v0.1` | 59.98 | 83.31 | 64.16 | 78.37 | 42.15 | 37.83 | 60.97 |
|
|
|
331 |
| `gemma-7B` | 61.09 | 82.20 | 64.56 | 79.01 | 44.79 | 50.87 | 63.75 |
|
332 |
|***RWKV models*** | | | | | | | |
|
333 |
| `RWKV-v6-Finch-7B`<sup>*</sup> | 43.86 | 75.19 | 41.69 | 68.27 | 42.19 | 19.64 | 48.47 |
|
|
|
328 |
| `Meta-Llama-3-8B` | 60.24 | 82.23 | 66.70 | 78.45 | 42.93 | 45.19 | 62.62 |
|
329 |
| `Meta-Llama-3.1-8B` | 58.53 | 82.13 | 66.43 | 74.35 | 44.29 | 47.92 | 62.28 |
|
330 |
| `Mistral-7B-v0.1` | 59.98 | 83.31 | 64.16 | 78.37 | 42.15 | 37.83 | 60.97 |
|
331 |
+
| `Mistral-Nemo-Base-2407 (12B)`<sup>*</sup> | 57.94 | 82.82 | 64.43 | 73.72 | 49.14 | 55.27 | 63.89 |
|
332 |
| `gemma-7B` | 61.09 | 82.20 | 64.56 | 79.01 | 44.79 | 50.87 | 63.75 |
|
333 |
|***RWKV models*** | | | | | | | |
|
334 |
| `RWKV-v6-Finch-7B`<sup>*</sup> | 43.86 | 75.19 | 41.69 | 68.27 | 42.19 | 19.64 | 48.47 |
|