Update README.md
Browse files
README.md
CHANGED
|
@@ -114,7 +114,7 @@ As shown in the table, our SeaLLM model outperforms most 13B baselines and reach
|
|
| 114 |
| M3Exam / 3-shot (Acc) | En | Zh | Vi | Id | Th
|
| 115 |
|-----------| ------- | ------- | ------- | ------- | ------- |
|
| 116 |
| Random | 25.00 | 25.00 | 25.00 | 23.00 | 23.00
|
| 117 |
-
| ChatGPT | 75.46 | 60.20 | 58.64 |
|
| 118 |
| Llama-2-13b | 59.88 | 43.40 | 41.70 | 34.80 | 23.18
|
| 119 |
| Llama-2-13b-chat | 61.17 | 43.29 | 39.97 | 35.50 | 23.74
|
| 120 |
| Polylm-13b-chat | 32.23 | 29.26 | 29.01 | 25.36 | 18.08
|
|
|
|
| 114 |
| M3Exam / 3-shot (Acc) | En | Zh | Vi | Id | Th
|
| 115 |
|-----------| ------- | ------- | ------- | ------- | ------- |
|
| 116 |
| Random | 25.00 | 25.00 | 25.00 | 23.00 | 23.00
|
| 117 |
+
| ChatGPT | 75.46 | 60.20 | 58.64 | 49.27 | 37.41
|
| 118 |
| Llama-2-13b | 59.88 | 43.40 | 41.70 | 34.80 | 23.18
|
| 119 |
| Llama-2-13b-chat | 61.17 | 43.29 | 39.97 | 35.50 | 23.74
|
| 120 |
| Polylm-13b-chat | 32.23 | 29.26 | 29.01 | 25.36 | 18.08
|