Update README.md
Browse files
README.md
CHANGED
@@ -3,7 +3,9 @@ license: apache-2.0
|
|
3 |
datasets:
|
4 |
- datajuicer/alpaca-cot-en-refined-by-data-juicer
|
5 |
---
|
6 |
-
|
|
|
|
|
7 |
This is a reference LLM from [Data-Juicer](https://github.com/alibaba/data-juicer).
|
8 |
|
9 |
The model architecture is LLaMA-7B and we built it upon the pre-trained [checkpoint](https://huggingface.co/huggyllama/llama-7b).
|
|
|
3 |
datasets:
|
4 |
- datajuicer/alpaca-cot-en-refined-by-data-juicer
|
5 |
---
|
6 |
+
## News
|
7 |
+
Our first data-centric LLM competition begins! Please visit the competition's official websites, **FT-Data Ranker** ([1B Track](https://tianchi.aliyun.com/competition/entrance/532157), [7B Track](https://tianchi.aliyun.com/competition/entrance/532158)), for more information.
|
8 |
+
## Instruction
|
9 |
This is a reference LLM from [Data-Juicer](https://github.com/alibaba/data-juicer).
|
10 |
|
11 |
The model architecture is LLaMA-7B and we built it upon the pre-trained [checkpoint](https://huggingface.co/huggyllama/llama-7b).
|