Update README.md
Browse files
README.md
CHANGED
@@ -29,6 +29,7 @@ Small parameter LLMs are ideal for navigating the complexities of the Japanese l
|
|
29 |
|
30 |
## Llama 3.2 Chibi 3B
|
31 |
This experimental model is a result from continual pre-training of [Meta's Llama 3.2 3B](https://huggingface.co/meta-llama/Llama-3.2-3B) on a small mixture of japanese datasets.
|
|
|
32 |
|
33 |
## Architecture
|
34 |
[Llama 3.2 3B](https://huggingface.co/meta-llama/Llama-3.2-3B)
|
|
|
29 |
|
30 |
## Llama 3.2 Chibi 3B
|
31 |
This experimental model is a result from continual pre-training of [Meta's Llama 3.2 3B](https://huggingface.co/meta-llama/Llama-3.2-3B) on a small mixture of japanese datasets.
|
32 |
+
This is not fine-tuned for chat or dialogue-based tasks. It has been pre-trained for general language modeling purposes and may require additional fine-tuning for specific applications such as conversational agents or other downstream tasks. Users interested in deploying this model for interactive environments should consider further fine-tuning with suitable datasets.
|
33 |
|
34 |
## Architecture
|
35 |
[Llama 3.2 3B](https://huggingface.co/meta-llama/Llama-3.2-3B)
|