---
base_model: mesolitica/malaysian-mistral-7b-32k-instructions-v4
inference: false
model_creator: mesolitica
model_name: Malaysian Mistral 7B 32k Instructions v4
model_type: mistral
pipeline_tag: text-generation
prompt_template: >-
[INST] This is a system prompt.
This is the first user input. [/INST] This is the first assistant response.
[INST] This is the second user input. [/INST]
quantized_by: prsyahmi
tags:
- finetuned
language:
- ms
- en
---
# Malaysian Mistral 7B 32k Instructions v4 - GGUF
- Model creator: [mesolotica](https://huggingface.co/mesolitica)
- Original model: [Malaysian Mistral 7B 32k Instructions v4](https://huggingface.co/mesolitica/malaysian-mistral-7b-32k-instructions-v4)
## Pengenalan
Repo ini mengandungi model berformat GGUF, iaitu format kepada llama.cpp yang dibangunkan menggunakan C/C++ dimana aplikasi ini kurang kebergantungan dengan software/library lain menjadikan ia ringan berbanding rata-rata aplikasi python.
## Prompt template: Mistral
```
[INST] This is a system prompt.
This is the first user input. [/INST] This is the first assistant response.
[INST] This is the second user input. [/INST]
```
## Fail yang diberikan
Sila rujuk [Files and versions](https://huggingface.co/prsyahmi/malaysian-mistral-7b-32k-instructions-v4-GGUF/tree/main)
## Penghargaan
Terima kasih kepada Husein Zolkepli dan keseluruhan team [mesolotica](https://huggingface.co/mesolitica)!
Atas jasa mereka, kita dapat menggunakan atau mencuba AI peringkat tempatan.
-------