usage


from transformers import AutoModelForCausalLM, AutoTokenizer
import torch

path = "mssma/ko-solar-10.7b-v0.5"
model = AutoModelForCausalLM.from_pretrained(
        path,
        return_dict=True,
        torch_dtype=torch.float16,
        device_map='auto'
)
tokenizer = AutoTokenizer.from_pretrained(path)
Downloads last month
61
Safetensors
Model size
10.9B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for mssma/ko-solar-10.7b-v0.5

Quantizations
1 model