--- language: - ko datasets: - kyujinpy/Open-platypus-Commercial base_model: google/gemma-7b library_name: transformers pipeline_tag: text-generation license: other license_name: gemma-terms-of-use license_link: LICENSE --- # **gemma-7b-open-platypus-commercial** ## Model Details **Base Model** - google/gemma-7b (https://huggingface.co/google/gemma-7b) **Training Dataset** - kyujinpy/Open-platypus-Commercial (https://huggingface.co/datasets/kyujinpy/Open-platypus-Commercial) # Implementation Code ```python ### KO-Platypus from transformers import AutoModelForCausalLM, AutoTokenizer import torch repo = "grayhacker91/gemma-7b-open-platypus-commercial" OpenOrca = AutoModelForCausalLM.from_pretrained( repo, return_dict=True, torch_dtype=torch.float16, device_map='auto' ) OpenOrca_tokenizer = AutoTokenizer.from_pretrained(repo) ``` ---