cogbuji's picture
8e2e711e84df7bc7785c27e420fcdb2414195acc0dd683a3cae244279cdc4a6c
2d88b28 verified
|
raw
history blame
898 Bytes
metadata
base_model: mistralai/Mistral-7B-v0.1
datasets:
  - Open-Orca/OpenOrca
  - pubmed
  - medmcqa
  - maximegmd/medqa_alpaca_format
language:
  - en
license: apache-2.0
metrics:
  - accuracy
tags:
  - medical
  - mlx
tag: text-generation

cogbuji/MrGrammaticaOntology-internistai-SCT-DRIFT-clinical-problem-0.6.5

The Model cogbuji/MrGrammaticaOntology-internistai-SCT-DRIFT-clinical-problem-0.6.5 was converted to MLX format from internistai/base-7b-v0.2 using mlx-lm version 0.16.0.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("cogbuji/MrGrammaticaOntology-internistai-SCT-DRIFT-clinical-problem-0.6.5")
response = generate(model, tokenizer, prompt="hello", verbose=True)