File size: 857 Bytes
d5e2593
 
93e4696
 
 
 
d5e2593
93e4696
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
893557e
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
---
license: apache-2.0
language:
- en
tags:
- medical
---
This repo contains MedLLaMA_13B, which is LLaMA-13b finetuned with some Medical Corpus.

The model was trained with the following hyperparameters:

* Epochs: 5 
* Batch size: 320 
* Cutoff length: 2048
* Learning rate: 2e-5

The model can be loaded as follows:

```
import transformers
import torch
tokenizer = transformers.LlamaTokenizer.from_pretrained('chaoyi-wu/MedLLaMA_13B')
model = transformers.LlamaForCausalLM.from_pretrained('chaoyi-wu/MedLLaMA_13B')
sentence = 'Hello, doctor' 
batch = tokenizer(
            sentence,
            return_tensors="pt", 
            add_special_tokens=False
        )
with torch.no_grad():
    generated = model.generate(inputs = batch["input_ids"], max_length=200, do_sample=True, top_k=50)
    print('model predict: ',tokenizer.decode(generated[0]))
```