Italian
File size: 1,149 Bytes
24adf28
c860f43
24adf28
c860f43
 
 
 
24adf28
c860f43
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
---
inference: false
license: openrail
language:
- it
datasets:
- teelinsan/camoscio
---

# ExtremITA Camoscio 7 bilion parameters adapters: ExtremITLLaMA
This is ExtremITLLaMA, the adapters for the instruction-tuned Italian LLaMA model that participated in all the tasks of [EVALITA 2023](https://www.evalita.it/campaigns/evalita-2023/) winning 41% of tasks and achieving 64% of top-three positions.
It requires the base model from [sag-uniroma2/extremITA-Camoscio-7b](https://huggingface.co/sag-uniroma2/extremITA-Camoscio-7b).

# Usage
Checkout the github repository for more insights and codes: https://github.com/crux82/ExtremITA

```python
from peft import PeftModel
from transformers import LLaMATokenizer, LLaMAForCausalLM
import torch

tokenizer = LLaMATokenizer.from_pretrained("yahma/llama-7b-hf")
model = LlamaForCausalLM.from_pretrained(
        "sag-uniroma2/extremITA-Camoscio-7b",
        load_in_8bit=True,
        torch_dtype=torch.float16,
        device_map="auto",
    )
model = PeftModel.from_pretrained(
    model,
    "sag-uniroma2/extremITA-Camoscio-7b-adapters",
    torch_dtype=torch.float16,
    device_map="auto",
)
```