File size: 3,681 Bytes
bf97174
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0fc185a
bf97174
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5f83b1b
bf97174
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
b50c988
 
 
7a5bcd9
 
0fc185a
 
9325baa
5f83b1b
 
 
 
 
 
 
bf97174
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
---
base_model: meta-llama/Meta-Llama-3.1-8B-Instruct
library_name: peft
license: llama3.1
tags:
- unsloth
- generated_from_trainer
model-index:
- name: l3.1-8b-ins-magiccoder
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# l3.1-8b-ins-magiccoder

This model is a fine-tuned version of [meta-llama/Meta-Llama-3.1-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2331

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.02
- num_epochs: 0.56

### Training results

| Training Loss | Epoch  | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 1.4834        | 0.0130 | 2    | 1.3970          |
| 1.2584        | 0.0259 | 4    | 1.3753          |
| 1.2988        | 0.0389 | 6    | 1.3373          |
| 1.3458        | 0.0518 | 8    | 1.3058          |
| 1.2461        | 0.0648 | 10   | 1.2893          |
| 1.263         | 0.0777 | 12   | 1.2828          |
| 1.2758        | 0.0907 | 14   | 1.2782          |
| 1.2802        | 0.1036 | 16   | 1.2702          |
| 1.137         | 0.1166 | 18   | 1.2617          |
| 1.336         | 0.1296 | 20   | 1.2531          |
| 1.1811        | 0.1425 | 22   | 1.2466          |
| 1.1447        | 0.1555 | 24   | 1.2441          |
| 1.177         | 0.1684 | 26   | 1.2426          |
| 1.2585        | 0.1814 | 28   | 1.2404          |
| 1.1993        | 0.1943 | 30   | 1.2381          |
| 1.1566        | 0.2073 | 32   | 1.2370          |
| 1.2826        | 0.2202 | 34   | 1.2364          |
| 1.1512        | 0.2332 | 36   | 1.2356          |
| 1.1779        | 0.2462 | 38   | 1.2352          |
| 1.261         | 0.2591 | 40   | 1.2346          |
| 1.1998        | 0.2721 | 42   | 1.2341          |
| 1.1847        | 0.2850 | 44   | 1.2335          |
| 1.1266        | 0.2980 | 46   | 1.2336          |
| 1.1699        | 0.3109 | 48   | 1.2336          |
| 1.283         | 0.3239 | 50   | 1.2332          |
| 1.2469        | 0.3368 | 52   | 1.2331          |
| 1.1653        | 0.3498 | 54   | 1.2330          |
| 1.2752        | 0.3628 | 56   | 1.2332          |
| 1.2077        | 0.3757 | 58   | 1.2331          |
| 1.1729        | 0.3887 | 60   | 1.2330          |
| 1.2643        | 0.4016 | 62   | 1.2331          |
| 1.3324        | 0.4146 | 64   | 1.2331          |
| 1.2215        | 0.4275 | 66   | 1.2332          |
| 1.2623        | 0.4405 | 68   | 1.2332          |
| 1.2845        | 0.4534 | 70   | 1.2331          |
| 1.1966        | 0.4664 | 72   | 1.2331          |
| 1.2389        | 0.4794 | 74   | 1.2331          |
| 1.1957        | 0.4923 | 76   | 1.2331          |
| 1.2684        | 0.5053 | 78   | 1.2331          |
| 1.3217        | 0.5182 | 80   | 1.2331          |
| 1.3126        | 0.5312 | 82   | 1.2331          |
| 1.2146        | 0.5441 | 84   | 1.2330          |
| 1.216         | 0.5571 | 86   | 1.2331          |


### Framework versions

- PEFT 0.12.0
- Transformers 4.44.2
- Pytorch 2.3.0+cu121
- Datasets 2.21.0
- Tokenizers 0.19.1