ThatsGroes
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -10,14 +10,26 @@ tags:
|
|
10 |
license: apache-2.0
|
11 |
language:
|
12 |
- en
|
|
|
|
|
|
|
13 |
---
|
14 |
|
15 |
# Uploaded model
|
16 |
|
|
|
17 |
- **Developed by:** ThatsGroes
|
18 |
- **License:** apache-2.0
|
19 |
- **Finetuned from model :** meta-llama/Llama-3.1-8B-Instruct
|
20 |
|
21 |
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
|
22 |
|
23 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
10 |
license: apache-2.0
|
11 |
language:
|
12 |
- en
|
13 |
+
datasets:
|
14 |
+
- kobprof/skolegpt-instruct
|
15 |
+
- Mabeck/Danish-SlimOrca
|
16 |
---
|
17 |
|
18 |
# Uploaded model
|
19 |
|
20 |
+
- **Compute sponsored by:** Nvidia, Arrow ECS Denmark through Danish Data Science Community
|
21 |
- **Developed by:** ThatsGroes
|
22 |
- **License:** apache-2.0
|
23 |
- **Finetuned from model :** meta-llama/Llama-3.1-8B-Instruct
|
24 |
|
25 |
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
|
26 |
|
27 |
+
Fine tuned LoRA adapter in fp16 for 1 epoch on kobprof/skolegpt-instruct and Mabeck/Danish-SlimOrca with rank=alpha=64
|
28 |
+
|
29 |
+
[codecarbon INFO @ 10:41:13] Energy consumed for RAM : 2.822621 kWh. RAM Power : 188.78840446472168 W
|
30 |
+
[codecarbon INFO @ 10:41:13] Energy consumed for all GPUs : 4.379013 kWh. Total GPU Power : 260.7733742516678 W
|
31 |
+
[codecarbon INFO @ 10:41:13] Energy consumed for all CPUs : 0.635721 kWh. Total CPU Power : 42.5 W
|
32 |
+
[codecarbon INFO @ 10:41:13] 7.837356 kWh of electricity used since the beginning.
|
33 |
+
|
34 |
+
|
35 |
+
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|