Josephgflowers
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -3,6 +3,9 @@ license: mit
|
|
3 |
---
|
4 |
This is a converted tinyllama model using the following script:
|
5 |
|
|
|
|
|
|
|
6 |
1. Overall Flow in the Model
|
7 |
|
8 |
Each of these modules is integrated into the model’s modified decoder layer (ModifiedLlamaDecoderLayer). Here’s a high-level outline of the sequence in which they operate within the decoder:
|
|
|
3 |
---
|
4 |
This is a converted tinyllama model using the following script:
|
5 |
|
6 |
+
https://huggingface.co/Josephgflowers/Differential-Attention-Liquid-Metal-Tinyllama/blob/main/LM-Diff.py
|
7 |
+
|
8 |
+
|
9 |
1. Overall Flow in the Model
|
10 |
|
11 |
Each of these modules is integrated into the model’s modified decoder layer (ModifiedLlamaDecoderLayer). Here’s a high-level outline of the sequence in which they operate within the decoder:
|