oaishi commited on
Commit
230e0ac
·
unverified ·
1 Parent(s): cc11c6b

Fix Lora config error for Llama3 (#1659)

Browse files

The current yml code throws an error: ValueError: Please set lora_modules_to_save to [`embed_tokens`, `lm_head`] when using an adapter and changing the special tokens.

I added the required changes to resolve it

Files changed (1) hide show
  1. examples/llama-3/lora-8b.yml +3 -0
examples/llama-3/lora-8b.yml CHANGED
@@ -24,6 +24,9 @@ lora_alpha: 16
24
  lora_dropout: 0.05
25
  lora_target_linear: true
26
  lora_fan_in_fan_out:
 
 
 
27
 
28
  wandb_project:
29
  wandb_entity:
 
24
  lora_dropout: 0.05
25
  lora_target_linear: true
26
  lora_fan_in_fan_out:
27
+ lora_modules_to_save:
28
+ - embed_tokens
29
+ - lm_head
30
 
31
  wandb_project:
32
  wandb_entity: