Pico Small Configuration
Collection
2 items
•
Updated
Adapter for mlsquare/pico_seshu_test using LoRA on "model.layers.3.dt_proj". Standard use of PEFT on Mamba-hf model
Refer to the github repository for more information
Refer to the github repository for more information
Refer to the github repository: https://github.com/mlsquare/fedem
Individual target and source sentences from the AI4Bharat Samanantar dataset. All 11 language sentences and their translations have been stacked and used for next character generation task.
Trained on the next character generation task using cross-entropy loss.
converted to raw UTF8 characters before training by using ByT5-large tokenizer
A simple cross-entropy loss has been used to test the pipeline and working of the model.
MLsquare