metadata
license: mit
language:
- en
base_model:
- meta-llama/Llama-3.2-1B
Model Details
Model Description
Finetune of LLaMa 3.2 1B model to include flashnormalization (https://arxiv.org/abs/2407.09577)
- Developed by: OpenMachine Labs
- License: MIT
- Finetuned from model Meta LLaMa 3.2 1B
Model Sources [optional]
- Repository: https://github.com/meta-llama/llama-models/tree/main/models/llama3_2
- Paper https://ai.meta.com/blog/llama-3-2-connect-2024-vision-edge-mobile-devices/
Uses
Direct Use
[More Information Needed]
Downstream Use [optional]
[More Information Needed]
Out-of-Scope Use
[More Information Needed]
Bias, Risks, and Limitations
[More Information Needed]
Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
How to Get Started with the Model
Use the code below to get started with the model.
Speeds, Sizes, Times [optional]
[More Information Needed]
Evaluation
Testing Data, Factors & Metrics
Testing Data
[More Information Needed]
Factors
[More Information Needed]
Metrics
[More Information Needed]
Results
[More Information Needed]
Summary
Model Examination [optional]
Model Card Authors
Nils Graef ([email protected]) Drew Wasielewski ([email protected])
Model Card Contact
[More Information Needed]