metadata
license: apache-2.0
pipeline_tag: text-generation
language:
- fr
- en
- it
- de
- es
tags:
- pretrained
- llama-3
- openllm-france
datasets:
- OpenLLM-France/Lucie-Training-Dataset
widget:
- text: |-
Quelle est la capitale de l'Espagne ? Madrid.
Quelle est la capitale de la France ?
example_title: Capital cities in French
group: 1-shot Question Answering
Model Card
This repository contains checkpoints (splitted for 512 GPUs) in DeepSpeed format for the Lucie-7B model,
which was trained using this repository of code
based on a fork of Megatron-Deepspeed
.
Each checkpoint is in a subbranch (revision), which names specifies the number of training steps.
For instance step0400000
corresponds to the checkpoint after 4M training steps.
Those checkpoints are provided so that the model can be retrained from a given point.