Дообучались слои 14, 15, 16.

Датасет состоял из 13 862 816 токенов.

Видеокарта для дообучения: Tesla A100.

Обучение

{'loss': 0.8521, 'grad_norm': 0.5644629001617432, 'learning_rate': 2.9148375768217733e-05, 'epoch': 1.29}
{'loss': 0.6742, 'grad_norm': 0.5370610952377319, 'learning_rate': 7.199297629499562e-06, 'epoch': 2.58}
{'train_runtime': 5708.2175, 'train_samples_per_second': 22.869, 'train_steps_per_second': 0.204, 'train_loss': 0.7442483934749853, 'epoch': 3.0}
Downloads last month
55
Safetensors
Model size
1.24B params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for radce/Llama-3.2-1B-ru-v2

Finetuned
(1)
this model

Datasets used to train radce/Llama-3.2-1B-ru-v2