biobart-finetuned

This model is a fine-tuned version of GanjinZero/biobart-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5903
  • Rouge1: 19.8110
  • Rouge2: 4.6715
  • Rougel: 15.0191
  • Rougelsum: 15.1237
  • Gen Len: 21.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 3
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
1.656 0.2132 100 1.6722 20.9762 5.1450 15.8207 15.9142 21.0
1.9302 0.4264 200 1.6486 20.0025 4.8184 15.1261 15.2048 21.0
1.6597 0.6397 300 1.6328 19.7071 4.7205 15.0092 15.0983 21.0
1.627 0.8529 400 1.6208 20.3180 4.9983 15.2925 15.3890 21.0
1.6247 1.0661 500 1.6173 20.2940 4.7440 15.3256 15.4381 20.9957
1.6847 1.2793 600 1.6104 18.7252 4.4523 14.2404 14.3214 21.0
1.8386 1.4925 700 1.6037 20.0123 4.7664 15.1375 15.2353 21.0
1.7413 1.7058 800 1.6012 20.2530 4.9525 15.3133 15.4338 21.0
1.7164 1.9190 900 1.5980 20.2849 4.7909 15.4225 15.5315 21.0
1.7448 2.1322 1000 1.5976 18.9970 4.5040 14.5958 14.6876 20.9957
1.3899 2.3454 1100 1.5951 19.5370 4.7154 14.8449 14.9161 20.9957
1.4707 2.5586 1200 1.5920 19.9151 4.5826 15.0214 15.1526 21.0
1.594 2.7719 1300 1.5912 19.6837 4.6517 14.8929 15.0113 21.0
1.6435 2.9851 1400 1.5903 19.8110 4.6715 15.0191 15.1237 21.0

Framework versions

  • PEFT 0.14.0
  • Transformers 4.47.1
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
64
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model’s pipeline type.

Model tree for pendar02/biobart-finetuned

Adapter
(4)
this model