gokulsrinivasagan's picture
End of training
35131e7 verified
metadata
library_name: transformers
tags:
  - generated_from_trainer
datasets:
  - gokulsrinivasagan/processed_book_corpus-ld-50
metrics:
  - accuracy
model-index:
  - name: bert_base_lda_50_v1_book
    results:
      - task:
          name: Masked Language Modeling
          type: fill-mask
        dataset:
          name: gokulsrinivasagan/processed_book_corpus-ld-50
          type: gokulsrinivasagan/processed_book_corpus-ld-50
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.7552313864477905

bert_base_lda_50_v1_book

This model is a fine-tuned version of on the gokulsrinivasagan/processed_book_corpus-ld-50 dataset. It achieves the following results on the evaluation set:

  • Loss: 3.9956
  • Accuracy: 0.7552

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 96
  • eval_batch_size: 96
  • seed: 10
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 10000
  • num_epochs: 25

Training results

Training Loss Epoch Step Validation Loss Accuracy
8.955 0.4215 10000 8.7407 0.1644
8.7137 0.8431 20000 8.5491 0.1653
8.6265 1.2646 30000 8.4783 0.1663
6.7021 1.6861 40000 6.0988 0.4554
5.3349 2.1077 50000 5.0105 0.6051
5.0794 2.5292 60000 4.7754 0.6390
4.9395 2.9507 70000 4.6509 0.6570
4.849 3.3723 80000 4.5686 0.6692
4.7838 3.7938 90000 4.5056 0.6787
4.7252 4.2153 100000 4.4582 0.6854
4.6807 4.6369 110000 4.4130 0.6913
4.6389 5.0584 120000 4.3734 0.6966
4.6076 5.4799 130000 4.3443 0.7008
4.5871 5.9014 140000 4.3176 0.7046
4.5675 6.3230 150000 4.3085 0.7070
4.5509 6.7445 160000 4.2861 0.7097
4.5201 7.1660 170000 4.2644 0.7131
4.5046 7.5876 180000 4.2521 0.7151
4.4859 8.0091 190000 4.2339 0.7175
4.4759 8.4306 200000 4.2231 0.7194
4.4563 8.8522 210000 4.2089 0.7215
4.4461 9.2737 220000 4.1986 0.7233
4.4263 9.6952 230000 4.1845 0.7251
4.4123 10.1168 240000 4.1760 0.7270
4.4131 10.5383 250000 4.1642 0.7284
4.3987 10.9598 260000 4.1552 0.7298
4.3866 11.3814 270000 4.1501 0.7306
4.3844 11.8029 280000 4.1384 0.7325
4.3661 12.2244 290000 4.1314 0.7336
4.3614 12.6460 300000 4.1207 0.7351
4.3483 13.0675 310000 4.1126 0.7365
4.3453 13.4890 320000 4.1087 0.7371
4.3391 13.9106 330000 4.1031 0.7379
4.3372 14.3321 340000 4.0942 0.7396
4.3255 14.7536 350000 4.0891 0.7403
4.3166 15.1751 360000 4.0857 0.7410
4.315 15.5967 370000 4.0752 0.7421
4.3041 16.0182 380000 4.0704 0.7431
4.2986 16.4397 390000 4.0649 0.7440
4.293 16.8613 400000 4.0620 0.7446
4.2881 17.2828 410000 4.0532 0.7457
4.282 17.7043 420000 4.0508 0.7465
4.2738 18.1259 430000 4.0469 0.7471
4.2676 18.5474 440000 4.0429 0.7476
4.2666 18.9689 450000 4.0364 0.7485
4.2598 19.3905 460000 4.0315 0.7493
4.258 19.8120 470000 4.0286 0.7499
4.2503 20.2335 480000 4.0250 0.7505
4.2446 20.6551 490000 4.0202 0.7513
4.2401 21.0766 500000 4.0157 0.7517
4.2359 21.4981 510000 4.0125 0.7524
4.2301 21.9197 520000 4.0108 0.7527
4.2318 22.3412 530000 4.0075 0.7532
4.2247 22.7627 540000 4.0060 0.7536
4.2241 23.1843 550000 4.0021 0.7540
4.2201 23.6058 560000 3.9990 0.7546
4.2151 24.0273 570000 3.9981 0.7549
4.2144 24.4488 580000 3.9943 0.7553
4.2123 24.8704 590000 3.9949 0.7552

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.2.1+cu118
  • Datasets 2.17.0
  • Tokenizers 0.20.3