chuuhtetnaing's picture
End of training
6397430 verified
metadata
library_name: transformers
license: apache-2.0
base_model: distilbert/distilroberta-base
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: bank-transaction-classification-distilroberta-base-v1
    results: []

bank-transaction-classification-distilroberta-base-v1

This model is a fine-tuned version of distilbert/distilroberta-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5173
  • Accuracy: 0.8850

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 45
  • eval_batch_size: 45
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 25

Training results

Training Loss Epoch Step Accuracy Validation Loss
0.6326 1.0 4596 0.8358 0.5800
0.4676 2.0 9192 0.8571 0.4992
0.3977 3.0 13788 0.8599 0.4680
0.3654 4.0 18384 0.8653 0.4636
0.3437 5.0 22980 0.8683 0.4650
0.3183 6.0 27576 0.8724 0.4453
0.296 7.0 32172 0.8706 0.4521
0.2379 8.0 36768 0.8760 0.4299
0.2772 9.0 41364 0.8837 0.4520
0.2488 10.0 45960 0.8776 0.4627
0.2589 11.0 50556 0.8786 0.4711
0.2317 12.0 55152 0.8801 0.4677
0.2179 13.0 59748 0.8828 0.4698
0.2216 14.0 64344 0.8798 0.4736
0.1983 15.0 68940 0.8802 0.4788
0.2068 16.0 73536 0.8817 0.4959
0.2034 17.0 78132 0.8824 0.4849
0.2116 18.0 82728 0.8837 0.4957
0.2152 19.0 87324 0.8833 0.4883
0.2021 20.0 91920 0.8855 0.5119
0.1962 21.0 96516 0.5030 0.8855
0.1836 22.0 101112 0.5089 0.8855
0.1944 23.0 105708 0.5114 0.8851
0.1785 24.0 110304 0.5157 0.8852
0.1802 25.0 114900 0.5173 0.8850

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0
  • Datasets 3.0.0
  • Tokenizers 0.19.1