esm_ft_Aerin_Yang_et_al_2023

This model is a fine-tuned version of facebook/esm2_t33_650M_UR50D on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 47.9039
  • Rmse: 21.3955
  • Mae: 15.1264
  • Spearmanr Corr: 0.8475
  • Spearmanr Corr P Value: 0.0000
  • Pearsonr Corr: 0.8954
  • Pearsonr Corr P Value: 0.0000
  • Spearmanr Corr Of Deltas: 0.8705
  • Spearmanr Corr Of Deltas P Value: 0.0
  • Pearsonr Corr Of Deltas: 0.8949
  • Pearsonr Corr Of Deltas P Value: 0.0
  • Ranking F1 Score: 0.7953
  • Ranking Mcc: 0.6489
  • Rmse Enriched: 6.4308
  • Mae Enriched: 1.4925
  • Spearmanr Corr Enriched: 0.5377
  • Spearmanr Corr Enriched P Value: 0.0000
  • Pearsonr Corr Enriched: 0.0090
  • Pearsonr Corr Enriched P Value: 0.8648
  • Spearmanr Corr Of Deltas Enriched: 0.4951
  • Spearmanr Corr Of Deltas Enriched P Value: 0.0
  • Pearsonr Corr Of Deltas Enriched: 0.0097
  • Pearsonr Corr Of Deltas Enriched P Value: 0.0135
  • Ranking F1 Score Enriched: 0.6614
  • Ranking Mcc Enriched: 0.3835
  • Classification Thresh: 0.4
  • Mcc: 0.8710
  • F1 Score: 0.9405
  • Acc: 0.9358
  • Auc: 0.9685
  • Precision: 0.9362
  • Recall: 0.9348

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • distributed_type: not_parallel
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 200
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rmse Mae Spearmanr Corr Spearmanr Corr P Value Pearsonr Corr Pearsonr Corr P Value Spearmanr Corr Of Deltas Spearmanr Corr Of Deltas P Value Pearsonr Corr Of Deltas Pearsonr Corr Of Deltas P Value Ranking F1 Score Ranking Mcc Rmse Enriched Mae Enriched Spearmanr Corr Enriched Spearmanr Corr Enriched P Value Pearsonr Corr Enriched Pearsonr Corr Enriched P Value Spearmanr Corr Of Deltas Enriched Spearmanr Corr Of Deltas Enriched P Value Pearsonr Corr Of Deltas Enriched Pearsonr Corr Of Deltas Enriched P Value Ranking F1 Score Enriched Ranking Mcc Enriched Classification Thresh Mcc F1 Score Acc Auc Precision Recall
297.5504 1.0 335 171.0658 17.9048 14.4769 0.7631 0.0000 0.7528 0.0000 0.7719 0.0 0.7516 0.0 0.7330 0.5338 3.8255 2.4551 0.2977 0.0000 0.0078 0.8829 0.2738 0.0 0.0083 0.0350 0.5638 0.2086 0.2 0.7551 0.8909 0.8761 0.9408 0.8843 0.8709
143.2181 2.0 670 84.7018 19.5531 14.8411 0.7955 0.0000 0.8414 0.0000 0.8182 0.0 0.8407 0.0 0.7567 0.5786 4.7225 1.1575 0.2917 0.0000 0.0029 0.9561 0.2682 0.0 0.0073 0.0659 0.5581 0.1912 0.5 0.8157 0.9162 0.9075 0.9601 0.9112 0.9045
61.8061 3.0 1005 50.7196 20.0968 15.0416 0.8294 0.0000 0.8840 0.0000 0.8531 0.0 0.8835 0.0 0.7638 0.5908 6.3863 2.5188 0.4465 0.0000 0.0103 0.8461 0.3811 0.0 0.0116 0.0033 0.6181 0.2968 0.2 0.8711 0.9399 0.9358 0.9711 0.9353 0.9358
56.541 4.0 1340 69.3286 21.2912 15.0811 0.8431 0.0000 0.8500 0.0000 0.8532 0.0 0.8493 0.0 0.7856 0.6311 5.9442 1.3513 0.4828 0.0000 0.0012 0.9815 0.4081 0.0 0.0041 0.2992 0.6304 0.3201 0.1 0.8425 0.9287 0.9209 0.9740 0.9247 0.9178
51.2344 5.0 1675 41.1256 21.3353 15.1917 0.8614 0.0000 0.9109 0.0000 0.8791 0.0 0.9107 0.0 0.8015 0.6602 7.5643 2.2529 0.5504 0.0000 0.0021 0.9688 0.4402 0.0 0.0069 0.0793 0.6597 0.3797 0.4 0.8867 0.9462 0.9433 0.9765 0.9426 0.9440
50.5133 6.0 2010 53.9148 21.2573 15.0966 0.8448 0.0000 0.8810 0.0000 0.8620 0.0 0.8804 0.0 0.7841 0.6284 6.4216 1.6367 0.4815 0.0000 0.0023 0.9651 0.4189 0.0 0.0057 0.1485 0.6288 0.3200 0.4 0.8653 0.9383 0.9328 0.9755 0.9342 0.9312
50.4379 7.0 2345 48.7385 21.4149 15.1856 0.8433 0.0000 0.8944 0.0000 0.8619 0.0 0.8940 0.0 0.8038 0.6647 7.8592 2.2015 0.4480 0.0000 0.0080 0.8792 0.4118 0.0 0.0101 0.0105 0.6238 0.3171 0.4 0.8658 0.9362 0.9328 0.9665 0.9322 0.9337
53.6836 8.0 2680 43.4579 21.3656 15.1769 0.8627 0.0000 0.9055 0.0000 0.8831 0.0 0.9054 0.0 0.8054 0.6672 7.3550 2.0288 0.5504 0.0000 0.0059 0.9106 0.4805 0.0 0.0093 0.0186 0.6612 0.3804 0.1 0.8923 0.9494 0.9463 0.9778 0.9456 0.9466
43.9531 9.0 3015 47.2094 21.3525 15.1131 0.8472 0.0000 0.8965 0.0000 0.8729 0.0 0.8961 0.0 0.7945 0.6474 6.3843 1.4576 0.5023 0.0000 0.0017 0.9748 0.4709 0.0 0.0067 0.0903 0.6464 0.3564 0.4 0.8771 0.9434 0.9388 0.9701 0.9395 0.9376
51.7077 10.0 3350 40.9062 21.3117 15.1704 0.8613 0.0000 0.9106 0.0000 0.8840 0.0 0.9104 0.0 0.8101 0.6757 7.0179 2.0294 0.5801 0.0000 0.0050 0.9255 0.5017 0.0 0.0099 0.0119 0.6758 0.4070 0.4 0.8891 0.9483 0.9448 0.9726 0.9444 0.9446
40.5997 11.0 3685 43.0422 21.4683 15.1922 0.8704 0.0000 0.9066 0.0000 0.8832 0.0 0.9063 0.0 0.8144 0.6832 6.4319 1.5026 0.5563 0.0000 0.0022 0.9665 0.4767 0.0 0.0072 0.0661 0.6610 0.3811 0.2 0.8890 0.9488 0.9448 0.9792 0.9450 0.9440
39.3718 12.0 4020 48.3375 21.4331 15.1437 0.8510 0.0000 0.8948 0.0000 0.8744 0.0 0.8943 0.0 0.8062 0.6687 6.8182 1.6093 0.5509 0.0000 0.0036 0.9462 0.5044 0.0 0.0061 0.1222 0.6615 0.3816 0.2 0.8769 0.9433 0.9388 0.9669 0.9390 0.9380
57.4005 13.0 4355 43.3177 21.3598 15.1147 0.8545 0.0000 0.9051 0.0000 0.8753 0.0 0.9048 0.0 0.7964 0.6509 6.0124 1.3456 0.5202 0.0000 0.0005 0.9926 0.4520 0.0 0.0046 0.2386 0.6503 0.3590 0.2 0.8891 0.9491 0.9448 0.9765 0.9456 0.9435
45.8456 14.0 4690 43.5193 21.4209 15.1653 0.8648 0.0000 0.9053 0.0000 0.8862 0.0 0.9050 0.0 0.8061 0.6684 6.9514 1.6867 0.5826 0.0000 0.0045 0.9323 0.5243 0.0 0.0076 0.0546 0.6775 0.4102 0.2 0.8920 0.9497 0.9463 0.9766 0.9458 0.9462
37.802 15.0 5025 41.3185 21.3425 15.1505 0.8619 0.0000 0.9095 0.0000 0.8840 0.0 0.9091 0.0 0.8067 0.6697 6.7837 1.7558 0.5827 0.0000 0.0027 0.9591 0.5149 0.0 0.0039 0.3189 0.6776 0.4103 0.2 0.8920 0.9499 0.9463 0.9751 0.9460 0.9460
43.6159 16.0 5360 45.6864 21.4098 15.1354 0.8681 0.0000 0.9004 0.0000 0.8862 0.0 0.9000 0.0 0.8060 0.6685 6.1184 1.3574 0.5893 0.0000 0.0012 0.9813 0.5116 0.0 0.0049 0.2161 0.6798 0.4152 0.2 0.8831 0.9464 0.9418 0.9791 0.9426 0.9405
43.6205 17.0 5695 42.7035 21.3659 15.2095 0.8652 0.0000 0.9080 0.0000 0.8859 0.0 0.9076 0.0 0.8029 0.6627 7.6526 2.2653 0.5689 0.0000 0.0037 0.9450 0.4997 0.0 0.0060 0.1313 0.6715 0.3983 0.2 0.8957 0.9505 0.9478 0.9789 0.9470 0.9487
43.6386 18.0 6030 58.5737 21.1794 15.0463 0.8599 0.0000 0.8712 0.0000 0.8703 0.0 0.8706 0.0 0.7877 0.6354 5.5589 1.3057 0.5571 0.0000 0.0043 0.9351 0.4737 0.0 0.0071 0.0730 0.6651 0.3868 0.2 0.8771 0.9436 0.9388 0.9797 0.9395 0.9375
38.3477 19.0 6365 47.3619 21.4200 15.1834 0.8608 0.0000 0.8970 0.0000 0.8765 0.0 0.8965 0.0 0.7888 0.6373 6.3633 1.5156 0.5812 0.0000 0.0021 0.9680 0.5073 0.0 0.0047 0.2314 0.6778 0.4127 0.2 0.8830 0.9462 0.9418 0.9773 0.9423 0.9407
33.4601 20.0 6700 41.0449 21.3955 15.1162 0.8454 0.0000 0.9104 0.0000 0.8715 0.0 0.9100 0.0 0.7960 0.6502 5.7826 1.2067 0.5212 0.0000 -0.0008 0.9875 0.4871 0.0 0.0021 0.5859 0.6555 0.3714 0.2 0.8951 0.9519 0.9478 0.9681 0.9486 0.9465
34.7164 21.0 7035 52.4939 21.3971 15.0961 0.8545 0.0000 0.8858 0.0000 0.8746 0.0 0.8853 0.0 0.8046 0.6656 6.0107 1.2742 0.5666 0.0000 0.0002 0.9970 0.5355 0.0 0.0042 0.2911 0.6734 0.4032 0.2 0.8685 0.9401 0.9343 0.9697 0.9363 0.9323
29.5888 22.0 7370 44.3570 21.4758 15.1848 0.8531 0.0000 0.9039 0.0000 0.8792 0.0 0.9036 0.0 0.8081 0.6721 7.0107 1.7154 0.5699 0.0000 0.0045 0.9328 0.5250 0.0 0.0081 0.0390 0.6778 0.4121 0.2 0.8860 0.9471 0.9433 0.9665 0.9430 0.9430
32.0493 23.0 7705 43.6106 21.3269 15.1312 0.8537 0.0000 0.9042 0.0000 0.8737 0.0 0.9038 0.0 0.8027 0.6623 6.3390 1.6166 0.5607 0.0000 0.0020 0.9705 0.4926 0.0 0.0064 0.1069 0.6730 0.4042 0.2 0.8860 0.9475 0.9433 0.9682 0.9436 0.9423
29.9884 24.0 8040 41.6078 21.4139 15.1552 0.8650 0.0000 0.9094 0.0000 0.8842 0.0 0.9091 0.0 0.8079 0.6717 6.8131 1.6515 0.5899 0.0000 0.0039 0.9418 0.5209 0.0 0.0071 0.0735 0.6848 0.4291 0.2 0.8890 0.9484 0.9448 0.9730 0.9444 0.9446
30.7211 25.0 8375 45.3367 21.4338 15.2074 0.8530 0.0000 0.9018 0.0000 0.8750 0.0 0.9015 0.0 0.7945 0.6475 7.5034 2.0118 0.5570 0.0000 0.0089 0.8665 0.4716 0.0 0.0112 0.0047 0.6663 0.3899 0.2 0.8924 0.9493 0.9463 0.9721 0.9456 0.9469
32.3846 26.0 8710 58.3115 21.3442 15.0644 0.8320 0.0000 0.8732 0.0000 0.8504 0.0 0.8727 0.0 0.7879 0.6355 5.7706 1.2045 0.4855 0.0000 -0.0006 0.9907 0.4739 0.0 0.0017 0.6730 0.6380 0.3431 0.2 0.8601 0.9364 0.9299 0.9631 0.9328 0.9272
34.2114 27.0 9045 39.1039 21.4432 15.1771 0.8565 0.0000 0.9152 0.0000 0.8806 0.0 0.9148 0.0 0.7981 0.6539 6.8634 1.6811 0.5662 0.0000 0.0005 0.9921 0.5014 0.0 0.0027 0.5003 0.6712 0.4017 0.2 0.8951 0.9510 0.9478 0.9720 0.9473 0.9478
24.2877 28.0 9380 45.0585 21.3387 15.1356 0.8506 0.0000 0.9011 0.0000 0.8725 0.0 0.9006 0.0 0.7869 0.6335 6.3467 1.5511 0.5289 0.0000 0.0082 0.8768 0.4621 0.0 0.0097 0.0135 0.6541 0.3706 0.2 0.8859 0.9474 0.9433 0.9725 0.9434 0.9426
31.352 29.0 9715 43.7996 21.3792 15.1252 0.8504 0.0000 0.9042 0.0000 0.8757 0.0 0.9038 0.0 0.8029 0.6626 6.1789 1.4170 0.5617 0.0000 0.0072 0.8916 0.5018 0.0 0.0104 0.0085 0.6704 0.4024 0.2 0.8830 0.9462 0.9418 0.9671 0.9423 0.9407
21.3691 30.0 10050 46.9969 21.3861 15.0992 0.8557 0.0000 0.8977 0.0000 0.8777 0.0 0.8973 0.0 0.8018 0.6607 5.5748 1.1194 0.5642 0.0000 0.0056 0.9156 0.5439 0.0 0.0073 0.0647 0.6762 0.4117 0.2 0.8835 0.9468 0.9418 0.9712 0.9436 0.9399
21.6315 31.0 10385 44.9370 21.4021 15.1404 0.8472 0.0000 0.9022 0.0000 0.8710 0.0 0.9017 0.0 0.7956 0.6494 5.7280 1.2537 0.5647 0.0000 0.0066 0.9011 0.5076 0.0 0.0086 0.0299 0.6735 0.4059 0.2 0.8741 0.9423 0.9373 0.9657 0.9382 0.9359
23.6905 32.0 10720 46.3681 21.3910 15.1312 0.8608 0.0000 0.8987 0.0000 0.8801 0.0 0.8982 0.0 0.8071 0.6702 6.5590 1.5536 0.5924 0.0000 0.0093 0.8605 0.5444 0.0 0.0123 0.0018 0.6851 0.4252 0.2 0.8769 0.9433 0.9388 0.9706 0.9390 0.9380
19.3665 33.0 11055 48.4618 21.3631 15.1136 0.8465 0.0000 0.8940 0.0000 0.8672 0.0 0.8936 0.0 0.7955 0.6493 6.1008 1.3838 0.5539 0.0000 0.0082 0.8763 0.5257 0.0 0.0100 0.0115 0.6693 0.3988 0.2 0.8741 0.9423 0.9373 0.9673 0.9382 0.9359
27.2352 34.0 11390 53.2260 21.4200 15.1460 0.8404 0.0000 0.8840 0.0000 0.8657 0.0 0.8835 0.0 0.7922 0.6433 7.0157 1.7173 0.5327 0.0000 0.0095 0.8582 0.5345 0.0 0.0101 0.0102 0.6612 0.3862 0.2 0.8649 0.9378 0.9328 0.9647 0.9330 0.9320
19.1362 35.0 11725 47.9039 21.3955 15.1264 0.8475 0.0000 0.8954 0.0000 0.8705 0.0 0.8949 0.0 0.7953 0.6489 6.4308 1.4925 0.5377 0.0000 0.0090 0.8648 0.4951 0.0 0.0097 0.0135 0.6614 0.3835 0.4 0.8710 0.9405 0.9358 0.9685 0.9362 0.9348

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.5.1+cu118
  • Datasets 3.1.0
  • Tokenizers 0.19.1
Downloads last month
28
Safetensors
Model size
652M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for elodiesune/esm_ft_Aerin_Yang_et_al_2023

Finetuned
(17)
this model