mobilenetv3-HandwritingStrip-3class-v2
This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0548
- Accuracy: 0.9913
- Precision: 0.9899
- Recall: 0.9887
- F1: 0.9893
- Roc Auc: 0.9986
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 4
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Roc Auc |
|---|---|---|---|---|---|---|---|---|
| 0.1771 | 0.0466 | 30 | 0.6308 | 0.7616 | 0.7977 | 0.7668 | 0.7543 | 0.9642 |
| 0.1758 | 0.0932 | 60 | 0.4830 | 0.8437 | 0.8652 | 0.8342 | 0.8363 | 0.9752 |
| 0.1691 | 0.1398 | 90 | 0.5730 | 0.8297 | 0.8744 | 0.7635 | 0.7676 | 0.9781 |
| 0.1806 | 0.1863 | 120 | 0.1443 | 0.9520 | 0.9435 | 0.9451 | 0.9440 | 0.9934 |
| 0.1599 | 0.2329 | 150 | 0.1325 | 0.9590 | 0.9629 | 0.9404 | 0.9492 | 0.9962 |
| 0.1087 | 0.2795 | 180 | 0.1342 | 0.9546 | 0.9577 | 0.9352 | 0.9436 | 0.9947 |
| 0.1241 | 0.3261 | 210 | 0.0832 | 0.9721 | 0.9679 | 0.9658 | 0.9668 | 0.9971 |
| 0.1201 | 0.3727 | 240 | 0.1240 | 0.9607 | 0.9653 | 0.9460 | 0.9534 | 0.9949 |
| 0.0803 | 0.4193 | 270 | 0.1077 | 0.9677 | 0.9673 | 0.9582 | 0.9622 | 0.9948 |
| 0.0970 | 0.4658 | 300 | 0.0660 | 0.9808 | 0.9819 | 0.9728 | 0.9769 | 0.9983 |
| 0.1167 | 0.5124 | 330 | 0.0834 | 0.9721 | 0.9687 | 0.9648 | 0.9667 | 0.9968 |
| 0.1077 | 0.5590 | 360 | 0.0884 | 0.9729 | 0.9688 | 0.9667 | 0.9677 | 0.9965 |
| 0.0628 | 0.6056 | 390 | 0.0959 | 0.9729 | 0.9693 | 0.9671 | 0.9681 | 0.9969 |
| 0.1763 | 0.6522 | 420 | 0.1187 | 0.9686 | 0.9685 | 0.9582 | 0.9629 | 0.9957 |
| 0.0488 | 0.6988 | 450 | 0.0594 | 0.9799 | 0.9753 | 0.9773 | 0.9762 | 0.9988 |
| 0.0817 | 0.7453 | 480 | 0.0624 | 0.9790 | 0.9752 | 0.9743 | 0.9747 | 0.9984 |
| 0.0553 | 0.7919 | 510 | 0.0653 | 0.9747 | 0.9744 | 0.9645 | 0.9689 | 0.9984 |
| 0.1238 | 0.8385 | 540 | 0.0684 | 0.9773 | 0.9733 | 0.9723 | 0.9728 | 0.9976 |
| 0.1450 | 0.8851 | 570 | 0.0820 | 0.9755 | 0.9746 | 0.9657 | 0.9697 | 0.9972 |
| 0.0451 | 0.9317 | 600 | 0.0665 | 0.9825 | 0.9814 | 0.9759 | 0.9785 | 0.9981 |
| 0.0918 | 0.9783 | 630 | 0.0680 | 0.9773 | 0.9730 | 0.9732 | 0.9730 | 0.9985 |
| 0.0784 | 1.0248 | 660 | 0.0576 | 0.9825 | 0.9784 | 0.9798 | 0.9791 | 0.9981 |
| 0.0508 | 1.0714 | 690 | 0.0588 | 0.9790 | 0.9789 | 0.9705 | 0.9743 | 0.9980 |
| 0.0694 | 1.1180 | 720 | 0.0837 | 0.9721 | 0.9634 | 0.9712 | 0.9667 | 0.9983 |
| 0.0468 | 1.1646 | 750 | 0.0617 | 0.9852 | 0.9845 | 0.9807 | 0.9825 | 0.9982 |
| 0.0261 | 1.2112 | 780 | 0.0675 | 0.9790 | 0.9724 | 0.9783 | 0.9751 | 0.9983 |
| 0.0874 | 1.2578 | 810 | 0.0643 | 0.9808 | 0.9761 | 0.9785 | 0.9772 | 0.9981 |
| 0.0458 | 1.3043 | 840 | 0.0620 | 0.9843 | 0.9820 | 0.9809 | 0.9815 | 0.9979 |
| 0.0992 | 1.3509 | 870 | 0.0637 | 0.9782 | 0.9716 | 0.9774 | 0.9742 | 0.9984 |
| 0.0770 | 1.3975 | 900 | 0.0704 | 0.9790 | 0.9760 | 0.9737 | 0.9748 | 0.9974 |
| 0.0797 | 1.4441 | 930 | 0.0579 | 0.9834 | 0.9827 | 0.9773 | 0.9798 | 0.9981 |
| 0.0296 | 1.4907 | 960 | 0.0652 | 0.9790 | 0.9733 | 0.9771 | 0.9751 | 0.9983 |
| 0.0863 | 1.5373 | 990 | 0.0547 | 0.9817 | 0.9775 | 0.9785 | 0.9780 | 0.9985 |
| 0.1118 | 1.5839 | 1020 | 0.0526 | 0.9808 | 0.9758 | 0.9780 | 0.9768 | 0.9981 |
| 0.1040 | 1.6304 | 1050 | 0.0650 | 0.9755 | 0.9683 | 0.9730 | 0.9705 | 0.9985 |
| 0.0525 | 1.6770 | 1080 | 0.0730 | 0.9755 | 0.9674 | 0.9744 | 0.9705 | 0.9983 |
| 0.0662 | 1.7236 | 1110 | 0.0552 | 0.9834 | 0.9798 | 0.9804 | 0.9801 | 0.9984 |
| 0.0214 | 1.7702 | 1140 | 0.0568 | 0.9878 | 0.9867 | 0.9834 | 0.9850 | 0.9976 |
| 0.0407 | 1.8168 | 1170 | 0.0561 | 0.9869 | 0.9854 | 0.9826 | 0.9839 | 0.9976 |
| 0.0427 | 1.8634 | 1200 | 0.0581 | 0.9852 | 0.9809 | 0.9831 | 0.9819 | 0.9977 |
| 0.0605 | 1.9099 | 1230 | 0.0708 | 0.9808 | 0.9759 | 0.9775 | 0.9766 | 0.9968 |
| 0.0506 | 1.9565 | 1260 | 0.0590 | 0.9852 | 0.9821 | 0.9815 | 0.9818 | 0.9974 |
| 0.0522 | 2.0031 | 1290 | 0.0569 | 0.9834 | 0.9786 | 0.9809 | 0.9797 | 0.9982 |
| 0.0356 | 2.0497 | 1320 | 0.0600 | 0.9834 | 0.9796 | 0.9799 | 0.9797 | 0.9977 |
| 0.0279 | 2.0963 | 1350 | 0.0521 | 0.9895 | 0.9888 | 0.9855 | 0.9871 | 0.9981 |
| 0.0175 | 2.1429 | 1380 | 0.0500 | 0.9886 | 0.9869 | 0.9853 | 0.9861 | 0.9984 |
| 0.0241 | 2.1894 | 1410 | 0.0813 | 0.9764 | 0.9684 | 0.9756 | 0.9715 | 0.9979 |
| 0.0317 | 2.2360 | 1440 | 0.0608 | 0.9869 | 0.9833 | 0.9847 | 0.9840 | 0.9980 |
| 0.0521 | 2.2826 | 1470 | 0.0608 | 0.9869 | 0.9829 | 0.9852 | 0.9840 | 0.9979 |
| 0.0199 | 2.3292 | 1500 | 0.0542 | 0.9878 | 0.9856 | 0.9844 | 0.9850 | 0.9984 |
| 0.0199 | 2.3758 | 1530 | 0.0538 | 0.9869 | 0.9848 | 0.9831 | 0.9839 | 0.9986 |
| 0.0120 | 2.4224 | 1560 | 0.0635 | 0.9843 | 0.9829 | 0.9786 | 0.9806 | 0.9982 |
| 0.0359 | 2.4689 | 1590 | 0.0660 | 0.9860 | 0.9850 | 0.9807 | 0.9828 | 0.9983 |
| 0.0417 | 2.5155 | 1620 | 0.0654 | 0.9869 | 0.9859 | 0.9820 | 0.9839 | 0.9980 |
| 0.0244 | 2.5621 | 1650 | 0.0643 | 0.9860 | 0.9840 | 0.9818 | 0.9829 | 0.9975 |
| 0.0547 | 2.6087 | 1680 | 0.0519 | 0.9878 | 0.9861 | 0.9846 | 0.9853 | 0.9982 |
| 0.0632 | 2.6553 | 1710 | 0.0474 | 0.9904 | 0.9891 | 0.9874 | 0.9882 | 0.9988 |
| 0.0065 | 2.7019 | 1740 | 0.0538 | 0.9886 | 0.9870 | 0.9853 | 0.9861 | 0.9986 |
| 0.0255 | 2.7484 | 1770 | 0.0571 | 0.9878 | 0.9863 | 0.9839 | 0.9851 | 0.9985 |
| 0.0080 | 2.7950 | 1800 | 0.0606 | 0.9878 | 0.9851 | 0.9850 | 0.9850 | 0.9985 |
| 0.0172 | 2.8416 | 1830 | 0.0580 | 0.9869 | 0.9848 | 0.9831 | 0.9839 | 0.9988 |
| 0.0250 | 2.8882 | 1860 | 0.0627 | 0.9878 | 0.9861 | 0.9840 | 0.9850 | 0.9986 |
| 0.0603 | 2.9348 | 1890 | 0.0624 | 0.9904 | 0.9891 | 0.9874 | 0.9882 | 0.9982 |
| 0.0229 | 2.9814 | 1920 | 0.0510 | 0.9904 | 0.9886 | 0.9879 | 0.9882 | 0.9987 |
| 0.0142 | 3.0280 | 1950 | 0.0515 | 0.9895 | 0.9883 | 0.9861 | 0.9871 | 0.9984 |
| 0.0070 | 3.0745 | 1980 | 0.0548 | 0.9895 | 0.9883 | 0.9861 | 0.9871 | 0.9983 |
| 0.0132 | 3.1211 | 2010 | 0.0536 | 0.9895 | 0.9872 | 0.9871 | 0.9872 | 0.9987 |
| 0.0075 | 3.1677 | 2040 | 0.0553 | 0.9878 | 0.9858 | 0.9844 | 0.9851 | 0.9984 |
| 0.0215 | 3.2143 | 2070 | 0.0575 | 0.9878 | 0.9858 | 0.9844 | 0.9851 | 0.9984 |
| 0.0020 | 3.2609 | 2100 | 0.0536 | 0.9878 | 0.9858 | 0.9844 | 0.9851 | 0.9986 |
| 0.0342 | 3.3075 | 2130 | 0.0532 | 0.9878 | 0.9863 | 0.9839 | 0.9851 | 0.9986 |
| 0.0170 | 3.3540 | 2160 | 0.0555 | 0.9878 | 0.9868 | 0.9834 | 0.9851 | 0.9985 |
| 0.0152 | 3.4006 | 2190 | 0.0568 | 0.9860 | 0.9826 | 0.9834 | 0.9830 | 0.9984 |
| 0.0011 | 3.4472 | 2220 | 0.0534 | 0.9886 | 0.9865 | 0.9858 | 0.9861 | 0.9987 |
| 0.0050 | 3.4938 | 2250 | 0.0567 | 0.9860 | 0.9830 | 0.9828 | 0.9829 | 0.9987 |
| 0.0045 | 3.5404 | 2280 | 0.0581 | 0.9878 | 0.9852 | 0.9850 | 0.9851 | 0.9986 |
| 0.0031 | 3.5870 | 2310 | 0.0554 | 0.9869 | 0.9845 | 0.9836 | 0.9841 | 0.9986 |
| 0.0075 | 3.6335 | 2340 | 0.0589 | 0.9869 | 0.9844 | 0.9836 | 0.9840 | 0.9984 |
| 0.0035 | 3.6801 | 2370 | 0.0596 | 0.9869 | 0.9844 | 0.9836 | 0.9840 | 0.9984 |
| 0.0125 | 3.7267 | 2400 | 0.0563 | 0.9895 | 0.9872 | 0.9871 | 0.9872 | 0.9986 |
| 0.0017 | 3.7733 | 2430 | 0.0551 | 0.9904 | 0.9892 | 0.9874 | 0.9883 | 0.9986 |
| 0.0074 | 3.8199 | 2460 | 0.0546 | 0.9913 | 0.9899 | 0.9887 | 0.9893 | 0.9986 |
| 0.0014 | 3.8665 | 2490 | 0.0547 | 0.9913 | 0.9899 | 0.9887 | 0.9893 | 0.9986 |
| 0.0136 | 3.9130 | 2520 | 0.0546 | 0.9913 | 0.9899 | 0.9887 | 0.9893 | 0.9986 |
| 0.0038 | 3.9596 | 2550 | 0.0548 | 0.9913 | 0.9899 | 0.9887 | 0.9893 | 0.9986 |
Framework versions
- Transformers 5.3.0
- Pytorch 2.10.0+cu128
- Datasets 4.0.0
- Tokenizers 0.22.2
- Downloads last month
- 2
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support