segformer_finetuned_rwy_obb_100epochs

This model is a fine-tuned version of nvidia/mit-b0 on the Spatiallysaying/rwy_obb-300-65-65 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0390
  • Mean Iou: 0.3599
  • Mean Accuracy: 0.7198
  • Overall Accuracy: 0.7198
  • Accuracy Background : nan
  • Accuracy Rwy Obb: 0.7198
  • Iou Background : 0.0
  • Iou Rwy Obb: 0.7198

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 1337
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 0.1
  • num_epochs: 100.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Rwy Obb Iou Background Iou Rwy Obb
0.5957 1.0 152 0.5023 0.2461 0.4923 0.4923 nan 0.4923 0.0 0.4923
0.4697 2.0 304 0.3524 0.2104 0.4208 0.4208 nan 0.4208 0.0 0.4208
0.3752 3.0 456 0.2361 0.2458 0.4915 0.4915 nan 0.4915 0.0 0.4915
0.1900 4.0 608 0.1412 0.1648 0.3296 0.3296 nan 0.3296 0.0 0.3296
0.1406 5.0 760 0.0964 0.1899 0.3797 0.3797 nan 0.3797 0.0 0.3797
0.0870 6.0 912 0.0790 0.2317 0.4634 0.4634 nan 0.4634 0.0 0.4634
0.0731 7.0 1064 0.0636 0.2109 0.4218 0.4218 nan 0.4218 0.0 0.4218
0.0588 8.0 1216 0.0586 0.2207 0.4413 0.4413 nan 0.4413 0.0 0.4413
0.0580 9.0 1368 0.0544 0.2900 0.5800 0.5800 nan 0.5800 0.0 0.5800
0.0514 10.0 1520 0.0497 0.2634 0.5267 0.5267 nan 0.5267 0.0 0.5267
0.0469 11.0 1672 0.0483 0.2826 0.5652 0.5652 nan 0.5652 0.0 0.5652
0.0444 12.0 1824 0.0467 0.3008 0.6015 0.6015 nan 0.6015 0.0 0.6015
0.0397 13.0 1976 0.0427 0.2877 0.5753 0.5753 nan 0.5753 0.0 0.5753
0.0388 14.0 2128 0.0454 0.2836 0.5673 0.5673 nan 0.5673 0.0 0.5673
0.0425 15.0 2280 0.0469 0.2433 0.4865 0.4865 nan 0.4865 0.0 0.4865
0.0345 16.0 2432 0.0406 0.2930 0.5860 0.5860 nan 0.5860 0.0 0.5860
0.0335 17.0 2584 0.0387 0.3416 0.6833 0.6833 nan 0.6833 0.0 0.6833
0.0352 18.0 2736 0.0406 0.2728 0.5456 0.5456 nan 0.5456 0.0 0.5456
0.0319 19.0 2888 0.0407 0.3050 0.6099 0.6099 nan 0.6099 0.0 0.6099
0.0316 20.0 3040 0.0441 0.2999 0.5998 0.5998 nan 0.5998 0.0 0.5998
0.0329 21.0 3192 0.0392 0.3560 0.7120 0.7120 nan 0.7120 0.0 0.7120
0.0291 22.0 3344 0.0382 0.3530 0.7059 0.7059 nan 0.7059 0.0 0.7059
0.0287 23.0 3496 0.0402 0.3648 0.7296 0.7296 nan 0.7296 0.0 0.7296
0.0257 24.0 3648 0.0415 0.3316 0.6632 0.6632 nan 0.6632 0.0 0.6632
0.0303 25.0 3800 0.0366 0.3315 0.6630 0.6630 nan 0.6630 0.0 0.6630
0.0240 26.0 3952 0.0374 0.3441 0.6882 0.6882 nan 0.6882 0.0 0.6882
0.0253 27.0 4104 0.0383 0.3391 0.6783 0.6783 nan 0.6783 0.0 0.6783
0.0253 28.0 4256 0.0358 0.3507 0.7014 0.7014 nan 0.7014 0.0 0.7014
0.0256 29.0 4408 0.0372 0.3402 0.6804 0.6804 nan 0.6804 0.0 0.6804
0.0245 30.0 4560 0.0379 0.3541 0.7082 0.7082 nan 0.7082 0.0 0.7082
0.0228 31.0 4712 0.0392 0.3344 0.6689 0.6689 nan 0.6689 0.0 0.6689
0.0253 32.0 4864 0.0392 0.3057 0.6115 0.6115 nan 0.6115 0.0 0.6115
0.0267 33.0 5016 0.0363 0.3466 0.6932 0.6932 nan 0.6932 0.0 0.6932
0.0240 34.0 5168 0.0393 0.3259 0.6518 0.6518 nan 0.6518 0.0 0.6518
0.0223 35.0 5320 0.0419 0.3382 0.6763 0.6763 nan 0.6763 0.0 0.6763
0.0243 36.0 5472 0.0390 0.3217 0.6434 0.6434 nan 0.6434 0.0 0.6434
0.0210 37.0 5624 0.0364 0.3555 0.7109 0.7109 nan 0.7109 0.0 0.7109
0.0245 38.0 5776 0.0380 0.3554 0.7109 0.7109 nan 0.7109 0.0 0.7109
0.0228 39.0 5928 0.0386 0.3381 0.6762 0.6762 nan 0.6762 0.0 0.6762
0.0209 40.0 6080 0.0352 0.3594 0.7187 0.7187 nan 0.7187 0.0 0.7187
0.0187 41.0 6232 0.0372 0.3651 0.7301 0.7301 nan 0.7301 0.0 0.7301
0.0211 42.0 6384 0.0408 0.3293 0.6585 0.6585 nan 0.6585 0.0 0.6585
0.0213 43.0 6536 0.0359 0.3653 0.7306 0.7306 nan 0.7306 0.0 0.7306
0.0208 44.0 6688 0.0362 0.3747 0.7495 0.7495 nan 0.7495 0.0 0.7495
0.0197 45.0 6840 0.0375 0.3580 0.7161 0.7161 nan 0.7161 0.0 0.7161
0.0188 46.0 6992 0.0378 0.3651 0.7302 0.7302 nan 0.7302 0.0 0.7302
0.0204 47.0 7144 0.0365 0.3732 0.7465 0.7465 nan 0.7465 0.0 0.7465
0.0191 48.0 7296 0.0373 0.3509 0.7017 0.7017 nan 0.7017 0.0 0.7017
0.0181 49.0 7448 0.0363 0.3697 0.7395 0.7395 nan 0.7395 0.0 0.7395
0.0197 50.0 7600 0.0366 0.3601 0.7203 0.7203 nan 0.7203 0.0 0.7203
0.0194 51.0 7752 0.0406 0.3355 0.6710 0.6710 nan 0.6710 0.0 0.6710
0.0193 52.0 7904 0.0365 0.3655 0.7309 0.7309 nan 0.7309 0.0 0.7309
0.0186 53.0 8056 0.0385 0.3545 0.7090 0.7090 nan 0.7090 0.0 0.7090
0.0186 54.0 8208 0.0387 0.3808 0.7616 0.7616 nan 0.7616 0.0 0.7616
0.0195 55.0 8360 0.0412 0.3384 0.6768 0.6768 nan 0.6768 0.0 0.6768
0.0175 56.0 8512 0.0370 0.3625 0.7249 0.7249 nan 0.7249 0.0 0.7249
0.0172 57.0 8664 0.0370 0.3685 0.7369 0.7369 nan 0.7369 0.0 0.7369
0.0173 58.0 8816 0.0374 0.3581 0.7162 0.7162 nan 0.7162 0.0 0.7162
0.0177 59.0 8968 0.0374 0.3728 0.7456 0.7456 nan 0.7456 0.0 0.7456
0.0165 60.0 9120 0.0373 0.3588 0.7176 0.7176 nan 0.7176 0.0 0.7176
0.0177 61.0 9272 0.0375 0.3762 0.7523 0.7523 nan 0.7523 0.0 0.7523
0.0163 62.0 9424 0.0395 0.3651 0.7303 0.7303 nan 0.7303 0.0 0.7303
0.0159 63.0 9576 0.0357 0.3612 0.7224 0.7224 nan 0.7224 0.0 0.7224
0.0163 64.0 9728 0.0371 0.3586 0.7173 0.7173 nan 0.7173 0.0 0.7173
0.0172 65.0 9880 0.0383 0.3500 0.6999 0.6999 nan 0.6999 0.0 0.6999
0.0145 66.0 10032 0.0383 0.3650 0.7299 0.7299 nan 0.7299 0.0 0.7299
0.0142 67.0 10184 0.0366 0.3698 0.7396 0.7396 nan 0.7396 0.0 0.7396
0.0153 68.0 10336 0.0381 0.3648 0.7295 0.7295 nan 0.7295 0.0 0.7295
0.0162 69.0 10488 0.0356 0.3726 0.7453 0.7453 nan 0.7453 0.0 0.7453
0.0148 70.0 10640 0.0386 0.3572 0.7144 0.7144 nan 0.7144 0.0 0.7144
0.0153 71.0 10792 0.0370 0.3671 0.7342 0.7342 nan 0.7342 0.0 0.7342
0.0144 72.0 10944 0.0370 0.3613 0.7225 0.7225 nan 0.7225 0.0 0.7225
0.0152 73.0 11096 0.0392 0.3503 0.7005 0.7005 nan 0.7005 0.0 0.7005
0.0144 74.0 11248 0.0379 0.3623 0.7246 0.7246 nan 0.7246 0.0 0.7246
0.0153 75.0 11400 0.0385 0.3681 0.7362 0.7362 nan 0.7362 0.0 0.7362
0.0139 76.0 11552 0.0381 0.3602 0.7205 0.7205 nan 0.7205 0.0 0.7205
0.0145 77.0 11704 0.0378 0.3626 0.7252 0.7252 nan 0.7252 0.0 0.7252
0.0166 78.0 11856 0.0387 0.3596 0.7193 0.7193 nan 0.7193 0.0 0.7193
0.0151 79.0 12008 0.0395 0.3634 0.7269 0.7269 nan 0.7269 0.0 0.7269
0.0165 80.0 12160 0.0393 0.3582 0.7163 0.7163 nan 0.7163 0.0 0.7163
0.0144 81.0 12312 0.0393 0.3535 0.7071 0.7071 nan 0.7071 0.0 0.7071
0.0156 82.0 12464 0.0391 0.3587 0.7173 0.7173 nan 0.7173 0.0 0.7173
0.0144 83.0 12616 0.0390 0.3707 0.7415 0.7415 nan 0.7415 0.0 0.7415
0.0137 84.0 12768 0.0385 0.3641 0.7282 0.7282 nan 0.7282 0.0 0.7282
0.0147 85.0 12920 0.0376 0.3622 0.7244 0.7244 nan 0.7244 0.0 0.7244
0.0159 86.0 13072 0.0382 0.3581 0.7163 0.7163 nan 0.7163 0.0 0.7163
0.0147 87.0 13224 0.0374 0.3645 0.7289 0.7289 nan 0.7289 0.0 0.7289
0.0142 88.0 13376 0.0388 0.3629 0.7257 0.7257 nan 0.7257 0.0 0.7257
0.0141 89.0 13528 0.0372 0.3652 0.7305 0.7305 nan 0.7305 0.0 0.7305
0.0142 90.0 13680 0.0378 0.3597 0.7194 0.7194 nan 0.7194 0.0 0.7194
0.0137 91.0 13832 0.0386 0.3587 0.7174 0.7174 nan 0.7174 0.0 0.7174
0.0140 92.0 13984 0.0387 0.3624 0.7249 0.7249 nan 0.7249 0.0 0.7249
0.0143 93.0 14136 0.0388 0.3608 0.7215 0.7215 nan 0.7215 0.0 0.7215
0.0144 94.0 14288 0.0384 0.3634 0.7269 0.7269 nan 0.7269 0.0 0.7269
0.0137 95.0 14440 0.0382 0.3595 0.7190 0.7190 nan 0.7190 0.0 0.7190
0.0142 96.0 14592 0.0394 0.3565 0.7131 0.7131 nan 0.7131 0.0 0.7131
0.0150 97.0 14744 0.0388 0.3577 0.7154 0.7154 nan 0.7154 0.0 0.7154
0.0147 98.0 14896 0.0383 0.3598 0.7197 0.7197 nan 0.7197 0.0 0.7197
0.0140 99.0 15048 0.0391 0.3620 0.7240 0.7240 nan 0.7240 0.0 0.7240
0.0132 100.0 15200 0.0390 0.3599 0.7198 0.7198 nan 0.7198 0.0 0.7198

Framework versions

  • Transformers 5.0.0.dev0
  • Pytorch 2.9.0+cu126
  • Datasets 4.0.0
  • Tokenizers 0.22.1
Downloads last month
45
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Spatiallysaying/segformer_finetuned_rwy_obb_100epochs

Base model

nvidia/mit-b0
Finetuned
(452)
this model