YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

bert-base-stage2-sbert

SentenceTransformer checkpoint fine-tuned for Vietnamese legal retrieval.

Evaluation

  • Truncate dims: [768, 512, 256, 128]

UIT-ViQuAD2.0

method Accuracy@1 Accuracy@3 Accuracy@5 Accuracy@10 NDCG@3 NDCG@5 NDCG@10 MRR@3 MRR@5 MRR@10 Recall@5 Recall@10 Recall@100 MAP@100
BM25 0.606492 0.75565 0.805643 0.861663 0.694646 0.715313 0.733532 0.673492 0.685004 0.692583 0.805643 0.861663 0.966717 0.69741
768 0.498288 0.676209 0.748665 0.824819 0.603137 0.63304 0.657852 0.57782 0.594448 0.604798 0.748665 0.824819 0.964662 0.611125
512 0.49007 0.669086 0.742501 0.818929 0.595305 0.625579 0.650532 0.569762 0.586582 0.59702 0.742501 0.818929 0.96343 0.603549
256 0.472127 0.648952 0.724969 0.804547 0.576285 0.607607 0.633454 0.551112 0.5685 0.579232 0.724969 0.804547 0.957951 0.586068
128 0.440077 0.626216 0.695795 0.778387 0.549035 0.577717 0.604581 0.522349 0.538278 0.549453 0.695795 0.778387 0.947815 0.556887

Zalo-Legal

method Accuracy@1 Accuracy@3 Accuracy@5 Accuracy@10 NDCG@3 NDCG@5 NDCG@10 MRR@3 MRR@5 MRR@10 Recall@5 Recall@10 Recall@100 MAP@100
BM25 0.379442 0.616751 0.69797 0.77665 0.515637 0.549114 0.57497 0.48181 0.500402 0.511188 0.696701 0.775381 0.923858 0.517283
768 0.369289 0.557107 0.651015 0.782995 0.47751 0.515787 0.559141 0.451988 0.47261 0.490666 0.649112 0.781726 0.961929 0.498361
512 0.371827 0.538071 0.630711 0.769036 0.467101 0.50561 0.550864 0.444585 0.465778 0.484756 0.628807 0.767132 0.956853 0.493436
256 0.351523 0.525381 0.611675 0.765228 0.451601 0.487877 0.537285 0.428088 0.448266 0.468514 0.609772 0.763325 0.951777 0.476734
128 0.326142 0.491117 0.56599 0.729695 0.421049 0.45224 0.50528 0.3989 0.416032 0.437983 0.564086 0.727792 0.937817 0.447903
Downloads last month
112
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support