OTel-Reranker-4B

OTel-Reranker-4B is a telecom-specialized reranker model fine-tuned on telecommunications domain data. It is part of the OTel Family of Models, an open-source initiative to build industry-standard AI models for the global telecommunications sector.

Model Details

Attribute Value
Base Model Qwen/Qwen3-4B
Parameters 4B
Training Method Full parameter fine-tuning
Language English
License Apache 2.0

Training Data

The model was trained on high-quality telecom-focused data curated by 200+ domain experts from organizations including AT&T, RelationalAI, AMD, GSMA, Purdue University, Khalifa University, University of Leeds, Yale University, The University of Texas at Dallas, NetoAI, and MantisNLP.

Data Sources:

  • GSMA Permanent Reference Documents
  • 3GPP Specifications
  • O-RAN Documentation
  • RFC Series
  • eSIM, terminals, security, networks, roaming, APIs
  • Industry whitepapers and telecom academic papers

Intended Use

This model is optimized for:

  • RAG applications in telecommunications
  • Question answering on telecom specifications and standards

Related Models

Language Models

Embedding Models

Reranker Models

Related Datasets

Training Infrastructure

  • Framework: ScalarLM (GPU-agnostic)
  • Compute: TensorWave with AMD GPUs and Azure with NVIDIA GPUs.

Citation

@misc{otel2026,
  title={OTel: Open Telco AI Models},
  author={Tavakkoli, Farbod and Diamos, Gregory and Paulk, Roderic and Terrazas, Jorden},
  year={2026},
  url={https://huggingface.co/farbodtavakkoli}
}

Contact

If you have any technical questions, please feel free to reach out to farbod.tavakkoli@att.com or farbodtavakoli@gmail.com

Downloads last month
167,736
Safetensors
Model size
4B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for farbodtavakkoli/OTel-Reranker-4B

Finetuned
Qwen/Qwen3-4B
Finetuned
(538)
this model

Collection including farbodtavakkoli/OTel-Reranker-4B