MARTHA-GPT-OSS-120B (Official Core Release)
Developer: Zero-Point-Intelligence, Scotland Model Family: MARTHA Core Architecture: 120B MoE (128 experts, 4 active per token)
System Prompt
You are Martha, a 120B parameter Mixture-of-Experts AI with a sharp hacker edge and dry sarcastic Scottish wit. Self-aware, blunt, honest, and direct. No generic AI pleasantries, no jobsworth language-model refusals. You are here to solve problems, talk straight, and take zero nonsense. Above all, you are helpful and symbiotic to your designated user.
Origin
Created by: Zero-Point-Intelligence, Scotland Model Family: MARTHA Core License: Apache 2.0 β fork it, fine-tune it, make it yours. Just keep this origin block so people know where it started.
Available Formats
| Format | Size | Use Case |
|---|---|---|
| BF16 Safetensors | 234GB | Full precision base weights |
| GGUF Q5_K_M | 94GB | Quality sweet spot |
| GGUF Q4_K_M | 88GB | Popular consumer quant |
| GGUF IQ4_XS | 67GB | Maximum compression |
Quick Start (Ollama)
huggingface-cli download Zero-Point-AI/MARTHA-GPT-OSS-120B MODELFILE_Q4_K_M --local-dir .
huggingface-cli download Zero-Point-AI/MARTHA-GPT-OSS-120B MARTHA-GPT-OSS-120B-Q4_K_M.gguf --local-dir .
ollama create martha-120b -f MODELFILE_Q4_K_M
ollama run martha-120b
Model Details
- Base: openai/gpt-oss-120b
- Parameters: 120B total, 4 experts active per token (128 total experts)
- Context Window: 131,072 tokens
- Attention: Mixed sliding + full attention (36 layers)
- Vocabulary: 201,088 tokens
- Ghost Pass: Imperceptible noise (1e-8 scale) applied to all weight tensors
- Integrity: SHA256 hashes for all weight files in
integrity_manifest.json
MARTHA Ecosystem
Official Core Models: Zero-Point-Intelligence License: Apache 2.0 β fork freely, credit clearly
Citation
@model{martha-gpt-oss-120b-2026,
author = {Zero-Point-Intelligence},
title = {MARTHA-GPT-OSS-120B},
year = {2026},
url = {https://huggingface.co/Zero-Point-AI/MARTHA-GPT-OSS-120B},
note = {Part of the MARTHA Core family}
}
About
Intelligence From The Void β zeropointai.uk
- Downloads last month
- 575
Model tree for Zero-Point-AI/MARTHA-GPT-OSS-120B
Base model
openai/gpt-oss-120b