Experimental prune of michaelwaves' Amoral GPT OSS 120B from 128 experts to 112 experts. GGUF quants cannot be made because I can't run this thing on my laptop lmao maybe someone else can do it (I hope!)
- Downloads last month
- 3
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for blascotobasco/michaelwaves-Amoral-GPT-OSS-112E
Base model
openai/gpt-oss-120b