https://huggingface.co/Marcjoni/QuasiStarSynth-12B
- quantized using AutoAWQ,
- 4bit
- group_size 64
- zero_point: True
- GEMM
- Downloads last month
- 5
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for tooolz/QuasiStarSynth-12B-AWQ-4bit-g64
Base model
Marcjoni/QuasiStarSynth-12B