BitDistiller: Unleashing the Potential of Sub-4-Bit LLMs via Self-Distillation
Paper • 2402.10631 • Published • 2
| PPL | arc_easy | arc_challenge | piqa | winogrande | hellaswag | mmlu | QA Avg |
|---|---|---|---|---|---|---|---|
| 16.94 | 36.91 ± 1.00 | 20.14 ± 1.00 | 60.28 ± 1.00 | 53.99 ± 1.00 | 33.00 ± 1.00 | - | 40.86 |
Training method based on BitDistiller Paper
Base model
TinyLlama/TinyLlama_v1.1