Your LLM's accuracy quietly decays the moment you quantize it — type0 | type0