CLAQ: Pushing the Limits of Low-Bit Post-Training Quantization for LLMs

Open in new window