Compressing Chain-of-Thought in LLMs via Step Entropy

Open in new window