Grounded Mathematical Proof Generation with Language Models

Neural Information Processing Systems 

Theorem proving in natural mathematical language - the mixture of symbolic and natural language used by humans - plays a central role in mathematical advances and education, and tests aspects of reasoning that are core to intelligence. Yet it has remained underexplored with modern generative models. We study largescale language models on two new generation tasks: suggesting the next step in a mathematical proof, and full proof generation.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found