SelfzCoT: a Self-Prompt Zero-shot CoT from Semantic-level to Code-level for a Better Utilization of LLMs
–arXiv.org Artificial Intelligence
As a way of communicating with users and any LLMs like GPT or PaLM2, prompting becomes an increasingly important research topic for better utilization of LLMs. Although simple prompting has great performance on single-step questions, it cannot always activate the correct knowledge path for multi-step reasoning tasks. The chain of thought (CoT), which often contains Zero-shot CoT and few-shot CoT, is a recently developed prompting method that is capable of explaining the reasoning process to the LLM and outperforms simple prompting in three challenging reasoning tasks, including arithmetic, symbolic, and common-sense reasoning. This paper proposes a code-level self-prompt Zero-shot CoT (SelfzCoT) that takes advantage of an entity node or reasoning path of representing knowledge to activate deeper knowledge of larger path lengths within LLM in a graph way. It is done with three iterative steps in the format of step-by-step reasoning that can be easily adjusted or extended to different kinds of tasks.
arXiv.org Artificial Intelligence
Nov-27-2023
- Country:
- Asia (0.67)
- North America > United States
- Minnesota > Hennepin County > Minneapolis (0.14)
- Genre:
- Research Report (1.00)
- Technology: