Revisiting Parallel Context Windows: A Frustratingly Simple Alternative and Chain-of-Thought Deterioration
Yang, Kejuan, Liu, Xiao, Men, Kaiwen, Zeng, Aohan, Dong, Yuxiao, Tang, Jie
–arXiv.org Artificial Intelligence
We identify two crucial limitations in the evaluation of recent parallel-integrated method Parallel Context Windows (PCW) (Ratner et al., 2023), which extends the maximum context lengths of language models, e.g., 2048 for LLaMA, by harnessing window-wise attention and positional embedding techniques. We first show that a simple yet strong baseline, weighted sum ensemble, is missing for the incontext few-shot classification. Moreover, on more challenging Chain-of-Thought (CoT) reasoning (e.g., HotpotQA), PCW would present unexpected deterioration regarding question miscomprehension and false inference. Based on our findings, we suggest that the existing PCW design may not guarantee sufficient improvement and practicality in handling lengthy documents in real-world applications. More community efforts on enabling language models' Figure 1: (a) PCW is comparable with Parallel Ensemble long context understanding ability should (PE) and outperforms on fine-grained classification be paid.
arXiv.org Artificial Intelligence
May-24-2023
- Country:
- Europe > France (0.05)
- North America > United States
- Colorado (0.04)
- Florida > Leon County
- Tallahassee (0.04)
- Hawaii > Hawaii County
- Hilo (0.04)
- Idaho (0.04)
- Missouri (0.04)
- Montana (0.04)
- Nebraska > Lancaster County
- Lincoln (0.04)
- Virginia (0.05)
- Genre:
- Research Report > New Finding (0.48)
- Industry:
- Government > Regional Government
- Leisure & Entertainment (1.00)
- Media (1.00)
- Technology: