KoBigBird-large: Transformation of Transformer for Korean Language Understanding
Yang, Kisu, Jang, Yoonna, Lee, Taewoo, Seong, Jinwoo, Lee, Hyungjin, Jang, Hwanseok, Lim, Heuiseok
–arXiv.org Artificial Intelligence
This work presents KoBigBird-large, a large size of Korean BigBird that achieves state-ofthe-art performance and allows long sequence processing for Korean language understanding. Without further pretraining, we only transform the architecture and extend the positional encoding with our proposed Tapered Absolute Positional Encoding Representations (TAPER). Figure 1: An illustration of building KoBigBird-large In experiments, KoBigBird-large shows stateof-the-art process. Based on the architecture of KoBigBird-base overall performance on Korean language and the parameters of RoBERTa-large, our proposed understanding benchmarks and the best TAPER method is applied to build KoBigBird-large.
arXiv.org Artificial Intelligence
Sep-19-2023
- Country:
- Genre:
- Research Report > New Finding (0.68)
- Industry:
- Energy (0.47)
- Technology: