Naver trained a 'GPT-3-like' Korean language model
Naver, the Seongnam, South Korean-based company that operates the eponymous search engine Naver, this week announced that it trained one of the largest AI language models of its kind, called HyperCLOVA. Naver claims that the system learned 6,500 times more Korean data than OpenAI's GPT-3 and contains 204 billion parameters, the parts of the machine learning model learned from historical training data. For the better part of a year, OpenAI's GPT-3 has remained among the largest AI language models ever created. Via an API, people have used it to automatically write emails and articles, summarize text, compose poetry and recipes, create website layouts, and generate code for deep learning in Python. But GPT-3 has key limitations, chief among them that it's only available in English.
Jun-1-2021, 14:25:31 GMT