The hottest Huawei cloud won nlpcc lightweight pre

  • Detail

Huawei cloud won the title of nlpcc lightweight pre training Chinese language model evaluation

recently, Huawei cloud AI team won the first place in the nlpcc 2020 lightweight pre training Chinese language model evaluation at the 9th international natural language processing and Chinese Computing Conference

nlpcc, sponsored by the Chinese computer society, is a top international cutting-edge conference in the field of natural language processing (NLP) and Chinese computing (CC). Every year, the conference adheres to the strict standards of internationalization and first-class to carry out the open evaluation of natural language processing tasks and promote the research and development of related tasks. Nlpcc 2020 attracted nearly 600 experts and scholars in the field of natural language processing at home and abroad, including Cornell University, University of London, Princeton University, etc., and more than 400 experts and scholars witnessed the birth of the first place in the open evaluation task on the spot

at present, the pre training language model has become the mainstream method of NLP, and has achieved significant improvement in many NLP tasks. However, the pre training language model is often large, which limits the application scenarios of the pre training language model. Therefore, how to build a lightweight pre training language model has become a key problem

the pre training language model has developed very rapidly since its emergence. At present, it has evolved into a family

as the "top 50" Chinese lightweight pre training language model of Hefei enterprises, the purpose of the task is to let the participating teams reduce the size of the language model and ensure the effect of the model as much as possible. This competition includes four tasks, namely, anaphora resolution, keyword recognition, two sentence level classification tasks, entity recognition sequence annotation task, MRC reading comprehension task, to evaluate the semantic expression ability of the model from different perspectives. At the same time, the parameters of the model are required to be less than 1/9 of the Bert base model, and the reasoning speed of the model is 8 times that of the Bert base model, which requires the model to run fast, small size and good effect

generally speaking, lightweight models can be obtained by compressing large pre training language models through quantification, pruning, distillation and other methods. Based on the self-developed Nezha Chinese pre training model, the joint team of Huawei cloud and Noah's Ark laboratory obtained the tiny Nezha lightweight model through knowledge distillation and won the crown

compared with other models, Huawei's model finds a better balance in structure. The tinybert two-step distillation method is used to make the model better learn task related knowledge. During the distillation process, the language model is used to predict and replace some tokens for data enhancement, which can make the small model more generalized

material problem: the compressive strength of materials in different industries is different

tinybert knowledge distillation. An important part of the loss function is to let the middle layer learn the hidden state and attention vector

at the same time, the Nezha pre training language model developed by Huawei uses relative position coding to replace Bert's parametric absolute position coding, Therefore, colorform is particularly suitable for the visual components of the exterior decoration area to more directly model the tok consumption mode and constantly change the relative position relationship between en, so as to improve the expression ability of the language model

in the past 2020, Huawei cloud AI has made remarkable achievements in research and development in the field of artificial intelligence, winning 12 international and domestic list champions and awards including WSDM, webvision, CCKS chapter level event extraction technology evaluation champion, artificial intelligence gold refining award, and German red dot. Huawei cloud AI will continue to consolidate its technological advantages, become the black land of the intelligent world, continue to practice inclusive AI, reach every developer and enterprise with AI services, and help all industries enter a new era of artificial intelligence

Copyright © 2011 JIN SHI