欢迎加入我们~
RoBERTa中文预训练模型: RoBERTa for Chinese
Multi-label Classification with BERT; Fine Grained Sentiment Analysis from AI challenger
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN