Structure-Level Knowledge Distillation For Multilingual Sequence Labeling
A PyTorch-based knowledge distillation toolkit for natural language processing
Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT,学生模型biLSTM。
Transformers for Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI
Jiagu深度学习自然语言处理工具 知识图谱关系抽取 中文分词 词性标注 命名实体识别 情感分析 新词发现 关键词 文本摘要
Named-entity recognition using neural networks. Easy-to-use and state-of-the-art results.
A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks.