MindSpore Transformers套件的目标是构建一个大模型训练、推理、部署的全流程套件: 提供业内主流的Transformer类预训练模型, 涵盖丰富的并行特性。 期望帮助用户轻松的实现大模型训练。 文档:https://mindformers.readthedocs.io/zh-cn/latest/
🤗 Transformers: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch.
PAKDD23: <Dynamic Multi-View Fusion Mechanism For Chinese Relation Extraction>
information extraction baselines (named entity recognition, relation extraction , joint entity and relation extraction