MindSpore Transformers套件的目标是构建一个大模型训练、推理、部署的全流程套件: 提供业内主流的Transformer类预训练模型, 涵盖丰富的并行特性。 期望帮助用户轻松的实现大模型训练。
MindSpore的vLLM插件,支持基于vLLM框架部署MindSpore模型的推理服务。
MindSpore is a new open source deep learning training/inference framework that could be used for mobile, edge and cloud scenarios.
MindSpore Golden Stick is a open source deep learning model compression algorithom framework.
Expert Kit is an efficient foundation of Expert Parallelism (EP) MoE model Inference on heterogenous hardware.