diff --git a/docs/mindformers/docs/source_en/index.rst b/docs/mindformers/docs/source_en/index.rst index 08d4897e46e5e5f70d1f82b5c79d152489057f84..57fa56c4eff4ed2d67321a2bfbb838258a5debca 100644 --- a/docs/mindformers/docs/source_en/index.rst +++ b/docs/mindformers/docs/source_en/index.rst @@ -1,7 +1,7 @@ MindSpore Transformers Documentation ===================================== -MindSpore Transformers (also known as MindFormers) is a MindSpore-native foundation model suite designed to provide full-flow development capabilities for foundation model training, fine-tuning, evaluating, inference and deploying, providing the industry mainstream Transformer class of pre-trained models and SOTA downstream task applications, and covering a rich range of parallel features, with the expectation of helping users to easily realize large model training and innovative research and development. +The goal of the MindSpore Transformers suite is to build a full-process development suite for Large model pre-training, fine-tuning, evaluation, inference, and deployment. It provides mainstream Transformer-based Large Language Models (LLMs) and Multimodal Models (MMs). It is expected to help users easily realize the full process of large model development. Users can refer to `Overall Architecture `_ and `Model Library `_ to get a quick overview of the MindSpore Transformers system architecture, and the list of supported functional features and foundation models. Further, refer to the `Installation `_ and `Quick Start `_ to get started with MindSpore Transformers. @@ -15,7 +15,7 @@ MindSpore Transformers supports one-click start of single/multi-card training, f - `Evaluation `_ - `Inference `_ - `Quantization `_ -- `Service Deployment `_ +- `Service Deployment `_ - `Multimodal Model Development `_ Code repository address: diff --git a/docs/mindformers/docs/source_en/quick_start/install.md b/docs/mindformers/docs/source_en/quick_start/install.md index 9c5920be83161a20b6220bd7f28cce1f32952019..84fa308bd9e59b17b878029256eb9f6ec91590c7 100644 --- a/docs/mindformers/docs/source_en/quick_start/install.md +++ b/docs/mindformers/docs/source_en/quick_start/install.md @@ -8,18 +8,19 @@ The currently supported hardware is the [Atlas 800T A2](https://www.hiascend.com The current recommended Python version for the suite is 3.11.4. -| MindSpore Transformers | MindSpore | CANN | Firmware & Drivers | Mirror Links | -|:----------------------:|:----------------------:|:----------------------:|:----------------------:|:--------------:| -| In-Development Version | In-Development Version | In-Development Version | In-Development Version | Not applicable | +| MindSpore Transformers | MindSpore | CANN | Firmware & Drivers | Mirror Links | +|:----------------------:|:---------------------------------------------:|:--------------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------------:|:------------:| +| 1.5.0 | [2.6.0-rc1](https://www.mindspore.cn/install) | [8.1.RC1](https://www.hiascend.com/document/detail/en/canncommercial/81RC1/softwareinst/instg/instg_0000.html) | [25.0.RC1](https://www.hiascend.com/document/detail/en/canncommercial/81RC1/softwareinst/instg/instg_0000.html) | Coming Soon | **Currently MindSpore Transformers recommends using a software package relationship as above.** Historical version matching relationship: -| MindSpore Transformers | MindSpore | CANN | Firmware & Drivers | Mirror Links | -|:----------------------------------------------------:|:-------------------------------------------:|:----------------------------------------------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------:| -| [1.3.2](https://pypi.org/project/mindformers/1.3.2/) | [2.4.10](https://www.mindspore.cn/install/) | [8.0.0](https://www.hiascend.com/document/detail/zh/canncommercial/800/softwareinst/instg/instg_0000.html?Mode=PmIns&OS=Ubuntu&Software=cannToolKit) | [24.1.0](https://www.hiascend.com/document/detail/zh/canncommercial/800/softwareinst/instg/instg_0000.html?Mode=PmIns&OS=Ubuntu&Software=cannToolKit) | [Link](http://mirrors.cn-central-221.ovaijisuan.com/detail/168.html) | -| [1.2.0](https://pypi.org/project/mindformers/1.2.0/) | [2.3.0](https://www.mindspore.cn/install/) | [8.0.RC2.beta1](https://www.hiascend.com/developer/download/community/result?module=cann&cann=8.0.RC2.beta1) | [24.1.RC2](https://www.hiascend.com/hardware/firmware-drivers/community) | [Link](http://mirrors.cn-central-221.ovaijisuan.com/detail/138.html) | +| MindSpore Transformers | MindSpore | CANN | Firmware & Drivers | Mirror Links | +|:----------------------:|:---------------------------------------------:|:------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------:| +| 1.3.2 | [2.4.10](https://www.mindspore.cn/install/en) | [8.0.0](https://www.hiascend.com/document/detail/en/canncommercial/800/softwareinst/instg/instg_0000.html) | [24.1.0](https://www.hiascend.com/document/detail/zh/canncommercial/800/softwareinst/instg/instg_0000.html) | [Link](http://mirrors.cn-central-221.ovaijisuan.com/detail/168.html) | +| 1.3.0 | [2.4.0](https://www.mindspore.cn/versions/en) | [8.0.RC3.beta1](https://www.hiascend.com/developer/download/community/result?module=cann&cann=8.0.RC3.beta1) | [24.1.RC3](https://www.hiascend.com/hardware/firmware-drivers/community) | [Link](http://mirrors.cn-central-221.ovaijisuan.com/detail/154.html) | +| 1.2.0 | [2.3.0](https://www.mindspore.cn/versions/en) | [8.0.RC2.beta1](https://www.hiascend.com/developer/download/community/result?module=cann&cann=8.0.RC2.beta1) | [24.1.RC2](https://www.hiascend.com/hardware/firmware-drivers/community) | [Link](http://mirrors.cn-central-221.ovaijisuan.com/detail/138.html) | ## Installing Dependent Software @@ -29,10 +30,10 @@ Historical version matching relationship: ## Installing MindSpore Transformers -Currently only source code compilation installation is supported for in-development version, users can execute the following command to install MindSpore Transformers: +Currently only source code compilation installation is supported, users can execute the following command to install MindSpore Transformers: ```bash -git clone -b r1.5.0 https://gitee.com/mindspore/mindformers.git +git clone -v 1.5.0 https://gitee.com/mindspore/mindformers.git cd mindformers bash build.sh ``` diff --git a/docs/mindformers/docs/source_en/start/models.md b/docs/mindformers/docs/source_en/start/models.md index 214632ff40ee3af7479270a86c093dd99675c074..1c609b4e194104c9ed105ca73fc262f38ec93482 100644 --- a/docs/mindformers/docs/source_en/start/models.md +++ b/docs/mindformers/docs/source_en/start/models.md @@ -4,52 +4,58 @@ The following table lists models supported by MindFormers. -| Model | Specifications | Model Type | Latest Version | -|:--------------------------------------------------------------------------------------------------------|:------------------------------|:----------------:|:----------------------:| -| [CodeLlama](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/codellama.md) | 34B | Dense LLM | In-development version | -| [CogVLM2-Image](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/cogvlm2_image.md) | 19B | MM | In-development version | -| [CogVLM2-Video](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/cogvlm2_video.md) | 13B | MM | In-development version | -| [DeepSeek-V3](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/deepseek3) | 671B | Sparse LLM | In-development version | -| [DeepSeek-V2](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/deepseek2) | 236B | Sparse LLM | In-development version | -| [DeepSeek-Coder-V1.5](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/deepseek1_5) | 7B | Dense LLM | In-development version | -| [DeepSeek-Coder](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/deepseek) | 33B | Dense LLM | In-development version | -| [GLM4](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/glm4.md) | 9B | Dense LLM | In-development version | -| [GLM3-32K](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/glm32k) | 6B | Dense LLM | In-development version | -| [GLM3](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/glm3.md) | 6B | Dense LLM | In-development version | -| [InternLM2](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/internlm2) | 7B/20B | Dense LLM | In-development version | -| [Llama3.1](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/llama3_1) | 8B/70B | Dense LLM | In-development version | -| [Llama3](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/llama3) | 8B/70B | Dense LLM | In-development version | -| [Llama2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/llama2.md) | 7B/13B/70B | Dense LLM | In-development version | -| [Mixtral](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/mixtral) | 8x7B | Sparse LLM | In-development version | -| [Qwen2](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/qwen2) | 0.5B/1.5B/7B/57B/57B-A14B/72B | Dense/Sparse LLM | In-development version | -| [Qwen1.5](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/qwen1_5) | 7B/14B/72B | Dense LLM | In-development version | -| [Qwen-VL](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/qwenvl) | 9.6B | MM | In-development version | -| [Whisper](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/whisper.md) | 1.5B | MM | In-development version | -| [Yi](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/yi) | 6B/34B | Dense LLM | In-development version | -| [Baichuan2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/research/baichuan2/baichuan2.md) | 7B/13B | Dense LLM | 1.3.2 | -| [GLM2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/docs/model_cards/glm2.md) | 6B | Dense LLM | 1.3.2 | -| [GPT2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/docs/model_cards/gpt2.md) | 124M/13B | Dense LLM | 1.3.2 | -| [InternLM](https://gitee.com/mindspore/mindformers/blob/r1.3.0/research/internlm/internlm.md) | 7B/20B | Dense LLM | 1.3.2 | -| [Qwen](https://gitee.com/mindspore/mindformers/blob/r1.3.0/research/qwen/qwen.md) | 7B/14B | Dense LLM | 1.3.2 | -| [CodeGeex2](https://gitee.com/mindspore/mindformers/blob/r1.1.0/docs/model_cards/codegeex2.md) | 6B | Dense LLM | 1.1.0 | -| [WizardCoder](https://gitee.com/mindspore/mindformers/blob/r1.1.0/research/wizardcoder/wizardcoder.md) | 15B | Dense LLM | 1.1.0 | -| [Baichuan](https://gitee.com/mindspore/mindformers/blob/r1.0/research/baichuan/baichuan.md) | 7B/13B | Dense LLM | 1.0 | -| [Blip2](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/blip2.md) | 8.1B | MM | 1.0 | -| [Bloom](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/bloom.md) | 560M/7.1B/65B/176B | Dense LLM | 1.0 | -| [Clip](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/clip.md) | 149M/428M | MM | 1.0 | -| [CodeGeex](https://gitee.com/mindspore/mindformers/blob/r1.0/research/codegeex/codegeex.md) | 13B | Dense LLM | 1.0 | -| [GLM](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/glm.md) | 6B | Dense LLM | 1.0 | -| [iFlytekSpark](https://gitee.com/mindspore/mindformers/blob/r1.0/research/iflytekspark/iflytekspark.md) | 13B | Dense LLM | 1.0 | -| [Llama](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/llama.md) | 7B/13B | Dense LLM | 1.0 | -| [MAE](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/mae.md) | 86M | MM | 1.0 | -| [Mengzi3](https://gitee.com/mindspore/mindformers/blob/r1.0/research/mengzi3/mengzi3.md) | 13B | Dense LLM | 1.0 | -| [PanguAlpha](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/pangualpha.md) | 2.6B/13B | Dense LLM | 1.0 | -| [SAM](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/sam.md) | 91M/308M/636M | MM | 1.0 | -| [Skywork](https://gitee.com/mindspore/mindformers/blob/r1.0/research/skywork/skywork.md) | 13B | Dense LLM | 1.0 | -| [Swin](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/swin.md) | 88M | MM | 1.0 | -| [T5](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/t5.md) | 14M/60M | Dense LLM | 1.0 | -| [VisualGLM](https://gitee.com/mindspore/mindformers/blob/r1.0/research/visualglm/visualglm.md) | 6B | MM | 1.0 | -| [Ziya](https://gitee.com/mindspore/mindformers/blob/r1.0/research/ziya/ziya.md) | 13B | Dense LLM | 1.0 | -| [Bert](https://gitee.com/mindspore/mindformers/blob/r0.8/docs/model_cards/bert.md) | 4M/110M | Dense LLM | 0.8 | +| Model | Specifications | Model Type | Latest Version | +|:--------------------------------------------------------------------------------------------------------|:------------------------------|:----------------:|:--------------:| +| [CodeLlama](https://gitee.com/mindspore/mindformers/blob/1.5.0/docs/model_cards/codellama.md) | 34B | Dense LLM | 1.5.0 | +| [CogVLM2-Image](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/cogvlm2_image.md) | 19B | MM | 1.5.0 | +| [CogVLM2-Video](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/cogvlm2_video.md) | 13B | MM | 1.5.0 | +| [DeepSeek-V3](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/deepseek3) | 671B | Sparse LLM | 1.5.0 | +| [DeepSeek-V2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/deepseek2) | 236B | Sparse LLM | 1.5.0 | +| [DeepSeek-Coder-V1.5](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/deepseek1_5) | 7B | Dense LLM | 1.5.0 | +| [DeepSeek-Coder](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/deepseek) | 33B | Dense LLM | 1.5.0 | +| [GLM4](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/glm4.md) | 9B | Dense LLM | 1.5.0 | +| [GLM3-32K](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/glm32k) | 6B | Dense LLM | 1.5.0 | +| [GLM3](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/glm3.md) | 6B | Dense LLM | 1.5.0 | +| [InternLM2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/internlm2) | 7B/20B | Dense LLM | 1.5.0 | +| [Llama3.2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/llama3_2.md) | 3B | Dense LLM | 1.5.0 | +| [Llama3.2-Vision](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/mllama.md) | 11B | MM | 1.5.0 | +| [Llama3.1](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/llama3_1) | 8B/70B | Dense LLM | 1.5.0 | +| [Llama3](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/llama3) | 8B/70B | Dense LLM | 1.5.0 | +| [Llama2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/llama2.md) | 7B/13B/70B | Dense LLM | 1.5.0 | +| [Mixtral](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/mixtral) | 8x7B | Sparse LLM | 1.5.0 | +| [Qwen2.5](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/qwen2_5) | 0.5B/1.5B/7B/14B/32B/72B | Dense LLM | 1.5.0 | +| [Qwen2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/qwen2) | 0.5B/1.5B/7B/57B/57B-A14B/72B | Dense/Sparse LLM | 1.5.0 | +| [Qwen1.5](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/qwen1_5) | 0.5B/1.8B/4B/7B/14B/72B | Dense LLM | 1.5.0 | +| [Qwen-VL](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/qwenvl) | 9.6B | MM | 1.5.0 | +| [TeleChat2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/telechat2) | 7B/35B/115B | Dense LLM | 1.5.0 | +| [TeleChat](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/telechat) | 7B/12B/52B | Dense LLM | 1.5.0 | +| [Whisper](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/whisper.md) | 1.5B | MM | 1.5.0 | +| [Yi](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/yi) | 6B/34B | Dense LLM | 1.5.0 | +| [YiZhao](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/yizhao) | 12B | Dense LLM | 1.5.0 | +| [Baichuan2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/research/baichuan2/baichuan2.md) | 7B/13B | Dense LLM | 1.3.2 | +| [GLM2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/docs/model_cards/glm2.md) | 6B | Dense LLM | 1.3.2 | +| [GPT2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/docs/model_cards/gpt2.md) | 124M/13B | Dense LLM | 1.3.2 | +| [InternLM](https://gitee.com/mindspore/mindformers/blob/r1.3.0/research/internlm/internlm.md) | 7B/20B | Dense LLM | 1.3.2 | +| [Qwen](https://gitee.com/mindspore/mindformers/blob/r1.3.0/research/qwen/qwen.md) | 7B/14B | Dense LLM | 1.3.2 | +| [CodeGeex2](https://gitee.com/mindspore/mindformers/blob/r1.1.0/docs/model_cards/codegeex2.md) | 6B | Dense LLM | 1.1.0 | +| [WizardCoder](https://gitee.com/mindspore/mindformers/blob/r1.1.0/research/wizardcoder/wizardcoder.md) | 15B | Dense LLM | 1.1.0 | +| [Baichuan](https://gitee.com/mindspore/mindformers/blob/r1.0/research/baichuan/baichuan.md) | 7B/13B | Dense LLM | 1.0 | +| [Blip2](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/blip2.md) | 8.1B | MM | 1.0 | +| [Bloom](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/bloom.md) | 560M/7.1B/65B/176B | Dense LLM | 1.0 | +| [Clip](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/clip.md) | 149M/428M | MM | 1.0 | +| [CodeGeex](https://gitee.com/mindspore/mindformers/blob/r1.0/research/codegeex/codegeex.md) | 13B | Dense LLM | 1.0 | +| [GLM](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/glm.md) | 6B | Dense LLM | 1.0 | +| [iFlytekSpark](https://gitee.com/mindspore/mindformers/blob/r1.0/research/iflytekspark/iflytekspark.md) | 13B | Dense LLM | 1.0 | +| [Llama](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/llama.md) | 7B/13B | Dense LLM | 1.0 | +| [MAE](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/mae.md) | 86M | MM | 1.0 | +| [Mengzi3](https://gitee.com/mindspore/mindformers/blob/r1.0/research/mengzi3/mengzi3.md) | 13B | Dense LLM | 1.0 | +| [PanguAlpha](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/pangualpha.md) | 2.6B/13B | Dense LLM | 1.0 | +| [SAM](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/sam.md) | 91M/308M/636M | MM | 1.0 | +| [Skywork](https://gitee.com/mindspore/mindformers/blob/r1.0/research/skywork/skywork.md) | 13B | Dense LLM | 1.0 | +| [Swin](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/swin.md) | 88M | MM | 1.0 | +| [T5](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/t5.md) | 14M/60M | Dense LLM | 1.0 | +| [VisualGLM](https://gitee.com/mindspore/mindformers/blob/r1.0/research/visualglm/visualglm.md) | 6B | MM | 1.0 | +| [Ziya](https://gitee.com/mindspore/mindformers/blob/r1.0/research/ziya/ziya.md) | 13B | Dense LLM | 1.0 | +| [Bert](https://gitee.com/mindspore/mindformers/blob/r0.8/docs/model_cards/bert.md) | 4M/110M | Dense LLM | 0.8 | * ***LLM:*** *Large Language Model;* ***MM:*** *Multi-Modal* \ No newline at end of file diff --git a/docs/mindformers/docs/source_en/usage/mindie_deployment.md b/docs/mindformers/docs/source_en/usage/deployment.md similarity index 95% rename from docs/mindformers/docs/source_en/usage/mindie_deployment.md rename to docs/mindformers/docs/source_en/usage/deployment.md index ed7819cf927a61343a3f75321b8ae03c27a218e5..a9e6a3996a5384c4734e12affeb6f191696230af 100644 --- a/docs/mindformers/docs/source_en/usage/mindie_deployment.md +++ b/docs/mindformers/docs/source_en/usage/deployment.md @@ -24,9 +24,9 @@ The model support for MindIE inference can be found in [model repository](https: MindIE and CANN versions must be matched, version matching relationship is as follows. - | MindIE | CANN-toolkit | CANN-kernels | - |:-------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------:| - | [1.0.0](https://www.hiascend.com/developer/download/community/result?module=ie%2Bpt%2Bcann) | [8.0.0](https://www.hiascend.com/developer/download/community/result?module=ie%2Bpt%2Bcann) | [8.0.0](https://www.hiascend.com/developer/download/community/result?module=ie%2Bpt%2Bcann) | + | MindIE | CANN | + |:---------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------:| + | [2.0.RC1](https://www.hiascend.com/developer/download/community/result?module=ie%2Bpt%2Bcann) | [8.1.RC1](https://www.hiascend.com/document/detail/en/canncommercial/81RC1/softwareinst/instg/instg_0000.html) | ### Environment Variables diff --git a/docs/mindformers/docs/source_zh_cn/index.rst b/docs/mindformers/docs/source_zh_cn/index.rst index e947483d0ad9ff1daa2b4a06ee7ba66ca554612b..991371131a357ba2e142daae07a021d9cf6ff33d 100644 --- a/docs/mindformers/docs/source_zh_cn/index.rst +++ b/docs/mindformers/docs/source_zh_cn/index.rst @@ -1,7 +1,7 @@ MindSpore Transformers 文档 ========================================= -MindSpore Transformers(也称MindFormers)是一个MindSpore原生的大模型套件,旨在提供大模型训练、微调、评估、推理、部署等全流程开发能力,提供业内主流的Transformer类预训练模型和SOTA下游任务应用,涵盖丰富的并行特性,期望帮助用户轻松地实现大模型训练和创新研发。 +MindSpore Transformers套件的目标是构建一个大模型预训练、微调、评测、推理、部署的全流程开发套件,提供业内主流的Transformer类大语言模型(Large Language Models, LLMs)和多模态理解模型(Multimodal Models, MMs)。期望帮助用户轻松地实现大模型全流程开发。 用户可以参阅 `整体架构 `_ 和 `模型库 `_ ,快速了解MindSpore Transformers的系统架构,及所支持的功能特性和大模型清单。进一步地,可参考 `安装 `_ 和 `快速启动 `_ 章节,上手探索MindSpore Transformers。 @@ -55,7 +55,7 @@ MindSpore Transformers支持一键启动任意任务的单卡/多卡训练、微 diff --git a/docs/mindformers/docs/source_zh_cn/quick_start/install.md b/docs/mindformers/docs/source_zh_cn/quick_start/install.md index f977211c620ab81d41bdd0c2ad4f0932517cddb9..7500c25d9c85c5c4fae8d371bf4a5c207cff0487 100644 --- a/docs/mindformers/docs/source_zh_cn/quick_start/install.md +++ b/docs/mindformers/docs/source_zh_cn/quick_start/install.md @@ -8,18 +8,19 @@ 当前套件建议使用的Python版本为3.11.4。 -| MindSpore Transformers | MindSpore | CANN | 固件与驱动 | 镜像链接 | -|:-----------:|:---------:|:----:|:-----:|:----:| -| 在研版本 | 在研版本 | 在研版本 | 在研版本 | 不涉及 | +| MindSpore Transformers | MindSpore | CANN | 固件与驱动 | 镜像链接 | +|:----------------------:|:---------------------------------------------:|:--------------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------------:|:----:| +| 1.5.0 | [2.6.0-rc1](https://www.mindspore.cn/install) | [8.1.RC1](https://www.hiascend.com/document/detail/zh/canncommercial/81RC1/softwareinst/instg/instg_0000.html) | [25.0.RC1](https://www.hiascend.com/document/detail/zh/canncommercial/81RC1/softwareinst/instg/instg_0000.html) | 即将发布 | **当前MindSpore Transformers建议使用如上的软件配套关系。** 历史版本配套关系: -| MindSpore Transformers | MindSpore | CANN | 固件与驱动 | 镜像链接 | -|:----------------------------------------------------:|:-------------------------------------------:|:----------------------------------------------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------:| -| [1.3.2](https://pypi.org/project/mindformers/1.3.2/) | [2.4.10](https://www.mindspore.cn/install/) | [8.0.0](https://www.hiascend.com/document/detail/zh/canncommercial/800/softwareinst/instg/instg_0000.html?Mode=PmIns&OS=Ubuntu&Software=cannToolKit) | [24.1.0](https://www.hiascend.com/document/detail/zh/canncommercial/800/softwareinst/instg/instg_0000.html?Mode=PmIns&OS=Ubuntu&Software=cannToolKit) | [Link](http://mirrors.cn-central-221.ovaijisuan.com/detail/168.html) | -| [1.2.0](https://pypi.org/project/mindformers/1.2.0/) | [2.3.0](https://www.mindspore.cn/install/) | [8.0.RC2.beta1](https://www.hiascend.com/developer/download/community/result?module=cann&cann=8.0.RC2.beta1) | [24.1.RC2](https://www.hiascend.com/hardware/firmware-drivers/community) | [Link](http://mirrors.cn-central-221.ovaijisuan.com/detail/138.html) | +| MindSpore Transformers | MindSpore | CANN | 固件与驱动 | 镜像链接 | +|:----------------------:|:------------------------------------------:|:------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------:| +| 1.3.2 | [2.4.10](https://www.mindspore.cn/install) | [8.0.0](https://www.hiascend.com/document/detail/zh/canncommercial/800/softwareinst/instg/instg_0000.html) | [24.1.0](https://www.hiascend.com/document/detail/zh/canncommercial/800/softwareinst/instg/instg_0000.html) | [Link](http://mirrors.cn-central-221.ovaijisuan.com/detail/168.html) | +| 1.3.0 | [2.4.0](https://www.mindspore.cn/versions) | [8.0.RC3.beta1](https://www.hiascend.com/developer/download/community/result?module=cann&cann=8.0.RC3.beta1) | [24.1.RC3](https://www.hiascend.com/hardware/firmware-drivers/community) | [Link](http://mirrors.cn-central-221.ovaijisuan.com/detail/154.html) | +| 1.2.0 | [2.3.0](https://www.mindspore.cn/versions) | [8.0.RC2.beta1](https://www.hiascend.com/developer/download/community/result?module=cann&cann=8.0.RC2.beta1) | [24.1.RC2](https://www.hiascend.com/hardware/firmware-drivers/community) | [Link](http://mirrors.cn-central-221.ovaijisuan.com/detail/138.html) | ## 安装依赖软件 @@ -29,10 +30,10 @@ ## 安装MindSpore Transformers -目前在研版本仅支持源码编译安装,用户可以执行如下命令安装MindSpore Transformers: +目前仅支持源码编译安装,用户可以执行如下命令安装MindSpore Transformers: ```bash -git clone -b r1.5.0 https://gitee.com/mindspore/mindformers.git +git clone -v 1.5.0 https://gitee.com/mindspore/mindformers.git cd mindformers bash build.sh ``` diff --git a/docs/mindformers/docs/source_zh_cn/start/models.md b/docs/mindformers/docs/source_zh_cn/start/models.md index 48a42520f981b9f14eeab72fab65e945e649ca9c..d22666f17ca1a619cd8944f61ad6a959ed12eeee 100644 --- a/docs/mindformers/docs/source_zh_cn/start/models.md +++ b/docs/mindformers/docs/source_zh_cn/start/models.md @@ -4,52 +4,58 @@ 当前MindSpore Transformers全量的模型列表如下: -| 模型名 | 支持规格 | 模型类型 | 最新支持版本 | -|:--------------------------------------------------------------------------------------------------------|:------------------------------|:------------:|:------:| -| [CodeLlama](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/codellama.md) | 34B | 稠密LLM | 在研版本 | -| [CogVLM2-Image](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/cogvlm2_image.md) | 19B | MM | 在研版本 | -| [CogVLM2-Video](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/cogvlm2_video.md) | 13B | MM | 在研版本 | -| [DeepSeek-V3](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/deepseek3) | 671B | 稀疏LLM | 在研版本 | -| [DeepSeek-V2](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/deepseek2) | 236B | 稀疏LLM | 在研版本 | -| [DeepSeek-Coder-V1.5](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/deepseek1_5) | 7B | 稠密LLM | 在研版本 | -| [DeepSeek-Coder](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/deepseek) | 33B | 稠密LLM | 在研版本 | -| [GLM4](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/glm4.md) | 9B | 稠密LLM | 在研版本 | -| [GLM3-32K](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/glm32k) | 6B | 稠密LLM | 在研版本 | -| [GLM3](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/glm3.md) | 6B | 稠密LLM | 在研版本 | -| [InternLM2](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/internlm2) | 7B/20B | 稠密LLM | 在研版本 | -| [Llama3.1](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/llama3_1) | 8B/70B | 稠密LLM | 在研版本 | -| [Llama3](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/llama3) | 8B/70B | 稠密LLM | 在研版本 | -| [Llama2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/llama2.md) | 7B/13B/70B | 稠密LLM | 在研版本 | -| [Mixtral](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/mixtral) | 8x7B | 稀疏LLM | 在研版本 | -| [Qwen2](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/qwen2) | 0.5B/1.5B/7B/57B/57B-A14B/72B | 稠密/稀疏LLM | 在研版本 | -| [Qwen1.5](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/qwen1_5) | 7B/14B/72B | 稠密LLM | 在研版本 | -| [Qwen-VL](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/qwenvl) | 9.6B | MM | 在研版本 | -| [Whisper](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/whisper.md) | 1.5B | MM | 在研版本 | -| [Yi](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/yi) | 6B/34B | 稠密LLM | 在研版本 | -| [Baichuan2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/research/baichuan2/baichuan2.md) | 7B/13B | 稠密LLM | 1.3.2 | -| [GLM2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/docs/model_cards/glm2.md) | 6B | 稠密LLM | 1.3.2 | -| [GPT2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/docs/model_cards/gpt2.md) | 124M/13B | 稠密LLM | 1.3.2 | -| [InternLM](https://gitee.com/mindspore/mindformers/blob/r1.3.0/research/internlm/internlm.md) | 7B/20B | 稠密LLM | 1.3.2 | -| [Qwen](https://gitee.com/mindspore/mindformers/blob/r1.3.0/research/qwen/qwen.md) | 7B/14B | 稠密LLM | 1.3.2 | -| [CodeGeex2](https://gitee.com/mindspore/mindformers/blob/r1.1.0/docs/model_cards/codegeex2.md) | 6B | 稠密LLM | 1.1.0 | -| [WizardCoder](https://gitee.com/mindspore/mindformers/blob/r1.1.0/research/wizardcoder/wizardcoder.md) | 15B | 稠密LLM | 1.1.0 | -| [Baichuan](https://gitee.com/mindspore/mindformers/blob/r1.0/research/baichuan/baichuan.md) | 7B/13B | 稠密LLM | 1.0 | -| [Blip2](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/blip2.md) | 8.1B | MM | 1.0 | -| [Bloom](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/bloom.md) | 560M/7.1B/65B/176B | 稠密LLM | 1.0 | -| [Clip](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/clip.md) | 149M/428M | MM | 1.0 | -| [CodeGeex](https://gitee.com/mindspore/mindformers/blob/r1.0/research/codegeex/codegeex.md) | 13B | 稠密LLM | 1.0 | -| [GLM](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/glm.md) | 6B | 稠密LLM | 1.0 | -| [iFlytekSpark](https://gitee.com/mindspore/mindformers/blob/r1.0/research/iflytekspark/iflytekspark.md) | 13B | 稠密LLM | 1.0 | -| [Llama](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/llama.md) | 7B/13B | 稠密LLM | 1.0 | -| [MAE](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/mae.md) | 86M | MM | 1.0 | -| [Mengzi3](https://gitee.com/mindspore/mindformers/blob/r1.0/research/mengzi3/mengzi3.md) | 13B | 稠密LLM | 1.0 | -| [PanguAlpha](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/pangualpha.md) | 2.6B/13B | 稠密LLM | 1.0 | -| [SAM](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/sam.md) | 91M/308M/636M | MM | 1.0 | -| [Skywork](https://gitee.com/mindspore/mindformers/blob/r1.0/research/skywork/skywork.md) | 13B | 稠密LLM | 1.0 | -| [Swin](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/swin.md) | 88M | MM | 1.0 | -| [T5](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/t5.md) | 14M/60M | 稠密LLM | 1.0 | -| [VisualGLM](https://gitee.com/mindspore/mindformers/blob/r1.0/research/visualglm/visualglm.md) | 6B | MM | 1.0 | -| [Ziya](https://gitee.com/mindspore/mindformers/blob/r1.0/research/ziya/ziya.md) | 13B | 稠密LLM | 1.0 | -| [Bert](https://gitee.com/mindspore/mindformers/blob/r0.8/docs/model_cards/bert.md) | 4M/110M | 稠密LLM | 0.8 | +| 模型名 | 支持规格 | 模型类型 | 最新支持版本 | +|:--------------------------------------------------------------------------------------------------------|:------------------------------|:--------:|:------:| +| [CodeLlama](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/codellama.md) | 34B | 稠密LLM | 1.5.0 | +| [CogVLM2-Image](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/cogvlm2_image.md) | 19B | MM | 1.5.0 | +| [CogVLM2-Video](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/cogvlm2_video.md) | 13B | MM | 1.5.0 | +| [DeepSeek-V3](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/deepseek3) | 671B | 稀疏LLM | 1.5.0 | +| [DeepSeek-V2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/deepseek2) | 236B | 稀疏LLM | 1.5.0 | +| [DeepSeek-Coder-V1.5](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/deepseek1_5) | 7B | 稠密LLM | 1.5.0 | +| [DeepSeek-Coder](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/deepseek) | 33B | 稠密LLM | 1.5.0 | +| [GLM4](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/glm4.md) | 9B | 稠密LLM | 1.5.0 | +| [GLM3-32K](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/glm32k) | 6B | 稠密LLM | 1.5.0 | +| [GLM3](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/glm3.md) | 6B | 稠密LLM | 1.5.0 | +| [InternLM2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/internlm2) | 7B/20B | 稠密LLM | 1.5.0 | +| [Llama3.2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/llama3_2.md) | 3B | 稠密LLM | 1.5.0 | +| [Llama3.2-Vision](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/mllama.md) | 11B | MM | 1.5.0 | +| [Llama3.1](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/llama3_1) | 8B/70B | 稠密LLM | 1.5.0 | +| [Llama3](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/llama3) | 8B/70B | 稠密LLM | 1.5.0 | +| [Llama2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/llama2.md) | 7B/13B/70B | 稠密LLM | 1.5.0 | +| [Mixtral](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/mixtral) | 8x7B | 稀疏LLM | 1.5.0 | +| [Qwen2.5](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/qwen2_5) | 0.5B/1.5B/7B/14B/32B/72B | 稠密LLM | 1.5.0 | +| [Qwen2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/qwen2) | 0.5B/1.5B/7B/57B/57B-A14B/72B | 稠密/稀疏LLM | 1.5.0 | +| [Qwen1.5](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/qwen1_5) | 0.5B/1.8B/4B/7B/14B/72B | 稠密LLM | 1.5.0 | +| [Qwen-VL](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/qwenvl) | 9.6B | MM | 1.5.0 | +| [TeleChat2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/telechat2) | 7B/35B/115B | 稠密LLM | 1.5.0 | +| [TeleChat](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/telechat) | 7B/12B/52B | 稠密LLM | 1.5.0 | +| [Whisper](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/whisper.md) | 1.5B | MM | 1.5.0 | +| [Yi](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/yi) | 6B/34B | 稠密LLM | 1.5.0 | +| [YiZhao](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/yizhao) | 12B | 稠密LLM | 1.5.0 | +| [Baichuan2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/research/baichuan2/baichuan2.md) | 7B/13B | 稠密LLM | 1.3.2 | +| [GLM2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/docs/model_cards/glm2.md) | 6B | 稠密LLM | 1.3.2 | +| [GPT2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/docs/model_cards/gpt2.md) | 124M/13B | 稠密LLM | 1.3.2 | +| [InternLM](https://gitee.com/mindspore/mindformers/blob/r1.3.0/research/internlm/internlm.md) | 7B/20B | 稠密LLM | 1.3.2 | +| [Qwen](https://gitee.com/mindspore/mindformers/blob/r1.3.0/research/qwen/qwen.md) | 7B/14B | 稠密LLM | 1.3.2 | +| [CodeGeex2](https://gitee.com/mindspore/mindformers/blob/r1.1.0/docs/model_cards/codegeex2.md) | 6B | 稠密LLM | 1.1.0 | +| [WizardCoder](https://gitee.com/mindspore/mindformers/blob/r1.1.0/research/wizardcoder/wizardcoder.md) | 15B | 稠密LLM | 1.1.0 | +| [Baichuan](https://gitee.com/mindspore/mindformers/blob/r1.0/research/baichuan/baichuan.md) | 7B/13B | 稠密LLM | 1.0 | +| [Blip2](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/blip2.md) | 8.1B | MM | 1.0 | +| [Bloom](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/bloom.md) | 560M/7.1B/65B/176B | 稠密LLM | 1.0 | +| [Clip](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/clip.md) | 149M/428M | MM | 1.0 | +| [CodeGeex](https://gitee.com/mindspore/mindformers/blob/r1.0/research/codegeex/codegeex.md) | 13B | 稠密LLM | 1.0 | +| [GLM](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/glm.md) | 6B | 稠密LLM | 1.0 | +| [iFlytekSpark](https://gitee.com/mindspore/mindformers/blob/r1.0/research/iflytekspark/iflytekspark.md) | 13B | 稠密LLM | 1.0 | +| [Llama](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/llama.md) | 7B/13B | 稠密LLM | 1.0 | +| [MAE](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/mae.md) | 86M | MM | 1.0 | +| [Mengzi3](https://gitee.com/mindspore/mindformers/blob/r1.0/research/mengzi3/mengzi3.md) | 13B | 稠密LLM | 1.0 | +| [PanguAlpha](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/pangualpha.md) | 2.6B/13B | 稠密LLM | 1.0 | +| [SAM](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/sam.md) | 91M/308M/636M | MM | 1.0 | +| [Skywork](https://gitee.com/mindspore/mindformers/blob/r1.0/research/skywork/skywork.md) | 13B | 稠密LLM | 1.0 | +| [Swin](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/swin.md) | 88M | MM | 1.0 | +| [T5](https://gitee.com/mindspore/mindformers/blob/r1.0/docs/model_cards/t5.md) | 14M/60M | 稠密LLM | 1.0 | +| [VisualGLM](https://gitee.com/mindspore/mindformers/blob/r1.0/research/visualglm/visualglm.md) | 6B | MM | 1.0 | +| [Ziya](https://gitee.com/mindspore/mindformers/blob/r1.0/research/ziya/ziya.md) | 13B | 稠密LLM | 1.0 | +| [Bert](https://gitee.com/mindspore/mindformers/blob/r0.8/docs/model_cards/bert.md) | 4M/110M | 稠密LLM | 0.8 | * ***LLM:*** *大语言模型(Large Language Model);* ***MM:*** *多模态(Multi-Modal)* \ No newline at end of file diff --git a/docs/mindformers/docs/source_zh_cn/usage/mindie_deployment.md b/docs/mindformers/docs/source_zh_cn/usage/deployment.md similarity index 95% rename from docs/mindformers/docs/source_zh_cn/usage/mindie_deployment.md rename to docs/mindformers/docs/source_zh_cn/usage/deployment.md index 88222751d7bb7bb93bd362400865b23122991716..3b52b0775b0e8249073ef77d7fed89d454bbd120 100644 --- a/docs/mindformers/docs/source_zh_cn/usage/mindie_deployment.md +++ b/docs/mindformers/docs/source_zh_cn/usage/deployment.md @@ -24,9 +24,9 @@ MindIE推理的模型支持度可参考[模型库](https://www.mindspore.cn/mind MindIE与CANN版本必须配套使用,其版本配套关系如下所示。 - | MindIE | CANN-toolkit | CANN-kernels | - |:-------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------:| - | [1.0.0](https://www.hiascend.com/developer/download/community/result?module=ie%2Bpt%2Bcann) | [8.0.0](https://www.hiascend.com/developer/download/community/result?module=ie%2Bpt%2Bcann) | [8.0.0](https://www.hiascend.com/developer/download/community/result?module=ie%2Bpt%2Bcann) | + | MindIE | CANN | + |:---------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------:| + | [2.0.RC1](https://www.hiascend.com/developer/download/community/result?module=ie%2Bpt%2Bcann) | [8.1.RC1](https://www.hiascend.com/document/detail/en/canncommercial/81RC1/softwareinst/instg/instg_0000.html) | ### 环境变量