From 771456898c1d87515201c5e17ac48cf552110ed5 Mon Sep 17 00:00:00 2001 From: huan <3174348550@qq.com> Date: Tue, 26 Nov 2024 10:05:52 +0800 Subject: [PATCH] modify the contents 2.0 --- docs/serving/docs/source_en/serving_install.md | 15 ++++++++------- docs/serving/docs/source_zh_cn/serving_install.md | 15 ++++++++------- 2 files changed, 16 insertions(+), 14 deletions(-) diff --git a/docs/serving/docs/source_en/serving_install.md b/docs/serving/docs/source_en/serving_install.md index 16e8af065a..5f8cf285b5 100644 --- a/docs/serving/docs/source_en/serving_install.md +++ b/docs/serving/docs/source_en/serving_install.md @@ -6,19 +6,20 @@ Currently, MindSpore Serving can be deployed only in the Linux environment. -MindSpore Serving wheel packages are common to various hardware platforms(Nvidia GPU, Atlas training series, Atlas 200/300/500 inference product, Atlas inference series, CPU). The inference task depends on the MindSpore or MindSpore Lite inference framework. We need to select one of them as the Serving Inference backend. When these two inference backend both exist, Mindspore Lite inference framework will be used. +MindSpore Serving wheel packages are common to various hardware platforms(Nvidia GPU, Atlas training series products, Atlas inference series products, Atlas 200/300/500 inference products, CPU). The inference task depends on the MindSpore or MindSpore Lite inference framework. We need to select one of them as the Serving Inference backend. When these two inference backend both exist, Mindspore Lite inference framework will be used. MindSpore and MindSpore Lite have different build packages for different hardware platforms. The following table lists the target devices and model formats supported by each build package. |Inference backend|Build platform|Target device|Supported model formats| |---------| --- | --- | -------- | |MindSpore| Nvidia GPU | Nvidia GPU | `MindIR` | -| | Ascend | Atlas training series | `MindIR` | +| | Ascend |Atlas training series products | `MindIR` | +| | | Atlas inference series products, Atlas 200/300/500 inference products | `MindIR`, `OM` | |MindSpore Lite| Nvidia GPU | Nvidia GPU, CPU | `MindIR_Lite` | -| | Ascend | Atlas 200/300/500 inference product, Atlas inference series, CPU | `MindIR_Lite` | +| | Ascend | Atlas inference series products, Atlas 200/300/500 inference products, CPU | `MindIR_Lite` | | | CPU | CPU | `MindIR_Lite` | -When [MindSpore](https://www.mindspore.cn/) is used as the inference backend, MindSpore Serving supports the Atlas training series and Nvidia GPU environments. Atlas training series and GPU environment only supports the `MindIR` model format. +When [MindSpore](https://www.mindspore.cn/) is used as the inference backend, MindSpore Serving supports the Atlas training series products, Atlas inference series products, Atlas 200/300/500 inference products and Nvidia GPU environments. The Atlas inference series products, Atlas 200/300/500 inference products environment supports both `OM` and `MindIR` model formats, and the Ascend 910 and GPU environment only supports the `MindIR` model format. Due to the dependency between MindSpore Serving and MindSpore, please follow the table below, download and install the corresponding MindSpore verision from [MindSpore download page](https://www.mindspore.cn/versions/en). @@ -31,13 +32,13 @@ Due to the dependency between MindSpore Serving and MindSpore, please follow the For details about how to install and configure MindSpore, see [Installing MindSpore](https://gitee.com/mindspore/mindspore/blob/master/README.md#installation) and [Configuring MindSpore](https://gitee.com/mindspore/docs/blob/master/install/mindspore_ascend_install_source_en.md#configuring-environment-variables). -When [MindSpore Lite](https://www.mindspore.cn/lite) is used as the inference backend, MindSpore Serving supports Atlas 200/300/500 inference product, Atlas inference series, Nvidia GPU and CPU environments. Only the `MindIR_Lite` model formats is supported. Models of format `MindIR` exported from MindSpore or models of other frameworks need be be converted to `MindIR_Lite` format by the MindSpore Lite conversion tool. The `MindIR_Lite` models converted from `Ascend310` and `Ascend310P` environments are different, and the `MindIR_Lite` models must be running on the corresponding `Ascend310` or `Ascend310P` environments. `MindIR_Lite` models converted from Nvidia GPU and CPU environments can be running only in the Nvidia GPU and CPU environments. +When [MindSpore Lite](https://www.mindspore.cn/lite) is used as the inference backend, MindSpore Serving supports Atlas inference series products, Atlas 200/300/500 inference products, Nvidia GPU and CPU environments. Only the `MindIR_Lite` model formats is supported. Models of format `MindIR` exported from MindSpore or models of other frameworks need be be converted to `MindIR_Lite` format by the MindSpore Lite conversion tool. The `MindIR_Lite` models converted from `Atlas 200/300/500 inference products` and `Atlas inference series products` environments are different, and the `MindIR_Lite` models must be running on the corresponding `Atlas 200/300/500 inference products` or `Atlas inference series products` environments. `MindIR_Lite` models converted from Nvidia GPU and CPU environments can be running only in the Nvidia GPU and CPU environments. | Inference backend | Running environment of Lite conversion tool | Target device of `MindIR_Lite` models | | -------------- | ---------------- | --------------- | | MindSpore Lite | Nvidia GPU, CPU | Nvidia GPU, CPU | -| | Atlas 200/300/500 inference product | Atlas 200/300/500 inference product | -| | Atlas inference series | Atlas inference series | +| | Atlas 200/300/500 inference products | Atlas 200/300/500 inference products | +| | Atlas inference series products | Atlas inference series products | For details about how to compile and install MindSpore Lite, see the [MindSpore Lite Documentation](https://www.mindspore.cn/lite/docs/en/master/index.html). We should configure the environment variable `LD_LIBRARY_PATH` to indicates the installation path of `libmindspore-lite.so`. diff --git a/docs/serving/docs/source_zh_cn/serving_install.md b/docs/serving/docs/source_zh_cn/serving_install.md index 3d3f185af7..6e699a524e 100644 --- a/docs/serving/docs/source_zh_cn/serving_install.md +++ b/docs/serving/docs/source_zh_cn/serving_install.md @@ -6,19 +6,20 @@ MindSpore Serving当前仅支持Linux环境部署。 -MindSpore Serving包在各类硬件平台(Nvidia GPU、Atlas训练系列产品、Atlas 200/300/500推理产品、Atlas推理系列产品、CPU)上通用,推理任务依赖MindSpore或MindSpore Lite推理框架,我们需要选择一个作为Serving推理后端。当这两个推理后端同时存在的时候,优先使用MindSpore Lite推理框架。 +MindSpore Serving包在各类硬件平台(Nvidia GPU,Atlas训练系列产品,Atlas推理系列产品,Atlas 200/300/500推理产品,CPU)上通用,推理任务依赖MindSpore或MindSpore Lite推理框架,我们需要选择一个作为Serving推理后端。当这两个推理后端同时存在的时候,优先使用MindSpore Lite推理框架。 MindSpore和MindSpore Lite针对不同的硬件平台有不同的构建包,每个不同的构建包支持的运行目标设备和模型格式如下表所示: |推理后端|构建平台|运行目标设备|支持的模型格式| |---------| --- | --- | -------- | |MindSpore| Nvidia GPU | Nvidia GPU | `MindIR` | -| | Ascend |Atlas训练系列产品 | `MindIR` | -|MindSpore Lite| Nvidia GPU | Nvidia GPU、CPU | `MindIR_Lite` | -| | Ascend | Atlas 200/300/500推理产品、Atlas推理系列产品、CPU | `MindIR_Lite` | +| | Ascend | Atlas训练系列产品 | `MindIR` | +| | | Atlas推理系列产品,Atlas 200/300/500推理产品 | `MindIR`, `OM` | +|MindSpore Lite| Nvidia GPU | Nvidia GPU, CPU | `MindIR_Lite` | +| | Ascend | Atlas推理系列产品,Atlas 200/300/500推理产品, CPU | `MindIR_Lite` | | | CPU | CPU | `MindIR_Lite` | -当以[MindSpore](https://www.mindspore.cn/)作为推理后端时,MindSpore Serving当前支持Atlas训练系列产品和Nvidia GPU环境。Atlas训练系列产品和GPU环境仅支持`MindIR`模型格式。 +当以[MindSpore](https://www.mindspore.cn/)作为推理后端时,MindSpore Serving当前支持Atlas训练系列产品,Atlas推理系列产品,Atlas 200/300/500推理产品和Nvidia GPU环境。其中Atlas推理系列产品、Atlas 200/300/500推理产品环境支持`OM`和`MindIR`两种模型格式,Atlas训练系列产品和GPU环境仅支持`MindIR`模型格式。 由于MindSpore Serving与MindSpore有依赖关系,请按照根据下表中所指示的对应关系,在[MindSpore下载页面](https://www.mindspore.cn/versions)下载并安装对应的whl包。 @@ -31,13 +32,13 @@ MindSpore和MindSpore Lite针对不同的硬件平台有不同的构建包,每 MindSpore的安装和配置可以参考[安装MindSpore](https://gitee.com/mindspore/mindspore#安装),并根据需要完成[环境变量配置](https://gitee.com/mindspore/docs/blob/master/install/mindspore_ascend_install_pip.md#配置环境变量)。 -当以[MindSpore Lite](https://www.mindspore.cn/lite)作为推理后端时,MindSpore Serving当前支持Atlas 200/300/500推理产品、Atlas推理系列产品、Nvidia GPU和CPU。当前仅支持`MindIR_Lite`模型格式,MindSpore的`MindIR`或其他框架的模型文件需要通过Lite转换工具转换成`MindIR_Lite`模型格式。模型转换时,`Ascend310`设备和`Ascend310P`转换出的模型不一致,需要在对应的`Ascend310`或者`Ascend310P`设备上运行;Nvidia GPU和CPU环境转换成的`MindIR_Lite`模型仅能在Nvidia GPU和CPU使用。 +当以[MindSpore Lite](https://www.mindspore.cn/lite)作为推理后端时,MindSpore Serving当前支持Atlas推理系列产品、Atlas 200/300/500推理产品、Nvidia GPU和CPU。当前仅支持`MindIR_Lite`模型格式,MindSpore的`MindIR`或其他框架的模型文件需要通过Lite转换工具转换成`MindIR_Lite`模型格式。模型转换时,`Atlas 200/300/500推理产品`设备和`Atlas推理系列产品`转换出的模型不一致,需要在对应的`Atlas 200/300/500推理产品`或者`Atlas推理系列产品`设备上运行;Nvidia GPU和CPU环境转换成的`MindIR_Lite`模型仅能在Nvidia GPU和CPU使用。 | 推理后端 | 转换工具运行平台 | `MindIR_Lite`模型运行设备 | | -------------- | ---------------- | --------------- | | MindSpore Lite | Nvidia GPU, CPU | Nvidia GPU, CPU | | | Atlas 200/300/500推理产品 | Atlas 200/300/500推理产品 | -| | Atlas推理系列产品 | Atlas推理系列产品 | +| | Atlas推理系列产品 | Atlas推理系列产品 | MindSpore Lite安装和配置可以参考[MindSpore Lite文档](https://www.mindspore.cn/lite/docs/zh-CN/master/index.html),通过环境变量`LD_LIBRARY_PATH`指示`libmindspore-lite.so`的安装路径。 -- Gitee