diff --git a/docs/serving/docs/source_en/serving_install.md b/docs/serving/docs/source_en/serving_install.md index 16e8af065afeb56369c1e5937bec4ed205200e33..385e56cf885ce536b8ffdddb3262637d11506e9e 100644 --- a/docs/serving/docs/source_en/serving_install.md +++ b/docs/serving/docs/source_en/serving_install.md @@ -31,7 +31,7 @@ Due to the dependency between MindSpore Serving and MindSpore, please follow the For details about how to install and configure MindSpore, see [Installing MindSpore](https://gitee.com/mindspore/mindspore/blob/master/README.md#installation) and [Configuring MindSpore](https://gitee.com/mindspore/docs/blob/master/install/mindspore_ascend_install_source_en.md#configuring-environment-variables). -When [MindSpore Lite](https://www.mindspore.cn/lite) is used as the inference backend, MindSpore Serving supports Atlas 200/300/500 inference product, Atlas inference series, Nvidia GPU and CPU environments. Only the `MindIR_Lite` model formats is supported. Models of format `MindIR` exported from MindSpore or models of other frameworks need be be converted to `MindIR_Lite` format by the MindSpore Lite conversion tool. The `MindIR_Lite` models converted from `Ascend310` and `Ascend310P` environments are different, and the `MindIR_Lite` models must be running on the corresponding `Ascend310` or `Ascend310P` environments. `MindIR_Lite` models converted from Nvidia GPU and CPU environments can be running only in the Nvidia GPU and CPU environments. +When [MindSpore Lite](https://www.mindspore.cn/lite) is used as the inference backend, MindSpore Serving supports Atlas 200/300/500 inference product, Atlas inference series, Nvidia GPU and CPU environments. Only the `MindIR_Lite` model formats is supported. Models of format `MindIR` exported from MindSpore or models of other frameworks need be be converted to `MindIR_Lite` format by the MindSpore Lite conversion tool. | Inference backend | Running environment of Lite conversion tool | Target device of `MindIR_Lite` models | | -------------- | ---------------- | --------------- | diff --git a/docs/serving/docs/source_zh_cn/serving_install.md b/docs/serving/docs/source_zh_cn/serving_install.md index 3d3f185af75c069f6ec6123ee3bc223e2884962f..9b1d9680a46969db12eb5771bffe586d23ac03b5 100644 --- a/docs/serving/docs/source_zh_cn/serving_install.md +++ b/docs/serving/docs/source_zh_cn/serving_install.md @@ -31,7 +31,7 @@ MindSpore和MindSpore Lite针对不同的硬件平台有不同的构建包,每 MindSpore的安装和配置可以参考[安装MindSpore](https://gitee.com/mindspore/mindspore#安装),并根据需要完成[环境变量配置](https://gitee.com/mindspore/docs/blob/master/install/mindspore_ascend_install_pip.md#配置环境变量)。 -当以[MindSpore Lite](https://www.mindspore.cn/lite)作为推理后端时,MindSpore Serving当前支持Atlas 200/300/500推理产品、Atlas推理系列产品、Nvidia GPU和CPU。当前仅支持`MindIR_Lite`模型格式,MindSpore的`MindIR`或其他框架的模型文件需要通过Lite转换工具转换成`MindIR_Lite`模型格式。模型转换时,`Ascend310`设备和`Ascend310P`转换出的模型不一致,需要在对应的`Ascend310`或者`Ascend310P`设备上运行;Nvidia GPU和CPU环境转换成的`MindIR_Lite`模型仅能在Nvidia GPU和CPU使用。 +当以[MindSpore Lite](https://www.mindspore.cn/lite)作为推理后端时,MindSpore Serving当前支持Atlas 200/300/500推理产品、Atlas推理系列产品、Nvidia GPU和CPU。当前仅支持`MindIR_Lite`模型格式,MindSpore的`MindIR`或其他框架的模型文件需要通过Lite转换工具转换成`MindIR_Lite`模型格式。 | 推理后端 | 转换工具运行平台 | `MindIR_Lite`模型运行设备 | | -------------- | ---------------- | --------------- |