diff --git a/docs/mindarmour/docs/source_en/concept_drift_images.md b/docs/mindarmour/docs/source_en/concept_drift_images.md index c297c97d412c6142e3e1c15d0bd21c7f3e3e68ea..1871a370c7db9b5805573724dbf4d2ed51ae8020 100644 --- a/docs/mindarmour/docs/source_en/concept_drift_images.md +++ b/docs/mindarmour/docs/source_en/concept_drift_images.md @@ -16,7 +16,7 @@ This example provides a method for detecting a distribution change of image data 5. Execute the concept drift detection function. 6. View the execution result. -> This example is for CPUs, GPUs, and Ascend 910 AI processors. Currently only supports GRAPH_MODE. You can download the complete sample code at . +> This example is for CPUs, and GPUs. Currently only supports GRAPH_MODE. You can download the complete sample code at . ## Preparations diff --git a/docs/mindarmour/docs/source_en/improve_model_security_nad.md b/docs/mindarmour/docs/source_en/improve_model_security_nad.md index c6d5a1321df0196356d0f2dd96d05a7a0ea08614..1c945fc9f3ee2ee34a1daa3fa25c092496fd0d1d 100644 --- a/docs/mindarmour/docs/source_en/improve_model_security_nad.md +++ b/docs/mindarmour/docs/source_en/improve_model_security_nad.md @@ -14,7 +14,7 @@ At the beginning of AI algorithm design, related security threats are sometimes This section describes how to use MindSpore Armour in adversarial attack and defense by taking the Fast Gradient Sign Method (FGSM) attack algorithm and Natural Adversarial Defense (NAD) algorithm as examples. -> The current sample is for CPU, GPU and Ascend 910 AI processor. You can find the complete executable sample code at +> The current sample is for CPU and GPU. You can find the complete executable sample code at > ## Creating an Target Model diff --git a/docs/mindarmour/docs/source_en/protect_user_privacy_with_differential_privacy.md b/docs/mindarmour/docs/source_en/protect_user_privacy_with_differential_privacy.md index 80d93112e01d501e75570d6507f4b3ff9febe02b..3da0e3e44e69da2c723a92a07db3ed03a09f2ec4 100644 --- a/docs/mindarmour/docs/source_en/protect_user_privacy_with_differential_privacy.md +++ b/docs/mindarmour/docs/source_en/protect_user_privacy_with_differential_privacy.md @@ -29,7 +29,7 @@ MindSpore Armour differential privacy module Differential-Privacy implements the The LeNet model and MNIST dataset are used as an example to describe how to use the differential privacy optimizer to train a neural network model on MindSpore. -> Because of the limit of CPU APIs, differential privacy training can only run on GPU or Ascend, except for CPU. This example is for the Ascend 910 AI processor. You can download the complete sample code from . +> Because of the limit of CPU APIs, differential privacy training can only run on GPU or Ascend, except for CPU. You can download the complete sample code from . ## Implementation diff --git a/docs/mindarmour/docs/source_en/protect_user_privacy_with_suppress_privacy.md b/docs/mindarmour/docs/source_en/protect_user_privacy_with_suppress_privacy.md index 4589e7d0f0a789a346822e05c2d3fee4472a0be0..545c75237f7f234b8c2c6f89b3b5357cdf6c7734 100644 --- a/docs/mindarmour/docs/source_en/protect_user_privacy_with_suppress_privacy.md +++ b/docs/mindarmour/docs/source_en/protect_user_privacy_with_suppress_privacy.md @@ -14,8 +14,6 @@ Suppress-Privacy, a Suppress-Privacy module in MindSpore Armour, implements a su Here is an example showing that how to train a neural network model in MindSpore using the LeNet model, MNIST dataset, and the SuppressourPrivacy optimizer. -> This example is for the Ascend 910 AI processor and you can download the full sample code at - ## Implementation ### Importing Library Files diff --git a/docs/mindarmour/docs/source_en/test_model_security_fuzzing.md b/docs/mindarmour/docs/source_en/test_model_security_fuzzing.md index ad45dcc0d29eaef9cec9f882032b0dd1c7a4fbc0..4fc3160b73f26d60f182d797f85e4ef9dd839d8f 100644 --- a/docs/mindarmour/docs/source_en/test_model_security_fuzzing.md +++ b/docs/mindarmour/docs/source_en/test_model_security_fuzzing.md @@ -10,7 +10,7 @@ The fuzz testing module of MindSpore Armour uses the neuron coverage rate as the The LeNet model and MNIST dataset are used as an example to describe how to use Fuzz testing. -> This example is for CPUs, GPUs, and Ascend 910 AI processors. Currently only supports GRAPH_MODE. You can download the complete sample code at . +> This example is for CPUs, and GPUs. Currently only supports GRAPH_MODE. You can download the complete sample code at . ## Implementation diff --git a/docs/mindarmour/docs/source_en/test_model_security_membership_inference.md b/docs/mindarmour/docs/source_en/test_model_security_membership_inference.md index 2bfa0d5e152df8dd0c4261bc8ad04e01d8586172..a71d602622a2bc987565c0a520e82b53d5fcca3e 100644 --- a/docs/mindarmour/docs/source_en/test_model_security_membership_inference.md +++ b/docs/mindarmour/docs/source_en/test_model_security_membership_inference.md @@ -10,10 +10,6 @@ In machine learning and deep learning, if an attacker has some access permission The following uses a VGG16 model and CIFAR-100 dataset as an example to describe how to use membership inference to perform model privacy security evaluation. This tutorial uses pre-trained model parameters for demonstration. This following describes only the model structure, parameter settings, and dataset preprocessing method. -> This example is for the Ascend 910 AI Processor. You can download the complete sample code in the following link: -> -> - ## Implementation ### Importing Library Files diff --git a/docs/mindarmour/docs/source_zh_cn/concept_drift_images.md b/docs/mindarmour/docs/source_zh_cn/concept_drift_images.md index c6ba0ed5bc42e3776005b39edc067c4d62243f1b..b2e04a50520ab4feff8da766e06c550660d0be86 100644 --- a/docs/mindarmour/docs/source_zh_cn/concept_drift_images.md +++ b/docs/mindarmour/docs/source_zh_cn/concept_drift_images.md @@ -17,7 +17,7 @@ 5. 执行概念漂移检测函数。 6. 查看结果。 -> 本例面向CPU、GPU、Ascend 910 AI处理器,目前仅支持GRAPH_MODE。你可以在这里找到完整可运行的样例代码:。 +> 本例面向CPU、GPU,目前仅支持GRAPH_MODE。你可以在这里找到完整可运行的样例代码:。 ## 准备环节 diff --git a/docs/mindarmour/docs/source_zh_cn/improve_model_security_nad.md b/docs/mindarmour/docs/source_zh_cn/improve_model_security_nad.md index ef048144062d3c1c3de9c9cab59e462abcf17079..0371f6e89614e93a5c38bac77ee5b7ed0842b98b 100644 --- a/docs/mindarmour/docs/source_zh_cn/improve_model_security_nad.md +++ b/docs/mindarmour/docs/source_zh_cn/improve_model_security_nad.md @@ -14,7 +14,7 @@ AI算法设计之初普遍未考虑相关的安全威胁,使得AI算法的判 这里通过图像分类任务上的对抗性攻防,以攻击算法FGSM和防御算法NAD为例,介绍MindSpore Armour在对抗攻防上的使用方法。 -> 本例面向CPU、GPU、Ascend 910 AI处理器,你可以在这里下载完整的样例代码: +> 本例面向CPU、GPU,你可以在这里下载完整的样例代码: > ## 建立被攻击模型 diff --git a/docs/mindarmour/docs/source_zh_cn/protect_user_privacy_with_differential_privacy.md b/docs/mindarmour/docs/source_zh_cn/protect_user_privacy_with_differential_privacy.md index 8131c66d817cb435502e0e44ae37533154751c4d..49e1c8beb2b692e42ad4c788a6b04b6f4b69e1d6 100644 --- a/docs/mindarmour/docs/source_zh_cn/protect_user_privacy_with_differential_privacy.md +++ b/docs/mindarmour/docs/source_zh_cn/protect_user_privacy_with_differential_privacy.md @@ -30,7 +30,6 @@ MindSpore Armour的差分隐私模块Differential-Privacy,实现了差分隐 这里以LeNet模型,MNIST 数据集为例,说明如何在MindSpore上使用差分隐私优化器训练神经网络模型。 > 由于API支持的限制,差分隐私训练目前只支持在GPU或者Ascend服务器上面进行,不支持CPU。 -本例面向Ascend 910 AI处理器,你可以在这里下载完整的样例代码: ## 实现阶段 diff --git a/docs/mindarmour/docs/source_zh_cn/protect_user_privacy_with_suppress_privacy.md b/docs/mindarmour/docs/source_zh_cn/protect_user_privacy_with_suppress_privacy.md index 41722a8bb6138433e39616d35df68126a83c7763..c2cc7b506fb68fceb244dff091c61b25906fb959 100644 --- a/docs/mindarmour/docs/source_zh_cn/protect_user_privacy_with_suppress_privacy.md +++ b/docs/mindarmour/docs/source_zh_cn/protect_user_privacy_with_suppress_privacy.md @@ -12,8 +12,6 @@ MindSpore Armour的抑制隐私模块Suppress-Privacy,实现了抑制隐私优 这里以LeNet模型,MNIST 数据集为例,说明如何在MindSpore上使用抑制隐私优化器训练神经网络模型。 -> 本例面向Ascend 910 AI处理器,你可以在这里下载完整的样例代码: - ## 实现阶段 ### 导入需要的库文件 diff --git a/docs/mindarmour/docs/source_zh_cn/test_model_security_fuzzing.md b/docs/mindarmour/docs/source_zh_cn/test_model_security_fuzzing.md index 2e0c7ad17e9be7662f622b90eac9017a36331aac..e0f6a7670ad02699b2b1f536668bc0e175809a91 100644 --- a/docs/mindarmour/docs/source_zh_cn/test_model_security_fuzzing.md +++ b/docs/mindarmour/docs/source_zh_cn/test_model_security_fuzzing.md @@ -10,8 +10,6 @@ MindSpore Armour的fuzz_testing模块以神经元覆盖率作为测试评价准 这里以LeNet模型,MNIST数据集为例,说明如何使用Fuzzer。 -> 本例面向CPU、GPU、Ascend 910 AI处理器,目前仅支持GRAPH_MODE。你可以在这里下载完整的样例代码: - ## 实现阶段 ### 导入需要的库文件 diff --git a/docs/mindarmour/docs/source_zh_cn/test_model_security_membership_inference.md b/docs/mindarmour/docs/source_zh_cn/test_model_security_membership_inference.md index 985104da54b454a76afb8512d1af82fcbfe16f1c..9d2e938a282f95c3d33f0ef7b27ebb08f459017c 100644 --- a/docs/mindarmour/docs/source_zh_cn/test_model_security_membership_inference.md +++ b/docs/mindarmour/docs/source_zh_cn/test_model_security_membership_inference.md @@ -10,10 +10,6 @@ 这里以VGG16模型,CIFAR-100数据集为例,说明如何使用MembershipInference进行模型隐私安全评估。本教程使用预训练的模型参数进行演示,这里仅给出模型结构、参数设置和数据集预处理方式。 ->本例面向Ascend 910处理器,您可以在这里下载完整的样例代码: -> -> - ## 实现阶段 ### 导入需要的库文件 diff --git a/docs/serving/docs/source_en/serving_distributed_example.md b/docs/serving/docs/source_en/serving_distributed_example.md index 667d82903b74865c30e1560a0aedea53e4ced450..fdbb67b6c5c03d386a5f177560b992e7aa5578f7 100644 --- a/docs/serving/docs/source_en/serving_distributed_example.md +++ b/docs/serving/docs/source_en/serving_distributed_example.md @@ -21,7 +21,6 @@ Currently, the distributed model has the following restrictions: - The model of the first stage receives the same input data. - The models of other stages do not receive data. - All models of the latter stage return the same data. -- Only Ascend 910 inference is supported. The following uses a simple distributed network MatMul as an example to demonstrate the deployment process. diff --git a/docs/serving/docs/source_en/serving_install.md b/docs/serving/docs/source_en/serving_install.md index 76360ca6e9c445cf0d9e7bd0a4c9fbe48c3267be..9c9f24b6ff57fe0cd767c7189cbc8d96c076ee45 100644 --- a/docs/serving/docs/source_en/serving_install.md +++ b/docs/serving/docs/source_en/serving_install.md @@ -6,20 +6,17 @@ Currently, MindSpore Serving can be deployed only in the Linux environment. -MindSpore Serving wheel packages are common to various hardware platforms(Nvidia GPU, Ascend 910/310P/310, CPU). The inference task depends on the MindSpore or MindSpore Lite inference framework. We need to select one of them as the Serving Inference backend. When these two inference backend both exist, Mindspore Lite inference framework will be used. +MindSpore Serving wheel packages are common to various hardware platforms(Nvidia GPU, CPU). The inference task depends on the MindSpore or MindSpore Lite inference framework. We need to select one of them as the Serving Inference backend. When these two inference backend both exist, Mindspore Lite inference framework will be used. MindSpore and MindSpore Lite have different build packages for different hardware platforms. The following table lists the target devices and model formats supported by each build package. |Inference backend|Build platform|Target device|Supported model formats| |---------| --- | --- | -------- | |MindSpore| Nvidia GPU | Nvidia GPU | `MindIR` | -| | Ascend | Ascend 910 | `MindIR` | -| | | Ascend 310P/310 | `MindIR`, `OM` | |MindSpore Lite| Nvidia GPU | Nvidia GPU, CPU | `MindIR_Lite` | -| | Ascend | Ascend 310P/310, CPU | `MindIR_Lite` | | | CPU | CPU | `MindIR_Lite` | -When [MindSpore](https://www.mindspore.cn/) is used as the inference backend, MindSpore Serving supports the Ascend 910/310P/310 and Nvidia GPU environments. The Ascend 310P/310 environment supports both `OM` and `MindIR` model formats, and the Ascend 910 and GPU environment only supports the `MindIR` model format. +When [MindSpore](https://www.mindspore.cn/) is used as the inference backend, MindSpore Serving supports the Nvidia GPU environment. GPU environment only supports the `MindIR` model format. Due to the dependency between MindSpore Serving and MindSpore, please follow the table below, download and install the corresponding MindSpore verision from [MindSpore download page](https://www.mindspore.cn/versions/en). @@ -32,13 +29,11 @@ Due to the dependency between MindSpore Serving and MindSpore, please follow the For details about how to install and configure MindSpore, see [Installing MindSpore](https://gitee.com/mindspore/mindspore/blob/r2.0/README.md#installation) and [Configuring MindSpore](https://gitee.com/mindspore/docs/blob/r2.0/install/mindspore_ascend_install_source_en.md#configuring-environment-variables). -When [MindSpore Lite](https://www.mindspore.cn/lite) is used as the inference backend, MindSpore Serving supports Ascend 310P/310, Nvidia GPU and CPU environments. Only the `MindIR_Lite` model formats is supported. Models of format `MindIR` exported from MindSpore or models of other frameworks need be be converted to `MindIR_Lite` format by the MindSpore Lite conversion tool. The `MindIR_Lite` models converted from `Ascend310` and `Ascend310P` environments are different, and the `MindIR_Lite` models must be running on the corresponding `Ascend310` or `Ascend310P` environments. `MindIR_Lite` models converted from Nvidia GPU and CPU environments can be running only in the Nvidia GPU and CPU environments. +When [MindSpore Lite](https://www.mindspore.cn/lite) is used as the inference backend, MindSpore Serving supports Nvidia GPU and CPU environments. Only the `MindIR_Lite` model formats is supported. Models of format `MindIR` exported from MindSpore or models of other frameworks need be be converted to `MindIR_Lite` format by the MindSpore Lite conversion tool. `MindIR_Lite` models converted from Nvidia GPU and CPU environments can be running only in the Nvidia GPU and CPU environments. | Inference backend | Running environment of Lite conversion tool | Target device of `MindIR_Lite` models | | -------------- | ---------------- | --------------- | | MindSpore Lite | Nvidia GPU, CPU | Nvidia GPU, CPU | -| | Ascend 310 | Ascend 310 | -| | Ascend 310P | Ascend 310P | For details about how to compile and install MindSpore Lite, see the [MindSpore Lite Documentation](https://www.mindspore.cn/lite/docs/en/r2.0/index.html). We should configure the environment variable `LD_LIBRARY_PATH` to indicates the installation path of `libmindspore-lite.so`. diff --git a/docs/serving/docs/source_zh_cn/serving_distributed_example.md b/docs/serving/docs/source_zh_cn/serving_distributed_example.md index ff71189c2abd88a9c90e9762dff004c98088e6ff..5a385b01b8ba55185641816bfe6e7d6f819acd9d 100644 --- a/docs/serving/docs/source_zh_cn/serving_distributed_example.md +++ b/docs/serving/docs/source_zh_cn/serving_distributed_example.md @@ -19,7 +19,6 @@ - 第一个stage的模型接收相同的输入数据。 - 其他的stage的模型不接收数据。 - 最后一个stage的所有模型都返回相同的数据。 -- 仅支持Ascend 910推理。 下面以一个简单的分布式网络MatMul为例,演示部署流程。 diff --git a/docs/serving/docs/source_zh_cn/serving_install.md b/docs/serving/docs/source_zh_cn/serving_install.md index c35701be4fad615116fbe76208df4109c2a98aaa..356c5a3a1bd0e1838de6f69cfe072aff2e74be2b 100644 --- a/docs/serving/docs/source_zh_cn/serving_install.md +++ b/docs/serving/docs/source_zh_cn/serving_install.md @@ -6,20 +6,17 @@ MindSpore Serving当前仅支持Linux环境部署。 -MindSpore Serving包在各类硬件平台(Nvidia GPU, Ascend 910/310P/310, CPU)上通用,推理任务依赖MindSpore或MindSpore Lite推理框架,我们需要选择一个作为Serving推理后端。当这两个推理后端同时存在的时候,优先使用MindSpore Lite推理框架。 +MindSpore Serving包在各类硬件平台(Nvidia GPU, CPU)上通用,推理任务依赖MindSpore或MindSpore Lite推理框架,我们需要选择一个作为Serving推理后端。当这两个推理后端同时存在的时候,优先使用MindSpore Lite推理框架。 MindSpore和MindSpore Lite针对不同的硬件平台有不同的构建包,每个不同的构建包支持的运行目标设备和模型格式如下表所示: |推理后端|构建平台|运行目标设备|支持的模型格式| |---------| --- | --- | -------- | |MindSpore| Nvidia GPU | Nvidia GPU | `MindIR` | -| | Ascend | Ascend 910 | `MindIR` | -| | | Ascend 310P/310 | `MindIR`, `OM` | |MindSpore Lite| Nvidia GPU | Nvidia GPU, CPU | `MindIR_Lite` | -| | Ascend | Ascend 310P/310, CPU | `MindIR_Lite` | | | CPU | CPU | `MindIR_Lite` | -当以[MindSpore](https://www.mindspore.cn/)作为推理后端时,MindSpore Serving当前支持Ascend 910/310P/310和Nvidia GPU环境。其中Ascend 310P/310环境支持`OM`和`MindIR`两种模型格式,Ascend 910和GPU环境仅支持`MindIR`模型格式。 +当以[MindSpore](https://www.mindspore.cn/)作为推理后端时,MindSpore Serving当前支持Nvidia GPU环境。GPU环境仅支持`MindIR`模型格式。 由于MindSpore Serving与MindSpore有依赖关系,请按照根据下表中所指示的对应关系,在[MindSpore下载页面](https://www.mindspore.cn/versions)下载并安装对应的whl包。 @@ -32,13 +29,11 @@ MindSpore和MindSpore Lite针对不同的硬件平台有不同的构建包,每 MindSpore的安装和配置可以参考[安装MindSpore](https://gitee.com/mindspore/mindspore#安装),并根据需要完成[环境变量配置](https://gitee.com/mindspore/docs/blob/r2.0/install/mindspore_ascend_install_pip.md#配置环境变量)。 -当以[MindSpore Lite](https://www.mindspore.cn/lite)作为推理后端时,MindSpore Serving当前支持Ascend 310P/310、Nvidia GPU和CPU。当前仅支持`MindIR_Lite`模型格式,MindSpore的`MindIR`或其他框架的模型文件需要通过Lite转换工具转换成`MindIR_Lite`模型格式。模型转换时,`Ascend310`设备和`Ascend310P`转换出的模型不一致,需要在对应的`Ascend310`或者`Ascend310P`设备上运行;Nvidia GPU和CPU环境转换成的`MindIR_Lite`模型仅能在Nvidia GPU和CPU使用。 +当以[MindSpore Lite](https://www.mindspore.cn/lite)作为推理后端时,MindSpore Serving当前支持Nvidia GPU和CPU。当前仅支持`MindIR_Lite`模型格式,MindSpore的`MindIR`或其他框架的模型文件需要通过Lite转换工具转换成`MindIR_Lite`模型格式。Nvidia GPU和CPU环境转换成的`MindIR_Lite`模型仅能在Nvidia GPU和CPU使用。 | 推理后端 | 转换工具运行平台 | `MindIR_Lite`模型运行设备 | | -------------- | ---------------- | --------------- | | MindSpore Lite | Nvidia GPU, CPU | Nvidia GPU, CPU | -| | Ascend 310 | Ascend 310 | -| | Ascend 310P | Ascend 310P | MindSpore Lite安装和配置可以参考[MindSpore Lite文档](https://www.mindspore.cn/lite/docs/zh-CN/r2.0/index.html),通过环境变量`LD_LIBRARY_PATH`指示`libmindspore-lite.so`的安装路径。