From 6567585bd099d06c436f22c1dd02d00b1897e607 Mon Sep 17 00:00:00 2001 From: huan <3174348550@qq.com> Date: Tue, 26 Nov 2024 09:14:19 +0800 Subject: [PATCH] delete contents 2.0 --- .../mindarmour/docs/source_en/concept_drift_images.md | 2 +- .../docs/source_en/improve_model_security_nad.md | 2 +- .../protect_user_privacy_with_differential_privacy.md | 2 +- .../protect_user_privacy_with_suppress_privacy.md | 2 -- .../docs/source_en/test_model_security_fuzzing.md | 2 +- .../test_model_security_membership_inference.md | 4 ---- .../docs/source_zh_cn/concept_drift_images.md | 2 +- .../docs/source_zh_cn/improve_model_security_nad.md | 2 +- .../protect_user_privacy_with_differential_privacy.md | 1 - .../protect_user_privacy_with_suppress_privacy.md | 2 -- .../docs/source_zh_cn/test_model_security_fuzzing.md | 2 -- .../test_model_security_membership_inference.md | 4 ---- .../docs/source_en/serving_distributed_example.md | 1 - docs/serving/docs/source_en/serving_install.md | 11 +++-------- .../docs/source_zh_cn/serving_distributed_example.md | 1 - docs/serving/docs/source_zh_cn/serving_install.md | 11 +++-------- 16 files changed, 12 insertions(+), 39 deletions(-) diff --git a/docs/mindarmour/docs/source_en/concept_drift_images.md b/docs/mindarmour/docs/source_en/concept_drift_images.md index c297c97d41..1871a370c7 100644 --- a/docs/mindarmour/docs/source_en/concept_drift_images.md +++ b/docs/mindarmour/docs/source_en/concept_drift_images.md @@ -16,7 +16,7 @@ This example provides a method for detecting a distribution change of image data 5. Execute the concept drift detection function. 6. View the execution result. -> This example is for CPUs, GPUs, and Ascend 910 AI processors. Currently only supports GRAPH_MODE. You can download the complete sample code at . +> This example is for CPUs, and GPUs. Currently only supports GRAPH_MODE. You can download the complete sample code at . ## Preparations diff --git a/docs/mindarmour/docs/source_en/improve_model_security_nad.md b/docs/mindarmour/docs/source_en/improve_model_security_nad.md index c6d5a1321d..1c945fc9f3 100644 --- a/docs/mindarmour/docs/source_en/improve_model_security_nad.md +++ b/docs/mindarmour/docs/source_en/improve_model_security_nad.md @@ -14,7 +14,7 @@ At the beginning of AI algorithm design, related security threats are sometimes This section describes how to use MindSpore Armour in adversarial attack and defense by taking the Fast Gradient Sign Method (FGSM) attack algorithm and Natural Adversarial Defense (NAD) algorithm as examples. -> The current sample is for CPU, GPU and Ascend 910 AI processor. You can find the complete executable sample code at +> The current sample is for CPU and GPU. You can find the complete executable sample code at > ## Creating an Target Model diff --git a/docs/mindarmour/docs/source_en/protect_user_privacy_with_differential_privacy.md b/docs/mindarmour/docs/source_en/protect_user_privacy_with_differential_privacy.md index 80d93112e0..3da0e3e44e 100644 --- a/docs/mindarmour/docs/source_en/protect_user_privacy_with_differential_privacy.md +++ b/docs/mindarmour/docs/source_en/protect_user_privacy_with_differential_privacy.md @@ -29,7 +29,7 @@ MindSpore Armour differential privacy module Differential-Privacy implements the The LeNet model and MNIST dataset are used as an example to describe how to use the differential privacy optimizer to train a neural network model on MindSpore. -> Because of the limit of CPU APIs, differential privacy training can only run on GPU or Ascend, except for CPU. This example is for the Ascend 910 AI processor. You can download the complete sample code from . +> Because of the limit of CPU APIs, differential privacy training can only run on GPU or Ascend, except for CPU. You can download the complete sample code from . ## Implementation diff --git a/docs/mindarmour/docs/source_en/protect_user_privacy_with_suppress_privacy.md b/docs/mindarmour/docs/source_en/protect_user_privacy_with_suppress_privacy.md index 4589e7d0f0..545c75237f 100644 --- a/docs/mindarmour/docs/source_en/protect_user_privacy_with_suppress_privacy.md +++ b/docs/mindarmour/docs/source_en/protect_user_privacy_with_suppress_privacy.md @@ -14,8 +14,6 @@ Suppress-Privacy, a Suppress-Privacy module in MindSpore Armour, implements a su Here is an example showing that how to train a neural network model in MindSpore using the LeNet model, MNIST dataset, and the SuppressourPrivacy optimizer. -> This example is for the Ascend 910 AI processor and you can download the full sample code at - ## Implementation ### Importing Library Files diff --git a/docs/mindarmour/docs/source_en/test_model_security_fuzzing.md b/docs/mindarmour/docs/source_en/test_model_security_fuzzing.md index ad45dcc0d2..4fc3160b73 100644 --- a/docs/mindarmour/docs/source_en/test_model_security_fuzzing.md +++ b/docs/mindarmour/docs/source_en/test_model_security_fuzzing.md @@ -10,7 +10,7 @@ The fuzz testing module of MindSpore Armour uses the neuron coverage rate as the The LeNet model and MNIST dataset are used as an example to describe how to use Fuzz testing. -> This example is for CPUs, GPUs, and Ascend 910 AI processors. Currently only supports GRAPH_MODE. You can download the complete sample code at . +> This example is for CPUs, and GPUs. Currently only supports GRAPH_MODE. You can download the complete sample code at . ## Implementation diff --git a/docs/mindarmour/docs/source_en/test_model_security_membership_inference.md b/docs/mindarmour/docs/source_en/test_model_security_membership_inference.md index 2bfa0d5e15..a71d602622 100644 --- a/docs/mindarmour/docs/source_en/test_model_security_membership_inference.md +++ b/docs/mindarmour/docs/source_en/test_model_security_membership_inference.md @@ -10,10 +10,6 @@ In machine learning and deep learning, if an attacker has some access permission The following uses a VGG16 model and CIFAR-100 dataset as an example to describe how to use membership inference to perform model privacy security evaluation. This tutorial uses pre-trained model parameters for demonstration. This following describes only the model structure, parameter settings, and dataset preprocessing method. -> This example is for the Ascend 910 AI Processor. You can download the complete sample code in the following link: -> -> - ## Implementation ### Importing Library Files diff --git a/docs/mindarmour/docs/source_zh_cn/concept_drift_images.md b/docs/mindarmour/docs/source_zh_cn/concept_drift_images.md index c6ba0ed5bc..b2e04a5052 100644 --- a/docs/mindarmour/docs/source_zh_cn/concept_drift_images.md +++ b/docs/mindarmour/docs/source_zh_cn/concept_drift_images.md @@ -17,7 +17,7 @@ 5. 执行概念漂移检测函数。 6. 查看结果。 -> 本例面向CPU、GPU、Ascend 910 AI处理器,目前仅支持GRAPH_MODE。你可以在这里找到完整可运行的样例代码:。 +> 本例面向CPU、GPU,目前仅支持GRAPH_MODE。你可以在这里找到完整可运行的样例代码:。 ## 准备环节 diff --git a/docs/mindarmour/docs/source_zh_cn/improve_model_security_nad.md b/docs/mindarmour/docs/source_zh_cn/improve_model_security_nad.md index ef04814406..0371f6e896 100644 --- a/docs/mindarmour/docs/source_zh_cn/improve_model_security_nad.md +++ b/docs/mindarmour/docs/source_zh_cn/improve_model_security_nad.md @@ -14,7 +14,7 @@ AI算法设计之初普遍未考虑相关的安全威胁,使得AI算法的判 这里通过图像分类任务上的对抗性攻防,以攻击算法FGSM和防御算法NAD为例,介绍MindSpore Armour在对抗攻防上的使用方法。 -> 本例面向CPU、GPU、Ascend 910 AI处理器,你可以在这里下载完整的样例代码: +> 本例面向CPU、GPU,你可以在这里下载完整的样例代码: > ## 建立被攻击模型 diff --git a/docs/mindarmour/docs/source_zh_cn/protect_user_privacy_with_differential_privacy.md b/docs/mindarmour/docs/source_zh_cn/protect_user_privacy_with_differential_privacy.md index 8131c66d81..49e1c8beb2 100644 --- a/docs/mindarmour/docs/source_zh_cn/protect_user_privacy_with_differential_privacy.md +++ b/docs/mindarmour/docs/source_zh_cn/protect_user_privacy_with_differential_privacy.md @@ -30,7 +30,6 @@ MindSpore Armour的差分隐私模块Differential-Privacy,实现了差分隐 这里以LeNet模型,MNIST 数据集为例,说明如何在MindSpore上使用差分隐私优化器训练神经网络模型。 > 由于API支持的限制,差分隐私训练目前只支持在GPU或者Ascend服务器上面进行,不支持CPU。 -本例面向Ascend 910 AI处理器,你可以在这里下载完整的样例代码: ## 实现阶段 diff --git a/docs/mindarmour/docs/source_zh_cn/protect_user_privacy_with_suppress_privacy.md b/docs/mindarmour/docs/source_zh_cn/protect_user_privacy_with_suppress_privacy.md index 41722a8bb6..c2cc7b506f 100644 --- a/docs/mindarmour/docs/source_zh_cn/protect_user_privacy_with_suppress_privacy.md +++ b/docs/mindarmour/docs/source_zh_cn/protect_user_privacy_with_suppress_privacy.md @@ -12,8 +12,6 @@ MindSpore Armour的抑制隐私模块Suppress-Privacy,实现了抑制隐私优 这里以LeNet模型,MNIST 数据集为例,说明如何在MindSpore上使用抑制隐私优化器训练神经网络模型。 -> 本例面向Ascend 910 AI处理器,你可以在这里下载完整的样例代码: - ## 实现阶段 ### 导入需要的库文件 diff --git a/docs/mindarmour/docs/source_zh_cn/test_model_security_fuzzing.md b/docs/mindarmour/docs/source_zh_cn/test_model_security_fuzzing.md index 2e0c7ad17e..e0f6a7670a 100644 --- a/docs/mindarmour/docs/source_zh_cn/test_model_security_fuzzing.md +++ b/docs/mindarmour/docs/source_zh_cn/test_model_security_fuzzing.md @@ -10,8 +10,6 @@ MindSpore Armour的fuzz_testing模块以神经元覆盖率作为测试评价准 这里以LeNet模型,MNIST数据集为例,说明如何使用Fuzzer。 -> 本例面向CPU、GPU、Ascend 910 AI处理器,目前仅支持GRAPH_MODE。你可以在这里下载完整的样例代码: - ## 实现阶段 ### 导入需要的库文件 diff --git a/docs/mindarmour/docs/source_zh_cn/test_model_security_membership_inference.md b/docs/mindarmour/docs/source_zh_cn/test_model_security_membership_inference.md index 985104da54..9d2e938a28 100644 --- a/docs/mindarmour/docs/source_zh_cn/test_model_security_membership_inference.md +++ b/docs/mindarmour/docs/source_zh_cn/test_model_security_membership_inference.md @@ -10,10 +10,6 @@ 这里以VGG16模型,CIFAR-100数据集为例,说明如何使用MembershipInference进行模型隐私安全评估。本教程使用预训练的模型参数进行演示,这里仅给出模型结构、参数设置和数据集预处理方式。 ->本例面向Ascend 910处理器,您可以在这里下载完整的样例代码: -> -> - ## 实现阶段 ### 导入需要的库文件 diff --git a/docs/serving/docs/source_en/serving_distributed_example.md b/docs/serving/docs/source_en/serving_distributed_example.md index 667d82903b..fdbb67b6c5 100644 --- a/docs/serving/docs/source_en/serving_distributed_example.md +++ b/docs/serving/docs/source_en/serving_distributed_example.md @@ -21,7 +21,6 @@ Currently, the distributed model has the following restrictions: - The model of the first stage receives the same input data. - The models of other stages do not receive data. - All models of the latter stage return the same data. -- Only Ascend 910 inference is supported. The following uses a simple distributed network MatMul as an example to demonstrate the deployment process. diff --git a/docs/serving/docs/source_en/serving_install.md b/docs/serving/docs/source_en/serving_install.md index 76360ca6e9..9c9f24b6ff 100644 --- a/docs/serving/docs/source_en/serving_install.md +++ b/docs/serving/docs/source_en/serving_install.md @@ -6,20 +6,17 @@ Currently, MindSpore Serving can be deployed only in the Linux environment. -MindSpore Serving wheel packages are common to various hardware platforms(Nvidia GPU, Ascend 910/310P/310, CPU). The inference task depends on the MindSpore or MindSpore Lite inference framework. We need to select one of them as the Serving Inference backend. When these two inference backend both exist, Mindspore Lite inference framework will be used. +MindSpore Serving wheel packages are common to various hardware platforms(Nvidia GPU, CPU). The inference task depends on the MindSpore or MindSpore Lite inference framework. We need to select one of them as the Serving Inference backend. When these two inference backend both exist, Mindspore Lite inference framework will be used. MindSpore and MindSpore Lite have different build packages for different hardware platforms. The following table lists the target devices and model formats supported by each build package. |Inference backend|Build platform|Target device|Supported model formats| |---------| --- | --- | -------- | |MindSpore| Nvidia GPU | Nvidia GPU | `MindIR` | -| | Ascend | Ascend 910 | `MindIR` | -| | | Ascend 310P/310 | `MindIR`, `OM` | |MindSpore Lite| Nvidia GPU | Nvidia GPU, CPU | `MindIR_Lite` | -| | Ascend | Ascend 310P/310, CPU | `MindIR_Lite` | | | CPU | CPU | `MindIR_Lite` | -When [MindSpore](https://www.mindspore.cn/) is used as the inference backend, MindSpore Serving supports the Ascend 910/310P/310 and Nvidia GPU environments. The Ascend 310P/310 environment supports both `OM` and `MindIR` model formats, and the Ascend 910 and GPU environment only supports the `MindIR` model format. +When [MindSpore](https://www.mindspore.cn/) is used as the inference backend, MindSpore Serving supports the Nvidia GPU environment. GPU environment only supports the `MindIR` model format. Due to the dependency between MindSpore Serving and MindSpore, please follow the table below, download and install the corresponding MindSpore verision from [MindSpore download page](https://www.mindspore.cn/versions/en). @@ -32,13 +29,11 @@ Due to the dependency between MindSpore Serving and MindSpore, please follow the For details about how to install and configure MindSpore, see [Installing MindSpore](https://gitee.com/mindspore/mindspore/blob/r2.0/README.md#installation) and [Configuring MindSpore](https://gitee.com/mindspore/docs/blob/r2.0/install/mindspore_ascend_install_source_en.md#configuring-environment-variables). -When [MindSpore Lite](https://www.mindspore.cn/lite) is used as the inference backend, MindSpore Serving supports Ascend 310P/310, Nvidia GPU and CPU environments. Only the `MindIR_Lite` model formats is supported. Models of format `MindIR` exported from MindSpore or models of other frameworks need be be converted to `MindIR_Lite` format by the MindSpore Lite conversion tool. The `MindIR_Lite` models converted from `Ascend310` and `Ascend310P` environments are different, and the `MindIR_Lite` models must be running on the corresponding `Ascend310` or `Ascend310P` environments. `MindIR_Lite` models converted from Nvidia GPU and CPU environments can be running only in the Nvidia GPU and CPU environments. +When [MindSpore Lite](https://www.mindspore.cn/lite) is used as the inference backend, MindSpore Serving supports Nvidia GPU and CPU environments. Only the `MindIR_Lite` model formats is supported. Models of format `MindIR` exported from MindSpore or models of other frameworks need be be converted to `MindIR_Lite` format by the MindSpore Lite conversion tool. `MindIR_Lite` models converted from Nvidia GPU and CPU environments can be running only in the Nvidia GPU and CPU environments. | Inference backend | Running environment of Lite conversion tool | Target device of `MindIR_Lite` models | | -------------- | ---------------- | --------------- | | MindSpore Lite | Nvidia GPU, CPU | Nvidia GPU, CPU | -| | Ascend 310 | Ascend 310 | -| | Ascend 310P | Ascend 310P | For details about how to compile and install MindSpore Lite, see the [MindSpore Lite Documentation](https://www.mindspore.cn/lite/docs/en/r2.0/index.html). We should configure the environment variable `LD_LIBRARY_PATH` to indicates the installation path of `libmindspore-lite.so`. diff --git a/docs/serving/docs/source_zh_cn/serving_distributed_example.md b/docs/serving/docs/source_zh_cn/serving_distributed_example.md index ff71189c2a..5a385b01b8 100644 --- a/docs/serving/docs/source_zh_cn/serving_distributed_example.md +++ b/docs/serving/docs/source_zh_cn/serving_distributed_example.md @@ -19,7 +19,6 @@ - 第一个stage的模型接收相同的输入数据。 - 其他的stage的模型不接收数据。 - 最后一个stage的所有模型都返回相同的数据。 -- 仅支持Ascend 910推理。 下面以一个简单的分布式网络MatMul为例,演示部署流程。 diff --git a/docs/serving/docs/source_zh_cn/serving_install.md b/docs/serving/docs/source_zh_cn/serving_install.md index c35701be4f..356c5a3a1b 100644 --- a/docs/serving/docs/source_zh_cn/serving_install.md +++ b/docs/serving/docs/source_zh_cn/serving_install.md @@ -6,20 +6,17 @@ MindSpore Serving当前仅支持Linux环境部署。 -MindSpore Serving包在各类硬件平台(Nvidia GPU, Ascend 910/310P/310, CPU)上通用,推理任务依赖MindSpore或MindSpore Lite推理框架,我们需要选择一个作为Serving推理后端。当这两个推理后端同时存在的时候,优先使用MindSpore Lite推理框架。 +MindSpore Serving包在各类硬件平台(Nvidia GPU, CPU)上通用,推理任务依赖MindSpore或MindSpore Lite推理框架,我们需要选择一个作为Serving推理后端。当这两个推理后端同时存在的时候,优先使用MindSpore Lite推理框架。 MindSpore和MindSpore Lite针对不同的硬件平台有不同的构建包,每个不同的构建包支持的运行目标设备和模型格式如下表所示: |推理后端|构建平台|运行目标设备|支持的模型格式| |---------| --- | --- | -------- | |MindSpore| Nvidia GPU | Nvidia GPU | `MindIR` | -| | Ascend | Ascend 910 | `MindIR` | -| | | Ascend 310P/310 | `MindIR`, `OM` | |MindSpore Lite| Nvidia GPU | Nvidia GPU, CPU | `MindIR_Lite` | -| | Ascend | Ascend 310P/310, CPU | `MindIR_Lite` | | | CPU | CPU | `MindIR_Lite` | -当以[MindSpore](https://www.mindspore.cn/)作为推理后端时,MindSpore Serving当前支持Ascend 910/310P/310和Nvidia GPU环境。其中Ascend 310P/310环境支持`OM`和`MindIR`两种模型格式,Ascend 910和GPU环境仅支持`MindIR`模型格式。 +当以[MindSpore](https://www.mindspore.cn/)作为推理后端时,MindSpore Serving当前支持Nvidia GPU环境。GPU环境仅支持`MindIR`模型格式。 由于MindSpore Serving与MindSpore有依赖关系,请按照根据下表中所指示的对应关系,在[MindSpore下载页面](https://www.mindspore.cn/versions)下载并安装对应的whl包。 @@ -32,13 +29,11 @@ MindSpore和MindSpore Lite针对不同的硬件平台有不同的构建包,每 MindSpore的安装和配置可以参考[安装MindSpore](https://gitee.com/mindspore/mindspore#安装),并根据需要完成[环境变量配置](https://gitee.com/mindspore/docs/blob/r2.0/install/mindspore_ascend_install_pip.md#配置环境变量)。 -当以[MindSpore Lite](https://www.mindspore.cn/lite)作为推理后端时,MindSpore Serving当前支持Ascend 310P/310、Nvidia GPU和CPU。当前仅支持`MindIR_Lite`模型格式,MindSpore的`MindIR`或其他框架的模型文件需要通过Lite转换工具转换成`MindIR_Lite`模型格式。模型转换时,`Ascend310`设备和`Ascend310P`转换出的模型不一致,需要在对应的`Ascend310`或者`Ascend310P`设备上运行;Nvidia GPU和CPU环境转换成的`MindIR_Lite`模型仅能在Nvidia GPU和CPU使用。 +当以[MindSpore Lite](https://www.mindspore.cn/lite)作为推理后端时,MindSpore Serving当前支持Nvidia GPU和CPU。当前仅支持`MindIR_Lite`模型格式,MindSpore的`MindIR`或其他框架的模型文件需要通过Lite转换工具转换成`MindIR_Lite`模型格式。Nvidia GPU和CPU环境转换成的`MindIR_Lite`模型仅能在Nvidia GPU和CPU使用。 | 推理后端 | 转换工具运行平台 | `MindIR_Lite`模型运行设备 | | -------------- | ---------------- | --------------- | | MindSpore Lite | Nvidia GPU, CPU | Nvidia GPU, CPU | -| | Ascend 310 | Ascend 310 | -| | Ascend 310P | Ascend 310P | MindSpore Lite安装和配置可以参考[MindSpore Lite文档](https://www.mindspore.cn/lite/docs/zh-CN/r2.0/index.html),通过环境变量`LD_LIBRARY_PATH`指示`libmindspore-lite.so`的安装路径。 -- Gitee