From ec67271f785eabb04edce9315fc9d3c23e95f3f5 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E5=AE=A6=E6=99=93=E7=8E=B2?= <3174348550@qq.com> Date: Wed, 17 Sep 2025 17:15:44 +0800 Subject: [PATCH] modify contents --- .../source_en/advanced/third_party/ascend_info.md | 8 ++++---- .../advanced/third_party/converter_register.md | 14 +++++++------- docs/lite/docs/source_en/mindir/build.md | 2 +- .../advanced/third_party/ascend_info.md | 2 +- .../advanced/third_party/converter_register.md | 2 +- docs/lite/docs/source_zh_cn/mindir/build.md | 2 +- docs/mindformers/docs/source_en/feature/ckpt.md | 2 +- 7 files changed, 16 insertions(+), 16 deletions(-) diff --git a/docs/lite/docs/source_en/advanced/third_party/ascend_info.md b/docs/lite/docs/source_en/advanced/third_party/ascend_info.md index 46d1bb228b..96f347ed29 100644 --- a/docs/lite/docs/source_en/advanced/third_party/ascend_info.md +++ b/docs/lite/docs/source_en/advanced/third_party/ascend_info.md @@ -2,7 +2,7 @@ [![View Source On Gitee](https://mindspore-website.obs.cn-north-4.myhuaweicloud.com/website-images/master/resource/_static/logo_source_en.svg)](https://gitee.com/mindspore/docs/blob/master/docs/lite/docs/source_en/advanced/third_party/ascend_info.md) -> - The Ascend backend support on device-side version will be deprecated later. For related usage of the Ascend backend, please refer to the cloud-side inference version documentation. +> - The Ascend backend support in the device-side version will be deprecated. For related usage of the Ascend backend, please refer to the cloud-side inference version documentation. > - [Build Cloud-side MindSpore Lite](https://mindspore.cn/lite/docs/en/master/mindir/build.html) > - [Cloud-side Model Converter](https://mindspore.cn/lite/docs/en/master/mindir/converter.html) > - [Cloud-side Benchmark Tool](https://mindspore.cn/lite/docs/en/master/mindir/benchmark.html) @@ -54,7 +54,7 @@ This document describes how to use MindSpore Lite to perform inference and use t After the Ascend software package is installed, export runtime environment variables. In the following command, `/usr/local/Ascend` in `LOCAL_ASCEND=/usr/local/Ascend` indicates the installation path of the software package. Change it to the actual installation path. ```bash -# control log level. 0-EBUG, 1-INFO, 2-WARNING, 3-ERROR, 4-CRITICAL, default level is WARNING. +# control log level. 0-DEBUG, 1-INFO, 2-WARNING, 3-ERROR, 4-CRITICAL, default level is WARNING. export GLOG_v=2 # Conda environmental options @@ -101,7 +101,7 @@ First, use the converter to convert a model into an `ms` model. Then, use the ru 5. (Optional) Configuring configFile - You can use this option to configure the Ascend option for model conversion. The configuration file is in the INI format. For the Ascend scenario, the configurable parameter is [acl_option_cfg_param]. For details about the parameter, see the following table, Ascend initialization can be configured through the acl_init_options parameter, and Ascend composition can be configured through the acl_build_options parameter. + You can use this option to configure the Ascend option for model conversion. The configuration file is in the INI format. For the Ascend scenario, the configurable parameter is [acl_option_cfg_param]. For details about the parameter, see the following table: Ascend initialization can be configured through the acl_init_options parameter, and Ascend composition can be configured through the acl_build_options parameter. 6. Execute the converter to generate an Ascend `ms` model. @@ -147,7 +147,7 @@ Table 1 [acl_option_cfg_param] parameter configuration After obtaining the converted model, use the matching runtime inference framework to perform inference. For details about how to use runtime to perform inference, see [Using C++ Interface to Perform Inference](https://www.mindspore.cn/lite/docs/en/master/infer/runtime_cpp.html). -## Executinge the Benchmark +## Executing the Benchmark MindSpore Lite provides a benchmark test tool, which can be used to perform quantitative (performance) analysis on the execution time consumed by forward inference of the MindSpore Lite model. In addition, you can perform comparative error (accuracy) analysis based on the output of a specified model. For details about the inference tool, see [benchmark](https://www.mindspore.cn/lite/docs/en/master/tools/benchmark_tool.html). diff --git a/docs/lite/docs/source_en/advanced/third_party/converter_register.md b/docs/lite/docs/source_en/advanced/third_party/converter_register.md index 4e36a7e844..0ab28f0804 100644 --- a/docs/lite/docs/source_en/advanced/third_party/converter_register.md +++ b/docs/lite/docs/source_en/advanced/third_party/converter_register.md @@ -6,15 +6,15 @@ MindSpore Lite [Conversion Tool](https://www.mindspore.cn/lite/docs/en/master/converter/converter_tool.html), in addition to the basic model conversion function, also supports user-defined model optimization and construction to generate models with user-defined operators. -We have designed a set of registration mechanism, which allows users to expand, including node-parse extension, model-parse extension and graph-optimization extension. The users can combined them as needed to achieve their own intention. +We have designed a set of registration mechanisms, which allows users to expand, including node-parse extension, model-parse extension and graph-optimization extension. The users can combined them as needed to achieve their own intention. node-parse extension: The users can define the process to parse a certain node of a model by themselves, which only support ONNX, CAFFE, TF and TFLITE. The related interface is [NodeParser](https://www.mindspore.cn/lite/api/en/master/generate/classmindspore_converter_NodeParser.html), [NodeParserRegistry](https://www.mindspore.cn/lite/api/en/master/generate/classmindspore_registry_NodeParserRegistry.html). model-parse extension: The users can define the process to parse a model by themselves, which only support ONNX, CAFFE, TF and TFLITE. The related interface is [ModelParser](https://www.mindspore.cn/lite/api/en/master/generate/classmindspore_converter_ModelParser.html), [ModelParserRegistry](https://www.mindspore.cn/lite/api/en/master/generate/classmindspore_registry_ModelParserRegistry.html). graph-optimization extension: After parsing a model, a graph structure defined by MindSpore Lite will show up and then, the users can define the process to optimize the parsed graph. The related interfaces are [PassBase](https://www.mindspore.cn/lite/api/en/master/generate/classmindspore_registry_PassBase.html), [PassPosition](https://mindspore.cn/lite/api/en/master/generate/enum_mindspore_registry_PassPosition-1.html), [PassRegistry](https://www.mindspore.cn/lite/api/en/master/generate/classmindspore_registry_PassRegistry.html). -> The node-parse extension needs to rely on the flatbuffers, protobuf and the serialization files of third-party frameworks, at the same time, the version of flatbuffers and the protobuf needs to be consistent with that of the released package, the serialized files must be compatible with that used by the released package. Note that the flatbuffers, protobuf and the serialization files are not provided in the released package, users need to compile and generate the serialized files by themselves. The users can obtain the basic information about [flabuffers](https://gitee.com/mindspore/mindspore-lite/blob/master/cmake/external_libs/flatbuffers.cmake), [probobuf](https://gitee.com/mindspore/mindspore-lite/blob/master/cmake/external_libs/protobuf.cmake), [ONNX prototype file](https://gitee.com/mindspore/mindspore-lite/tree/master/third_party/proto/onnx), [CAFFE prototype file](https://gitee.com/mindspore/mindspore-lite/tree/master/third_party/proto/caffe), [TF prototype file](https://gitee.com/mindspore/mindspore-lite/tree/master/third_party/proto/tensorflow) and [TFLITE prototype file](https://gitee.com/mindspore/mindspore-lite/blob/master/mindspore-lite/tools/converter/parser/tflite/schema.fbs) from the [MindSpore WareHouse](https://gitee.com/mindspore/mindspore-lite/tree/master). +> The node-parse extension needs to rely on the flatbuffers, protobuf and the serialization files of third-party frameworks, at the same time, the version of flatbuffers and the protobuf needs to be consistent with that of the released package, the serialized files must be compatible with that used by the released package. Note that the flatbuffers, protobuf and the serialization files are not provided in the released package, users need to compile and generate the serialized files by themselves. The users can obtain the basic information about [flatbuffers](https://gitee.com/mindspore/mindspore-lite/blob/master/cmake/external_libs/flatbuffers.cmake), [protobuf](https://gitee.com/mindspore/mindspore-lite/blob/master/cmake/external_libs/protobuf.cmake), [ONNX prototype file](https://gitee.com/mindspore/mindspore-lite/tree/master/third_party/proto/onnx), [CAFFE prototype file](https://gitee.com/mindspore/mindspore-lite/tree/master/third_party/proto/caffe), [TF prototype file](https://gitee.com/mindspore/mindspore-lite/tree/master/third_party/proto/tensorflow) and [TFLITE prototype file](https://gitee.com/mindspore/mindspore-lite/blob/master/mindspore-lite/tools/converter/parser/tflite/schema.fbs) from the [MindSpore WareHouse](https://gitee.com/mindspore/mindspore-lite/tree/master). > -> MindSpore Lite alse providers a series of registration macros to facilitate user access. These macros include node-parse registration [REG_NODE_PARSER](https://www.mindspore.cn/lite/api/en/master/generate/define_node_parser_registry.h_REG_NODE_PARSER-1.html), model-parse registration [REG_MODEL_PARSER](https://www.mindspore.cn/lite/api/en/master/generate/define_model_parser_registry.h_REG_MODEL_PARSER-1.html), graph-optimization registration [REG_PASS](https://www.mindspore.cn/lite/api/en/master/generate/define_pass_registry.h_REG_PASS-1.html) and graph-optimization scheduled registration [REG_SCHEDULED_PASS](https://www.mindspore.cn/lite/api/en/master/generate/define_pass_registry.h_REG_SCHEDULED_PASS-1.html) +> MindSpore Lite alse provides a series of registration macros to facilitate user access. These macros include node-parse registration [REG_NODE_PARSER](https://www.mindspore.cn/lite/api/en/master/generate/define_node_parser_registry.h_REG_NODE_PARSER-1.html), model-parse registration [REG_MODEL_PARSER](https://www.mindspore.cn/lite/api/en/master/generate/define_model_parser_registry.h_REG_MODEL_PARSER-1.html), graph-optimization registration [REG_PASS](https://www.mindspore.cn/lite/api/en/master/generate/define_pass_registry.h_REG_PASS-1.html) and graph-optimization scheduled registration [REG_SCHEDULED_PASS](https://www.mindspore.cn/lite/api/en/master/generate/define_pass_registry.h_REG_SCHEDULED_PASS-1.html) The expansion capability of MindSpore Lite conversion tool only supports on Linux system currently. @@ -22,7 +22,7 @@ In this chapter, we will show the users a sample of extending MindSpore Lite con > Due to that model-parse extension is a modular extension ability, the chapter will not introduce in details. However, we still provide the users with a simplified unit case for inference. -The chapter takes a [add.tflite](https://download.mindspore.cn/model_zoo/official/lite/quick_start/add.tflite), which only includes an opreator of adding, as an example. We will show the users how to convert the single operator of adding to that of [Custom](https://www.mindspore.cn/lite/docs/en/master/advanced/third_party/register_kernel.html#custom-operators) and finally obtain a model which only includs a single operator of custom. +The chapter takes a [add.tflite](https://download.mindspore.cn/model_zoo/official/lite/quick_start/add.tflite), which only includes an operator of adding, as an example. We will show the users how to convert the single operator of adding to that of [Custom](https://www.mindspore.cn/lite/docs/en/master/advanced/third_party/register_kernel.html#custom-operators) and finally obtain a model which only includes a single operator of custom. The code related to the example can be obtained from the path [mindspore-lite/examples/converter_extend](https://gitee.com/mindspore/mindspore-lite/tree/master/mindspore-lite/examples/converter_extend). @@ -126,7 +126,7 @@ For the sample code, please refer to [pass](https://gitee.com/mindspore/mindspor cd ${PACKAGE_ROOT_PATH}/tools/converter/converter ``` -3. Create extension configuration file(converter.cfg, please refer to [Extension Configuration](#extension-configuration)), the content is as follows: +3. Create extension configuration file (converter.cfg, please refer to [Extension Configuration](#extension-configuration)), the content is as follows: ```text [registry] @@ -155,7 +155,7 @@ To load the extension module when converting, users need to configure the path o | ----------------- | --------- | -------------------------------------------- | -------------- | ------------- | ------------------------------------------------------- | | plugin_path | Optional | Third-party library path | String | - | If there are more than one, please use `;` to separate. | | disable_fusion | Optional | Indicate whether to close fusion | String | off | off or on. | -| fusion_blacklists | Optional | Specified fusion operator names to be closed | String | - | If there are more than one, please use `,` to separate | +| fusion_blacklists | Optional | Specified fusion operator names to be closed | String | - | If there are more than one, please use `,` to separate. | We have generated the default configuration file (converter.cfg). The content is as follows: @@ -164,7 +164,7 @@ We have generated the default configuration file (converter.cfg). The content is plugin_path=libconverter_extend_tutorial.so # users need to configure the correct path of the dynamic library ``` -If the user needs to turn off the specified operator fusions, the fusion configuration of the the specified operator names to be closed are as follows: +If the user needs to turn off the specified operator fusions, the fusion configuration of the specified operator names to be closed are as follows: ```ini [registry] diff --git a/docs/lite/docs/source_en/mindir/build.md b/docs/lite/docs/source_en/mindir/build.md index e5482cc766..29edcaa551 100644 --- a/docs/lite/docs/source_en/mindir/build.md +++ b/docs/lite/docs/source_en/mindir/build.md @@ -110,7 +110,7 @@ git clone https://gitee.com/mindspore/mindspore-lite.git - After installing Ascend package, you need to export Runtime-related environment variables. `/usr/local/Ascend` of the `/LOCAL_ASCEND=/usr/local/Ascend` in the following command indicates the installation path of the package, so you need to change it to the actual installation path of the package. ```bash - # control log level. 0-EBUG, 1-INFO, 2-WARNING, 3-ERROR, 4-CRITICAL, default level is WARNING. + # control log level. 0-DEBUG, 1-INFO, 2-WARNING, 3-ERROR, 4-CRITICAL, default level is WARNING. export GLOG_v=2 # Conda environmental options diff --git a/docs/lite/docs/source_zh_cn/advanced/third_party/ascend_info.md b/docs/lite/docs/source_zh_cn/advanced/third_party/ascend_info.md index 376b77022a..7a976d2136 100644 --- a/docs/lite/docs/source_zh_cn/advanced/third_party/ascend_info.md +++ b/docs/lite/docs/source_zh_cn/advanced/third_party/ascend_info.md @@ -53,7 +53,7 @@ 安装好Ascend软件包之后,需要导出Runtime相关环境变量,下述命令中`LOCAL_ASCEND=/usr/local/Ascend`的`/usr/local/Ascend`表示配套软件包的安装路径,需注意将其改为配套软件包的实际安装路径。 ```bash -# control log level. 0-EBUG, 1-INFO, 2-WARNING, 3-ERROR, 4-CRITICAL, default level is WARNING. +# control log level. 0-DEBUG, 1-INFO, 2-WARNING, 3-ERROR, 4-CRITICAL, default level is WARNING. export GLOG_v=2 # Conda environmental options diff --git a/docs/lite/docs/source_zh_cn/advanced/third_party/converter_register.md b/docs/lite/docs/source_zh_cn/advanced/third_party/converter_register.md index 34481ace3f..0e79a2144e 100644 --- a/docs/lite/docs/source_zh_cn/advanced/third_party/converter_register.md +++ b/docs/lite/docs/source_zh_cn/advanced/third_party/converter_register.md @@ -12,7 +12,7 @@ MindSpore Lite的[转换工具](https://www.mindspore.cn/lite/docs/zh-CN/master/ 模型解析扩展:用户自定义模型的整个解析过程,支持ONNX、CAFFE、TF、TFLITE。接口可参考[ModelParser](https://www.mindspore.cn/lite/api/zh-CN/master/api_cpp/mindspore_converter.html#modelparser)、[ModelParserRegistry](https://www.mindspore.cn/lite/api/zh-CN/master/api_cpp/mindspore_registry.html#modelparserregistry)。 图优化扩展:模型解析之后,将获得MindSpore Lite定义的图结构,用户可基于此结构自定义图的优化过程。接口可参考[PassBase](https://www.mindspore.cn/lite/api/zh-CN/master/api_cpp/mindspore_registry.html#passbase)、[PassPosition](https://www.mindspore.cn/lite/api/zh-CN/master/api_cpp/mindspore_registry.html#passposition)、[PassRegistry](https://www.mindspore.cn/lite/api/zh-CN/master/api_cpp/mindspore_registry.html#passregistry)。 -> 节点解析扩展需要依赖flatbuffers和protobuf及三方框架的序列化文件,并且flatbuffers和protobuf需要与发布件采用的版本一致,序列化文件需保证兼容发布件采用的序列化文件。发布件中不提供flatbuffers、protobuf及序列化文件,用户需自行编译,并生成序列化文件。用户可以从[MindSpore Lite仓](https://gitee.com/mindspore/mindspore-lite/tree/master)中获取[flatbuffers](https://gitee.com/mindspore/mindspore-lite/blob/master/cmake/external_libs/flatbuffers.cmake)、[probobuf](https://gitee.com/mindspore/mindspore-lite/blob/master/cmake/external_libs/protobuf.cmake)、[ONNX原型文件](https://gitee.com/mindspore/mindspore-lite/tree/master/third_party/proto/onnx)、[CAFFE原型文件](https://gitee.com/mindspore/mindspore-lite/tree/master/third_party/proto/caffe)、[TF原型文件](https://gitee.com/mindspore/mindspore-lite/tree/master/third_party/proto/tensorflow)和[TFLITE原型文件](https://gitee.com/mindspore/mindspore-lite/blob/master/mindspore-lite/tools/converter/parser/tflite/schema.fbs)。 +> 节点解析扩展需要依赖flatbuffers和protobuf及三方框架的序列化文件,并且flatbuffers和protobuf需要与发布件采用的版本一致,序列化文件需保证兼容发布件采用的序列化文件。发布件中不提供flatbuffers、protobuf及序列化文件,用户需自行编译,并生成序列化文件。用户可以从[MindSpore Lite仓](https://gitee.com/mindspore/mindspore-lite/tree/master)中获取[flatbuffers](https://gitee.com/mindspore/mindspore-lite/blob/master/cmake/external_libs/flatbuffers.cmake)、[protobuf](https://gitee.com/mindspore/mindspore-lite/blob/master/cmake/external_libs/protobuf.cmake)、[ONNX原型文件](https://gitee.com/mindspore/mindspore-lite/tree/master/third_party/proto/onnx)、[CAFFE原型文件](https://gitee.com/mindspore/mindspore-lite/tree/master/third_party/proto/caffe)、[TF原型文件](https://gitee.com/mindspore/mindspore-lite/tree/master/third_party/proto/tensorflow)和[TFLITE原型文件](https://gitee.com/mindspore/mindspore-lite/blob/master/mindspore-lite/tools/converter/parser/tflite/schema.fbs)。 > > MindSpore Lite还提供了一系列的注册宏,以便于用户侧的扩展接入转换工具。注册宏包括节点解析注册[REG_NODE_PARSER](https://www.mindspore.cn/lite/api/zh-CN/master/api_cpp/mindspore_registry.html#reg-node-parser)、模型解析注册[REG_MODEL_PARSER](https://www.mindspore.cn/lite/api/zh-CN/master/api_cpp/mindspore_registry.html#reg-model-parser)、图优化注册[REG_PASS](https://www.mindspore.cn/lite/api/zh-CN/master/api_cpp/mindspore_registry.html#reg-pass)、图优化调度注册[REG_SCHEDULED_PASS](https://www.mindspore.cn/lite/api/zh-CN/master/api_cpp/mindspore_registry.html#reg-scheduled-pass)。 diff --git a/docs/lite/docs/source_zh_cn/mindir/build.md b/docs/lite/docs/source_zh_cn/mindir/build.md index 40bbd91be8..9851fbaa9f 100644 --- a/docs/lite/docs/source_zh_cn/mindir/build.md +++ b/docs/lite/docs/source_zh_cn/mindir/build.md @@ -110,7 +110,7 @@ git clone https://gitee.com/mindspore/mindspore-lite.git - 安装好Ascend软件包之后,需要导出Runtime相关环境变量,下述命令中`LOCAL_ASCEND=/usr/local/Ascend`的`/usr/local/Ascend`表示配套软件包的安装路径,需注意将其改为配套软件包的实际安装路径。 ```bash - # control log level. 0-EBUG, 1-INFO, 2-WARNING, 3-ERROR, 4-CRITICAL, default level is WARNING. + # control log level. 0-DEBUG, 1-INFO, 2-WARNING, 3-ERROR, 4-CRITICAL, default level is WARNING. export GLOG_v=2 # Conda environmental options diff --git a/docs/mindformers/docs/source_en/feature/ckpt.md b/docs/mindformers/docs/source_en/feature/ckpt.md index 187bd0f63e..a9b8db2b3e 100644 --- a/docs/mindformers/docs/source_en/feature/ckpt.md +++ b/docs/mindformers/docs/source_en/feature/ckpt.md @@ -1,6 +1,6 @@ # Ckpt Weights -[![View Source On Gitee](https://mindspore-website.obs.cn-north-4.myhuaweicloud.com/website-images/master/resource/_static/logo_source_en.svg)](https://gitee.com/mindspore/docs/blob/master/docs/mindformers/docs/source_zh_cn/feature/ckpt.md) +[![View Source On Gitee](https://mindspore-website.obs.cn-north-4.myhuaweicloud.com/website-images/master/resource/_static/logo_source_en.svg)](https://gitee.com/mindspore/docs/blob/master/docs/mindformers/docs/source_en/feature/ckpt.md) ## Overview -- Gitee