From 16aa1688237aff84252440c9410479efa4bc3292 Mon Sep 17 00:00:00 2001 From: qiuleilei Date: Tue, 12 Aug 2025 14:12:17 +0800 Subject: [PATCH] bugfix --- docs/lite/docs/source_en/infer/runtime_cpp.md | 4 ++-- docs/lite/docs/source_en/mindir/benchmark_tool.md | 2 +- docs/lite/docs/source_zh_cn/infer/runtime_cpp.md | 4 ++-- docs/lite/docs/source_zh_cn/mindir/benchmark_tool.md | 2 +- 4 files changed, 6 insertions(+), 6 deletions(-) diff --git a/docs/lite/docs/source_en/infer/runtime_cpp.md b/docs/lite/docs/source_en/infer/runtime_cpp.md index 74cdf0f16c..105ada407e 100644 --- a/docs/lite/docs/source_en/infer/runtime_cpp.md +++ b/docs/lite/docs/source_en/infer/runtime_cpp.md @@ -726,7 +726,7 @@ int RunEncryptedInfer(const char *model_path, const std::string dec_key_str, If the command for using the converter_lite is: ```bash -./converter_lite --fmk=MINDIR --modelFile=./lenet.mindir --outputFile=lenet_enc --encryptKey=30313233343536373839414243444546 --encryption=true +./converter_lite --fmk=MINDIR --modelFile=./lenet.mindir --outputFile=lenet_enc --encryptKey="your encrypt key" --encryption=true ``` Compile the source code in the mindspore-lite/examples/runtime_cpp directory, and generate build/runtime_cpp: @@ -740,7 +740,7 @@ cd build Run MindSpore Lite inference on the encrypted model file: ```bash -./runtime_cpp --modelFile=./lenet_enc.ms 6 30313233343536373839414243444546 ${your_openssl_path} +./runtime_cpp --modelFile=./lenet_enc.ms 6 "your decrypt key" ${your_openssl_path} ``` ### Viewing Logs diff --git a/docs/lite/docs/source_en/mindir/benchmark_tool.md b/docs/lite/docs/source_en/mindir/benchmark_tool.md index 18d2e0745f..1d43ba664b 100644 --- a/docs/lite/docs/source_en/mindir/benchmark_tool.md +++ b/docs/lite/docs/source_en/mindir/benchmark_tool.md @@ -115,5 +115,5 @@ If you need to specify the dimension of the input data (e.g. input dimension is If the model is encryption model, inference is performed after both `decryptKey` and `cryptoLibPath` are configured to decrypt the model. For example: ```bash -./benchmark --modelFile=/path/to/encry_model.mindir --decryptKey=30313233343536373839414243444546 --cryptoLibPath=/root/anaconda3/bin/openssl +./benchmark --modelFile=/path/to/encry_model.mindir --decryptKey="your decrypt key" --cryptoLibPath=/root/anaconda3/bin/openssl ``` \ No newline at end of file diff --git a/docs/lite/docs/source_zh_cn/infer/runtime_cpp.md b/docs/lite/docs/source_zh_cn/infer/runtime_cpp.md index 6ba3c58c0f..6659d0b9fc 100644 --- a/docs/lite/docs/source_zh_cn/infer/runtime_cpp.md +++ b/docs/lite/docs/source_zh_cn/infer/runtime_cpp.md @@ -725,7 +725,7 @@ int RunEncryptedInfer(const char *model_path, const std::string dec_key_str, 使用converter_lite工具的命令为: ```bash -./converter_lite --fmk=MINDIR --modelFile=./lenet.mindir --outputFile=lenet_enc --encryptKey=30313233343536373839414243444546 --encryption=true +./converter_lite --fmk=MINDIR --modelFile=./lenet.mindir --outputFile=lenet_enc --encryptKey="your encrypt key" --encryption=true ``` 在mindspore-lite/examples/runtime_cpp目录下编译源码生成build/runtime_cpp文件: @@ -739,7 +739,7 @@ cd build 运行Lite端侧使用加密后的模型进行推理: ```bash -./runtime_cpp --modelFile=./lenet_enc.ms 6 30313233343536373839414243444546 ${your_openssl_path} +./runtime_cpp --modelFile=./lenet_enc.ms 6 "your decrypt key" ${your_openssl_path} ``` ### 查看日志 diff --git a/docs/lite/docs/source_zh_cn/mindir/benchmark_tool.md b/docs/lite/docs/source_zh_cn/mindir/benchmark_tool.md index fa43285684..ace4bbfd0c 100644 --- a/docs/lite/docs/source_zh_cn/mindir/benchmark_tool.md +++ b/docs/lite/docs/source_zh_cn/mindir/benchmark_tool.md @@ -115,5 +115,5 @@ Mean bias of all nodes: 0% 如果输入的模型是加密模型,需要同时配置`decryptKey`和`cryptoLibPath`对模型解密后进行推理,使用如下命令: ```bash -./benchmark --modelFile=/path/to/encry_model.mindir --decryptKey=30313233343536373839414243444546 --cryptoLibPath=/root/anaconda3/bin/openssl +./benchmark --modelFile=/path/to/encry_model.mindir --decryptKey="your decrypt key" --cryptoLibPath=/root/anaconda3/bin/openssl ``` \ No newline at end of file -- Gitee