diff --git a/docs/lite/docs/source_en/infer/runtime_cpp.md b/docs/lite/docs/source_en/infer/runtime_cpp.md index 74cdf0f16cf3182b54baf610550df2b1727e50b6..105ada407e1375d074141824269936be73d464c6 100644 --- a/docs/lite/docs/source_en/infer/runtime_cpp.md +++ b/docs/lite/docs/source_en/infer/runtime_cpp.md @@ -726,7 +726,7 @@ int RunEncryptedInfer(const char *model_path, const std::string dec_key_str, If the command for using the converter_lite is: ```bash -./converter_lite --fmk=MINDIR --modelFile=./lenet.mindir --outputFile=lenet_enc --encryptKey=30313233343536373839414243444546 --encryption=true +./converter_lite --fmk=MINDIR --modelFile=./lenet.mindir --outputFile=lenet_enc --encryptKey="your encrypt key" --encryption=true ``` Compile the source code in the mindspore-lite/examples/runtime_cpp directory, and generate build/runtime_cpp: @@ -740,7 +740,7 @@ cd build Run MindSpore Lite inference on the encrypted model file: ```bash -./runtime_cpp --modelFile=./lenet_enc.ms 6 30313233343536373839414243444546 ${your_openssl_path} +./runtime_cpp --modelFile=./lenet_enc.ms 6 "your decrypt key" ${your_openssl_path} ``` ### Viewing Logs diff --git a/docs/lite/docs/source_en/mindir/benchmark_tool.md b/docs/lite/docs/source_en/mindir/benchmark_tool.md index 18d2e0745fdfdefdb7da745c80218e7f446b9655..1d43ba664b95b33d722e0096530691461a83449a 100644 --- a/docs/lite/docs/source_en/mindir/benchmark_tool.md +++ b/docs/lite/docs/source_en/mindir/benchmark_tool.md @@ -115,5 +115,5 @@ If you need to specify the dimension of the input data (e.g. input dimension is If the model is encryption model, inference is performed after both `decryptKey` and `cryptoLibPath` are configured to decrypt the model. For example: ```bash -./benchmark --modelFile=/path/to/encry_model.mindir --decryptKey=30313233343536373839414243444546 --cryptoLibPath=/root/anaconda3/bin/openssl +./benchmark --modelFile=/path/to/encry_model.mindir --decryptKey="your decrypt key" --cryptoLibPath=/root/anaconda3/bin/openssl ``` \ No newline at end of file diff --git a/docs/lite/docs/source_zh_cn/infer/runtime_cpp.md b/docs/lite/docs/source_zh_cn/infer/runtime_cpp.md index 6ba3c58c0ffd5f068d4743f5fac0bcf21d98e451..6659d0b9fc2d91cb98d643229f51aefa16f6773f 100644 --- a/docs/lite/docs/source_zh_cn/infer/runtime_cpp.md +++ b/docs/lite/docs/source_zh_cn/infer/runtime_cpp.md @@ -725,7 +725,7 @@ int RunEncryptedInfer(const char *model_path, const std::string dec_key_str, 使用converter_lite工具的命令为: ```bash -./converter_lite --fmk=MINDIR --modelFile=./lenet.mindir --outputFile=lenet_enc --encryptKey=30313233343536373839414243444546 --encryption=true +./converter_lite --fmk=MINDIR --modelFile=./lenet.mindir --outputFile=lenet_enc --encryptKey="your encrypt key" --encryption=true ``` 在mindspore-lite/examples/runtime_cpp目录下编译源码生成build/runtime_cpp文件: @@ -739,7 +739,7 @@ cd build 运行Lite端侧使用加密后的模型进行推理: ```bash -./runtime_cpp --modelFile=./lenet_enc.ms 6 30313233343536373839414243444546 ${your_openssl_path} +./runtime_cpp --modelFile=./lenet_enc.ms 6 "your decrypt key" ${your_openssl_path} ``` ### 查看日志 diff --git a/docs/lite/docs/source_zh_cn/mindir/benchmark_tool.md b/docs/lite/docs/source_zh_cn/mindir/benchmark_tool.md index fa43285684a62b0bba6e20ba47a2f80cc21ea894..ace4bbfd0c4134c617babace3177cea8effa4c31 100644 --- a/docs/lite/docs/source_zh_cn/mindir/benchmark_tool.md +++ b/docs/lite/docs/source_zh_cn/mindir/benchmark_tool.md @@ -115,5 +115,5 @@ Mean bias of all nodes: 0% 如果输入的模型是加密模型,需要同时配置`decryptKey`和`cryptoLibPath`对模型解密后进行推理,使用如下命令: ```bash -./benchmark --modelFile=/path/to/encry_model.mindir --decryptKey=30313233343536373839414243444546 --cryptoLibPath=/root/anaconda3/bin/openssl +./benchmark --modelFile=/path/to/encry_model.mindir --decryptKey="your decrypt key" --cryptoLibPath=/root/anaconda3/bin/openssl ``` \ No newline at end of file