From 4f1f97ac70bca46c431b9c708cd67d11f56e4ed7 Mon Sep 17 00:00:00 2001 From: yangjie159 Date: Tue, 30 Jun 2020 18:18:06 +0800 Subject: [PATCH] update export lite model for predict --- tutorials/source_en/advanced_use/on_device_inference.md | 2 ++ tutorials/source_zh_cn/advanced_use/on_device_inference.md | 2 ++ 2 files changed, 4 insertions(+) diff --git a/tutorials/source_en/advanced_use/on_device_inference.md b/tutorials/source_en/advanced_use/on_device_inference.md index f3ddc120b3..53c567b8cc 100644 --- a/tutorials/source_en/advanced_use/on_device_inference.md +++ b/tutorials/source_en/advanced_use/on_device_inference.md @@ -78,6 +78,8 @@ The compilation procedure is as follows: ## Use of On-Device Inference +> During optimization and upgrade, temporarily unavailable. + When MindSpore is used to perform model inference in the APK project of an app, preprocessing input is required before model inference. For example, before an image is converted into the tensor format required by MindSpore inference, the image needs to be resized. After MindSpore completes model inference, postprocess the model inference result and sends the processed output to the app. This section describes how to use MindSpore to perform model inference. The setup of an APK project and pre- and post-processing of model inference are not described here. diff --git a/tutorials/source_zh_cn/advanced_use/on_device_inference.md b/tutorials/source_zh_cn/advanced_use/on_device_inference.md index eb224de000..fcd8ca7a35 100644 --- a/tutorials/source_zh_cn/advanced_use/on_device_inference.md +++ b/tutorials/source_zh_cn/advanced_use/on_device_inference.md @@ -77,6 +77,8 @@ MindSpore Predict是一个轻量级的深度神经网络推理引擎,提供了 ## 端侧推理使用 +> 优化升级中,暂不可用。 + 在APP的APK工程中使用MindSpore进行模型推理前,需要对输入进行必要的前处理,比如将图片转换成MindSpore推理要求的`tensor`格式、对图片进行`resize`等处理。在MindSpore完成模型推理后,对模型推理的结果进行后处理,并将处理的输出发送给APP应用。 本章主要描述用户如何使用MindSpore进行模型推理,APK工程的搭建和模型推理的前后处理,不在此列举。 -- Gitee