diff --git a/docs/migration_guide/source_en/inference.md b/docs/migration_guide/source_en/inference.md index 11953d8037d2f8069973455f7232a9255b7dae04..dffabf31b799f78ef25e6ebb579ef4dbb0fb6ba5 100644 --- a/docs/migration_guide/source_en/inference.md +++ b/docs/migration_guide/source_en/inference.md @@ -1,17 +1,5 @@ # Inference Execution -Translator: [Dongdong92](https://gitee.com/zy179280) - - - -- [Inference Execution](#inference-execution) - - [Inference Service Based on Models](#inference-service-based-on-models) - - [Overview](#overview) - - [Executing Inference on Different Platforms](#executing-inference-on-different-platforms) - - [On-line Inference Service Deployment Based on MindSpore Serving](#On-line-inference-service-deployment-based-on-mindspore-serving) - - - For trained models, MindSpore can execute inference tasks on different hardware platforms. MindSpore also provides online inference services based on MindSpore Serving. @@ -33,6 +21,8 @@ For dominating the difference between backend models, model files in the [MindIR - For the CPU hardware platform, please refer to [Inference on a CPU](https://www.mindspore.cn/tutorial/inference/en/master/multi_platform_inference_cpu.html). - For inference on the Lite platform on device, please refer to [on-device inference](https://www.mindspore.cn/tutorial/lite/en/master/index.html). +> Explaination +> > Please refer to [MindSpore C++ Library Use](https://www.mindspore.cn/doc/faq/en/master/inference.html#c) to solve the interface issues on the Ascend hardware platform. ## On-line Inference Service Deployment Based on MindSpore Serving @@ -45,4 +35,7 @@ MindSpore Serving is a lite and high-performance service module, aiming at assis - [Servable Provided Through Model Configuration](https://www.mindspore.cn/tutorial/inference/en/master/serving_model.html). - [MindSpore Serving-based Distributed Inference Service Deployment](https://www.mindspore.cn/tutorial/inference/en/master/serving_distributed_example.html). +> Explaination +> > For deployment issues regarding the on-line inference service, please refer to [MindSpore Serving](https://www.mindspore.cn/doc/faq/en/master/inference.html#mindspore-serving). +