From 5a0d698aa227efb3ba493d5d6ce8d321709eb7a6 Mon Sep 17 00:00:00 2001
From: huan <3174348550@qq.com>
Date: Mon, 28 Jul 2025 09:49:37 +0800
Subject: [PATCH] modify error anchors
---
docs/lite/api/source_zh_cn/api_c/model_c.md | 4 ++--
docs/mindspore/source_en/faq/feature_advice.md | 2 +-
2 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/docs/lite/api/source_zh_cn/api_c/model_c.md b/docs/lite/api/source_zh_cn/api_c/model_c.md
index 4de2c0bbc7..369363763a 100644
--- a/docs/lite/api/source_zh_cn/api_c/model_c.md
+++ b/docs/lite/api/source_zh_cn/api_c/model_c.md
@@ -416,7 +416,7 @@ MSStatus MSModelSetLearningRate(MSModelHandle model, float learning_rate)
枚举类型的状态码`MSStatus`,若返回`MSStatus::kMSStatusSuccess`则证明成功。
-#### MSModelSetTrainMode
+#### MSModelSetTrainMode
```C
MSStatus MSModelSetTrainMode(const MSModelHandle model, bool train)
@@ -433,7 +433,7 @@ MSStatus MSModelSetTrainMode(const MSModelHandle model, bool train)
枚举类型的状态码`MSStatus`,若返回`MSStatus::kMSStatusSuccess`则证明成功。
-#### MSModelSetTrainMode
+#### MSModelSetTrainMode
```C
MSStatus MSModelSetTrainMode(MSModelHandle model, bool train)
diff --git a/docs/mindspore/source_en/faq/feature_advice.md b/docs/mindspore/source_en/faq/feature_advice.md
index 299fc1ba5d..0850a84ac7 100644
--- a/docs/mindspore/source_en/faq/feature_advice.md
+++ b/docs/mindspore/source_en/faq/feature_advice.md
@@ -50,7 +50,7 @@ A: The formats of `ckpt` of MindSpore and `ckpt`of TensorFlow are not generic.
## Q: How do I use models trained by MindSpore on Atlas 200/300/500 inference product? Can they be converted to models used by HiLens Kit?
-A: Yes. HiLens Kit uses Atlas 200/300/500 inference product as the inference core. Therefore, the two questions are essentially the same, which both need to convert as OM model. Atlas 200/300/500 inference product requires a dedicated OM model. Use MindSpore to export the ONNX and convert it into an OM model supported by Atlas 200/300/500 inference product. For details, see [Multi-platform Inference](https://www.mindspore.cn/tutorials/en/master/model_infer/ms_infer/llm_inference_overview.html).
+A: Atlas 200/300/500 inference product requires a dedicated OM model. Use MindSpore to export the ONNX and convert it into an OM model supported by Atlas 200/300/500 inference product. Yes. HiLens Kit uses Atlas 200/300/500 inference product as the inference core. Therefore, the two questions are essentially the same, which both need to convert as OM model.
--
Gitee