From 25a35f07bcd24f3f5052d8455b0e89de5b0f6c67 Mon Sep 17 00:00:00 2001 From: zhangyi Date: Wed, 13 Apr 2022 18:07:44 +0800 Subject: [PATCH] modify links --- docs/mindspore/source_en/design/gradient.md | 2 +- docs/mindspore/source_en/design/parameter_server_training.md | 2 +- .../mindspore/source_en/migration_guide/neural_network_debug.md | 2 +- docs/mindspore/source_zh_cn/design/jit_fallback.md | 2 +- 4 files changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/mindspore/source_en/design/gradient.md b/docs/mindspore/source_en/design/gradient.md index 07bddc9560..23bab0463d 100644 --- a/docs/mindspore/source_en/design/gradient.md +++ b/docs/mindspore/source_en/design/gradient.md @@ -2,7 +2,7 @@ `Ascend` `GPU` `CPU` `Design` `Model Development` - + ## Automatic Differentiation Overview diff --git a/docs/mindspore/source_en/design/parameter_server_training.md b/docs/mindspore/source_en/design/parameter_server_training.md index 10fa00c9d5..f3d96ba147 100644 --- a/docs/mindspore/source_en/design/parameter_server_training.md +++ b/docs/mindspore/source_en/design/parameter_server_training.md @@ -31,7 +31,7 @@ Learn how to train a LeNet using the [MNIST dataset](http://yann.lecun.com/exdb/ 1. First of all, use `mindspore.context.set_ps_context(enable_ps=True)` to enable Parameter Server training mode. - This method should be called before `mindspore.communication.init()`. - - If you don't call this method, the [Environment Variable Setting](https://www.mindspore.cn/tutorials/experts/en/master/parallel/apply_parameter_server_training.html#environment-variable-setting) below will not take effect. + - If you don't call this method, the [Environment Variable Setting](https://www.mindspore.cn/docs/en/master/design/parameter_server_training.html#environment-variable-setting) below will not take effect. - Use `mindspore.context.reset_ps_context()` to disable Parameter Server training mode. 2. In this training mode, you can use either of the following methods to control whether the training parameters are updated by the Parameter Server and whether the training parameters are initialized on Worker or Server: diff --git a/docs/mindspore/source_en/migration_guide/neural_network_debug.md b/docs/mindspore/source_en/migration_guide/neural_network_debug.md index 9f64d851c0..5434992da9 100644 --- a/docs/mindspore/source_en/migration_guide/neural_network_debug.md +++ b/docs/mindspore/source_en/migration_guide/neural_network_debug.md @@ -37,7 +37,7 @@ This section introduces the problems and solutions during Network Debugging proc For script development and network process debugging, we recommend using the PyNative mode for debugging. The PyNative mode supports executing single operators, normal functions and networks, as well as separate operations for computing gradients. In PyNative mode, you can easily set breakpoints and get intermediate results of network execution, and you can also debug the network by means of pdb. -By default, MindSpore is in Graph mode, which can be set as PyNative mode via `context.set_context(mode=context.PYNATIVE_MODE)`. Related examples can be found in [Debugging With PyNative Mode](hhttps://www.mindspore.cn/tutorials/zh-CN/master/advanced/pynative_graph/pynative.html). +By default, MindSpore is in Graph mode, which can be set as PyNative mode via `context.set_context(mode=context.PYNATIVE_MODE)`. Related examples can be found in [Debugging With PyNative Mode](https://www.mindspore.cn/tutorials/en/master/advanced/pynative_graph/pynative.html). #### Getting More Error Messages diff --git a/docs/mindspore/source_zh_cn/design/jit_fallback.md b/docs/mindspore/source_zh_cn/design/jit_fallback.md index b98bf3895f..db270fbc3c 100644 --- a/docs/mindspore/source_zh_cn/design/jit_fallback.md +++ b/docs/mindspore/source_zh_cn/design/jit_fallback.md @@ -4,7 +4,7 @@ ## 概述 -MindSpore框架支持静态图模式和动态图模式两种方式。在静态图模式下,先将Python代码编译成静态计算图,然后执行静态计算图。由于语法解析的限制,用户编写程序时需要遵循MindSpore[静态图语法支持](https://www.mindspore.cn/docs/zh-CN/master/note/static_graph_syntax_support.html),语法使用存在约束限制。在动态图模式下,Python代码会通过Python解释器执行,用户可以使用任意Python语法。可以看到,静态图和动态图的编译流程不一致,语法约束限制也不同。关于静态图和动态图的更多介绍,请参考[静态图和动态图](https://www.mindspore.cn/docs/zh-CN/master/design/dynamic_graph_and_static_graph.html)。 +MindSpore框架支持静态图模式和动态图模式两种方式。在静态图模式下,先将Python代码编译成静态计算图,然后执行静态计算图。由于语法解析的限制,用户编写程序时需要遵循MindSpore[静态图语法支持](https://www.mindspore.cn/docs/zh-CN/master/note/static_graph_syntax_support.html),语法使用存在约束限制。在动态图模式下,Python代码会通过Python解释器执行,用户可以使用任意Python语法。可以看到,静态图和动态图的编译流程不一致,语法约束限制也不同。关于静态图和动态图的更多介绍,请参考[静态图和动态图](https://www.mindspore.cn/tutorials/zh-CN/master/advanced/pynative_graph.html)。 JIT Fallback是从静态图的角度出发考虑静态图和动态图的统一。通过JIT Fallback特性,静态图可以支持尽量多的动态图语法,使得静态图提供接近动态图的语法使用体验,从而实现动静统一。为了便于用户选择是否使用JIT Fallback特性的能力,提供了开关`MS_DEV_ENABLE_FALLBACK`,当前默认已经打开。如果需要关闭,可以使用命令:`export MS_DEV_ENABLE_FALLBACK=0`。 -- Gitee