diff --git a/docs/mindspore/source_en/features/parallel/auto_parallel.rst b/docs/mindspore/source_en/features/parallel/auto_parallel.rst index cd1fed1dcc3331e5c4b20de3eccac3f778c0f1dd..8aa80bb8e7734ce7604007e612d7da8dc477b1ab 100644 --- a/docs/mindspore/source_en/features/parallel/auto_parallel.rst +++ b/docs/mindspore/source_en/features/parallel/auto_parallel.rst @@ -69,7 +69,7 @@ The sharding strategy propagation algorithm means that the user only needs to ma Related interfaces: -1. ``mindspore.parallel.auto_parallel.AutoParallel(net, parallel_mode="recursive_programming")``: Set the parallel mode and select the Strategy Propagation Algorithm or Recursive Algorithm via ``parallel_mode``. +1. ``mindspore.parallel.auto_parallel.AutoParallel(net, parallel_mode="sharding_propagation")``: Set the parallel mode and select the Strategy Propagation Algorithm via ``parallel_mode``. 2. ``mindspore.nn.Cell.shard()`` and ``mindspore.ops.Primitive.shard()``: Specifies the operator sharding strategy, and the strategy for the rest of the operators is derived by the propagation algorithm. Currently the ``mindspore.nn.Cell.shard()`` interface can be used in PyNative mode and Graph mode; The ``mindspore.ops.Primitive.shard()`` interface can only be used in Graph mode. diff --git a/docs/mindspore/source_zh_cn/features/parallel/auto_parallel.rst b/docs/mindspore/source_zh_cn/features/parallel/auto_parallel.rst index 6c7b025ce8547a7fb8742989df3f87228c7d5c58..2299cd627958c701c57bcea38ffe0004a447b8ff 100644 --- a/docs/mindspore/source_zh_cn/features/parallel/auto_parallel.rst +++ b/docs/mindspore/source_zh_cn/features/parallel/auto_parallel.rst @@ -71,7 +71,7 @@ MindSpore将单机版本的程序转换成并行版本的程序。该转换是 相关接口: -1. ``mindspore.parallel.auto_parallel.AutoParallel(net, parallel_mode="recursive_programming")``:设置并行模式,可以通过parallel_mode选择策略传播算法或双递归算法。 +1. ``mindspore.parallel.auto_parallel.AutoParallel(net, parallel_mode="sharding_propagation")``:设置并行模式,可以通过parallel_mode选择策略传播算法。 2. ``mindspore.nn.Cell.shard()`` 以及 ``mindspore.ops.Primitive.shard()``:指定算子切分策略,其余算子的策略通过传播算法推导得到。目前 ``mindspore.nn.Cell.shard()`` 接口同时支持 PyNative 模式与 Graph 模式;``mindspore.ops.Primitive.shard()`` 接口仅可在 Graph 模式下使用。