diff --git a/docs/mindspore/source_en/features/parallel/auto_parallel.rst b/docs/mindspore/source_en/features/parallel/auto_parallel.rst index 7b19781bdb6266597dc14941ce581dc068557262..a4b82b6d21024eaa24ed91190061b2e760b7797a 100644 --- a/docs/mindspore/source_en/features/parallel/auto_parallel.rst +++ b/docs/mindspore/source_en/features/parallel/auto_parallel.rst @@ -69,7 +69,7 @@ The sharding strategy propagation algorithm means that the user only needs to ma Related interfaces: -1. ``mindspore.parallel.auto_parallel.AutoParallel(net, parallel_mode="recursive_programming")``: Set the parallel mode and select the Strategy Propagation Algorithm or Recursive Algorithm via ``parallel_mode``. +1. ``mindspore.parallel.auto_parallel.AutoParallel(net, parallel_mode="sharding_propagation")``: Set the parallel mode and select the Strategy Propagation Algorithm via ``parallel_mode``. 2. ``mindspore.nn.Cell.shard()`` and ``mindspore.ops.Primitive.shard()``: Specifies the operator sharding strategy, and the strategy for the rest of the operators is derived by the propagation algorithm. Currently the ``mindspore.nn.Cell.shard()`` interface can be used in PyNative mode and Graph mode; The ``mindspore.ops.Primitive.shard()`` interface can only be used in Graph mode. diff --git a/docs/mindspore/source_zh_cn/features/parallel/auto_parallel.rst b/docs/mindspore/source_zh_cn/features/parallel/auto_parallel.rst index ab821228fdbfe31ee83b744c80a73e46262cad73..86a2b88674c72d670bbe9da0337f312c6517b469 100644 --- a/docs/mindspore/source_zh_cn/features/parallel/auto_parallel.rst +++ b/docs/mindspore/source_zh_cn/features/parallel/auto_parallel.rst @@ -71,7 +71,7 @@ MindSpore将单机版本的程序转换成并行版本的程序。该转换是 相关接口: -1. ``mindspore.parallel.auto_parallel.AutoParallel(net, parallel_mode="recursive_programming")``:设置并行模式,可以通过parallel_mode选择策略传播算法或双递归算法。 +1. ``mindspore.parallel.auto_parallel.AutoParallel(net, parallel_mode="sharding_propagation")``:设置并行模式,可以通过parallel_mode选择策略传播算法。 2. ``mindspore.nn.Cell.shard()`` 以及 ``mindspore.ops.Primitive.shard()``:指定算子切分策略,其余算子的策略通过传播算法推导得到。目前 ``mindspore.nn.Cell.shard()`` 接口同时支持 PyNative 模式与 Graph 模式;``mindspore.ops.Primitive.shard()`` 接口仅可在 Graph 模式下使用。