From 15586767962b312eacf7db6416dd146ad1701d59 Mon Sep 17 00:00:00 2001 From: Wu Jinchen Date: Fri, 6 Jun 2025 15:06:21 +0800 Subject: [PATCH] fix sp interface --- docs/mindspore/source_en/features/parallel/auto_parallel.rst | 2 +- docs/mindspore/source_zh_cn/features/parallel/auto_parallel.rst | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/mindspore/source_en/features/parallel/auto_parallel.rst b/docs/mindspore/source_en/features/parallel/auto_parallel.rst index 7b19781bdb..a4b82b6d21 100644 --- a/docs/mindspore/source_en/features/parallel/auto_parallel.rst +++ b/docs/mindspore/source_en/features/parallel/auto_parallel.rst @@ -69,7 +69,7 @@ The sharding strategy propagation algorithm means that the user only needs to ma Related interfaces: -1. ``mindspore.parallel.auto_parallel.AutoParallel(net, parallel_mode="recursive_programming")``: Set the parallel mode and select the Strategy Propagation Algorithm or Recursive Algorithm via ``parallel_mode``. +1. ``mindspore.parallel.auto_parallel.AutoParallel(net, parallel_mode="sharding_propagation")``: Set the parallel mode and select the Strategy Propagation Algorithm via ``parallel_mode``. 2. ``mindspore.nn.Cell.shard()`` and ``mindspore.ops.Primitive.shard()``: Specifies the operator sharding strategy, and the strategy for the rest of the operators is derived by the propagation algorithm. Currently the ``mindspore.nn.Cell.shard()`` interface can be used in PyNative mode and Graph mode; The ``mindspore.ops.Primitive.shard()`` interface can only be used in Graph mode. diff --git a/docs/mindspore/source_zh_cn/features/parallel/auto_parallel.rst b/docs/mindspore/source_zh_cn/features/parallel/auto_parallel.rst index ab821228fd..86a2b88674 100644 --- a/docs/mindspore/source_zh_cn/features/parallel/auto_parallel.rst +++ b/docs/mindspore/source_zh_cn/features/parallel/auto_parallel.rst @@ -71,7 +71,7 @@ MindSpore将单机版本的程序转换成并行版本的程序。该转换是 相关接口: -1. ``mindspore.parallel.auto_parallel.AutoParallel(net, parallel_mode="recursive_programming")``:设置并行模式,可以通过parallel_mode选择策略传播算法或双递归算法。 +1. ``mindspore.parallel.auto_parallel.AutoParallel(net, parallel_mode="sharding_propagation")``:设置并行模式,可以通过parallel_mode选择策略传播算法。 2. ``mindspore.nn.Cell.shard()`` 以及 ``mindspore.ops.Primitive.shard()``:指定算子切分策略,其余算子的策略通过传播算法推导得到。目前 ``mindspore.nn.Cell.shard()`` 接口同时支持 PyNative 模式与 Graph 模式;``mindspore.ops.Primitive.shard()`` 接口仅可在 Graph 模式下使用。 -- Gitee