diff --git a/docs/mindspore/source_en/api_python/operator_list_parallel.md b/docs/mindspore/source_en/api_python/operator_list_parallel.md index 061106b99cfd5e5e5494cfc66bff877aa7101a17..294d45eb4a5fe86281cb4cc4469d9b5ab818d41d 100644 --- a/docs/mindspore/source_en/api_python/operator_list_parallel.md +++ b/docs/mindspore/source_en/api_python/operator_list_parallel.md @@ -160,7 +160,7 @@ None | Not support confi | [mindspore.ops.Sub](https://www.mindspore.cn/docs/en/master/api_python/ops/mindspore.ops.Sub.html) | None | Layout configuration is supported. The input layout should be the same or broadcastable. The output layout cannot be configured. | | [mindspore.ops.Tan](https://www.mindspore.cn/docs/en/master/api_python/ops/mindspore.ops.Tan.html) | None | Not support config layout | | [mindspore.ops.Tanh](https://www.mindspore.cn/docs/en/master/api_python/ops/mindspore.ops.Tanh.html) | None | Not support config layout | -| [mindspore.ops.Tile](https://www.mindspore.cn/docs/en/master/api_python/ops/mindspore.ops.Tile.html) | Only support configuring shard strategy for multiples. | Not support config layout | +| [mindspore.ops.Tile](https://www.mindspore.cn/docs/en/master/api_python/ops/mindspore.ops.Tile.html) | Only support configuring shard strategy for dims. | Support config input and output layout. When dim (replication number) = 1, the input and output layout for this dim should be the same; when dim > 1, the input of this dim can't be split for the data accuracy, this dim must be divisible by the output split strategies. | | [mindspore.ops.TopK](https://www.mindspore.cn/docs/en/master/api_python/ops/mindspore.ops.TopK.html) | The input_x can't be split into the last dimension, otherwise it's inconsistent with the single machine in the mathematical logic. | Not support config layout | | [mindspore.ops.Transpose](https://www.mindspore.cn/docs/en/master/api_python/ops/mindspore.ops.Transpose.html) | None | Support config layout, and the output layout cannot be configured. | | [mindspore.ops.TruncateDiv](https://mindspore.cn/docs/en/master/api_python/ops/mindspore.ops.TruncateDiv.html) | None | Not support config layout | diff --git a/docs/mindspore/source_zh_cn/api_python/operator_list_parallel.md b/docs/mindspore/source_zh_cn/api_python/operator_list_parallel.md index eb358c366d192a7eb35336be03dc0bd4568c32a5..5feeec361d3edeeabde79d3cf958fadab2a813e8 100644 --- a/docs/mindspore/source_zh_cn/api_python/operator_list_parallel.md +++ b/docs/mindspore/source_zh_cn/api_python/operator_list_parallel.md @@ -159,7 +159,7 @@ | [mindspore.ops.Sub](https://www.mindspore.cn/docs/zh-CN/master/api_python/ops/mindspore.ops.Sub.html) | 无 | 支持配置Layout,输入的Layout 需要相同或能广播,不支持配置输出的Layout | | [mindspore.ops.Tan](https://www.mindspore.cn/docs/zh-CN/master/api_python/ops/mindspore.ops.Tan.html) | 无 | 不支持配置Layout | | [mindspore.ops.Tanh](https://www.mindspore.cn/docs/zh-CN/master/api_python/ops/mindspore.ops.Tanh.html) | 无 | 不支持配置Layout | -| [mindspore.ops.Tile](https://www.mindspore.cn/docs/zh-CN/master/api_python/ops/mindspore.ops.Tile.html) | 仅支持对multiples配置切分策略 | 不支持配置Layout | +| [mindspore.ops.Tile](https://www.mindspore.cn/docs/zh-CN/master/api_python/ops/mindspore.ops.Tile.html) | 仅支持对dims配置切分策略 | 支持配置输入与输出的Layout,dim (复制次数) 为1的维度,输入与输出中此维度切分策略需相同;dim>1的维度,输入中此维度不允许切分以防止复制后数据乱序,输出中对应dim需要能被切分数整除 | | [mindspore.ops.TopK](https://www.mindspore.cn/docs/zh-CN/master/api_python/ops/mindspore.ops.TopK.html) | 最后一维不支持切分,切分后,在数学逻辑上和单机不等价 | 不支持配置Layout | | [mindspore.ops.Transpose](https://www.mindspore.cn/docs/zh-CN/master/api_python/ops/mindspore.ops.Transpose.html) | 无 | 支持配置Layout,不支持配置输出的Layout | | [mindspore.ops.TruncateDiv](https://mindspore.cn/docs/zh-CN/master/api_python/ops/mindspore.ops.TruncateDiv.html) | 无 | 不支持配置Layout |