From 737bb9e6799e3dd30fb855ef61ea2836269a4818 Mon Sep 17 00:00:00 2001 From: huan <3174348550@qq.com> Date: Tue, 15 Jul 2025 11:21:49 +0800 Subject: [PATCH] delete files --- tutorials/source_en/parallel/distributed_case.rst | 3 +-- tutorials/source_en/parallel/overview.md | 1 - tutorials/source_zh_cn/parallel/distributed_case.rst | 1 - tutorials/source_zh_cn/parallel/overview.md | 1 - 4 files changed, 1 insertion(+), 5 deletions(-) diff --git a/tutorials/source_en/parallel/distributed_case.rst b/tutorials/source_en/parallel/distributed_case.rst index aca997e988..a7887c21b1 100644 --- a/tutorials/source_en/parallel/distributed_case.rst +++ b/tutorials/source_en/parallel/distributed_case.rst @@ -8,5 +8,4 @@ Distributed High-Level Configuration Case .. toctree:: :maxdepth: 1 - multiple_mixed - ms_operator \ No newline at end of file + multiple_mixed \ No newline at end of file diff --git a/tutorials/source_en/parallel/overview.md b/tutorials/source_en/parallel/overview.md index 164ef39f41..3c2d791848 100644 --- a/tutorials/source_en/parallel/overview.md +++ b/tutorials/source_en/parallel/overview.md @@ -66,4 +66,3 @@ If there is a requirement for performance, throughput, or scale, or if you don't ## Distributed High-Level Configuration Examples - **Multi-dimensional Hybrid Parallel Case Based on Double Recursive Search**: Multi-dimensional hybrid parallel based on double recursive search means that the user can configure optimization methods such as recomputation, optimizer parallel, pipeline parallel. Based on the user configurations, the operator-level strategy is automatically searched by the double recursive strategy search algorithm, which generates the optimal parallel strategy. For details, please refer to the [Multi-dimensional Hybrid Parallel Case Based on Double Recursive Search](https://www.mindspore.cn/tutorials/en/r2.7.0rc1/parallel/multiple_mixed.html). -- **Performing Distributed Training on K8S Clusters**: MindSpore Operator is a plugin that follows Kubernetes' Operator pattern (based on the CRD-Custom Resource Definition feature) and implements distributed training on Kubernetes. MindSpore Operator defines Scheduler, PS, Worker three roles in CRD, and users can easily use MindSpore on K8S for distributed training through simple YAML file configuration. The code repository of mindSpore Operator is described in: [ms-operator](https://gitee.com/mindspore/ms-operator/). For details, please refer to the [Performing Distributed Training on K8S Clusters](https://www.mindspore.cn/tutorials/en/r2.7.0rc1/parallel/ms_operator.html). diff --git a/tutorials/source_zh_cn/parallel/distributed_case.rst b/tutorials/source_zh_cn/parallel/distributed_case.rst index fc09a68100..da4891d22d 100644 --- a/tutorials/source_zh_cn/parallel/distributed_case.rst +++ b/tutorials/source_zh_cn/parallel/distributed_case.rst @@ -9,4 +9,3 @@ :maxdepth: 1 multiple_mixed - ms_operator diff --git a/tutorials/source_zh_cn/parallel/overview.md b/tutorials/source_zh_cn/parallel/overview.md index 7e5ef73311..437d215ea8 100644 --- a/tutorials/source_zh_cn/parallel/overview.md +++ b/tutorials/source_zh_cn/parallel/overview.md @@ -66,4 +66,3 @@ MindSpore提供两种粒度的算子级并行能力:算子级并行和高阶 ## 分布式高阶配置案例 - **基于双递归搜索的多维混合并行案例**:基于双递归搜索的多维混合并行是指用户可以配置重计算、优化器并行、流水线并行等优化方法,在用户配置的基础上,通过双递归策略搜索算法进行算子级策略自动搜索,进而生成最优的并行策略。详细可参考[基于双递归搜索的多维混合并行案例](https://www.mindspore.cn/tutorials/zh-CN/r2.7.0rc1/parallel/multiple_mixed.html)。 -- **在K8S集群上进行分布式训练**:MindSpore Operator是遵循Kubernetes的Operator模式(基于CRD-Custom Resource Definition功能),实现的在Kubernetes上进行分布式训练的插件。其中,MindSpore Operator在CRD中定义了Scheduler、PS、Worker三种角色,用户只需通过简单的YAML文件配置,就可以轻松地在K8S上进行分布式训练。MindSpore Operator的代码仓详见:[ms-operator](https://gitee.com/mindspore/ms-operator/)。详细可参考[在K8S集群上进行分布式训练](https://www.mindspore.cn/tutorials/zh-CN/r2.7.0rc1/parallel/ms_operator.html)。 -- Gitee