diff --git a/docs/mindformers/docs/source_en/introduction/models.md b/docs/mindformers/docs/source_en/introduction/models.md index dec643604616a45c9d37fa5b3935b1ba274004b6..60fd8bef432252e6979335242c1a983c7a057c17 100644 --- a/docs/mindformers/docs/source_en/introduction/models.md +++ b/docs/mindformers/docs/source_en/introduction/models.md @@ -6,32 +6,32 @@ The following table lists models supported by MindSpore TransFormers. | Model | Specifications | Model Type | Latest Version | |:--------------------------------------------------------------------------------------------------------|:------------------------------|:----------------:|:--------------:| -| [DeepSeek-V3](https://gitee.com/mindspore/mindformers/blob/master/research/deepseek3) | 671B | Sparse LLM | 1.6.0 | +| [DeepSeek-V3](https://gitee.com/mindspore/mindformers/tree/master/research/deepseek3) | 671B | Sparse LLM | 1.6.0 | | [GLM4](https://gitee.com/mindspore/mindformers/blob/master/docs/model_cards/glm4.md) | 9B | Dense LLM | 1.6.0 | -| [Llama3.1](https://gitee.com/mindspore/mindformers/blob/master/research/llama3_1) | 8B/70B | Dense LLM | 1.6.0 | -| [Mixtral](https://gitee.com/mindspore/mindformers/blob/master/research/mixtral) | 8x7B | Sparse LLM | 1.6.0 | -| [Qwen2.5](https://gitee.com/mindspore/mindformers/blob/master/research/qwen2_5) | 0.5B/1.5B/7B/14B/32B/72B | Dense LLM | 1.6.0 | -| [TeleChat2](https://gitee.com/mindspore/mindformers/blob/master/research/telechat2) | 7B/35B/115B | Dense LLM | 1.6.0 | +| [Llama3.1](https://gitee.com/mindspore/mindformers/tree/master/research/llama3_1) | 8B/70B | Dense LLM | 1.6.0 | +| [Mixtral](https://gitee.com/mindspore/mindformers/tree/master/research/mixtral) | 8x7B | Sparse LLM | 1.6.0 | +| [Qwen2.5](https://gitee.com/mindspore/mindformers/tree/master/research/qwen2_5) | 0.5B/1.5B/7B/14B/32B/72B | Dense LLM | 1.6.0 | +| [TeleChat2](https://gitee.com/mindspore/mindformers/tree/master/research/telechat2) | 7B/35B/115B | Dense LLM | 1.6.0 | | [CodeLlama](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/codellama.md) | 34B | Dense LLM | 1.5.0 | | [CogVLM2-Image](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/cogvlm2_image.md) | 19B | MM | 1.5.0 | | [CogVLM2-Video](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/cogvlm2_video.md) | 13B | MM | 1.5.0 | -| [DeepSeek-V2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/deepseek2) | 236B | Sparse LLM | 1.5.0 | -| [DeepSeek-Coder-V1.5](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/deepseek1_5) | 7B | Dense LLM | 1.5.0 | -| [DeepSeek-Coder](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/deepseek) | 33B | Dense LLM | 1.5.0 | -| [GLM3-32K](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/glm32k) | 6B | Dense LLM | 1.5.0 | +| [DeepSeek-V2](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/deepseek2) | 236B | Sparse LLM | 1.5.0 | +| [DeepSeek-Coder-V1.5](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/deepseek1_5) | 7B | Dense LLM | 1.5.0 | +| [DeepSeek-Coder](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/deepseek) | 33B | Dense LLM | 1.5.0 | +| [GLM3-32K](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/glm32k) | 6B | Dense LLM | 1.5.0 | | [GLM3](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/glm3.md) | 6B | Dense LLM | 1.5.0 | -| [InternLM2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/internlm2) | 7B/20B | Dense LLM | 1.5.0 | +| [InternLM2](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/internlm2) | 7B/20B | Dense LLM | 1.5.0 | | [Llama3.2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/llama3_2.md) | 3B | Dense LLM | 1.5.0 | | [Llama3.2-Vision](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/mllama.md) | 11B | MM | 1.5.0 | -| [Llama3](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/llama3) | 8B/70B | Dense LLM | 1.5.0 | +| [Llama3](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/llama3) | 8B/70B | Dense LLM | 1.5.0 | | [Llama2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/llama2.md) | 7B/13B/70B | Dense LLM | 1.5.0 | -| [Qwen2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/qwen2) | 0.5B/1.5B/7B/57B/57B-A14B/72B | Dense/Sparse LLM | 1.5.0 | -| [Qwen1.5](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/qwen1_5) | 7B/14B/72B | Dense LLM | 1.5.0 | -| [Qwen-VL](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/qwenvl) | 9.6B | MM | 1.5.0 | -| [TeleChat](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/telechat) | 7B/12B/52B | Dense LLM | 1.5.0 | +| [Qwen2](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/qwen2) | 0.5B/1.5B/7B/57B/57B-A14B/72B | Dense/Sparse LLM | 1.5.0 | +| [Qwen1.5](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/qwen1_5) | 7B/14B/72B | Dense LLM | 1.5.0 | +| [Qwen-VL](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/qwenvl) | 9.6B | MM | 1.5.0 | +| [TeleChat](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/telechat) | 7B/12B/52B | Dense LLM | 1.5.0 | | [Whisper](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/whisper.md) | 1.5B | MM | 1.5.0 | -| [Yi](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/yi) | 6B/34B | Dense LLM | 1.5.0 | -| [YiZhao](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/yizhao) | 12B | Dense LLM | 1.5.0 | +| [Yi](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/yi) | 6B/34B | Dense LLM | 1.5.0 | +| [YiZhao](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/yizhao) | 12B | Dense LLM | 1.5.0 | | [Baichuan2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/research/baichuan2/baichuan2.md) | 7B/13B | Dense LLM | 1.3.2 | | [GLM2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/docs/model_cards/glm2.md) | 6B | Dense LLM | 1.3.2 | | [GPT2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/docs/model_cards/gpt2.md) | 124M/13B | Dense LLM | 1.3.2 | diff --git a/docs/mindformers/docs/source_zh_cn/introduction/models.md b/docs/mindformers/docs/source_zh_cn/introduction/models.md index a71e537a311577a44caf7b2c71ddf73bc74c0103..4cdfeaadce869149443b9084be5af9bfd69363ef 100644 --- a/docs/mindformers/docs/source_zh_cn/introduction/models.md +++ b/docs/mindformers/docs/source_zh_cn/introduction/models.md @@ -6,32 +6,32 @@ | 模型名 | 支持规格 | 模型类型 | 最新支持版本 | |:--------------------------------------------------------------------------------------------------------|:------------------------------------------|:-----------:|:------:| -| [DeepSeek-V3](https://gitee.com/mindspore/mindformers/blob/master/research/deepseek3) | 671B | 稀疏LLM | 1.6.0 | +| [DeepSeek-V3](https://gitee.com/mindspore/mindformers/tree/master/research/deepseek3) | 671B | 稀疏LLM | 1.6.0 | | [GLM4](https://gitee.com/mindspore/mindformers/blob/master/docs/model_cards/glm4.md) | 9B | 稠密LLM | 1.6.0 | -| [Llama3.1](https://gitee.com/mindspore/mindformers/blob/master/research/llama3_1) | 8B/70B | 稠密LLM | 1.6.0 | -| [Mixtral](https://gitee.com/mindspore/mindformers/blob/master/research/mixtral) | 8x7B | 稀疏LLM | 1.6.0 | -| [Qwen2.5](https://gitee.com/mindspore/mindformers/blob/master/research/qwen2_5) | 0.5B/1.5B/7B/14B/32B/72B | 稠密LLM | 1.6.0 | -| [TeleChat2](https://gitee.com/mindspore/mindformers/blob/master/research/telechat2) | 7B/35B/115B | 稠密LLM | 1.6.0 | +| [Llama3.1](https://gitee.com/mindspore/mindformers/tree/master/research/llama3_1) | 8B/70B | 稠密LLM | 1.6.0 | +| [Mixtral](https://gitee.com/mindspore/mindformers/tree/master/research/mixtral) | 8x7B | 稀疏LLM | 1.6.0 | +| [Qwen2.5](https://gitee.com/mindspore/mindformers/tree/master/research/qwen2_5) | 0.5B/1.5B/7B/14B/32B/72B | 稠密LLM | 1.6.0 | +| [TeleChat2](https://gitee.com/mindspore/mindformers/tree/master/research/telechat2) | 7B/35B/115B | 稠密LLM | 1.6.0 | | [CodeLlama](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/codellama.md) | 34B | 稠密LLM | 1.5.0 | | [CogVLM2-Image](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/cogvlm2_image.md) | 19B | MM | 1.5.0 | | [CogVLM2-Video](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/cogvlm2_video.md) | 13B | MM | 1.5.0 | -| [DeepSeek-V2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/deepseek2) | 236B | 稀疏LLM | 1.5.0 | -| [DeepSeek-Coder-V1.5](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/deepseek1_5) | 7B | 稠密LLM | 1.5.0 | -| [DeepSeek-Coder](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/deepseek) | 33B | 稠密LLM | 1.5.0 | -| [GLM3-32K](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/glm32k) | 6B | 稠密LLM | 1.5.0 | +| [DeepSeek-V2](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/deepseek2) | 236B | 稀疏LLM | 1.5.0 | +| [DeepSeek-Coder-V1.5](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/deepseek1_5) | 7B | 稠密LLM | 1.5.0 | +| [DeepSeek-Coder](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/deepseek) | 33B | 稠密LLM | 1.5.0 | +| [GLM3-32K](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/glm32k) | 6B | 稠密LLM | 1.5.0 | | [GLM3](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/glm3.md) | 6B | 稠密LLM | 1.5.0 | -| [InternLM2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/internlm2) | 7B/20B | 稠密LLM | 1.5.0 | +| [InternLM2](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/internlm2) | 7B/20B | 稠密LLM | 1.5.0 | | [Llama3.2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/llama3_2.md) | 3B | 稠密LLM | 1.5.0 | | [Llama3.2-Vision](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/mllama.md) | 11B | MM | 1.5.0 | -| [Llama3](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/llama3) | 8B/70B | 稠密LLM | 1.5.0 | +| [Llama3](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/llama3) | 8B/70B | 稠密LLM | 1.5.0 | | [Llama2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/llama2.md) | 7B/13B/70B | 稠密LLM | 1.5.0 | -| [Qwen2](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/qwen2) | 0.5B/1.5B/7B/57B/57B-A14B/72B | 稠密/稀疏LLM | 1.5.0 | -| [Qwen1.5](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/qwen1_5) | 7B/14B/72B | 稠密LLM | 1.5.0 | -| [Qwen-VL](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/qwenvl) | 9.6B | MM | 1.5.0 | -| [TeleChat](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/telechat) | 7B/12B/52B | 稠密LLM | 1.5.0 | +| [Qwen2](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/qwen2) | 0.5B/1.5B/7B/57B/57B-A14B/72B | 稠密/稀疏LLM | 1.5.0 | +| [Qwen1.5](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/qwen1_5) | 7B/14B/72B | 稠密LLM | 1.5.0 | +| [Qwen-VL](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/qwenvl) | 9.6B | MM | 1.5.0 | +| [TeleChat](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/telechat) | 7B/12B/52B | 稠密LLM | 1.5.0 | | [Whisper](https://gitee.com/mindspore/mindformers/blob/r1.5.0/docs/model_cards/whisper.md) | 1.5B | MM | 1.5.0 | -| [Yi](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/yi) | 6B/34B | 稠密LLM | 1.5.0 | -| [YiZhao](https://gitee.com/mindspore/mindformers/blob/r1.5.0/research/yizhao) | 12B | 稠密LLM | 1.5.0 | +| [Yi](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/yi) | 6B/34B | 稠密LLM | 1.5.0 | +| [YiZhao](https://gitee.com/mindspore/mindformers/tree/r1.5.0/research/yizhao) | 12B | 稠密LLM | 1.5.0 | | [Baichuan2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/research/baichuan2/baichuan2.md) | 7B/13B | 稠密LLM | 1.3.2 | | [GLM2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/docs/model_cards/glm2.md) | 6B | 稠密LLM | 1.3.2 | | [GPT2](https://gitee.com/mindspore/mindformers/blob/r1.3.0/docs/model_cards/gpt2.md) | 124M/13B | 稠密LLM | 1.3.2 | diff --git a/docs/mindspore/source_en/faq/network_compilation.md b/docs/mindspore/source_en/faq/network_compilation.md index ea6b8e474722936f52ca12a42a390bf26f479a36..2c27b842d5b661748a3330868a9f396d234cd7eb 100644 --- a/docs/mindspore/source_en/faq/network_compilation.md +++ b/docs/mindspore/source_en/faq/network_compilation.md @@ -413,8 +413,7 @@ A: The "External" type indicates that an object that cannot be natively supporte ## Q: What can I do if an error `Nested execution during JIT execution for 'xxx' is not supported when 'xxx' compile and execute.` is reported? -A: When the compilation process is triggered, that is, when the code is compiled into a static computational diagram -, see [Graph Mode Execution Principle](https://www.mindspore.cn/docs/en/master/features/program_form/overview.html), using the JIT Fallback feature by default, the above exception will be thrown when entering the compilation process again. +A: When the compilation process is triggered, that is, when the code is compiled into a static computational diagram, using the JIT Fallback feature by default, the above exception will be thrown when entering the compilation process again. Taking JIT Fallback support for calling objects and methods from third-party libraries as an example: diff --git a/docs/mindspore/source_en/features/data_engine.md b/docs/mindspore/source_en/features/data_engine.md index fada48b563d61b9066e17ffa3fd874abeaacfa9a..256403c7b0b70fac8e1d1d0fab10ebb251c357f4 100644 --- a/docs/mindspore/source_en/features/data_engine.md +++ b/docs/mindspore/source_en/features/data_engine.md @@ -14,7 +14,7 @@ The core of MindSpore training data processing engine is to efficiently and flex - Provide an automatic data augmentation mode, and perform automatic data augmentation on images based on specific strategies. - Provide single-node data caching capability to solve the problem of repeated loading and processing of data, reduce data processing overhead, and improve device-to-device training efficiency. -Please refer to the instructions for usage: [Data Loading And Processing](https://www.mindspore.cn/docs/en/master/features/dataset/overview.html) +Please refer to the instructions for usage: [Data Loading And Processing](https://www.mindspore.cn/tutorials/en/master/dataset/overview.html) ![image](https://mindspore-website.obs.cn-north-4.myhuaweicloud.com/website-images/master/docs/mindspore/source_en/features/images/data/data_engine_en.png) diff --git a/docs/mindspore/source_en/features/overview.md b/docs/mindspore/source_en/features/overview.md index 9d49376aa4bf3db34710b0d1a161396573228c12..2899bbfb4b41336bd2019ecc97f77a60b856db4a 100644 --- a/docs/mindspore/source_en/features/overview.md +++ b/docs/mindspore/source_en/features/overview.md @@ -43,7 +43,7 @@ MindSpore implements functional differential programming, which performs differe At the same time, based on the functional programming paradigm, MindSpore provides rich higher-order functions such as vmap, shard, and other built-in higher-order functions. Like the differential function grad, these allow developers to conveniently construct a function or object as a parameter for higher-order functions. Higher-order functions, after internal compilation optimization, generate optimized versions of developers' functions, implementing features such as vectorization transformation and distributed parallel partitioning. -### [Unified Programming Experience for Dynamic and Static Graphs](https://www.mindspore.cn/docs/en/master/features/program_form/overview.html) +### Unified Programming Experience for Dynamic and Static Graphs Traditional AI frameworks mainly have two programming execution forms: static graph mode and dynamic eager mode. diff --git a/docs/mindspore/source_en/features/runtime/memory_manager.md b/docs/mindspore/source_en/features/runtime/memory_manager.md index 1162827adce30c41f68c497e4da76979301607b9..c51fcf785cbd9ba3898c0f727feab7014c3a162e 100644 --- a/docs/mindspore/source_en/features/runtime/memory_manager.md +++ b/docs/mindspore/source_en/features/runtime/memory_manager.md @@ -9,7 +9,7 @@ Device memory (hereinafter referred to as memory) is the most important resource 1. Memory pool serves as a base for memory management and can effectively avoid the overhead of frequent dynamic allocation of memory. 2. Memory reuse algorithm, as a core competency in memory management, needs to have efficient memory reuse results as well as minimal memory fragmentation. -![memory_manager](https://mindspore-website.obs.cn-north-4.myhuaweicloud.com/website-images/master/docs/mindspore/source_en/features/images/multi_level_compilation/jit_level_memory_manage.png) +![memory_manager](https://mindspore-website.obs.cn-north-4.myhuaweicloud.com/website-images/master/docs/mindspore/source_en/features/compile/images/multi_level_compilation/jit_level_memory_manage.png) ## Interfaces @@ -22,7 +22,7 @@ The memory management-related interfaces are detailed in [runtime interfaces](ht The core idea of memory pool as a base for memory management is to pre-allocate a large block of contiguous memory, allocate it directly from the pool when applying for memory, and return it to the pool for reuse when releasing it, instead of frequently calling the memory application and release interfaces in the system, which reduces the overhead of frequent dynamic allocations, and improves system performance. MindSpore mainly uses the BestFit memory allocation algorithm, supports dynamic expansion of memory blocks and defragmentation, and sets the initialization parameters of the memory pool through the interface [mindspore.runtime.set_memory(init_size,increase_size,max_size)](https://www.mindspore.cn/docs/en/master/api_python/runtime/mindspore.runtime.set_memory.html) to control the dynamic expansion size and maximum memory usage. -![memory_pool](https://mindspore-website.obs.cn-north-4.myhuaweicloud.com/website-images/master/docs/mindspore/source_en/features/images/multi_level_compilation/jit_level_memory_pool.png) +![memory_pool](https://mindspore-website.obs.cn-north-4.myhuaweicloud.com/website-images/master/docs/mindspore/source_en/features/compile/images/multi_level_compilation/jit_level_memory_pool.png) 1. Slicing operation: When memory is allocated, free areas are sorted according to their sizes, the first free area that meets the requirements is found, allocated on demand, the excess is cut, and a new block of free memory is inserted. 2. Merge operation: When memory is reclaimed, neighboring free memory blocks are reclaimed and merged into one large free memory block. diff --git a/docs/mindspore/source_en/features/runtime/multilevel_pipeline.md b/docs/mindspore/source_en/features/runtime/multilevel_pipeline.md index 235120d55c435a9349a24584098aea1a7f62f1e6..252fd765a089b605229f9efa137792fbee0e2d29 100644 --- a/docs/mindspore/source_en/features/runtime/multilevel_pipeline.md +++ b/docs/mindspore/source_en/features/runtime/multilevel_pipeline.md @@ -12,7 +12,7 @@ Runtime scheduling for an operator mainly includes the operations InferShape (in Multi-stage flow is a key performance optimization point for runtime, which improves runtime scheduling efficiency by task decomposition and parallel flow issued to give full play to CPU multi-core performance. The main flow is as follows: -![rt_pipeline](https://mindspore-website.obs.cn-north-4.myhuaweicloud.com/website-images/master/docs/mindspore/source_en/features/images/multi_level_compilation/jit_level_rt_pipeline.png) +![rt_pipeline](https://mindspore-website.obs.cn-north-4.myhuaweicloud.com/website-images/master/docs/mindspore/source_en/features/compile/images/multi_level_compilation/jit_level_rt_pipeline.png) 1. Task decomposition: the operator scheduling is decomposed into three tasks InferShape, Resize and Launch. 2. Queue creation: Create three queues, Infer Queue, Resize Queue and Launch Queue, for taking over the three tasks in step 1. diff --git a/docs/mindspore/source_en/features/runtime/multistream_concurrency.md b/docs/mindspore/source_en/features/runtime/multistream_concurrency.md index b99d645b3ffac040d2061977cdcfea8d372adda0..db9291024e4305e603a7020cbf1859042f2f15b0 100644 --- a/docs/mindspore/source_en/features/runtime/multistream_concurrency.md +++ b/docs/mindspore/source_en/features/runtime/multistream_concurrency.md @@ -10,7 +10,7 @@ During the training of large-scale deep learning models, the importance of commu Traditional multi-stream concurrency methods usually rely on manual configuration, which is not only cumbersome and error-prone, but also often difficult to achieve optimal concurrency when faced with complex computational graphs. MindSpore's automatic stream assignment feature automatically identifies and assigns concurrency opportunities in the computational graph by means of an intelligent algorithm, and assigns different operators to different streams for execution. This automated allocation process not only simplifies user operations, but also enables dynamic adjustment of stream allocation policies at runtime to accommodate different computing environments and resource conditions. -![multi_stream](https://mindspore-website.obs.cn-north-4.myhuaweicloud.com/website-images/master/docs/mindspore/source_en/features/images/multi_level_compilation/jit_level_multi_stream.png) +![multi_stream](https://mindspore-website.obs.cn-north-4.myhuaweicloud.com/website-images/master/docs/mindspore/source_en/features/compile/images/multi_level_compilation/jit_level_multi_stream.png) The principles are as follows: diff --git a/docs/mindspore/source_zh_cn/faq/network_compilation.md b/docs/mindspore/source_zh_cn/faq/network_compilation.md index 233a8a87c409967932be66d200fe8e00cd28a58c..739c48f38125e03df8a1cc693fede4ec0ee4aa7f 100644 --- a/docs/mindspore/source_zh_cn/faq/network_compilation.md +++ b/docs/mindspore/source_zh_cn/faq/network_compilation.md @@ -411,7 +411,7 @@ A: “External” 类型表示在图模式中使用了无法原生支持的对 ## Q: 编译时报错`Nested execution during JIT execution for 'xxx' is not supported when 'xxx' compile and execute.`怎么办? -A: 当触发编译流程,即代码编译成静态计算图时,见[Graph模式执行原理](https://www.mindspore.cn/docs/zh-CN/master/features/program_form/overview.html),同时在默认使用JIT Fallback特性时,再次进入编译流程时,则会抛出以上异常。 +A: 当触发编译流程,即代码编译成静态计算图时,同时在默认使用JIT Fallback特性时,再次进入编译流程时,则会抛出以上异常。 下面以JIT Fallback支持调用第三方库的对象和方法为例: diff --git a/docs/mindspore/source_zh_cn/features/data_engine.md b/docs/mindspore/source_zh_cn/features/data_engine.md index 19c621bd760a108f2ff0f0de349a5befe46bad78..1940e5cbe4584619ca3d722495b5e6a79b055d9a 100644 --- a/docs/mindspore/source_zh_cn/features/data_engine.md +++ b/docs/mindspore/source_zh_cn/features/data_engine.md @@ -14,7 +14,7 @@ MindSpore训练数据处理引擎核心是将训练样本(数据集)高效 - 提供了自动数据增强模式,能够基于特定策略自动对图像进行数据增强处理; - 提供单节点数据缓存能力,解决重复加载、处理数据的问题,降低数据处理开销,提升端到端训练效率。 -具体用法参考:[数据处理与加载](https://www.mindspore.cn/docs/zh-CN/master/features/dataset/overview.html) +具体用法参考:[数据处理与加载](https://www.mindspore.cn/tutorials/zh-CN/master/dataset/overview.html) ![image](./images/data/data_engine.png) diff --git a/docs/mindspore/source_zh_cn/features/overview.md b/docs/mindspore/source_zh_cn/features/overview.md index 414bee3a89fbfac5d54005f46decc2e72876306b..4062caf1872c1571a6bdd234dbbb6e81cee44930 100644 --- a/docs/mindspore/source_zh_cn/features/overview.md +++ b/docs/mindspore/source_zh_cn/features/overview.md @@ -43,7 +43,7 @@ MindSpore实现了函数式微分编程,对可被微分求导的函数对象 同时,基于函数式编程范式,MindSpore提供了丰富高阶函数如vmap、shard等内置高阶函数功能。与微分求导函数grad一样,可以让开发者方便的构造一个函数或对象,作为高阶函数的参数。高阶函数经过内部编译优化,生成针对开发者函数的优化版本,实现如向量化变换、分布式并行切分等特点功能。 -### [动静统一的编程体验](https://www.mindspore.cn/docs/zh-CN/master/features/program_form/overview.html) +### 动静统一的编程体验 传统AI框架主要有两种编程执行形态,静态图模式和动态图模式。