From b4b7ca0435bee93187097b9b90338f0367aabf0a Mon Sep 17 00:00:00 2001 From: fenglingcong Date: Tue, 1 Jun 2021 02:10:16 +0800 Subject: [PATCH 1/7] translation --- .../source_en/script_analysis.md | 117 +++++++----------- 1 file changed, 44 insertions(+), 73 deletions(-) diff --git a/docs/migration_guide/source_en/script_analysis.md b/docs/migration_guide/source_en/script_analysis.md index 18a19e45f9..3b28c11b72 100644 --- a/docs/migration_guide/source_en/script_analysis.md +++ b/docs/migration_guide/source_en/script_analysis.md @@ -1,76 +1,47 @@ -# Network Script Analysis - - - -- [Network Script Analysis](#network-script_analysis) - - [Operator Evaluation](#operator-evaluation) - - [MindSpore Operator Design](#mindspore-operator-design) - - [Query Operator Mapping Table](#query-operator-mapping-table) - - [Missing Operator Processing Strategy](#missing-operator-processing-strategy) - - [Grammar Assessment](#grammar-assessment) - - [Common Restriction Principles](#common-restriction-principles) - - [Common Processing Strategies](#common-processing-strategies) - - - - - -## Operator Evaluation - -### MindSpore Operator Design - -The process of using the MindSpore framework to build a neural network is similar to other frameworks (e.g., TensorFlow/PyTorch), but the supported operators are different. It is necessary to find out the missing operators in the MindSpore framework when performing network migration (e.g., migrating from TensorFlow to the MindSpore-ascend platform). +## Network Script Analysis +### Operator Evaluation +##### MindSpore operator design +Using the MindSpore framework to build a neural network is similar to other frameworks (TensorFlow/PyTorch), but the supported operators are different. Therefore, to find out the missing operators is necessary in the MindSpore framework during network migration (such as migrating from TensorFlow to the MindSpore Ascend platform). MindSpore API is composed of various Python/C++ API operators, which can be roughly divided into: -- Data framework operator - - Including tensors, basic data types, training gradients, optimizer operators, such as `mindspore.int32`, `mindspore.nn.Cell`, etc. - -- Data preprocessing operator - - Including image reading, data type conversion operators, such as `mindspore.dataset.MnistDataset`, etc. - -- Network structure operator - - Including convolution and normalization operators used in network construction, such as `mindspore.nn.Conv2d`, `mindspore.nn.Dense`, etc. - -The surface layer of the network structure operator is the MindSpore operator (hereinafter referred as ME operator), which is the operator API called by the user (e.g., `mindspore.nn.Softmax`), and the ME operator is implemented by calling the TBE operator (C/C++) at the bottom layer. - -When counting missing ME operators, you need to find out the corresponding operators of all operators in the source script (including data framework classes, data preprocessing, and network structure operators) in the MindSpore framework (e.g.,`tf.nn.relu` corresponds to MindSpore operator `mindspore.nn.ReLU`). If there is no corresponding operator in MindSpore, it will be counted as missing. - -### Query Operator Mapping Table - -Find the network structure and the Python file that implements the training function in the code library (the name is generally train.py model.py, etc.), and find all relevant operators in the script file (including data framework classes, data preprocessing, network structure operators, etc.), and compare with [MindSpore Operator API](https://www.mindspore.cn/doc/note/en/master/operator_list_ms.html) , to find the platform support status of the operator under `mindspore.nn` or `mindspore.ops`. - -If the corresponding ME operator cannot be found on this webpage, you can continue to search for the operator name in [MindSpore API List](https://www.mindspore.cn/doc/api_python/en/master/index.html). - -If the source code is a PyTorch script, you can directly query [MindSpore and PyTorch operator mapping](https://www.mindspore.cn/doc/note/en/master/index.html#operator_api) to find the corresponding MindSpore operator. Note that for operators with the same function, MindSpore may define a name for this operator differing from other frameworks, and the parameters and functions of operators with the same name may also be different from other frameworks. Please refer to the official description for checking the names. - -### Missing Operator Processing Strategy - -1. Consider replacing it with other operators: It is necessary to analyze the implementation formula of the operator and examine whether the existing MindSpore operator can be superimposed to achieve the expected goal. -2. Consider temporary circumvention solutions: For example, if a certain loss is not supported, it can be replaced with a loss operator of the same kind that has been supported. -3. Submit suggestions in [MindSpore Community](https://gitee.com/mindspore/mindspore/issues) to develop missing operators. - -## Grammar Assessment - -MindSpore provides two modes: `GRAPH_MODE` and `PYNATIVE_MODE`. - -In PyNative mode, the behavior of the model for **Evaluation** is same as that of in the general Python code. - -When using `GRAPH_MODE`, or when using `PYNATIVE_MODE` for **Training**, there are usually grammatical restrictions. In these two cases, it is necessary to perform graph compilation operations on the Python code. In this step, MindSpore has not yet been able to support the complete set of Python syntax, so there will be some restrictions on the implementation of the `construct` function. For specific restrictions, please refer to [MindSpore static graph syntax support](https://www.mindspore.cn/doc/note/en/master/static_graph_syntax_support.html). - -### Common Restriction Principles - -Compared with the specific syntax description, the common restrictions can be summarized as follows: - -- Do not call other Python module , such as numpy and scipy, when building the graph. The related processing should be moved to the `__init__` stage. -- Do not use custom types when building the graph. Instead, use the data types and Python basic types provided by MindSpore. You can use tuple/list combinations based on these types. -- Do not processing multi-threaded, multi-process data when building the graph. - -### Common Processing Strategies - -1. Use the operators provided by MindSpore to replace the functions of other Python libraries. The processing of constants can be moved to the `__init__` stage. -2. Use basic types for combination, you can consider increasing the amount of function parameters. There are no restrictions on the input parameters of the function, and variable length input can be used. -3. Avoid multithreading in the network. + - Data framework operator + Including tensors, basic data types, training gradients, optimizer operators, such as mindspore.int32, mindspore.nn.Cell, etc. + - Data preprocessing operator +Including image reading, data type conversion operators, such as mindspore.dataset.MnistDataset, etc. + - Network structure operator + Including convolution and normalization operators used in network construction, such as mindspore.nn.Conv2d, mindspore.nn.Dense, etc. +   + The surface layer of the network structure operator is the ME operator, that is, the operator API (such as mindspore.nn.Softmax) called by the user, and the ME operator is implemented by calling the TBE operator (C/C++) at the bottom layer. +   + When counting missing ME operators, you need to find out the corresponding operators of all operators in the source script (including data frame classes, data preprocessing, and network structure operators) in the MindSpore framework (for example, tf.nn.relu corresponds to MindSpore operators as mindspore.nn.ReLU). If there is no corresponding operator in MindSpore, it will be counted as missing. +##### Query operator mapping table +Find the network structure and the Python file that implements the training function in the code library (the name is generally train.py model.py, etc.), and find all relevant operators in the script file (including data frame classes, data preprocessing, network structure operators, etc.) ), and compare with [the MindSpore operator API](https://www.mindspore.cn/doc/note/zh-CN/master/operator_list_ms.html) to find the platform support status of the operator under mindspore.nn or mindspore.ops. + +If the corresponding ME operator cannot be found on this webpage, you can continue to search for the operator name in [the MindSpore API list](https://www.mindspore.cn/doc/api_python/zh-CN/master/index.html). + +If the source code is a PyTorch script, you can directly query [the operator mapping between MindSpore and PyTorch](https://www.mindspore.cn/doc/note/zh-CN/master/index.html#operator_api) to find the corresponding MindSpore operator. Note that for operators with the same function, MindSpore may be named differently from other frameworks, and the parameters and functions of operators with the same name may also be different from other frameworks. The official description shall prevail. +##### Missing operator processing strategy +1.Consider replacing it with other operators: it is necessary to analyze the implementation formula of the operator, and examine whether the existing MindSpore operator can be superimposed to achieve the expected goal. +  +2.Consider temporary circumvention solutions: For example, if a certain loss is not supported, it can be replaced with a loss operator of the same kind that has been supported. +  +3.Submit suggestions in [the MindSpore community](https://gitee.com/mindspore/mindspore/issues) to develop missing operators. +### Grammar assessment +MindSpore provides two modes: GRAPH_MODE and PYNATIVE_MODE. + +The **inference** behavior of the model in PyNative mode is no different from general Python code. + +When using GRAPH_MODE or PYNATIVE_MODE for **training**, there are usually grammatical restrictions. In these two cases, it is necessary to perform graph compilation operations on the Python code. In this step, MindSpore has not yet been able to support the complete set of Python syntax, so there will be some restrictions on the writing of construct functions. For specific restrictions, please refer to [the MindSpore static graph grammar](https://www.mindspore.cn/doc/note/zh-CN/master/static_graph_syntax_support.html). +##### Common restriction principles +Compared with the detailed syntax description, the common limitations can be summarized as follows: + + - Don't call other Python libraries, such as numpy and scipy, when composing the picture. The related processing should be moved to the __ init__ stage. + - Do not use custom types when composing images. Instead, use the data types and Python basic types provided by MindSpore. You can use tuple/list combinations based on these types. + - Do not deal with multi-threaded, multi-process data when composing a picture. +##### Common processing strategies +1.Use the operators provided inside MindSpore to replace the functions of other Python libraries. The processing of constants can be moved forward to the __ init __ stage. +  +2.Using basic types for combination, you can consider increasing the amount of function parameters. There are no restrictions on the input parameters of the function, and variable length input can be used. +  +3.Avoid multithreading in the network. \ No newline at end of file -- Gitee From 51b2e6a27cb8909cf76ed7217e75b5e17b52b467 Mon Sep 17 00:00:00 2001 From: fenglingcong Date: Tue, 1 Jun 2021 02:29:31 +0800 Subject: [PATCH 2/7] translation --- docs/migration_guide/source_en/script_analysis.md | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/docs/migration_guide/source_en/script_analysis.md b/docs/migration_guide/source_en/script_analysis.md index 3b28c11b72..2d6a184db8 100644 --- a/docs/migration_guide/source_en/script_analysis.md +++ b/docs/migration_guide/source_en/script_analysis.md @@ -22,11 +22,11 @@ If the corresponding ME operator cannot be found on this webpage, you can contin If the source code is a PyTorch script, you can directly query [the operator mapping between MindSpore and PyTorch](https://www.mindspore.cn/doc/note/zh-CN/master/index.html#operator_api) to find the corresponding MindSpore operator. Note that for operators with the same function, MindSpore may be named differently from other frameworks, and the parameters and functions of operators with the same name may also be different from other frameworks. The official description shall prevail. ##### Missing operator processing strategy -1.Consider replacing it with other operators: it is necessary to analyze the implementation formula of the operator, and examine whether the existing MindSpore operator can be superimposed to achieve the expected goal. +1. Consider replacing it with other operators: it is necessary to analyze the implementation formula of the operator, and examine whether the existing MindSpore operator can be superimposed to achieve the expected goal.   -2.Consider temporary circumvention solutions: For example, if a certain loss is not supported, it can be replaced with a loss operator of the same kind that has been supported. +2. Consider temporary circumvention solutions: For example, if a certain loss is not supported, it can be replaced with a loss operator of the same kind that has been supported.   -3.Submit suggestions in [the MindSpore community](https://gitee.com/mindspore/mindspore/issues) to develop missing operators. +3. Submit suggestions in [the MindSpore community](https://gitee.com/mindspore/mindspore/issues) to develop missing operators. ### Grammar assessment MindSpore provides two modes: GRAPH_MODE and PYNATIVE_MODE. @@ -40,8 +40,8 @@ Compared with the detailed syntax description, the common limitations can be sum - Do not use custom types when composing images. Instead, use the data types and Python basic types provided by MindSpore. You can use tuple/list combinations based on these types. - Do not deal with multi-threaded, multi-process data when composing a picture. ##### Common processing strategies -1.Use the operators provided inside MindSpore to replace the functions of other Python libraries. The processing of constants can be moved forward to the __ init __ stage. +1. Use the operators provided inside MindSpore to replace the functions of other Python libraries. The processing of constants can be moved forward to the __ init __ stage.   -2.Using basic types for combination, you can consider increasing the amount of function parameters. There are no restrictions on the input parameters of the function, and variable length input can be used. +2. Using basic types for combination, you can consider increasing the amount of function parameters. There are no restrictions on the input parameters of the function, and variable length input can be used.   -3.Avoid multithreading in the network. \ No newline at end of file +3. Avoid multithreading in the network. \ No newline at end of file -- Gitee From b1dc6693ce1ebdb3cee0202c7d01f32ab73f135f Mon Sep 17 00:00:00 2001 From: fenglingcong Date: Tue, 1 Jun 2021 02:42:33 +0800 Subject: [PATCH 3/7] translation --- .../source_en/script_analysis.md | 17 +++++++++-------- 1 file changed, 9 insertions(+), 8 deletions(-) diff --git a/docs/migration_guide/source_en/script_analysis.md b/docs/migration_guide/source_en/script_analysis.md index 2d6a184db8..f47dbabebe 100644 --- a/docs/migration_guide/source_en/script_analysis.md +++ b/docs/migration_guide/source_en/script_analysis.md @@ -1,6 +1,7 @@ -## Network Script Analysis -### Operator Evaluation -##### MindSpore operator design + +# Network Script Analysis +## Operator Evaluation +### MindSpore operator design Using the MindSpore framework to build a neural network is similar to other frameworks (TensorFlow/PyTorch), but the supported operators are different. Therefore, to find out the missing operators is necessary in the MindSpore framework during network migration (such as migrating from TensorFlow to the MindSpore Ascend platform). MindSpore API is composed of various Python/C++ API operators, which can be roughly divided into: @@ -15,31 +16,31 @@ Including image reading, data type conversion operators, such as mindspore.datas The surface layer of the network structure operator is the ME operator, that is, the operator API (such as mindspore.nn.Softmax) called by the user, and the ME operator is implemented by calling the TBE operator (C/C++) at the bottom layer.   When counting missing ME operators, you need to find out the corresponding operators of all operators in the source script (including data frame classes, data preprocessing, and network structure operators) in the MindSpore framework (for example, tf.nn.relu corresponds to MindSpore operators as mindspore.nn.ReLU). If there is no corresponding operator in MindSpore, it will be counted as missing. -##### Query operator mapping table +### Query operator mapping table Find the network structure and the Python file that implements the training function in the code library (the name is generally train.py model.py, etc.), and find all relevant operators in the script file (including data frame classes, data preprocessing, network structure operators, etc.) ), and compare with [the MindSpore operator API](https://www.mindspore.cn/doc/note/zh-CN/master/operator_list_ms.html) to find the platform support status of the operator under mindspore.nn or mindspore.ops. If the corresponding ME operator cannot be found on this webpage, you can continue to search for the operator name in [the MindSpore API list](https://www.mindspore.cn/doc/api_python/zh-CN/master/index.html). If the source code is a PyTorch script, you can directly query [the operator mapping between MindSpore and PyTorch](https://www.mindspore.cn/doc/note/zh-CN/master/index.html#operator_api) to find the corresponding MindSpore operator. Note that for operators with the same function, MindSpore may be named differently from other frameworks, and the parameters and functions of operators with the same name may also be different from other frameworks. The official description shall prevail. -##### Missing operator processing strategy +### Missing operator processing strategy 1. Consider replacing it with other operators: it is necessary to analyze the implementation formula of the operator, and examine whether the existing MindSpore operator can be superimposed to achieve the expected goal.   2. Consider temporary circumvention solutions: For example, if a certain loss is not supported, it can be replaced with a loss operator of the same kind that has been supported.   3. Submit suggestions in [the MindSpore community](https://gitee.com/mindspore/mindspore/issues) to develop missing operators. -### Grammar assessment +## Grammar assessment MindSpore provides two modes: GRAPH_MODE and PYNATIVE_MODE. The **inference** behavior of the model in PyNative mode is no different from general Python code. When using GRAPH_MODE or PYNATIVE_MODE for **training**, there are usually grammatical restrictions. In these two cases, it is necessary to perform graph compilation operations on the Python code. In this step, MindSpore has not yet been able to support the complete set of Python syntax, so there will be some restrictions on the writing of construct functions. For specific restrictions, please refer to [the MindSpore static graph grammar](https://www.mindspore.cn/doc/note/zh-CN/master/static_graph_syntax_support.html). -##### Common restriction principles +### Common restriction principles Compared with the detailed syntax description, the common limitations can be summarized as follows: - Don't call other Python libraries, such as numpy and scipy, when composing the picture. The related processing should be moved to the __ init__ stage. - Do not use custom types when composing images. Instead, use the data types and Python basic types provided by MindSpore. You can use tuple/list combinations based on these types. - Do not deal with multi-threaded, multi-process data when composing a picture. -##### Common processing strategies +#### Common processing strategies 1. Use the operators provided inside MindSpore to replace the functions of other Python libraries. The processing of constants can be moved forward to the __ init __ stage.   2. Using basic types for combination, you can consider increasing the amount of function parameters. There are no restrictions on the input parameters of the function, and variable length input can be used. -- Gitee From 06e11afe4269fd7330f1d0b414e275db4dcb02ee Mon Sep 17 00:00:00 2001 From: fenglingcong Date: Tue, 1 Jun 2021 02:54:49 +0800 Subject: [PATCH 4/7] translation --- .../source_en/script_analysis.md | 18 +++--------------- 1 file changed, 3 insertions(+), 15 deletions(-) diff --git a/docs/migration_guide/source_en/script_analysis.md b/docs/migration_guide/source_en/script_analysis.md index f47dbabebe..6528e80534 100644 --- a/docs/migration_guide/source_en/script_analysis.md +++ b/docs/migration_guide/source_en/script_analysis.md @@ -1,48 +1,36 @@ - # Network Script Analysis ## Operator Evaluation ### MindSpore operator design Using the MindSpore framework to build a neural network is similar to other frameworks (TensorFlow/PyTorch), but the supported operators are different. Therefore, to find out the missing operators is necessary in the MindSpore framework during network migration (such as migrating from TensorFlow to the MindSpore Ascend platform). - MindSpore API is composed of various Python/C++ API operators, which can be roughly divided into: - - Data framework operator Including tensors, basic data types, training gradients, optimizer operators, such as mindspore.int32, mindspore.nn.Cell, etc. - Data preprocessing operator Including image reading, data type conversion operators, such as mindspore.dataset.MnistDataset, etc. - Network structure operator Including convolution and normalization operators used in network construction, such as mindspore.nn.Conv2d, mindspore.nn.Dense, etc. -   The surface layer of the network structure operator is the ME operator, that is, the operator API (such as mindspore.nn.Softmax) called by the user, and the ME operator is implemented by calling the TBE operator (C/C++) at the bottom layer. -   When counting missing ME operators, you need to find out the corresponding operators of all operators in the source script (including data frame classes, data preprocessing, and network structure operators) in the MindSpore framework (for example, tf.nn.relu corresponds to MindSpore operators as mindspore.nn.ReLU). If there is no corresponding operator in MindSpore, it will be counted as missing. ### Query operator mapping table Find the network structure and the Python file that implements the training function in the code library (the name is generally train.py model.py, etc.), and find all relevant operators in the script file (including data frame classes, data preprocessing, network structure operators, etc.) ), and compare with [the MindSpore operator API](https://www.mindspore.cn/doc/note/zh-CN/master/operator_list_ms.html) to find the platform support status of the operator under mindspore.nn or mindspore.ops. - If the corresponding ME operator cannot be found on this webpage, you can continue to search for the operator name in [the MindSpore API list](https://www.mindspore.cn/doc/api_python/zh-CN/master/index.html). - If the source code is a PyTorch script, you can directly query [the operator mapping between MindSpore and PyTorch](https://www.mindspore.cn/doc/note/zh-CN/master/index.html#operator_api) to find the corresponding MindSpore operator. Note that for operators with the same function, MindSpore may be named differently from other frameworks, and the parameters and functions of operators with the same name may also be different from other frameworks. The official description shall prevail. ### Missing operator processing strategy 1. Consider replacing it with other operators: it is necessary to analyze the implementation formula of the operator, and examine whether the existing MindSpore operator can be superimposed to achieve the expected goal. -  + 2. Consider temporary circumvention solutions: For example, if a certain loss is not supported, it can be replaced with a loss operator of the same kind that has been supported. -  + 3. Submit suggestions in [the MindSpore community](https://gitee.com/mindspore/mindspore/issues) to develop missing operators. ## Grammar assessment MindSpore provides two modes: GRAPH_MODE and PYNATIVE_MODE. - The **inference** behavior of the model in PyNative mode is no different from general Python code. - When using GRAPH_MODE or PYNATIVE_MODE for **training**, there are usually grammatical restrictions. In these two cases, it is necessary to perform graph compilation operations on the Python code. In this step, MindSpore has not yet been able to support the complete set of Python syntax, so there will be some restrictions on the writing of construct functions. For specific restrictions, please refer to [the MindSpore static graph grammar](https://www.mindspore.cn/doc/note/zh-CN/master/static_graph_syntax_support.html). ### Common restriction principles Compared with the detailed syntax description, the common limitations can be summarized as follows: - - Don't call other Python libraries, such as numpy and scipy, when composing the picture. The related processing should be moved to the __ init__ stage. - Do not use custom types when composing images. Instead, use the data types and Python basic types provided by MindSpore. You can use tuple/list combinations based on these types. - Do not deal with multi-threaded, multi-process data when composing a picture. -#### Common processing strategies +### Common processing strategies 1. Use the operators provided inside MindSpore to replace the functions of other Python libraries. The processing of constants can be moved forward to the __ init __ stage. -  2. Using basic types for combination, you can consider increasing the amount of function parameters. There are no restrictions on the input parameters of the function, and variable length input can be used. -  3. Avoid multithreading in the network. \ No newline at end of file -- Gitee From a6f8111a5ddb46ca8c6b3a023a13f32dc077bd56 Mon Sep 17 00:00:00 2001 From: fenglingcong Date: Tue, 1 Jun 2021 03:06:29 +0800 Subject: [PATCH 5/7] translation --- docs/migration_guide/source_en/script_analysis.md | 8 ++++++-- 1 file changed, 6 insertions(+), 2 deletions(-) diff --git a/docs/migration_guide/source_en/script_analysis.md b/docs/migration_guide/source_en/script_analysis.md index 6528e80534..d06b108fae 100644 --- a/docs/migration_guide/source_en/script_analysis.md +++ b/docs/migration_guide/source_en/script_analysis.md @@ -1,3 +1,4 @@ + # Network Script Analysis ## Operator Evaluation ### MindSpore operator design @@ -13,20 +14,23 @@ Including image reading, data type conversion operators, such as mindspore.datas When counting missing ME operators, you need to find out the corresponding operators of all operators in the source script (including data frame classes, data preprocessing, and network structure operators) in the MindSpore framework (for example, tf.nn.relu corresponds to MindSpore operators as mindspore.nn.ReLU). If there is no corresponding operator in MindSpore, it will be counted as missing. ### Query operator mapping table Find the network structure and the Python file that implements the training function in the code library (the name is generally train.py model.py, etc.), and find all relevant operators in the script file (including data frame classes, data preprocessing, network structure operators, etc.) ), and compare with [the MindSpore operator API](https://www.mindspore.cn/doc/note/zh-CN/master/operator_list_ms.html) to find the platform support status of the operator under mindspore.nn or mindspore.ops. + If the corresponding ME operator cannot be found on this webpage, you can continue to search for the operator name in [the MindSpore API list](https://www.mindspore.cn/doc/api_python/zh-CN/master/index.html). + If the source code is a PyTorch script, you can directly query [the operator mapping between MindSpore and PyTorch](https://www.mindspore.cn/doc/note/zh-CN/master/index.html#operator_api) to find the corresponding MindSpore operator. Note that for operators with the same function, MindSpore may be named differently from other frameworks, and the parameters and functions of operators with the same name may also be different from other frameworks. The official description shall prevail. ### Missing operator processing strategy 1. Consider replacing it with other operators: it is necessary to analyze the implementation formula of the operator, and examine whether the existing MindSpore operator can be superimposed to achieve the expected goal. - 2. Consider temporary circumvention solutions: For example, if a certain loss is not supported, it can be replaced with a loss operator of the same kind that has been supported. - 3. Submit suggestions in [the MindSpore community](https://gitee.com/mindspore/mindspore/issues) to develop missing operators. ## Grammar assessment MindSpore provides two modes: GRAPH_MODE and PYNATIVE_MODE. + The **inference** behavior of the model in PyNative mode is no different from general Python code. + When using GRAPH_MODE or PYNATIVE_MODE for **training**, there are usually grammatical restrictions. In these two cases, it is necessary to perform graph compilation operations on the Python code. In this step, MindSpore has not yet been able to support the complete set of Python syntax, so there will be some restrictions on the writing of construct functions. For specific restrictions, please refer to [the MindSpore static graph grammar](https://www.mindspore.cn/doc/note/zh-CN/master/static_graph_syntax_support.html). ### Common restriction principles Compared with the detailed syntax description, the common limitations can be summarized as follows: + - Don't call other Python libraries, such as numpy and scipy, when composing the picture. The related processing should be moved to the __ init__ stage. - Do not use custom types when composing images. Instead, use the data types and Python basic types provided by MindSpore. You can use tuple/list combinations based on these types. - Do not deal with multi-threaded, multi-process data when composing a picture. -- Gitee From 0a1bb5b0b1bc88a7b237d01fbf45a99e4dba827f Mon Sep 17 00:00:00 2001 From: fenglingcong Date: Tue, 1 Jun 2021 03:11:33 +0800 Subject: [PATCH 6/7] translation --- docs/migration_guide/source_en/script_analysis.md | 6 ------ 1 file changed, 6 deletions(-) diff --git a/docs/migration_guide/source_en/script_analysis.md b/docs/migration_guide/source_en/script_analysis.md index d06b108fae..148facf4bf 100644 --- a/docs/migration_guide/source_en/script_analysis.md +++ b/docs/migration_guide/source_en/script_analysis.md @@ -1,4 +1,3 @@ - # Network Script Analysis ## Operator Evaluation ### MindSpore operator design @@ -14,9 +13,7 @@ Including image reading, data type conversion operators, such as mindspore.datas When counting missing ME operators, you need to find out the corresponding operators of all operators in the source script (including data frame classes, data preprocessing, and network structure operators) in the MindSpore framework (for example, tf.nn.relu corresponds to MindSpore operators as mindspore.nn.ReLU). If there is no corresponding operator in MindSpore, it will be counted as missing. ### Query operator mapping table Find the network structure and the Python file that implements the training function in the code library (the name is generally train.py model.py, etc.), and find all relevant operators in the script file (including data frame classes, data preprocessing, network structure operators, etc.) ), and compare with [the MindSpore operator API](https://www.mindspore.cn/doc/note/zh-CN/master/operator_list_ms.html) to find the platform support status of the operator under mindspore.nn or mindspore.ops. - If the corresponding ME operator cannot be found on this webpage, you can continue to search for the operator name in [the MindSpore API list](https://www.mindspore.cn/doc/api_python/zh-CN/master/index.html). - If the source code is a PyTorch script, you can directly query [the operator mapping between MindSpore and PyTorch](https://www.mindspore.cn/doc/note/zh-CN/master/index.html#operator_api) to find the corresponding MindSpore operator. Note that for operators with the same function, MindSpore may be named differently from other frameworks, and the parameters and functions of operators with the same name may also be different from other frameworks. The official description shall prevail. ### Missing operator processing strategy 1. Consider replacing it with other operators: it is necessary to analyze the implementation formula of the operator, and examine whether the existing MindSpore operator can be superimposed to achieve the expected goal. @@ -24,13 +21,10 @@ If the source code is a PyTorch script, you can directly query [the operator map 3. Submit suggestions in [the MindSpore community](https://gitee.com/mindspore/mindspore/issues) to develop missing operators. ## Grammar assessment MindSpore provides two modes: GRAPH_MODE and PYNATIVE_MODE. - The **inference** behavior of the model in PyNative mode is no different from general Python code. - When using GRAPH_MODE or PYNATIVE_MODE for **training**, there are usually grammatical restrictions. In these two cases, it is necessary to perform graph compilation operations on the Python code. In this step, MindSpore has not yet been able to support the complete set of Python syntax, so there will be some restrictions on the writing of construct functions. For specific restrictions, please refer to [the MindSpore static graph grammar](https://www.mindspore.cn/doc/note/zh-CN/master/static_graph_syntax_support.html). ### Common restriction principles Compared with the detailed syntax description, the common limitations can be summarized as follows: - - Don't call other Python libraries, such as numpy and scipy, when composing the picture. The related processing should be moved to the __ init__ stage. - Do not use custom types when composing images. Instead, use the data types and Python basic types provided by MindSpore. You can use tuple/list combinations based on these types. - Do not deal with multi-threaded, multi-process data when composing a picture. -- Gitee From f25447cfe85a50da8042fc500b02b513180aaac2 Mon Sep 17 00:00:00 2001 From: fenglingcong Date: Tue, 1 Jun 2021 03:24:48 +0800 Subject: [PATCH 7/7] translation --- .../source_en/script_analysis.md | 31 ++++++++++++++----- 1 file changed, 23 insertions(+), 8 deletions(-) diff --git a/docs/migration_guide/source_en/script_analysis.md b/docs/migration_guide/source_en/script_analysis.md index 148facf4bf..3fac24d4c9 100644 --- a/docs/migration_guide/source_en/script_analysis.md +++ b/docs/migration_guide/source_en/script_analysis.md @@ -1,34 +1,49 @@ # Network Script Analysis + ## Operator Evaluation + ### MindSpore operator design + Using the MindSpore framework to build a neural network is similar to other frameworks (TensorFlow/PyTorch), but the supported operators are different. Therefore, to find out the missing operators is necessary in the MindSpore framework during network migration (such as migrating from TensorFlow to the MindSpore Ascend platform). MindSpore API is composed of various Python/C++ API operators, which can be roughly divided into: - - Data framework operator + +- Data framework operator Including tensors, basic data types, training gradients, optimizer operators, such as mindspore.int32, mindspore.nn.Cell, etc. - - Data preprocessing operator +- Data preprocessing operator Including image reading, data type conversion operators, such as mindspore.dataset.MnistDataset, etc. - - Network structure operator +- Network structure operator Including convolution and normalization operators used in network construction, such as mindspore.nn.Conv2d, mindspore.nn.Dense, etc. The surface layer of the network structure operator is the ME operator, that is, the operator API (such as mindspore.nn.Softmax) called by the user, and the ME operator is implemented by calling the TBE operator (C/C++) at the bottom layer. When counting missing ME operators, you need to find out the corresponding operators of all operators in the source script (including data frame classes, data preprocessing, and network structure operators) in the MindSpore framework (for example, tf.nn.relu corresponds to MindSpore operators as mindspore.nn.ReLU). If there is no corresponding operator in MindSpore, it will be counted as missing. + ### Query operator mapping table + Find the network structure and the Python file that implements the training function in the code library (the name is generally train.py model.py, etc.), and find all relevant operators in the script file (including data frame classes, data preprocessing, network structure operators, etc.) ), and compare with [the MindSpore operator API](https://www.mindspore.cn/doc/note/zh-CN/master/operator_list_ms.html) to find the platform support status of the operator under mindspore.nn or mindspore.ops. If the corresponding ME operator cannot be found on this webpage, you can continue to search for the operator name in [the MindSpore API list](https://www.mindspore.cn/doc/api_python/zh-CN/master/index.html). If the source code is a PyTorch script, you can directly query [the operator mapping between MindSpore and PyTorch](https://www.mindspore.cn/doc/note/zh-CN/master/index.html#operator_api) to find the corresponding MindSpore operator. Note that for operators with the same function, MindSpore may be named differently from other frameworks, and the parameters and functions of operators with the same name may also be different from other frameworks. The official description shall prevail. + ### Missing operator processing strategy + 1. Consider replacing it with other operators: it is necessary to analyze the implementation formula of the operator, and examine whether the existing MindSpore operator can be superimposed to achieve the expected goal. 2. Consider temporary circumvention solutions: For example, if a certain loss is not supported, it can be replaced with a loss operator of the same kind that has been supported. 3. Submit suggestions in [the MindSpore community](https://gitee.com/mindspore/mindspore/issues) to develop missing operators. + ## Grammar assessment + MindSpore provides two modes: GRAPH_MODE and PYNATIVE_MODE. The **inference** behavior of the model in PyNative mode is no different from general Python code. When using GRAPH_MODE or PYNATIVE_MODE for **training**, there are usually grammatical restrictions. In these two cases, it is necessary to perform graph compilation operations on the Python code. In this step, MindSpore has not yet been able to support the complete set of Python syntax, so there will be some restrictions on the writing of construct functions. For specific restrictions, please refer to [the MindSpore static graph grammar](https://www.mindspore.cn/doc/note/zh-CN/master/static_graph_syntax_support.html). + ### Common restriction principles + Compared with the detailed syntax description, the common limitations can be summarized as follows: - - Don't call other Python libraries, such as numpy and scipy, when composing the picture. The related processing should be moved to the __ init__ stage. - - Do not use custom types when composing images. Instead, use the data types and Python basic types provided by MindSpore. You can use tuple/list combinations based on these types. - - Do not deal with multi-threaded, multi-process data when composing a picture. + +- Don't call other Python libraries, such as numpy and scipy, when composing the picture. The related processing should be moved to the __init__ stage. +- Do not use custom types when composing images. Instead, use the data types and Python basic types provided by MindSpore. You can use tuple/list combinations based on these types. +- Do not deal with multi-threaded, multi-process data when composing a picture. + ### Common processing strategies -1. Use the operators provided inside MindSpore to replace the functions of other Python libraries. The processing of constants can be moved forward to the __ init __ stage. + +1. Use the operators provided inside MindSpore to replace the functions of other Python libraries. The processing of constants can be moved forward to the __init__ stage. 2. Using basic types for combination, you can consider increasing the amount of function parameters. There are no restrictions on the input parameters of the function, and variable length input can be used. -3. Avoid multithreading in the network. \ No newline at end of file +3. Avoid multithreading in the network. -- Gitee