diff --git a/docs/migration_guide/source_en/script_analysis.md b/docs/migration_guide/source_en/script_analysis.md index 18a19e45f960946373a154d2745006307dc2817a..050bfe525b4e17bf39d3653047a70bef132e02af 100644 --- a/docs/migration_guide/source_en/script_analysis.md +++ b/docs/migration_guide/source_en/script_analysis.md @@ -1,76 +1,51 @@ -# Network Script Analysis - - - -- [Network Script Analysis](#network-script_analysis) - - [Operator Evaluation](#operator-evaluation) - - [MindSpore Operator Design](#mindspore-operator-design) - - [Query Operator Mapping Table](#query-operator-mapping-table) - - [Missing Operator Processing Strategy](#missing-operator-processing-strategy) - - [Grammar Assessment](#grammar-assessment) - - [Common Restriction Principles](#common-restriction-principles) - - [Common Processing Strategies](#common-processing-strategies) - - - +# Network Script Analysis ## Operator Evaluation -### MindSpore Operator Design - -The process of using the MindSpore framework to build a neural network is similar to other frameworks (e.g., TensorFlow/PyTorch), but the supported operators are different. It is necessary to find out the missing operators in the MindSpore framework when performing network migration (e.g., migrating from TensorFlow to the MindSpore-ascend platform). +### MindSpore operator design +Using the MindSpore framework to build a neural network is similar to other frameworks (TensorFlow/PyTorch), but the supported operators are different. Therefore, to find out the missing operators is necessary in the MindSpore framework during network migration (such as migrating from TensorFlow to the MindSpore Ascend platform). MindSpore API is composed of various Python/C++ API operators, which can be roughly divided into: - Data framework operator - - Including tensors, basic data types, training gradients, optimizer operators, such as `mindspore.int32`, `mindspore.nn.Cell`, etc. - + Including tensors, basic data types, training gradients, optimizer operators, such as mindspore.int32, mindspore.nn.Cell, etc. - Data preprocessing operator - - Including image reading, data type conversion operators, such as `mindspore.dataset.MnistDataset`, etc. - +Including image reading, data type conversion operators, such as mindspore.dataset.MnistDataset, etc. - Network structure operator + Including convolution and normalization operators used in network construction, such as mindspore.nn.Conv2d, mindspore.nn.Dense, etc. - Including convolution and normalization operators used in network construction, such as `mindspore.nn.Conv2d`, `mindspore.nn.Dense`, etc. - -The surface layer of the network structure operator is the MindSpore operator (hereinafter referred as ME operator), which is the operator API called by the user (e.g., `mindspore.nn.Softmax`), and the ME operator is implemented by calling the TBE operator (C/C++) at the bottom layer. - -When counting missing ME operators, you need to find out the corresponding operators of all operators in the source script (including data framework classes, data preprocessing, and network structure operators) in the MindSpore framework (e.g.,`tf.nn.relu` corresponds to MindSpore operator `mindspore.nn.ReLU`). If there is no corresponding operator in MindSpore, it will be counted as missing. + The surface layer of the network structure operator is the ME operator, that is, the operator API (such as mindspore.nn.Softmax) called by the user, and the ME operator is implemented by calling the TBE operator (C/C++) at the bottom layer. + When counting missing ME operators, you need to find out the corresponding operators of all operators in the source script (including data frame classes, data preprocessing, and network structure operators) in the MindSpore framework (for example, tf.nn.relu corresponds to MindSpore operators as mindspore.nn.ReLU). If there is no corresponding operator in MindSpore, it will be counted as missing. -### Query Operator Mapping Table +### Query operator mapping table -Find the network structure and the Python file that implements the training function in the code library (the name is generally train.py model.py, etc.), and find all relevant operators in the script file (including data framework classes, data preprocessing, network structure operators, etc.), and compare with [MindSpore Operator API](https://www.mindspore.cn/doc/note/en/master/operator_list_ms.html) , to find the platform support status of the operator under `mindspore.nn` or `mindspore.ops`. +Find the network structure and the Python file that implements the training function in the code library (the name is generally train.py model.py, etc.), and find all relevant operators in the script file (including data frame classes, data preprocessing, network structure operators, etc.) ), and compare with [the MindSpore operator API](https://www.mindspore.cn/doc/note/zh-CN/master/operator_list_ms.html) to find the platform support status of the operator under mindspore.nn or mindspore.ops. +If the corresponding ME operator cannot be found on this webpage, you can continue to search for the operator name in [the MindSpore API list](https://www.mindspore.cn/doc/api_python/zh-CN/master/index.html). +If the source code is a PyTorch script, you can directly query [the operator mapping between MindSpore and PyTorch](https://www.mindspore.cn/doc/note/zh-CN/master/index.html#operator_api) to find the corresponding MindSpore operator. Note that for operators with the same function, MindSpore may be named differently from other frameworks, and the parameters and functions of operators with the same name may also be different from other frameworks. The official description shall prevail. -If the corresponding ME operator cannot be found on this webpage, you can continue to search for the operator name in [MindSpore API List](https://www.mindspore.cn/doc/api_python/en/master/index.html). +### Missing operator processing strategy -If the source code is a PyTorch script, you can directly query [MindSpore and PyTorch operator mapping](https://www.mindspore.cn/doc/note/en/master/index.html#operator_api) to find the corresponding MindSpore operator. Note that for operators with the same function, MindSpore may define a name for this operator differing from other frameworks, and the parameters and functions of operators with the same name may also be different from other frameworks. Please refer to the official description for checking the names. - -### Missing Operator Processing Strategy - -1. Consider replacing it with other operators: It is necessary to analyze the implementation formula of the operator and examine whether the existing MindSpore operator can be superimposed to achieve the expected goal. +1. Consider replacing it with other operators: it is necessary to analyze the implementation formula of the operator, and examine whether the existing MindSpore operator can be superimposed to achieve the expected goal. 2. Consider temporary circumvention solutions: For example, if a certain loss is not supported, it can be replaced with a loss operator of the same kind that has been supported. -3. Submit suggestions in [MindSpore Community](https://gitee.com/mindspore/mindspore/issues) to develop missing operators. - -## Grammar Assessment - -MindSpore provides two modes: `GRAPH_MODE` and `PYNATIVE_MODE`. +3. Submit suggestions in [the MindSpore community](https://gitee.com/mindspore/mindspore/issues) to develop missing operators. -In PyNative mode, the behavior of the model for **Evaluation** is same as that of in the general Python code. +## Grammar assessment -When using `GRAPH_MODE`, or when using `PYNATIVE_MODE` for **Training**, there are usually grammatical restrictions. In these two cases, it is necessary to perform graph compilation operations on the Python code. In this step, MindSpore has not yet been able to support the complete set of Python syntax, so there will be some restrictions on the implementation of the `construct` function. For specific restrictions, please refer to [MindSpore static graph syntax support](https://www.mindspore.cn/doc/note/en/master/static_graph_syntax_support.html). +MindSpore provides two modes: GRAPH_MODE and PYNATIVE_MODE. +The **inference** behavior of the model in PyNative mode is no different from general Python code. +When using GRAPH_MODE or PYNATIVE_MODE for **training**, there are usually grammatical restrictions. In these two cases, it is necessary to perform graph compilation operations on the Python code. In this step, MindSpore has not yet been able to support the complete set of Python syntax, so there will be some restrictions on the writing of construct functions. For specific restrictions, please refer to [the MindSpore static graph grammar](https://www.mindspore.cn/doc/note/zh-CN/master/static_graph_syntax_support.html). -### Common Restriction Principles +### Common restriction principles -Compared with the specific syntax description, the common restrictions can be summarized as follows: +Compared with the detailed syntax description, the common limitations can be summarized as follows: -- Do not call other Python module , such as numpy and scipy, when building the graph. The related processing should be moved to the `__init__` stage. -- Do not use custom types when building the graph. Instead, use the data types and Python basic types provided by MindSpore. You can use tuple/list combinations based on these types. -- Do not processing multi-threaded, multi-process data when building the graph. +- Don't call other Python libraries, such as numpy and scipy, when composing the picture. The related processing should be moved to the __init__ stage. +- Do not use custom types when composing images. Instead, use the data types and Python basic types provided by MindSpore. You can use tuple/list combinations based on these types. +- Do not deal with multi-threaded, multi-process data when composing a picture. -### Common Processing Strategies +### Common processing strategies -1. Use the operators provided by MindSpore to replace the functions of other Python libraries. The processing of constants can be moved to the `__init__` stage. -2. Use basic types for combination, you can consider increasing the amount of function parameters. There are no restrictions on the input parameters of the function, and variable length input can be used. +1. Use the operators provided inside MindSpore to replace the functions of other Python libraries. The processing of constants can be moved forward to the __init__ stage. +2. Using basic types for combination, you can consider increasing the amount of function parameters. There are no restrictions on the input parameters of the function, and variable length input can be used. 3. Avoid multithreading in the network.