diff --git a/docs/api_cpp/source_en/api.md b/docs/api_cpp/source_en/api.md new file mode 100644 index 0000000000000000000000000000000000000000..e0c9aead9a5e5a1a083f5a37611c3436fd7ee6f0 --- /dev/null +++ b/docs/api_cpp/source_en/api.md @@ -0,0 +1,390 @@ +# mindspore::api + + + +## Context + +\#include <[context.h](https://gitee.com/mindspore/mindspore/blob/master/include/api/context.h)> + +The Context class is used to store environment variables during execution. + +### Static Public Member Function + +#### Instance + +```cpp +static Context &Instance(); +``` + +Obtains the MindSpore Context instance object. + +### Public Member Functions + +#### GetDeviceTarget + +```cpp +const std::string &GetDeviceTarget() const; +``` + +Obtains the target device type. + +- Returns + + Current DeviceTarget type. + +#### GetDeviceID + +```cpp +uint32_t GetDeviceID() const; +``` + +Obtains the device ID. + +- Returns + + Current device ID. + +#### SetDeviceTarget + +```cpp +Context &SetDeviceTarget(const std::string &device_target); +``` + +Configures the target device. + +- Parameters + + - `device_target`: target device to be configured. The options are `kDeviceTypeAscend310` and `kDeviceTypeAscend910`. + +- Returns + + MindSpore Context instance object. + +#### SetDeviceID + +```cpp +Context &SetDeviceID(uint32_t device_id); +``` + +Obtains the device ID. + +- Parameters + + - `device_id`: device ID to be configured. + +- Returns + + MindSpore Context instance object. + +## Serialization + +\#include <[serialization.h](https://gitee.com/mindspore/mindspore/blob/master/include/api/serialization.h)> + +The Serialization class is used to summarize methods for reading and writing model files. + +### Static Public Member Function + +#### LoadModel + +- Parameters + + - `file`: model file path. + - `model_type`: model file type. The options are `ModelType::kMindIR` and `ModelType::kOM`. + +- Returns + + Object for storing graph data. + +## Model + +\#include <[model.h](https://gitee.com/mindspore/mindspore/blob/master/include/api/model.h)> + +A Model class is used to define a MindSpore model, facilitating computational graph management. + +### Constructor and Destructor + +```cpp +Model(const GraphCell &graph); +~Model(); +``` + +`GraphCell` is a derivative of `Cell`. `Cell` is not open for use currently. `GraphCell` can be constructed from `Graph`, for example, `Model model(GraphCell(graph))`. + +### Public Member Functions + +#### Build + +```cpp +Status Build(const std::map &options); +``` + +Builds a model so that it can run on a device. + +- Parameters + + - `options`: model build options. In the following table, Key indicates the option name, and Value indicates the corresponding option. + +| Key | Value | +| --- | --- | +| kModelOptionInsertOpCfgPath | [AIPP](https://support.huaweicloud.com/intl/en-us/adevg-ms-atlas200dkappc32/atlasadm_01_0023.html) configuration file path. | +| kModelOptionInputFormat | Manually specifies the model input format. The options are `"NCHW"` and `"NHWC"`. | +| kModelOptionInputShape | Manually specifies the model input shape, for example, `"input_op_name1: n1,c2,h3,w4;input_op_name2: n4,c3,h2,w1"` | +| kModelOptionOutputType | Manually specifies the model output type, for example, `"FP16"` or `"UINT8"`. The default value is `"FP32"`. | +| kModelOptionPrecisionMode | Model precision mode. The options are `"force_fp16"`, `"allow_fp32_to_fp16"`, `"must_keep_origin_dtype"`, and `"allow_mix_precision"`. The default value is `"force_fp16"`. | +| kModelOptionOpSelectImplMode | Operator selection mode. The options are `"high_performance"` and `"high_precision"`. The default value is `"high_performance"`. | + +- Returns + + Status code. + +#### Predict + +```cpp +Status Predict(const std::vector &inputs, std::vector *outputs); +``` + +Inference model. + +- Parameters + + - `inputs`: a `vector` where model inputs are arranged in sequence. + - `outputs`: output parameter, which is the pointer to a `vector`. The model outputs are filled in the container in sequence. + +- Returns + + Status code. + +#### GetInputsInfo + +```cpp +Status GetInputsInfo(std::vector *names, std::vector> *shapes, std::vector *data_types, std::vector *mem_sizes) const; +``` + +Obtains the model input information. + +- Parameters + + - `names`: optional output parameter, which is the pointer to a `vector` where model inputs are arranged in sequence. The input names are filled in the container in sequence. If `nullptr` is input, the attribute is not obtained. + - `shapes`: optional output parameter, which is the pointer to a `vector` where model inputs are arranged in sequence. The input shapes are filled in the container in sequence. If `nullptr` is input, the attribute is not obtained. + - `data_types`: optional output parameter, which is the pointer to a `vector` where model inputs are arranged in sequence. The input data types are filled in the container in sequence. If `nullptr` is input, the attribute is not obtained. + - `mem_sizes`: optional output parameter, which is the pointer to a `vector` where model inputs are arranged in sequence. The input memory lengths (in bytes) are filled in the container in sequence. If `nullptr` is input, the attribute is not obtained. + +- Returns + + Status code. + +#### GetOutputsInfo + +```cpp +Status GetOutputsInfo(std::vector *names, std::vector> *shapes, std::vector *data_types, std::vector *mem_sizes) const; +``` + +Obtains the model output information. + +- Parameters + + - `names`: optional output parameter, which is the pointer to a `vector` where model outputs are arranged in sequence. The output names are filled in the container in sequence. If `nullptr` is input, the attribute is not obtained. + - `shapes`: optional output parameter, which is the pointer to a `vector` where model outputs are arranged in sequence. The output shapes are filled in the container in sequence. If `nullptr` is input, the attribute is not obtained. + - `data_types`: optional output parameter, which is the pointer to a `vector` where model outputs are arranged in sequence. The output data types are filled in the container in sequence. If `nullptr` is input, the attribute is not obtained. + - `mem_sizes`: optional output parameter, which is the pointer to a `vector` where model outputs are arranged in sequence. The output memory lengths (in bytes) are filled in the container in sequence. If `nullptr` is input, the attribute is not obtained. + +- Returns + + Status code. + +## Tensor + +\#include <[types.h](https://gitee.com/mindspore/mindspore/blob/master/include/api/types.h)> + +### Constructor and Destructor + +```cpp +Tensor(); +Tensor(const std::string &name, DataType type, const std::vector &shape, const void *data, size_t data_len); +~Tensor(); +``` + +### Static Public Member Function + +#### GetTypeSize + +```cpp +static int GetTypeSize(api::DataType type); +``` + +Obtains the memory length of a data type, in bytes. + +- Parameters + + - `type`: data type. + +- Returns + + Memory length, in bytes. + +### Public Member Functions + +#### Name + +```cpp +const std::string &Name() const; +``` + +Obtains the name of a tensor. + +- Returns + + Tensor name. + +#### DataType + +```cpp +api::DataType DataType() const; +``` + +Obtains the data type of a tensor. + +- Returns + + Tensor data type. + +#### Shape + +```cpp +const std::vector &Shape() const; +``` + +Obtains the shape of a tensor. + +- Returns + + Tensor shape. + +#### SetName + +```cpp +void SetName(const std::string &name); +``` + +Sets the name of a tensor. + +- Parameters + + - `name`: name to be set. + +#### SetDataType + +```cpp +void SetDataType(api::DataType type); +``` + +Sets the data type of a tensor. + +- Parameters + + - `type`: type to be set. + +#### SetShape + +```cpp +void SetShape(const std::vector &shape); +``` + +Sets the shape of a tensor. + +- Parameters + + - `shape`: shape to be set. + +#### Data + +```cpp +const void *Data() const; +``` + +Obtains the constant pointer to the tensor data. + +- Returns + + Constant pointer to the tensor data. + +#### MutableData + +```cpp +void *MutableData(); +``` + +Obtains the pointer to the tensor data. + +- Returns + + Pointer to the tensor data. + +#### DataSize + +```cpp +size_t DataSize() const; +``` + +Obtains the memory length (in bytes) of the tensor data. + +- Returns + + Memory length of the tensor data, in bytes. + +#### ResizeData + +```cpp +bool ResizeData(size_t data_len); +``` + +Adjusts the memory size of the tensor. + +- Parameters + + - `data_len`: number of bytes in the memory after adjustment. + +- Returns + + A value of bool indicates whether the operation is successful. + +#### SetData + +```cpp +bool SetData(const void *data, size_t data_len); +``` + +Adjusts the memory data of the tensor. + +- Parameters + + - `data`: memory address of the source data. + - `data_len`: length of the source data memory. + +- Returns + + A value of bool indicates whether the operation is successful. + +#### ElementNum + +```cpp +int64_t ElementNum() const; +``` + +Obtains the number of elements in a tensor. + +- Returns + + Number of elements in a tensor. + +#### Clone + +```cpp +Tensor Clone() const; +``` + +Performs a self copy. + +- Returns + + A deep copy. \ No newline at end of file diff --git a/docs/api_cpp/source_en/index.rst b/docs/api_cpp/source_en/index.rst index 779317bee1f0397ac1c5a78905b31b236f33d4f8..b5f76d3c78fd947026a99ea4a9ba8afd91355ed8 100644 --- a/docs/api_cpp/source_en/index.rst +++ b/docs/api_cpp/source_en/index.rst @@ -12,6 +12,7 @@ MindSpore C++ API class_list mindspore + api dataset vision lite diff --git a/docs/api_cpp/source_zh_cn/api.md b/docs/api_cpp/source_zh_cn/api.md index 7c0af98ec52b620064f95b13b38d80d618a8fe5d..ed98393806abbb1699deb49af6f96b9ca2229973 100644 --- a/docs/api_cpp/source_zh_cn/api.md +++ b/docs/api_cpp/source_zh_cn/api.md @@ -179,7 +179,7 @@ Status GetInputsInfo(std::vector *names, std::vector *names, std::vector> *shapes, std::vector *data_types, std::vector *mem_sizes) const; ``` -获取模型输入信息。 +获取模型输出信息。 - 参数 diff --git a/docs/api_java/source_en/index.rst b/docs/api_java/source_en/index.rst index 935aa0a5d22565b2d51fc919a8f81c00d9702b02..1a531e3f3a89cec32b3882e59a74980552ef0864 100644 --- a/docs/api_java/source_en/index.rst +++ b/docs/api_java/source_en/index.rst @@ -14,4 +14,5 @@ MindSpore Java API lite_session model msconfig - mstensor \ No newline at end of file + mstensor + lite_java_example \ No newline at end of file diff --git a/docs/api_java/source_en/lite_java_example.rst b/docs/api_java/source_en/lite_java_example.rst new file mode 100644 index 0000000000000000000000000000000000000000..9cb08fa346e469ff755473891e7676904897e7ab --- /dev/null +++ b/docs/api_java/source_en/lite_java_example.rst @@ -0,0 +1,7 @@ +Example +======== + +.. toctree:: + :maxdepth: 1 + + Quick Start \ No newline at end of file diff --git a/docs/api_python/source_en/mindspore/mindspore.ops.rst b/docs/api_python/source_en/mindspore/mindspore.ops.rst index 779103f524e5fba8fede87ba25f6bd58a6756850..7f0d5f5e61ef48d91bbaeddc3327c8a1288b5d7d 100644 --- a/docs/api_python/source_en/mindspore/mindspore.ops.rst +++ b/docs/api_python/source_en/mindspore/mindspore.ops.rst @@ -29,6 +29,7 @@ The composite operators are the pre-defined combination of operators. mindspore.ops.normal mindspore.ops.poisson mindspore.ops.repeat_elements + mindspore.ops.sequence_mask mindspore.ops.tensor_dot mindspore.ops.uniform diff --git a/docs/api_python/source_zh_cn/mindspore/mindspore.ops.rst b/docs/api_python/source_zh_cn/mindspore/mindspore.ops.rst index 779103f524e5fba8fede87ba25f6bd58a6756850..7f0d5f5e61ef48d91bbaeddc3327c8a1288b5d7d 100644 --- a/docs/api_python/source_zh_cn/mindspore/mindspore.ops.rst +++ b/docs/api_python/source_zh_cn/mindspore/mindspore.ops.rst @@ -29,6 +29,7 @@ The composite operators are the pre-defined combination of operators. mindspore.ops.normal mindspore.ops.poisson mindspore.ops.repeat_elements + mindspore.ops.sequence_mask mindspore.ops.tensor_dot mindspore.ops.uniform diff --git a/docs/note/source_en/design/overall.rst b/docs/note/source_en/design/overall.rst index bec96d2c15254cf9a888536a6cab4aff59ef9c00..5aeb51194e95a4155161c9c0475c7f23654863c2 100644 --- a/docs/note/source_en/design/overall.rst +++ b/docs/note/source_en/design/overall.rst @@ -4,5 +4,6 @@ Overall Design .. toctree:: :maxdepth: 1 + technical_white_paper mindspore/architecture mindspore/architecture_lite diff --git a/docs/note/source_en/design/technical_white_paper.md b/docs/note/source_en/design/technical_white_paper.md new file mode 100644 index 0000000000000000000000000000000000000000..41dec93c4019650d1a144c077a67862c33b7694d --- /dev/null +++ b/docs/note/source_en/design/technical_white_paper.md @@ -0,0 +1,5 @@ +# Technical White Paper + +Please stay tuned... + + diff --git a/docs/note/source_en/env_var_list.md b/docs/note/source_en/env_var_list.md new file mode 100644 index 0000000000000000000000000000000000000000..e6cefcfc8475228d83b2a80ba342af90e2e921e1 --- /dev/null +++ b/docs/note/source_en/env_var_list.md @@ -0,0 +1,5 @@ +# Environment Variables List + +No English version available right now, welcome to contribute. + + diff --git a/docs/note/source_en/index.rst b/docs/note/source_en/index.rst index e3aa74528572fe3a544fd4f85dd3b04f5502852e..f49f2d63f9a44196cf6027532fbfced153506b38 100644 --- a/docs/note/source_en/index.rst +++ b/docs/note/source_en/index.rst @@ -25,7 +25,9 @@ MindSpore Design And Specification benchmark network_list operator_list + syntax_list model_lite + env_var_list .. toctree:: :glob: diff --git a/docs/note/source_en/network_list_ms.md b/docs/note/source_en/network_list_ms.md index 95416313e766dae7ddaafd22da645f65861c3683..3a0e0e32dc4567ab97a9e0484aea0daaa5a02da2 100644 --- a/docs/note/source_en/network_list_ms.md +++ b/docs/note/source_en/network_list_ms.md @@ -26,21 +26,22 @@ |Computer Vision (CV) | Image Classification | [ResNext50](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/resnext50/src/image_classification.py) | Supported | Supported | Supported | Supported | Doing | Doing | Computer Vision (CV) | Image Classification | [VGG16](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/vgg16/src/vgg.py) | Supported | Supported | Supported | Supported | Doing | Doing | Computer Vision (CV) | Image Classification | [InceptionV3](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/inceptionv3/src/inception_v3.py) | Supported | Supported | Doing | Doing | Doing | Doing +| Computer Vision (CV) | Image Classification | [InceptionV4](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/inceptionv4/src/inceptionv4.py) | Supported | Doing | Doing | Doing | Doing | Doing | Computer Vision (CV) | Image Classification | [DenseNet121](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/densenet121/src/network/densenet.py) | Supported | Supported | Doing | Doing | Doing | Doing +| Computer Vision (CV) | Image Classification | [MobileNetV1](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/mobilenetv1/src/mobilenet_v1.py) | Supported | Doing | Doing | Doing | Doing | Doing | Computer Vision (CV) | Image Classification | [MobileNetV2](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/mobilenetv2/src/mobilenetV2.py) | Supported | Supported | Supported | Supported | Doing | Doing | Computer Vision (CV) | Image Classification | [MobileNetV2(Quantization)](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/mobilenetv2_quant/src/mobilenetV2.py) | Supported | Doing | Supported | Doing | Doing | Doing | Computer Vision (CV) | Image Classification | [MobileNetV3](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/mobilenetv3/src/mobilenetV3.py) | Doing | Doing | Supported | Supported | Doing | Doing +| Computer Vision (CV) | Image Classification | [Shufflenetv1](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/shufflenetv1/src/shufflenetv1.py) | Supported | Doing | Doing | Doing | Doing | Doing | Computer Vision (CV) | Image Classification | [NASNET](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/nasnet/src/nasnet_a_mobile.py) | Doing | Doing | Supported | Supported | Doing | Doing | Computer Vision (CV) | Image Classification | [ShuffleNetV2](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/shufflenetv2/src/shufflenetv2.py) | Doing | Doing | Supported | Supported | Doing | Doing | Computer Vision (CV) | Image Classification | [EfficientNet-B0](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/efficientnet/src/efficientnet.py) | Doing | Doing | Supported | Supported | Doing | Doing -| Computer Vision (CV) | Image Classification | [GhostNet](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/ghostnet/src/ghostnet.py) | Doing | Doing | Supported | Supported | Doing | Doing | Computer Vision (CV) | Image Classification | [ResNet50-0.65x](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/resnet50_adv_pruning/src/resnet_imgnet.py) | Supported | Supported | Doing | Doing | Doing | Doing | Computer Vision (CV) | Image Classification | [SSD-GhostNet](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/ssd_ghostnet/src/ssd_ghostnet.py) | Supported | Doing | Doing | Doing | Doing | Doing -| Computer Vision (CV) | Image Classification | [TinyNet](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/tinynet/src/tinynet.py) | Supported | Doing | Doing | Doing | Doing | Doing - Computer Vision(CV) | Image Classification | [FaceAttributes](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/FaceAttribute/src/FaceAttribute/resnet18.py) | Supported | Doing | Doing | Doing | Doing | Doing -| Computer Vision(CV) | Image Classification | [FaceQualityAssessment](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/FaceQualityAssessment/src/face_qa.py) | Supported | Doing | Doing | Doing | Doing | Doing -| Computer Vision(CV) | Image Classificationn | [FaceRecognitionForTracking](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/FaceRecognitionForTracking/src/reid.py) | Supported | Doing | Doing | Doing | Doing | Doing -| Computer Vision (CV) | Image Classification | [SqueezeNet](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/squeezenet/src/squeezenet.py) | Supported | Doing | Doing | Doing | Doing | Doing + Computer Vision(CV) | Image Classification | [FaceAttributes](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/FaceAttribute/src/FaceAttribute/resnet18.py) | Supported | Supported | Doing | Doing | Doing | Doing +| Computer Vision(CV) | Image Classification | [FaceQualityAssessment](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/FaceQualityAssessment/src/face_qa.py) | Supported | Supported | Doing | Doing | Doing | Doing +| Computer Vision(CV) | Image Classificationn | [FaceRecognitionForTracking](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/FaceRecognitionForTracking/src/reid.py) | Supported | Supported | Doing | Doing | Doing | Doing +| Computer Vision (CV) | Image Classification | [SqueezeNet](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/squeezenet/src/squeezenet.py) | Supported | Supported | Doing | Doing | Doing | Doing |Computer Vision (CV) | Object Detection | [SSD](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/ssd/src/ssd.py) | Supported | Supported | Supported | Supported | Supported | Supported | Computer Vision (CV) | Object Detection | [YoloV3-ResNet18](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/yolov3_resnet18/src/yolov3.py) | Supported | Supported | Doing | Doing | Doing | Doing | Computer Vision (CV) | Object Detection | [YoloV3-DarkNet53](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/yolov3_darknet53/src/yolo.py) | Supported | Supported | Supported | Supported | Doing | Doing @@ -51,28 +52,31 @@ | Computer Vision(CV) | Object Detection | [Retinaface-ResNet50](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/retinaface_resnet50/src/network.py) | Doing | Doing | Supported | Supported | Doing | Doing | Computer Vision(CV) | Object Detection | [CenterFace](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/centerface/src/centerface.py) | Supported | Doing | Doing | Doing | Doing | Doing | Computer Vision(CV) | Object Detection | [FaceDetection](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/FaceDetection/src/FaceDetection/yolov3.py) | Supported | Doing | Doing | Doing | Doing | Doing -| Computer Vision (CV) | Object Detection | [MaskRCNN-MobileNetV1](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/maskrcnn_mobilenetv1/src/maskrcnn_mobilenetv1/mobilenetv1.py) | Supported | Doing | Doing | Doing | Doing | Doing +| Computer Vision (CV) | Object Detection | [MaskRCNN-MobileNetV1](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/maskrcnn_mobilenetv1/src/maskrcnn_mobilenetv1/mobilenetv1.py) | Supported | Supported | Doing | Doing | Doing | Doing | Computer Vision (CV) | Object Detection | [SSD-MobileNetV1-FPN](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/ssd/src/mobilenet_v1_fpn.py) | Supported | Doing | Doing | Doing | Doing | Doing | Computer Vision (CV) | Object Detection | [YoloV4](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/yolov4/src/yolo.py) | Supported | Doing | Doing | Doing | Doing | Doing | Computer Vision (CV) | Text Detection | [PSENet](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/psenet/src/ETSNET/etsnet.py) | Supported | Supported | Doing | Doing | Doing | Doing | Computer Vision (CV) | Text Recognition | [CNNCTC](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/cnnctc/src/cnn_ctc.py) | Supported | Supported | Doing | Doing | Doing | Doing | Computer Vision (CV) | Semantic Segmentation | [DeeplabV3](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/deeplabv3/src/nets/deeplab_v3/deeplab_v3.py) | Supported | Supported | Doing | Doing | Doing | Doing | Computer Vision (CV) | Semantic Segmentation | [UNet2D-Medical](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/unet/src/unet/unet_model.py) | Supported | Supported | Doing | Doing | Doing | Doing - Computer Vision (CV) | Keypoint Detection | [Openpose](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/openpose/src/openposenet.py) | Supported | Doing | Doing | Doing | Doing | Doing +| Computer Vision (CV) | Keypoint Detection | [Openpose](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/openpose/src/openposenet.py) | Supported | Doing | Doing | Doing | Doing | Doing +| Computer Vision (CV) | Optical Character Recognition | [CRNN](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/crnn/src/crnn.py) | Supported | Doing | Doing | Doing | Doing | Doing | Natural Language Processing (NLP) | Natural Language Understanding | [BERT](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/nlp/bert/src/bert_model.py) | Supported | Supported | Supported | Supported | Doing | Doing | Natural Language Processing (NLP) | Natural Language Understanding | [Transformer](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/nlp/transformer/src/transformer_model.py) | Supported | Supported | Supported | Supported | Doing | Doing | Natural Language Processing (NLP) | Natural Language Understanding | [SentimentNet](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/nlp/lstm/src/lstm.py) | Doing | Doing | Supported | Supported | Supported | Supported | Natural Language Processing (NLP) | Natural Language Understanding | [MASS](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/nlp/mass/src/transformer/transformer_for_train.py) | Supported | Supported | Supported | Supported | Doing | Doing | Natural Language Processing (NLP) | Natural Language Understanding | [TinyBert](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/nlp/tinybert/src/tinybert_model.py) | Supported | Supported | Supported | Doing | Doing | Doing | Natural Language Processing (NLP) | Natural Language Understanding | [GNMT v2](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/nlp/gnmt_v2/src/gnmt_model/gnmt.py) | Supported | Doing | Doing | Doing | Doing | Doing -| Natural Language Processing (NLP) | Natural Language Understanding | [DS-CNN](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/nlp/dscnn/src/ds_cnn.py) | Supported | Doing | Doing | Doing | Doing | Doing -| Recommender | Recommender System, CTR prediction | [DeepFM](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/recommend/deepfm/src/deepfm.py) | Supported | Supported | Supported | Supported| Doing | Doing +| Natural Language Processing (NLP) | Natural Language Understanding | [DS-CNN](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/nlp/dscnn/src/ds_cnn.py) | Supported | Supported | Doing | Doing | Doing | Doing +| Natural Language Processing (NLP) | Natural Language Understanding | [TextCNN](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/nlp/textcnn/src/textcnn.py) | Supported | Doing | Doing | Doing | Doing | Doing +| Recommender | Recommender System, CTR prediction | [DeepFM](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/recommend/deepfm/src/deepfm.py) | Supported | Supported | Supported | Supported | Supported | Doing | Recommender | Recommender System, Search, Ranking | [Wide&Deep](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/recommend/wide_and_deep/src/wide_and_deep.py) | Supported | Supported | Supported | Supported | Doing | Doing +| Recommender | Recommender System | [NCF](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/recommend/ncf/src/ncf.py) | Supported | Doing | Supported | Doing | Doing | Doing | Graph Neural Networks (GNN) | Text Classification | [GCN](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/gnn/gcn/src/gcn.py) | Supported | Supported | Doing | Doing | Doing | Doing | Graph Neural Networks (GNN) | Text Classification | [GAT](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/gnn/gat/src/gat.py) | Supported | Supported | Doing | Doing | Doing | Doing | Graph Neural Networks (GNN) | Recommender System | [BGCF](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/gnn/bgcf/src/bgcf.py) | Supported | Doing | Doing | Doing | Doing | Doing | Audio | Auto Tagging | [FCN-4](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/audio/fcn-4/src/musictagger.py) | Supported | Supported | Doing | Doing | Doing | Doing -| High Performance Computing | Molecular Dynamics | [DeepPotentialH2O](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/hpc/molecular_dynamics/src/network.py) | Supported | Doing | Doing | Doing | Doing | Doing -| High Performance Computing | Ocean Model | [GOMO](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/hpc/ocean_model/src/GOMO.py) | Doing | Doing | Supported | Doing | Doing | Doing +| High Performance Computing | Molecular Dynamics | [DeepPotentialH2O](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/hpc/molecular_dynamics/src/network.py) | Supported | Supported | Doing | Doing | Doing | Doing +| High Performance Computing | Ocean Model | [GOMO](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/hpc/ocean_model/src/GOMO.py) | Doing | Doing | Supported | Supported | Doing | Doing > You can also use [MindWizard Tool](https://gitee.com/mindspore/mindinsight/tree/master/mindinsight/wizard/) to quickly generate classic network scripts. diff --git a/docs/note/source_en/static_graph_syntax_support.md b/docs/note/source_en/static_graph_syntax_support.md new file mode 100644 index 0000000000000000000000000000000000000000..47c6aa29c96a1352fe48688a9f66a72134e2800a --- /dev/null +++ b/docs/note/source_en/static_graph_syntax_support.md @@ -0,0 +1,5 @@ +# Static Graph Syntax Support + +No English version available right now, welcome to contribute. + + diff --git a/docs/note/source_en/syntax_list.rst b/docs/note/source_en/syntax_list.rst new file mode 100644 index 0000000000000000000000000000000000000000..597c59c2b324118dffe760c9e087fd773f644493 --- /dev/null +++ b/docs/note/source_en/syntax_list.rst @@ -0,0 +1,7 @@ +Syntax Support +================ + +.. toctree:: + :maxdepth: 1 + + static_graph_syntax_support \ No newline at end of file diff --git a/docs/note/source_zh_cn/network_list_ms.md b/docs/note/source_zh_cn/network_list_ms.md index 67d9e26b7a4b1c0987703d087c4b5c03d7cf213a..eb640e625ad16ec6ad122bb2fdee56301d619f8e 100644 --- a/docs/note/source_zh_cn/network_list_ms.md +++ b/docs/note/source_zh_cn/network_list_ms.md @@ -26,21 +26,22 @@ |计算机视觉(CV) | 图像分类(Image Classification) | [ResNext50](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/resnext50/src/image_classification.py) | Supported | Supported | Supported | Supported | Doing | Doing | 计算机视觉(CV) | 图像分类(Image Classification) | [VGG16](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/vgg16/src/vgg.py) | Supported | Supported | Supported | Supported | Doing | Doing | 计算机视觉(CV) | 图像分类(Image Classification) | [InceptionV3](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/inceptionv3/src/inception_v3.py) | Supported | Supported | Doing | Doing | Doing | Doing +| 计算机视觉(CV) | 图像分类(Image Classification) | [InceptionV4](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/inceptionv4/src/inceptionv4.py) | Supported | Doing | Doing | Doing | Doing | Doing | 计算机视觉(CV) | 图像分类(Image Classification) | [DenseNet121](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/densenet121/src/network/densenet.py) | Supported | Supported | Doing | Doing | Doing | Doing +| 计算机视觉(CV) | 图像分类(Image Classification) | [MobileNetV1](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/mobilenetv1/src/mobilenet_v1.py) | Supported | Doing | Doing | Doing | Doing | Doing | 计算机视觉(CV) | 图像分类(Image Classification) | [MobileNetV2](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/mobilenetv2/src/mobilenetV2.py) | Supported | Supported | Supported | Supported | Doing | Doing | 计算机视觉(CV) | 图像分类(Image Classification) | [MobileNetV2(量化)](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/mobilenetv2_quant/src/mobilenetV2.py) | Supported | Doing | Supported | Doing | Doing | Doing | 计算机视觉(CV) | 图像分类(Image Classification) | [MobileNetV3](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/mobilenetv3/src/mobilenetV3.py) | Doing | Doing | Supported | Supported | Doing | Doing +| 计算机视觉(CV) | 图像分类(Image Classification) | [Shufflenetv1](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/shufflenetv1/src/shufflenetv1.py) | Supported | Doing | Doing | Doing | Doing | Doing | 计算机视觉(CV) | 图像分类(Image Classification) | [NASNET](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/nasnet/src/nasnet_a_mobile.py) | Doing | Doing | Supported | Supported | Doing | Doing | 计算机视觉(CV) | 图像分类(Image Classification) | [ShuffleNetV2](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/shufflenetv2/src/shufflenetv2.py) | Doing | Doing | Supported | Supported | Doing | Doing | 计算机视觉(CV) | 图像分类(Image Classification) | [EfficientNet-B0](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/efficientnet/src/efficientnet.py) | Doing | Doing | Supported | Supported | Doing | Doing -| 计算机视觉(CV) | 图像分类(Image Classification) | [GhostNet](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/ghostnet/src/ghostnet.py) | Doing | Doing | Supported | Supported | Doing | Doing | 计算机视觉(CV) | 图像分类(Image Classification) | [ResNet50-0.65x](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/resnet50_adv_pruning/src/resnet_imgnet.py) | Supported | Supported | Doing | Doing | Doing | Doing | 计算机视觉(CV) | 图像分类(Image Classification) | [SSD-GhostNet](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/ssd_ghostnet/src/ssd_ghostnet.py) | Supported | Doing | Doing | Doing | Doing | Doing -| 计算机视觉(CV) | 图像分类(Image Classification) | [TinyNet](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/tinynet/src/tinynet.py) | Supported | Doing | Doing | Doing | Doing | Doing -| 计算机视觉(CV) | 图像分类(Image Classification) |[FaceAttributes](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/FaceAttribute/src/FaceAttribute/resnet18.py) | Supported | Doing | Doing | Doing | Doing | Doing -| 计算机视觉(CV) | 图像分类(Image Classification) |[FaceQualityAssessment](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/FaceQualityAssessment/src/face_qa.py) | Supported | Doing | Doing | Doing | Doing | Doing -| 计算机视觉(CV) | 图像分类(Image Classification) |[FaceRecognitionForTracking](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/FaceRecognitionForTracking/src/reid.py) | Supported | Doing | Doing | Doing | Doing | Doing -| 计算机视觉(CV) | 图像分类(Image Classification) |[SqueezeNet](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/squeezenet/src/squeezenet.py) | Supported | Doing | Doing | Doing | Doing | Doing +| 计算机视觉(CV) | 图像分类(Image Classification) |[FaceAttributes](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/FaceAttribute/src/FaceAttribute/resnet18.py) | Supported | Supported | Doing | Doing | Doing | Doing +| 计算机视觉(CV) | 图像分类(Image Classification) |[FaceQualityAssessment](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/FaceQualityAssessment/src/face_qa.py) | Supported | Supported | Doing | Doing | Doing | Doing +| 计算机视觉(CV) | 图像分类(Image Classification) |[FaceRecognitionForTracking](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/FaceRecognitionForTracking/src/reid.py) | Supported | Supported | Doing | Doing | Doing | Doing +| 计算机视觉(CV) | 图像分类(Image Classification) |[SqueezeNet](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/squeezenet/src/squeezenet.py) | Supported | Supported | Doing | Doing | Doing | Doing |计算机视觉(CV) | 目标检测(Object Detection) | [SSD](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/ssd/src/ssd.py) | Supported | Supported |Supported |Supported | Supported | Supported | 计算机视觉(CV) | 目标检测(Object Detection) | [YoloV3-ResNet18](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/yolov3_resnet18/src/yolov3.py) | Supported | Supported | Doing | Doing | Doing | Doing | 计算机视觉(CV) | 目标检测(Object Detection) | [YoloV3-DarkNet53](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/yolov3_darknet53/src/yolo.py) | Supported | Supported | Supported | Supported | Doing | Doing @@ -51,28 +52,31 @@ | 计算机视觉(CV) | 目标检测(Object Detection) | [Retinaface-ResNet50](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/retinaface_resnet50/src/network.py) | Doing | Doing | Supported | Supported | Doing | Doing | 计算机视觉(CV) | 目标检测(Object Detection) | [CenterFace](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/centerface/src/centerface.py) | Supported | Doing | Doing | Doing | Doing | Doing | 计算机视觉(CV) | 目标检测(Object Detection) | [FaceDetection](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/cv/FaceDetection/src/FaceDetection/yolov3.py) | Supported | Doing | Doing | Doing | Doing | Doing -| 计算机视觉(CV) | 目标检测(Object Detection) |[MaskRCNN-MobileNetV1](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/maskrcnn_mobilenetv1/src/maskrcnn_mobilenetv1/mobilenetv1.py) | Supported | Doing | Doing | Doing | Doing | Doing +| 计算机视觉(CV) | 目标检测(Object Detection) |[MaskRCNN-MobileNetV1](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/maskrcnn_mobilenetv1/src/maskrcnn_mobilenetv1/mobilenetv1.py) | Supported | Supported | Doing | Doing | Doing | Doing | 计算机视觉(CV) | 目标检测(Object Detection) |[SSD-MobileNetV1-FPN](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/ssd/src/mobilenet_v1_fpn.py) | Supported | Doing | Doing | Doing | Doing | Doing | 计算机视觉(CV) | 目标检测(Object Detection) |[YoloV4](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/yolov4/src/yolo.py) | Supported | Doing | Doing | Doing | Doing | Doing | 计算机视觉 (CV) | 文本检测 (Text Detection) | [PSENet](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/psenet/src/ETSNET/etsnet.py) | Supported | Supported | Doing | Doing | Doing | Doing | 计算机视觉 (CV) | 文本识别 (Text Recognition) | [CNNCTC](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/cnnctc/src/cnn_ctc.py) | Supported | Supported | Doing | Doing | Doing | Doing | 计算机视觉(CV) | 语义分割(Semantic Segmentation) | [DeeplabV3](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/deeplabv3/src/nets/deeplab_v3/deeplab_v3.py) | Supported | Supported | Doing | Doing | Doing | Doing | 计算机视觉(CV) | 语义分割(Semantic Segmentation) | [UNet2D-Medical](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/unet/src/unet/unet_model.py) | Supported | Supported | Doing | Doing | Doing | Doing -| 计算机视觉(CV) | 语义分割(Semantic Segmentation) |[Openpose](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/openpose/src/openposenet.py) | Supported | Doing | Doing | Doing | Doing | Doing +| 计算机视觉(CV) | 关键点检测(Keypoint Detection) |[Openpose](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/openpose/src/openposenet.py) | Supported | Doing | Doing | Doing | Doing | Doing +| 计算机视觉(CV) | 光学字符识别(Optical Character Recognition) |[CRNN](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/cv/crnn/src/crnn.py) | Supported | Doing | Doing | Doing | Doing | Doing | 自然语言处理(NLP) | 自然语言理解(Natural Language Understanding) | [BERT](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/nlp/bert/src/bert_model.py) | Supported | Supported | Supported | Supported | Doing | Doing | 自然语言处理(NLP) | 自然语言理解(Natural Language Understanding) | [Transformer](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/nlp/transformer/src/transformer_model.py) | Supported | Supported | Supported | Supported | Doing | Doing | 自然语言处理(NLP) | 自然语言理解(Natural Language Understanding) | [SentimentNet](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/nlp/lstm/src/lstm.py) | Doing | Doing | Supported | Supported | Supported | Supported | 自然语言处理(NLP) | 自然语言理解(Natural Language Understanding) | [MASS](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/nlp/mass/src/transformer/transformer_for_train.py) | Supported | Supported | Supported | Supported | Doing | Doing | 自然语言处理(NLP) | 自然语言理解(Natural Language Understanding) | [TinyBert](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/nlp/tinybert/src/tinybert_model.py) | Supported | Supported | Supported | Doing | Doing | Doing | 自然语言处理(NLP) | 自然语言理解(Natural Language Understanding) | [GNMT v2](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/nlp/gnmt_v2/src/gnmt_model/gnmt.py) | Supported | Doing | Doing | Doing | Doing | Doing -| 自然语言处理(NLP) | 自然语言理解(Natural Language Understanding) | [DS-CNN](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/nlp/dscnn/src/ds_cnn.py) | Supported | Doing | Doing | Doing | Doing | Doing -| 推荐(Recommender) | 推荐系统、点击率预估(Recommender System, CTR prediction) | [DeepFM](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/recommend/deepfm/src/deepfm.py) | Supported | Supported | Supported | Supported| Doing | Doing +| 自然语言处理(NLP) | 自然语言理解(Natural Language Understanding) | [DS-CNN](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/nlp/dscnn/src/ds_cnn.py) | Supported | Supported | Doing | Doing | Doing | Doing +| 自然语言处理(NLP) | 自然语言理解(Natural Language Understanding) | [TextCNN](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/nlp/textcnn/src/textcnn.py) | Supported | Doing | Doing | Doing | Doing | Doing +| 推荐(Recommender) | 推荐系统、点击率预估(Recommender System, CTR prediction) | [DeepFM](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/recommend/deepfm/src/deepfm.py) | Supported | Supported | Supported | Supported| Supported | Doing | 推荐(Recommender) | 推荐系统、搜索、排序(Recommender System, Search, Ranking) | [Wide&Deep](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/recommend/wide_and_deep/src/wide_and_deep.py) | Supported | Supported | Supported | Supported | Doing | Doing +| 推荐(Recommender) | 推荐系统(Recommender System) | [NCF](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/recommend/ncf/src/ncf.py) | Supported | Doing | Supported | Doing| Doing | Doing | 图神经网络(GNN) | 文本分类(Text Classification) | [GCN](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/gnn/gcn/src/gcn.py) | Supported | Supported | Doing | Doing | Doing | Doing | 图神经网络(GNN) | 文本分类(Text Classification) | [GAT](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/gnn/gat/src/gat.py) | Supported | Supported | Doing | Doing | Doing | Doing | 图神经网络(GNN) | 推荐系统(Recommender System) | [BGCF](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/official/gnn/bgcf/src/bgcf.py) | Supported | Doing | Doing | Doing | Doing | Doing |语音(Audio) | 音频标注(Audio Tagging) | [FCN-4](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/audio/fcn-4/src/musictagger.py) | Supported | Supported | Doing | Doing | Doing | Doing -|高性能计算(HPC) | 分子动力学(Molecular Dynamics) | [DeepPotentialH2O](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/hpc/molecular_dynamics/src/network.py) | Supported | Doing | Doing | Doing | Doing | Doing -|高性能计算(HPC) | 海洋模型(Ocean Model) | [GOMO](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/hpc/ocean_model/src/GOMO.py) | Doing | Doing | Supported | Doing | Doing | Doing +|高性能计算(HPC) | 分子动力学(Molecular Dynamics) | [DeepPotentialH2O](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/hpc/molecular_dynamics/src/network.py) | Supported | Supported| Doing | Doing | Doing | Doing +|高性能计算(HPC) | 海洋模型(Ocean Model) | [GOMO](https://gitee.com/mindspore/mindspore/blob/master/model_zoo/research/hpc/ocean_model/src/GOMO.py) | Doing | Doing | Supported | Supported | Doing | Doing > 你也可以使用 [MindWizard工具](https://gitee.com/mindspore/mindinsight/tree/master/mindinsight/wizard/) 快速生成经典网络脚本。 diff --git a/docs/programming_guide/source_en/cache.md b/docs/programming_guide/source_en/cache.md new file mode 100644 index 0000000000000000000000000000000000000000..71473379768d27be1f09b9346d59a0aecceeaf94 --- /dev/null +++ b/docs/programming_guide/source_en/cache.md @@ -0,0 +1,5 @@ +# Single Node Data Cache + +No English version available right now, welcome to contribute. + + diff --git a/docs/programming_guide/source_en/context.md b/docs/programming_guide/source_en/context.md index 709e994e1e140c79786e5e2d62d14c7757f8e92b..20bd3e4633f479ace0e27a7abdddbe18ac85ecfd 100644 --- a/docs/programming_guide/source_en/context.md +++ b/docs/programming_guide/source_en/context.md @@ -118,13 +118,25 @@ The system can collect profiling data during training and use the profiling tool - `enable_profiling`: indicates whether to enable the profiling function. If this parameter is set to True, the profiling function is enabled, and profiling options are read from enable_options. If this parameter is set to False, the profiling function is disabled and only training_trace is collected. -- `profiling_options`: profiling collection options. The values are as follows. Multiple data items can be collected. training_trace: collects step trace data, that is, software information about training tasks and AI software stacks, to analyze the performance of training tasks. It focuses on data argumentation, forward and backward computation, and gradient aggregation update. task_trace: collects task trace data, that is, hardware information of the Ascend 910 processor HWTS/AICore and analysis of task start and end information. op_trace: collects performance data of a single operator. Format: ['op_trace','task_trace','training_trace'] +- `profiling_options`: profiling collection options. The values are as follows. Multiple data items can be collected. + result_path: saving the path of the profiling collection result file. The directory spectified by this parameter needs to be created in advance on the training environment (container or host side) and ensure that the running user configured during installation has read and write permissions. It supports the configuration of absolute or relative paths(relative to the current path when executing the command line). The absolute path configuration starts with '/', for example:/home/data/output. The relative path configuration directly starts with the directory name, for example:output; + training_trace: collect iterative trajectory data, that is, the training task and software information of the AI software stack, to achieve performance analysis of the training task, focusing on data enhancement, forward and backward calculation, gradient aggregation update and other related data. The value is on/off; + task_trace: collect task trajectory data, that is, the hardware information of the HWTS/AICore of the Ascend 910 processor, and analyze the information of beginning and ending of the task. The value is on/off; + aicpu_trace: collect profiling data enhanced by aicpu data. The value is on/off; + fp_point: specify the start position of the forward operator of the training network iteration trajectory, which is used to record the start timestamp of the forward calculation. The configuration value is the name of the first operator specified in the forward direction. when the value is empty, the system will automatically obtain the forward operator name; + bp_point: specify the end position of the iteration trajectory reversal operator of the training network, record the end timestamp of the backward calculation. The configuration value is the name of the operator after the specified reverse. when the value is empty, the system will automatically obtain the backward operator name; + ai_core_metrics: the values are as follows: + - ArithmeticUtilization: percentage statistics of various calculation indicators; + - PipeUtilization: the time-consuming ratio of calculation unit and handling unit, this item is the default value; + - Memory: percentage of external memory read and write instructions; + - MemoryL0: percentage of internal memory read and write instructions; + - ResourceConflictRatio: proportion of pipline queue instructions. A code example is as follows: ```python from mindspore import context -context.set_context(enable_profiling=True, profiling_options="training_trace") +context.set_context(enable_profiling=True, profiling_options='{"result_path":"/home/data/output","training_trace":"on"}') ``` ### Saving MindIR diff --git a/docs/programming_guide/source_en/data_pipeline.rst b/docs/programming_guide/source_en/data_pipeline.rst index 75d7846d2d8692dc3031b80737d5daaee0c487d4..0e52d9ddf432e0ea22730d34e8ccf448f617c014 100644 --- a/docs/programming_guide/source_en/data_pipeline.rst +++ b/docs/programming_guide/source_en/data_pipeline.rst @@ -11,3 +11,4 @@ Data Pipeline tokenizer dataset_conversion auto_augmentation + cache diff --git a/docs/programming_guide/source_en/probability.md b/docs/programming_guide/source_en/probability.md index 56aa7ea8333d8896d3f5a1740b304123ccf68ac7..f79546780c6f6e6fdd0a904c8f4c9a92aaadb8a6 100644 --- a/docs/programming_guide/source_en/probability.md +++ b/docs/programming_guide/source_en/probability.md @@ -361,23 +361,28 @@ mean_b = Tensor(1.0, dtype=mstype.float32) sd_b = Tensor(2.0, dtype=mstype.float32) kl = my_normal.kl_loss('Normal', mean_b, sd_b) +# get the distribution args as a tuple +dist_arg = my_normal.get_dist_args() + print("mean: ", mean) print("var: ", var) print("entropy: ", entropy) print("prob: ", prob) print("cdf: ", cdf) print("kl: ", kl) +print("dist_arg: ", dist_arg) ``` The output is as follows: ```python -mean: 0.0 -var: 1.0 -entropy: 1.4189385 -prob: [0.35206532, 0.3989423, 0.35206532] -cdf: [0.3085482, 0.5, 0.6914518] -kl: 0.44314718 +mean:  0.0 +var:  1.0 +entropy:  1.4189385 +prob:  [0.35206532 0.3989423  0.35206532] +cdf:  [0.30853754 0.5        0.69146246] +kl:  0.44314718 +dist_arg: (Tensor(shape=[], dtype=Float32, value= 0), Tensor(shape=[], dtype=Float32, value= 1)) ``` ### Probability Distribution Class Application in Graph Mode @@ -463,7 +468,7 @@ tx = Tensor(x, dtype=dtype.float32) cdf = LogNormal.cdf(tx) # generate samples from the distribution -shape = ((3, 2)) +shape = (3, 2) sample = LogNormal.sample(shape) # get information of the distribution @@ -473,26 +478,24 @@ print("underlying distribution:\n", LogNormal.distribution) print("bijector:\n", LogNormal.bijector) # get the computation results print("cdf:\n", cdf) -print("sample:\n", sample) +print("sample shape:\n", sample.shape) ``` The output is as follows: ```python TransformedDistribution< - (_bijector): Exp - (_distribution): Normal - > +  (_bijector): Exp +  (_distribution): Normal +  > underlying distribution: - Normal + Normal bijector: - Exp + Exp cdf: - [0.7558914 0.9462397 0.9893489] -sample: - [[ 3.451917 0.645654 ] - [ 0.86533326 1.2023963 ] - [ 2.3343778 11.053896 ]] + [0.7558914 0.9462397 0.9893489] +sample shape: +(3, 2) ``` When the `TransformedDistribution` is constructed to map the transformed `is_constant_jacobian = true` (for example, `ScalarAffine`), the constructed `TransformedDistribution` instance can use the `mean` API to calculate the average value. For example: @@ -544,15 +547,14 @@ x = np.array([2.0, 3.0, 4.0, 5.0]).astype(np.float32) tx = Tensor(x, dtype=dtype.float32) cdf, sample = net(tx) print("cdf: ", cdf) -print("sample: ", sample) +print("sample shape: ", sample.shape) ``` The output is as follows: ```python cdf: [0.7558914 0.86403143 0.9171715 0.9462397 ] -sample: [[0.5361498 0.26627186 2.766659 ] - [1.5831033 0.4096472 2.008679 ]] +sample shape: (2, 3) ``` ## Probability Distribution Mapping @@ -694,11 +696,11 @@ print("inverse_log_jacobian: ", inverse_log_jaco) The output is as follows: ```python -PowerTransform -forward: [2.23606801e+00, 2.64575124e+00, 3.00000000e+00, 3.31662488e+00] -inverse: [1.50000000e+00, 4.00000048e+00, 7.50000000e+00, 1.20000010e+01] -forward_log_jacobian: [-8.04718971e-01, -9.72955048e-01, -1.09861231e+00, -1.19894767e+00] -inverse_log_jacobian: [6.93147182e-01 1.09861231e+00 1.38629436e+00 1.60943794e+00] +PowerTransform +forward:  [2.236068  2.6457515 3.        3.3166249] +inverse:  [ 1.5       4.        7.5      12.000001] +forward_log_jacobian:  [-0.804719  -0.9729551 -1.0986123 -1.1989477] +inverse_log_jacobian:  [0.6931472 1.0986123 1.3862944 1.609438 ] ``` ### Invoking a Bijector Instance in Graph Mode @@ -740,10 +742,10 @@ print("inverse_log_jaco: ", inverse_log_jaco) The output is as follows: ```python -forward: [2.236068 2.6457512 3. 3.3166249] -inverse: [ 1.5 4.0000005 7.5 12.000001 ] -forward_log_jaco: [-0.804719 -0.97295505 -1.0986123 -1.1989477 ] -inverse_log_jaco: [0.6931472 1.0986123 1.3862944 1.609438 ] +forward:  [2.236068  2.6457515 3.        3.3166249] +inverse:  [ 1.5       4.        7.5      12.000001] +forward_log_jacobian:  [-0.804719  -0.9729551 -1.0986123 -1.1989477] +inverse_log_jacobian:  [0.6931472 1.0986123 1.3862944 1.609438 ] ``` ## Deep Probabilistic Network diff --git a/docs/programming_guide/source_zh_cn/context.md b/docs/programming_guide/source_zh_cn/context.md index 40723c9363079be090f52bad4e4ced5e2e7130e9..f694038a60b12d0e2e6386d2c9e13b23c67820f9 100644 --- a/docs/programming_guide/source_zh_cn/context.md +++ b/docs/programming_guide/source_zh_cn/context.md @@ -122,13 +122,25 @@ context.set_auto_parallel_context(parallel_mode=ParallelMode.AUTO_PARALLEL, grad - `enable_profiling`:是否开启profiling功能。设置为True,表示开启profiling功能,从enable_options读取profiling的采集选项;设置为False,表示关闭profiling功能,仅采集training_trace。 -- `profiling_options`:profiling采集选项,取值如下,支持采集多项数据。training_trace:采集迭代轨迹数据,即训练任务及AI软件栈的软件信息,实现对训练任务的性能分析,重点关注数据增强、前后向计算、梯度聚合更新等相关数据;task_trace:采集任务轨迹数据,即昇腾910处理器HWTS/AICore的硬件信息,分析任务开始、结束等信息;op_trace:采集单算子性能数据。 +- `profiling_options`:profiling采集选项,取值如下,支持采集多项数据。 + result_path: Profiling采集结果文件保存路径。该参数指定的目录需要在启动训练的环境上(容器或Host侧)提前创建且确保安装时配置的运行用户具有读写权限,支持配置绝对路径或相对路径(相对执行命令时的当前路径); + training_trace:采集迭代轨迹数据,即训练任务及AI软件栈的软件信息,实现对训练任务的性能分析,重点关注数据增强、前后向计算、梯度聚合更新等相关数据,取值on/off。 + task_trace:采集任务轨迹数据,即昇腾910处理器HWTS/AICore的硬件信息,分析任务开始、结束等信息,取值on/off; + aicpu_trace: 采集aicpu数据增强的profiling数据。取值on/off; + fp_point: training_trace为on时需要配置。指定训练网络迭代轨迹正向算子的开始位置,用于记录前向算子开始时间戳。配置值为指定的正向第一个算子名字。当该值为空时,系统自动获取正向第一个算子名字; + bp_point: training_trace为on时需要配置。指定训练网络迭代轨迹反向算子的结束位置,用于记录反向算子结束时间戳。配置值为指定的反向最后一个算子名字。当该值为空时,系统自动获取反向最后一个算子名字; + ai_core_metrics: 取值如下: + - ArithmeticUtilization: 各种计算类指标占比统计。 + - PipeUtilization: 计算单元和搬运单元耗时占比,该项为默认值。 + - Memory: 外部内存读写类指令占比。 + - MemoryL0: 内部内存读写类指令占比。 + - ResourceConflictRatio: 流水线队列类指令占比。 代码样例如下: ```python from mindspore import context -context.set_context(enable_profiling=True, profiling_options="training_trace") +context.set_context(enable_profiling=True, profiling_options= '{"result_path":"/home/data/output","training_trace":"on"}') ``` ### 保存MindIR diff --git a/docs/programming_guide/source_zh_cn/probability.md b/docs/programming_guide/source_zh_cn/probability.md index ea6cd8e22217e580648faa1be465ab41cd1c9e20..dafa7d36343456d38b09d822b4bcd1b57934a4c9 100644 --- a/docs/programming_guide/source_zh_cn/probability.md +++ b/docs/programming_guide/source_zh_cn/probability.md @@ -361,23 +361,28 @@ mean_b = Tensor(1.0, dtype=mstype.float32) sd_b = Tensor(2.0, dtype=mstype.float32) kl = my_normal.kl_loss('Normal', mean_b, sd_b) +# get the distribution args as a tuple +dist_arg = my_normal.get_dist_args() + print("mean: ", mean) print("var: ", var) print("entropy: ", entropy) print("prob: ", prob) print("cdf: ", cdf) print("kl: ", kl) +print("dist_arg: ", dist_arg) ``` 输出为: ```text -mean: 0.0 -var: 1.0 -entropy: 1.4189385 -prob: [0.35206532, 0.3989423, 0.35206532] -cdf: [0.3085482, 0.5, 0.6914518] -kl: 0.44314718 +mean:  0.0 +var:  1.0 +entropy:  1.4189385 +prob:  [0.35206532 0.3989423  0.35206532] +cdf:  [0.30853754 0.5        0.69146246] +kl:  0.44314718 +dist_arg: (Tensor(shape=[], dtype=Float32, value= 0), Tensor(shape=[], dtype=Float32, value= 1)) ``` ### 概率分布类在图模式下的应用 @@ -465,7 +470,7 @@ tx = Tensor(x, dtype=dtype.float32) cdf = LogNormal.cdf(tx) # generate samples from the distribution -shape = ((3, 2)) +shape = (3, 2) sample = LogNormal.sample(shape) # get information of the distribution @@ -475,26 +480,24 @@ print("underlying distribution:\n", LogNormal.distribution) print("bijector:\n", LogNormal.bijector) # get the computation results print("cdf:\n", cdf) -print("sample:\n", sample) +print("sample shape:\n", sample.shape) ``` 输出为: ```text TransformedDistribution< - (_bijector): Exp - (_distribution): Normal - > +  (_bijector): Exp +  (_distribution): Normal +  > underlying distribution: -Normal -bijector -Exp + Normal +bijector: + Exp cdf: -[7.55891383e-01, 9.46239710e-01, 9.89348888e-01] -sample: -[[7.64315844e-01, 3.01435232e-01], - [1.17166102e+00, 2.60277224e+00], - [7.02699006e-01, 3.91564220e-01]] + [0.7558914 0.9462397 0.9893489] +sample shape: +(3, 2) ``` 当构造 `TransformedDistribution` 映射变换的 `is_constant_jacobian = true` 时(如 `ScalarAffine`),构造的 `TransformedDistribution` 实例可以使用直接使用 `mean` 接口计算均值,例如: @@ -546,15 +549,14 @@ x = np.array([2.0, 3.0, 4.0, 5.0]).astype(np.float32) tx = Tensor(x, dtype=dtype.float32) cdf, sample = net(tx) print("cdf: ", cdf) -print("sample: ", sample) +print("sample shape: ", sample.shape) ``` 输出为: ```text cdf: [0.7558914 0.86403143 0.9171715 0.9462397 ] -sample: [[0.5361498 0.26627186 2.766659 ] - [1.5831033 0.4096472 2.008679 ]] +sample shape: (2, 3) ``` ## 概率分布映射 @@ -695,11 +697,11 @@ print("inverse_log_jacobian: ", inverse_log_jaco) 输出: ```text -PowerTransform -forward: [2.23606801e+00, 2.64575124e+00, 3.00000000e+00, 3.31662488e+00] -inverse: [1.50000000e+00, 4.00000048e+00, 7.50000000e+00, 1.20000010e+01] -forward_log_jacobian: [-8.04718971e-01, -9.72955048e-01, -1.09861231e+00, -1.19894767e+00] -inverse_log_jacobian: [6.93147182e-01 1.09861231e+00 1.38629436e+00 1.60943794e+00] +PowerTransform +forward:  [2.236068  2.6457515 3.        3.3166249] +inverse:  [ 1.5       4.        7.5      12.000001] +forward_log_jacobian:  [-0.804719  -0.9729551 -1.0986123 -1.1989477] +inverse_log_jacobian:  [0.6931472 1.0986123 1.3862944 1.609438 ] ``` ### 图模式下调用Bijector实例 @@ -741,10 +743,10 @@ print("inverse_log_jacobian: ", inverse_log_jaco) 输出为: ```text -forward: [2.236068 2.6457515 3. 3.3166249] -inverse: [ 1.5 4. 7.5 12.000001] -forward_log_jacobian: [-0.804719 -0.9729551 -1.0986123 -1.1989477] -inverse_log_jacobian: [0.6931472 1.0986123 1.3862944 1.609438 ] +forward:  [2.236068  2.6457515 3.        3.3166249] +inverse:  [ 1.5       4.        7.5      12.000001] +forward_log_jacobian:  [-0.804719  -0.9729551 -1.0986123 -1.1989477] +inverse_log_jacobian:  [0.6931472 1.0986123 1.3862944 1.609438 ] ``` ## 深度概率网络 diff --git a/install/mindspore_ascend310_install_pip.md b/install/mindspore_ascend310_install_pip.md index 31b43e5a4adba665e47ceb3472b3abad3126fc12..96d77a6d6a3b58060d935d2de213b005e7a1cb4b 100644 --- a/install/mindspore_ascend310_install_pip.md +++ b/install/mindspore_ascend310_install_pip.md @@ -43,7 +43,7 @@ pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/{version}/MindSp 其中: - 在联网状态下,安装whl包时会自动下载MindSpore安装包的依赖项(依赖项详情参见[requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),其余情况需自行安装。 -- `{version}`表示MindSpore版本号,例如下载1.0.1版本MindSpore时,`{version}`应写为1.0.1。 +- `{version}`表示MindSpore版本号,例如安装1.1.0版本MindSpore时,`{version}`应写为1.1.0。 - `{arch}`表示系统架构,例如使用的Linux系统是x86架构64位时,`{arch}`应写为`x86_64`。如果系统是ARM架构64位,则写为`aarch64`。 - `{system}`表示系统版本,例如使用的欧拉系统ARM架构,`{system}`应写为`euleros_aarch64`,目前Ascend 310版本可支持以下系统`euleros_aarch64`/`centos_aarch64`/`centos_x86`/`ubuntu_aarch64`/`ubuntu_x86`。 diff --git a/install/mindspore_ascend310_install_pip_en.md b/install/mindspore_ascend310_install_pip_en.md index 340b98c4c12345e8591ad23e8af0046c8a36a891..ee759e8e06103b39bcbbc137c5e8efa061431948 100644 --- a/install/mindspore_ascend310_install_pip_en.md +++ b/install/mindspore_ascend310_install_pip_en.md @@ -1 +1,121 @@ -# Installing MindSpore in Ascend 310 by pip +# Installing MindSpore in Ascend 310 by pip + + + +- [Installing MindSpore in Ascend 310 by pip](#installing-mindspore-in-ascend-310-by-pip) + - [Checking System Environment Information](#checking-system-environment-information) + - [Installing MindSpore](#installing-mindspore) + - [Configuring Environment Variables](#configuring-environment-variables) + - [Verifying the Installation](#verifying-the-installation) + - [Installing MindSpore Serving](#installing-mindspore-serving) + + + + + +The following describes how to quickly install MindSpore by pip on Linux in the Ascend 310 environment. + +## Checking System Environment Information + +- Ensure that the 64-bit Ubuntu 18.04, CentOS 7.6, or EulerOS 2.8 is installed. +- Ensure that [GCC 7.3.0](http://ftp.gnu.org/gnu/gcc/gcc-7.3.0/gcc-7.3.0.tar.gz) is installed. +- Ensure that [GMP 6.1.2](https://gmplib.org/download/gmp/gmp-6.1.2.tar.xz) is installed. +- Ensure that [CMake 3.18.3 or later](https://cmake.org/download/) is installed. + - After installation, add the path of CMake to the system environment variables. +- Ensure that Python 3.7.5 is installed. + - If Python 3.7.5 (64-bit) is not installed, download it from the [Python official website](https://www.python.org/ftp/python/3.7.5/Python-3.7.5.tgz) or [HUAWEI CLOUD](https://mirrors.huaweicloud.com/python/3.7.5/Python-3.7.5.tgz) and install it. +- Ensure that the Ascend 310 AI Processor software packages (Atlas Data Center Solution V100R020C10: [A300-3000 1.0.7.SPC103 (aarch64)](https://support.huawei.com/enterprise/en/ascend-computing/a300-3000-pid-250702915/software/251999079?idAbsPath=fixnode01%7C23710424%7C251366513%7C22892968%7C250702915), [A300-3010 1.0.7.SPC103 (x86_64)](https://support.huawei.com/enterprise/en/ascend-computing/a300-3010-pid-251560253/software/251894987?idAbsPath=fixnode01%7C23710424%7C251366513%7C22892968%7C251560253), [CANN V100R020C10](https://support.huawei.com/enterprise/en/ascend-computing/cann-pid-251168373/software/251174283?idAbsPath=fixnode01%7C23710424%7C251366513%7C22892968%7C251168373)) are installed. + - Ensure that you have permissions to access the installation path `/usr/local/Ascend` of the Ascend 310 AI Processor software package. If not, ask the user root to add you to a user group to which `/usr/local/Ascend` belongs. For details about the configuration, see the description document in the software package. + - Ensure that the Ascend 310 AI Processor software package that matches GCC 7.3 is installed. + - Install the .whl package provided with the Ascend 310 AI Processor software package. The .whl package is released with the software package. After the software package is upgraded, you need to reinstall the .whl package. + + ```bash + pip install /usr/local/Ascend/atc/lib64/topi-{version}-py3-none-any.whl + pip install /usr/local/Ascend/atc/lib64/te-{version}-py3-none-any.whl + ``` + +## Installing MindSpore + +```bash +pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/{version}/MindSpore/ascend/{system}/mindspore_ascend-{version}-cp37-cp37m-linux_{arch}.whl --trusted-host ms-release.obs.cn-north-4.myhuaweicloud.com -i https://pypi.tuna.tsinghua.edu.cn/simple +``` + +In the preceding information: + +- When the network is connected, dependencies of the MindSpore installation package are automatically downloaded during the .whl package installation. For details about dependencies, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt). In other cases, install the dependencies by yourself. +- `{version}` specifies the MindSpore version number. For example, when installing MindSpore 1.1.0, set `{version}` to 1.1.0. +- `{arch}` specifies the system architecture. For example, if a Linux OS architecture is x86_64, set `{arch}` to `x86_64`. If the system architecture is ARM64, set `{arch}` to `aarch64`. +- `{system}` specifies the system version. For example, if EulerOS ARM64 is used, set `{system}` to `euleros_aarch64`. Currently, Ascend 310 supports the following systems: `euleros_aarch64`, `centos_aarch64`, `centos_x86`, `ubuntu_aarch64`, and `ubuntu_x86`. + +## Configuring Environment Variables + +After MindSpore is installed, export runtime environment variables. In the following command, `/usr/local/Ascend` in `LOCAL_ASCEND=/usr/local/Ascend` indicates the installation path of the software package. Change it to the actual installation path. + +```bash +# control log level. 0-DEBUG, 1-INFO, 2-WARNING, 3-ERROR, default level is WARNING. +export GLOG_v=2 + +# Conda environmental options +LOCAL_ASCEND=/usr/local/Ascend # the root directory of run package + +# lib libraries that the run package depends on +export LD_LIBRARY_PATH=${LOCAL_ASCEND}/ascend-toolkit/latest/acllib/lib64:${LOCAL_ASCEND}/ascend-toolkit/latest/atc/lib64:${LOCAL_ASCEND}/driver/lib64:${LOCAL_ASCEND}/opp/op_impl/built-in/ai_core/tbe/op_tiling:${LD_LIBRARY_PATH} + +# lib libraries that the mindspore depends on +export LD_LIBRARY_PATH=`pip3 show mindspore-ascend | grep Location | awk '{print $2"/mindspore/lib"}' | xargs realpath`:${LD_LIBRARY_PATH} + +# Environment variables that must be configured +export TBE_IMPL_PATH=${LOCAL_ASCEND}/ascend-toolkit/latest/opp/op_impl/built-in/ai_core/tbe # TBE operator implementation tool path +export ASCEND_OPP_PATH=${LOCAL_ASCEND}/ascend-toolkit/latest/opp # OPP path +export PATH=${LOCAL_ASCEND}/ascend-toolkit/latest/fwkacllib/ccec_compiler/bin/:${PATH} # TBE operator compilation tool path +export PYTHONPATH=${TBE_IMPL_PATH}:${PYTHONPATH} # Python library that TBE implementation depends on +``` + +## Verifying the Installation + +Create a directory to store the sample code project, for example, `/home/HwHiAiUser/Ascend/ascend-toolkit/20.0.RC1/acllib_linux.arm64/sample/acl_execute_model/ascend310_single_op_sample`. You can obtain the code from the [official website](https://obs.dualstack.cn-north-4.myhuaweicloud.com/mindspore-website/sample_resources/ascend310_single_op_sample.zip). A simple example of adding `[1, 2, 3, 4]` to `[2, 3, 4, 5]` is used and the code project directory structure is as follows: + +```text + +└─ascend310_single_op_sample + ├── CMakeLists.txt // Build script + ├── README.md // Usage description + ├── main.cc // Main function + └── tensor_add.mindir // MindIR model file +``` + +Go to the directory of the sample project and change the path based on the actual requirements. + +```bash +cd /home/HwHiAiUser/Ascend/ascend-toolkit/20.0.RC1/acllib_linux.arm64/sample/acl_execute_model/ascend310_single_op_sample +``` + +Build a project by referring to `README.md`. + +```bash +cmake . -DMINDSPORE_PATH=`pip3 show mindspore-ascend | grep Location | awk '{print $2"/mindspore"}' | xargs realpath` +make +``` + +After the build is successful, execute the case. + +```bash +./tensor_add_sample +``` + +The following information is displayed: + +```text +3 +5 +7 +9 +``` + +The preceding information indicates that MindSpore is successfully installed. + +## Installing MindSpore Serving + +If you want to quickly experience the MindSpore online inference service, you can install MindSpore Serving. + +For details, see [MindSpore Serving](https://gitee.com/mindspore/serving/blob/master/README.md). diff --git a/install/mindspore_ascend310_install_source.md b/install/mindspore_ascend310_install_source.md index 1eef13b06700ddd4e4d36f254fbb6d1adb82a450..f3cb418e881abcb992ed6364c2f4218654a4f6e6 100644 --- a/install/mindspore_ascend310_install_source.md +++ b/install/mindspore_ascend310_install_source.md @@ -76,7 +76,7 @@ pip install output/mindspore-ascend-{version}-cp37-cp37m-linux_{arch}.whl -i htt 其中: - 在联网状态下,安装whl包时会自动下载MindSpore安装包的依赖项(依赖项详情参见[requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),其余情况需自行安装。 -- `{version}`表示MindSpore版本号,例如下载1.0.1版本MindSpore时,`{version}`应写为1.0.1。 +- `{version}`表示MindSpore版本号,例如安装1.1.0版本MindSpore时,`{version}`应写为1.1.0。 - `{arch}`表示系统架构,例如使用的Linux系统是x86架构64位时,`{arch}`应写为`x86_64`。如果系统是ARM架构64位,则写为`aarch64`。 ## 配置环境变量 diff --git a/install/mindspore_ascend310_install_source_en.md b/install/mindspore_ascend310_install_source_en.md index 4827c91e89727c6d3cc3d430ecf786c06fc0fb1a..316cf1f3695bad4af5b7dabaa7048c593539496a 100644 --- a/install/mindspore_ascend310_install_source_en.md +++ b/install/mindspore_ascend310_install_source_en.md @@ -1 +1,153 @@ -# Installing MindSpore in Ascend 310 by Source Code +# Installing MindSpore in Ascend 310 by Source Code Compilation + + + +- [Installing MindSpore in Ascend 310 by Source Code Compilation](#installing-mindspore-in-ascend-310-by-source-code-compilation) + - [Checking System Environment Information](#checking-system-environment-information) + - [Downloading Source Code from the Code Repository](#downloading-source-code-from-the-code-repository) + - [Building MindSpore](#building-mindspore) + - [Installing MindSpore](#installing-mindspore) + - [Configuring Environment Variables](#configuring-environment-variables) + - [Verifying the Installation](#verifying-the-installation) + - [Installing MindSpore Serving](#installing-mindspore-serving) + + + + + +The following describes how to quickly install MindSpore by compiling the source code on Linux in the Ascend 310 environment. + +## Checking System Environment Information + +- Ensure that the 64-bit Ubuntu 18.04, CentOS 7.6, or EulerOS 2.8 is installed. +- Ensure that [GCC 7.3.0](http://ftp.gnu.org/gnu/gcc/gcc-7.3.0/gcc-7.3.0.tar.gz) is installed. +- Ensure that [GMP 6.1.2](https://gmplib.org/download/gmp/gmp-6.1.2.tar.xz) is installed. +- Ensure that [Python 3.7.5](https://www.python.org/ftp/python/3.7.5/Python-3.7.5.tgz) is installed. +- Ensure that [OpenSSL 1.1.1 or later](https://github.com/openssl/openssl.git) is installed. + - After installation, set the environment variable `export OPENSSL_ROOT_DIR= "OpenSSL installation directory"`. +- Ensure that [CMake 3.18.3 or later](https://cmake.org/download/) is installed. + - After installation, add the path of CMake to the system environment variables. +- Ensure that [patch 2.5 or later](http://ftp.gnu.org/gnu/patch/) is installed. + - After installation, add the patch path to the system environment variables. +- Ensure that [wheel 0.32.0 or later](https://pypi.org/project/wheel/) is installed. +- Ensure that the Ascend 310 AI Processor software packages (Atlas Data Center Solution V100R020C10: [A300-3000 1.0.7.SPC103 (aarch64)](https://support.huawei.com/enterprise/en/ascend-computing/a300-3000-pid-250702915/software/251999079?idAbsPath=fixnode01%7C23710424%7C251366513%7C22892968%7C250702915), [A300-3010 1.0.7.SPC103 (x86_64)](https://support.huawei.com/enterprise/en/ascend-computing/a300-3010-pid-251560253/software/251894987?idAbsPath=fixnode01%7C23710424%7C251366513%7C22892968%7C251560253), [CANN V100R020C10](https://support.huawei.com/enterprise/en/ascend-computing/cann-pid-251168373/software/251174283?idAbsPath=fixnode01%7C23710424%7C251366513%7C22892968%7C251168373)) are installed. + - Ensure that you have permissions to access the installation path `/usr/local/Ascend` of the Ascend 310 AI Processor software package. If not, ask the user root to add you to a user group to which `/usr/local/Ascend` belongs. For details about the configuration, see the description document in the software package. + - Ensure that the Ascend 310 AI Processor software package that matches GCC 7.3 is installed. + - Install the .whl package provided with the Ascend 310 AI Processor software package. The .whl package is released with the software package. After the software package is upgraded, you need to reinstall the .whl package. + + ```bash + pip install /usr/local/Ascend/atc/lib64/topi-{version}-py3-none-any.whl + pip install /usr/local/Ascend/atc/lib64/te-{version}-py3-none-any.whl + ``` + +- Ensure that the git tool is installed. + If not, run the following command to download and install it: + + ```bash + apt-get install git # ubuntu and so on + yum install git # centos and so on + ``` + +## Downloading Source Code from the Code Repository + +```bash +git clone https://gitee.com/mindspore/mindspore.git +``` + +## Building MindSpore + +Run the following command in the root directory of the source code. + +```bash +bash build.sh -e ascend -V 310 +``` + +In the preceding information: + +The default number of build threads is 8 in `build.sh`. If the compiler performance is poor, build errors may occur. You can add -j{Number of threads} to script to reduce the number of threads. For example, `bash build.sh -e ascend -V 310 -j4`. + +## Installing MindSpore + +```bash +chmod +x output/mindspore-ascend-{version}-cp37-cp37m-linux_{arch}.whl +pip install output/mindspore-ascend-{version}-cp37-cp37m-linux_{arch}.whl -i https://pypi.tuna.tsinghua.edu.cn/simple +``` + +In the preceding information: + +- When the network is connected, dependencies of the MindSpore installation package are automatically downloaded during the .whl package installation. For details about dependencies, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt). In other cases, install the dependencies by yourself. +- `{version}` specifies the MindSpore version number. For example, when installing MindSpore 1.1.0, set `{version}` to 1.1.0. +- `{arch}` specifies the system architecture. For example, if a Linux OS architecture is x86_64, set `{arch}` to `x86_64`. If the system architecture is ARM64, set `{arch}` to `aarch64`. + +## Configuring Environment Variables + +After MindSpore is installed, export runtime environment variables. In the following command, `/usr/local/Ascend` in `LOCAL_ASCEND=/usr/local/Ascend` indicates the installation path of the software package. Change it to the actual installation path. + +```bash +# control log level. 0-DEBUG, 1-INFO, 2-WARNING, 3-ERROR, default level is WARNING. +export GLOG_v=2 + +# Conda environmental options +LOCAL_ASCEND=/usr/local/Ascend # the root directory of run package + +# lib libraries that the run package depends on +export LD_LIBRARY_PATH=${LOCAL_ASCEND}/ascend-toolkit/latest/acllib/lib64:${LOCAL_ASCEND}/ascend-toolkit/latest/atc/lib64:${LOCAL_ASCEND}/driver/lib64:${LOCAL_ASCEND}/opp/op_impl/built-in/ai_core/tbe/op_tiling:${LD_LIBRARY_PATH} + +# lib libraries that the mindspore depends on +export LD_LIBRARY_PATH=`pip3 show mindspore-ascend | grep Location | awk '{print $2"/mindspore/lib"}' | xargs realpath`:${LD_LIBRARY_PATH} + +# Environment variables that must be configured +export TBE_IMPL_PATH=${LOCAL_ASCEND}/ascend-toolkit/latest/opp/op_impl/built-in/ai_core/tbe # TBE operator implementation tool path +export ASCEND_OPP_PATH=${LOCAL_ASCEND}/ascend-toolkit/latest/opp # OPP path +export PATH=${LOCAL_ASCEND}/ascend-toolkit/latest/fwkacllib/ccec_compiler/bin/:${PATH} # TBE operator compilation tool path +export PYTHONPATH=${TBE_IMPL_PATH}:${PYTHONPATH} # Python library that TBE implementation depends on +``` + +## Verifying the Installation + +Create a directory to store the sample code project, for example, `/home/HwHiAiUser/Ascend/ascend-toolkit/20.0.RC1/acllib_linux.arm64/sample/acl_execute_model/ascend310_single_op_sample`. You can obtain the code from the [official website](https://obs.dualstack.cn-north-4.myhuaweicloud.com/mindspore-website/sample_resources/ascend310_single_op_sample.zip). A simple example of adding `[1, 2, 3, 4]` to `[2, 3, 4, 5]` is used and the code project directory structure is as follows: + +```text + +└─ascend310_single_op_sample + ├── CMakeLists.txt // Build script + ├── README.md // Usage description + ├── main.cc // Main function + └── tensor_add.mindir // MindIR model file +``` + +Go to the directory of the sample project and change the path based on the actual requirements. + +```bash +cd /home/HwHiAiUser/Ascend/ascend-toolkit/20.0.RC1/acllib_linux.arm64/sample/acl_execute_model/ascend310_single_op_sample +``` + +Build a project by referring to `README.md`. + +```bash +cmake . -DMINDSPORE_PATH=`pip3 show mindspore-ascend | grep Location | awk '{print $2"/mindspore"}' | xargs realpath` +make +``` + +After the build is successful, execute the case. + +```bash +./tensor_add_sample +``` + +The following information is displayed: + +```text +3 +5 +7 +9 +``` + +The preceding information indicates that MindSpore is successfully installed. + +## Installing MindSpore Serving + +If you want to quickly experience the MindSpore online inference service, you can install MindSpore Serving. + +For details, see [MindSpore Serving](https://gitee.com/mindspore/serving/blob/master/README.md). diff --git a/install/mindspore_ascend_install_conda.md b/install/mindspore_ascend_install_conda.md index 9a51ce0891cb3ffc7395657bb12d96f819549107..6cce0a62dec163f98b87f6c108bf25fad077d4fc 100644 --- a/install/mindspore_ascend_install_conda.md +++ b/install/mindspore_ascend_install_conda.md @@ -71,7 +71,7 @@ pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/{version}/MindSp 其中: - 在联网状态下,安装whl包时会自动下载MindSpore安装包的依赖项(依赖项详情参见[requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),其余情况需自行安装。 -- `{version}`表示MindSpore版本号,例如下载1.0.1版本MindSpore时,`{version}`应写为1.0.1。 +- `{version}`表示MindSpore版本号,例如安装1.1.0版本MindSpore时,`{version}`应写为1.1.0。 - `{arch}`表示系统架构,例如使用的系统是x86架构64位时,`{arch}`应写为`x86_64`。如果系统是ARM架构64位,则写为`aarch64`。 - `{system}`表示系统,例如使用的欧拉系统ARM架构,`{system}`应写为`euleros_aarch64`,目前可支持以下系统`euleros_aarch64`/`euleros_x86`/`centos_aarch64`/`centos_x86`/`ubuntu_aarch64`/`ubuntu_x86`。 diff --git a/install/mindspore_ascend_install_pip.md b/install/mindspore_ascend_install_pip.md index bab51e4049fcfcfd926b85970e3ed9e01d102752..61b77dc40244ebd2748285708cfe44d5b841edf0 100644 --- a/install/mindspore_ascend_install_pip.md +++ b/install/mindspore_ascend_install_pip.md @@ -45,7 +45,7 @@ pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/{version}/MindSp 其中: - 在联网状态下,安装whl包时会自动下载MindSpore安装包的依赖项(依赖项详情参见[requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),其余情况需自行安装。 -- `{version}`表示MindSpore版本号,例如下载1.0.1版本MindSpore时,`{version}`应写为1.0.1。 +- `{version}`表示MindSpore版本号,例如安装1.1.0版本MindSpore时,`{version}`应写为1.1.0。 - `{arch}`表示系统架构,例如使用的Linux系统是x86架构64位时,`{arch}`应写为`x86_64`。如果系统是ARM架构64位,则写为`aarch64`。 - `{system}`表示系统版本,例如使用的欧拉系统ARM架构,`{system}`应写为`euleros_aarch64`,目前Ascend版本可支持以下系统`euleros_aarch64`/`euleros_x86`/`centos_aarch64`/`centos_x86`/`ubuntu_aarch64`/`ubuntu_x86`。 diff --git a/install/mindspore_ascend_install_pip_en.md b/install/mindspore_ascend_install_pip_en.md index 60d48e276d24dab98436b75084a740bbdec1e900..65e78e316cabac39c2bcc5fffbec7316c6c4edfe 100644 --- a/install/mindspore_ascend_install_pip_en.md +++ b/install/mindspore_ascend_install_pip_en.md @@ -45,7 +45,7 @@ pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/{version}/MindSp Of which, - When the network is connected, dependency items are automatically downloaded during .whl package installation. (For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)). In other cases, you need to manually install dependency items. -- `{version}` denotes the version of MindSpore. For example, when you are downloading MindSpore 1.0.1, `{version}` should be 1.0.1. +- `{version}` denotes the version of MindSpore. For example, when you are installing MindSpore 1.1.0, `{version}` should be 1.1.0. - `{arch}` denotes the system architecture. For example, the Linux system you are using is x86 architecture 64-bit, `{arch}` should be `x86_64`. If the system is ARM architecture 64-bit, then it should be `aarch64`. - `{system}` denotes the system version. For example, if you are using EulerOS ARM architecture, `{system}` should be `euleros_aarch64`. Currently, the following systems are supported by Ascend: `euleros_aarch64`/`euleros_x86`/`centos_x86`/`ubuntu_aarch64`/`ubuntu_x86`. diff --git a/install/mindspore_ascend_install_source.md b/install/mindspore_ascend_install_source.md index 721e2ea9e71a91ecda3d962a6c7447b9490cede0..2002c68cdf95ef109983d857f1fd39b186232372 100644 --- a/install/mindspore_ascend_install_source.md +++ b/install/mindspore_ascend_install_source.md @@ -98,7 +98,7 @@ pip install build/package/mindspore_ascend-{version}-cp37-cp37m-linux_{arch}.whl 其中: - 在联网状态下,安装whl包时会自动下载MindSpore安装包的依赖项(依赖项详情参见[requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),其余情况需自行安装。 -- `{version}`表示MindSpore版本号,例如下载1.0.1版本MindSpore时,`{version}`应写为1.0.1。 +- `{version}`表示MindSpore版本号,例如安装1.1.0版本MindSpore时,`{version}`应写为1.1.0。 - `{arch}`表示系统架构,例如使用的Linux系统是x86架构64位时,`{arch}`应写为`x86_64`。如果系统是ARM架构64位,则写为`aarch64`。 ## 配置环境变量 diff --git a/install/mindspore_ascend_install_source_en.md b/install/mindspore_ascend_install_source_en.md index 7135e8f030b7cb562c0c42be061a1746cef504e8..2505df2eec37373c8e23c4296b51906beecaf22b 100644 --- a/install/mindspore_ascend_install_source_en.md +++ b/install/mindspore_ascend_install_source_en.md @@ -100,7 +100,7 @@ pip install build/package/mindspore_ascend-{version}-cp37-cp37m-linux_{arch}.whl Of which, - When the network is connected, dependency items are automatically downloaded during .whl package installation. (For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)). In other cases, you need to manually install dependency items. -- `{version}` denotes the version of MindSpore. For example, when you are downloading MindSpore 1.0.1, `{version}` should be 1.0.1. +- `{version}` denotes the version of MindSpore. For example, when you are installing MindSpore 1.1.0, `{version}` should be 1.1.0. - `{arch}` denotes the system architecture. For example, the Linux system you are using is x86 architecture 64-bit, `{arch}` should be `x86_64`. If the system is ARM architecture 64-bit, then it should be `aarch64`. ## Configuring Environment Variables diff --git a/install/mindspore_cpu_install_conda.md b/install/mindspore_cpu_install_conda.md index 27d191004f359d55ac46008f27ba5c2d07ca7e9f..3abc0b48593f21065734786290aff139d54d3523 100644 --- a/install/mindspore_cpu_install_conda.md +++ b/install/mindspore_cpu_install_conda.md @@ -58,7 +58,7 @@ pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/{version}/MindSp 其中: - 在联网状态下,安装whl包时会自动下载MindSpore安装包的依赖项(依赖项详情参见[requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),其余情况需自行安装。 -- `{version}`表示MindSpore版本号,例如下载1.0.1版本MindSpore时,`{version}`应写为1.0.1。 +- `{version}`表示MindSpore版本号,例如安装1.1.0版本MindSpore时,`{version}`应写为1.1.0。 - `{arch}`表示系统架构,例如使用的Linux系统是x86架构64位时,`{arch}`应写为`x86_64`。如果系统是ARM架构64位,则写为`aarch64`。 - `{system}`表示系统,例如使用的Ubuntu系统X86架构,`{system}`应写为`ubuntu_x86`,目前CPU版本可支持以下系统`ubuntu_aarch64`/`ubuntu_x86`。 diff --git a/install/mindspore_cpu_install_pip.md b/install/mindspore_cpu_install_pip.md index 80d2d21c99702197a87a86c105350639924786e2..a7abd6c15f149bd5c0b22a05376901a0390b9f7a 100644 --- a/install/mindspore_cpu_install_pip.md +++ b/install/mindspore_cpu_install_pip.md @@ -32,7 +32,7 @@ pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/{version}/MindSp 其中: - 在联网状态下,安装whl包时会自动下载MindSpore安装包的依赖项(依赖项详情参见[requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),其余情况需自行安装。 -- `{version}`表示MindSpore版本号,例如下载1.0.1版本MindSpore时,`{version}`应写为1.0.1。 +- `{version}`表示MindSpore版本号,例如安装1.1.0版本MindSpore时,`{version}`应写为1.1.0。 - `{arch}`表示系统架构,例如使用的Linux系统是x86架构64位时,`{arch}`应写为`x86_64`。如果系统是ARM架构64位,则写为`aarch64`。 - `{system}`表示系统,例如使用的Ubuntu系统X86架构,`{system}`应写为`ubuntu_x86`,目前CPU版本可支持以下系统`ubuntu_aarch64`/`ubuntu_x86`。 diff --git a/install/mindspore_cpu_install_pip_en.md b/install/mindspore_cpu_install_pip_en.md index 1174728459384c640ee89d22b9577790d320136e..7aa8c70d5c06c947c90b4e3bd38b3eefb31d8a11 100644 --- a/install/mindspore_cpu_install_pip_en.md +++ b/install/mindspore_cpu_install_pip_en.md @@ -32,7 +32,7 @@ pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/{version}/MindSp Of which, - When the network is connected, dependency items are automatically downloaded during .whl package installation. (For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)). In other cases, you need to manually install dependency items. -- `{version}` denotes the version of MindSpore. For example, when you are downloading MindSpore 1.0.1, `{version}` should be 1.0.1. +- `{version}` denotes the version of MindSpore. For example, when you are installing MindSpore 1.1.0, `{version}` should be 1.1.0. - `{arch}` denotes the system architecture. For example, the Linux system you are using is x86 architecture 64-bit, `{arch}` should be `x86_64`. If the system is ARM architecture 64-bit, then it should be `aarch64`. - `{system}` denotes the system version. For example, if you are using Ubuntu x86 architecture, `{system}` should be `ubuntu_x86`. Currently, the following systems are supported by CPU: `ubuntu_aarch64`/`ubuntu_x86`. diff --git a/install/mindspore_cpu_install_source.md b/install/mindspore_cpu_install_source.md index 2e26d4cc416bd423bf36e7f367895816a2043062..5133d9abde16ef728033b4bc0628fc5d8efe0e29 100644 --- a/install/mindspore_cpu_install_source.md +++ b/install/mindspore_cpu_install_source.md @@ -71,7 +71,7 @@ pip install build/package/mindspore-{version}-cp37-cp37m-linux_{arch}.whl -i htt 其中: - 在联网状态下,安装whl包时会自动下载MindSpore安装包的依赖项(依赖项详情参见[requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),其余情况需自行安装。 -- `{version}`表示MindSpore版本号,例如下载1.0.1版本MindSpore时,`{version}`应写为1.0.1。 +- `{version}`表示MindSpore版本号,例如安装1.1.0版本MindSpore时,`{version}`应写为1.1.0。 - `{arch}`表示系统架构,例如使用的Linux系统是x86架构64位时,`{arch}`应写为`x86_64`。如果系统是ARMv8架构64位,则写为`aarch64`。 ## 验证安装是否成功 diff --git a/install/mindspore_cpu_install_source_en.md b/install/mindspore_cpu_install_source_en.md index 80957491277cf9a1d2ae5c217121e139c61d9d78..94280b766b71d9da2882ae15f9a7a8444e90fd2b 100644 --- a/install/mindspore_cpu_install_source_en.md +++ b/install/mindspore_cpu_install_source_en.md @@ -72,7 +72,7 @@ pip install build/package/mindspore-{version}-cp37-cp37m-linux_{arch}.whl -i htt Of which, - When the network is connected, dependency items are automatically downloaded during .whl package installation. (For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),In other cases, you need to manually install dependency items. -- `{version}` denotes the version of MindSpore. For example, when you are downloading MindSpore 1.0.1, `{version}` should be 1.0.1. +- `{version}` denotes the version of MindSpore. For example, when you are installing MindSpore 1.1.0, `{version}` should be 1.1.0. - `{arch}` denotes the system architecture. For example, the Linux system you are using is x86 architecture 64-bit, `{arch}` should be `x86_64`. If the system is ARM architecture 64-bit, then it should be `aarch64`. ## Installation Verification diff --git a/install/mindspore_cpu_macos_install_conda.md b/install/mindspore_cpu_macos_install_conda.md index 1cee10f453ccb46d4545b61faeb7f160d3701175..6969711dbccf5b4456a4b38bc5f43ddd9afc3f44 100644 --- a/install/mindspore_cpu_macos_install_conda.md +++ b/install/mindspore_cpu_macos_install_conda.md @@ -53,7 +53,7 @@ conda activate mindspore 其中: - 在联网状态下,安装whl包时会自动下载MindSpore安装包的依赖项(依赖项详情参见[requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),其余情况需自行安装。 -- `{version}`表示MindSpore版本号,例如下载1.0.1版本MindSpore时,`{version}`应写为1.0.1。 +- `{version}`表示MindSpore版本号,例如安装1.1.0版本MindSpore时,`{version}`应写为1.1.0。 ## 验证是否安装成功 diff --git a/install/mindspore_cpu_macos_install_pip.md b/install/mindspore_cpu_macos_install_pip.md index 7292ebd303976d06a6d077df3df50bef43290aaf..32dd85ba12538f04934c127871694fd78ee97cc9 100644 --- a/install/mindspore_cpu_macos_install_pip.md +++ b/install/mindspore_cpu_macos_install_pip.md @@ -29,7 +29,7 @@ 其中: - 在联网状态下,安装whl包时会自动下载MindSpore安装包的依赖项(依赖项详情参见[requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),其余情况需自行安装。 -- `{version}`表示MindSpore版本号,例如下载1.0.1版本MindSpore时,`{version}`应写为1.0.1。 +- `{version}`表示MindSpore版本号,例如安装1.1.0版本MindSpore时,`{version}`应写为1.1.0。 ## 验证是否安装成功 diff --git a/install/mindspore_cpu_macos_install_pip_en.md b/install/mindspore_cpu_macos_install_pip_en.md index a611e95c980ce0ce2e4655a47ff0a8a8519f4320..2889f0d231045dbc6f59c19fdc37638b0c443be9 100644 --- a/install/mindspore_cpu_macos_install_pip_en.md +++ b/install/mindspore_cpu_macos_install_pip_en.md @@ -28,7 +28,7 @@ This document describes how to quickly install MindSpore by pip in a macOS syste Of which, - When the network is connected, dependency items are automatically downloaded during .whl package installation. (For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)). In other cases, you need to manually install dependency items. -- `{version}` denotes the version of MindSpore. For example, when you are downloading MindSpore 1.0.1, `{version}` should be 1.0.1. +- `{version}` denotes the version of MindSpore. For example, when you are installing MindSpore 1.1.0, `{version}` should be 1.1.0. ## Installation Verification diff --git a/install/mindspore_cpu_macos_install_source.md b/install/mindspore_cpu_macos_install_source.md index ecd185cd7947054444803728dbe10fe30659e315..b9a4f0e5c1b3b37105d986c8bf66e5f65f7e2635 100644 --- a/install/mindspore_cpu_macos_install_source.md +++ b/install/mindspore_cpu_macos_install_source.md @@ -57,7 +57,7 @@ pip install build/package/mindspore-{version}-py37-none-any.whl -i https://pypi. 其中: - 在联网状态下,安装whl包时会自动下载MindSpore安装包的依赖项(依赖项详情参见[requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),其余情况需自行安装。 -- `{version}`表示MindSpore版本号,例如下载1.0.1版本MindSpore时,`{version}`应写为1.0.1。 +- `{version}`表示MindSpore版本号,例如安装1.1.0版本MindSpore时,`{version}`应写为1.1.0。 ## 验证是否安装成功 diff --git a/install/mindspore_cpu_macos_install_source_en.md b/install/mindspore_cpu_macos_install_source_en.md index e680ba9cf9b97c3f85f56cd84baca505012ab19e..e0d82d5a4e3aa272d3708685e0427b66a85b3f24 100644 --- a/install/mindspore_cpu_macos_install_source_en.md +++ b/install/mindspore_cpu_macos_install_source_en.md @@ -57,7 +57,7 @@ pip install build/package/mindspore-{version}-py37-none-any.whl -i https://pypi. Of which, - When the network is connected, dependency items are automatically downloaded during .whl package installation. (For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)). In other cases, you need to manually install dependency items. -- `{version}` denotes the version of MindSpore. For example, when you are downloading MindSpore 1.0.1, `{version}` should be 1.0.1. +- `{version}` denotes the version of MindSpore. For example, when you are installing MindSpore 1.1.0, `{version}` should be 1.1.0. ## Installation Verification diff --git a/install/mindspore_cpu_win_install_conda.md b/install/mindspore_cpu_win_install_conda.md index 3d913dee797aa56817f567b0aa56c9f1b4113dde..2e26bab630af50f82451826fe35f417b256efba2 100644 --- a/install/mindspore_cpu_win_install_conda.md +++ b/install/mindspore_cpu_win_install_conda.md @@ -59,7 +59,7 @@ pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/{version}/MindSp 其中: - 在联网状态下,安装whl包时会自动下载MindSpore安装包的依赖项(依赖项详情参见[requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),其余情况需自行安装。 -- `{version}`表示MindSpore版本号,例如下载1.0.1版本MindSpore时,`{version}`应写为1.0.1。 +- `{version}`表示MindSpore版本号,例如安装1.1.0版本MindSpore时,`{version}`应写为1.1.0。 ## 验证是否安装成功 diff --git a/install/mindspore_cpu_win_install_pip.md b/install/mindspore_cpu_win_install_pip.md index eeadbd9231adaf3d1ad771eb7a2509f8a73142c9..ae5795f907f24c33955849d7de03424959f697b7 100644 --- a/install/mindspore_cpu_win_install_pip.md +++ b/install/mindspore_cpu_win_install_pip.md @@ -32,7 +32,7 @@ pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/{version}/MindSp 其中: - 在联网状态下,安装whl包时会自动下载MindSpore安装包的依赖项(依赖项详情参见[requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),其余情况需自行安装。 -- `{version}`表示MindSpore版本号,例如下载1.0.1版本MindSpore时,`{version}`应写为1.0.1。 +- `{version}`表示MindSpore版本号,例如安装1.1.0版本MindSpore时,`{version}`应写为1.1.0。 ## 验证是否安装成功 diff --git a/install/mindspore_cpu_win_install_pip_en.md b/install/mindspore_cpu_win_install_pip_en.md index 24a0d86bb38546a8261b8cae6dc7c41226437f0a..8a2b1c48b64005391a0912245f0a4d0000d23831 100644 --- a/install/mindspore_cpu_win_install_pip_en.md +++ b/install/mindspore_cpu_win_install_pip_en.md @@ -32,7 +32,7 @@ pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/{version}/MindSp Of which, - When the network is connected, dependency items are automatically downloaded during .whl package installation. (For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)). In other cases, you need to manually install dependency items. -- `{version}` denotes the version of MindSpore. For example, when you are downloading MindSpore 1.0.1, `{version}` should be 1.0.1. +- `{version}` denotes the version of MindSpore. For example, when you are installing MindSpore 1.1.0, `{version}` should be 1.1.0. ## Installation Verification diff --git a/install/mindspore_cpu_win_install_source.md b/install/mindspore_cpu_win_install_source.md index c6bfbfd9ec5799a8599961199f7db5d5d753ae9e..600eac130fb46c1d9c557237208563a27919e64c 100644 --- a/install/mindspore_cpu_win_install_source.md +++ b/install/mindspore_cpu_win_install_source.md @@ -21,7 +21,7 @@ - 确认安装Windows 10是x86架构64位操作系统。 - 确认安装[Visual C++ Redistributable for Visual Studio 2015](https://www.microsoft.com/zh-CN/download/details.aspx?id=48145)。 - 确认安装了[git](https://github.com/git-for-windows/git/releases/download/v2.29.2.windows.2/Git-2.29.2.2-64-bit.exe)工具。 - - 如果git没有安装在`ProgramFiles`,在执行上述命令前,需设置环境变量指定`patch.exe`的位置,例如git安装在`D:\git`时,需设置`set MS_PATCH_PATH=D:\git\usr\bin`。 + - 如果git没有安装在`ProgramFiles`,需设置环境变量指定`patch.exe`的位置,例如git安装在`D:\git`时,需设置`set MS_PATCH_PATH=D:\git\usr\bin`。 - 确认安装[MinGW-W64 GCC-7.3.0](https://sourceforge.net/projects/mingw-w64/files/Toolchains%20targetting%20Win64/Personal%20Builds/mingw-builds/7.3.0/threads-posix/seh/x86_64-7.3.0-release-posix-seh-rt_v5-rev0.7z)。 - 安装路径中不能出现中文和日文,安装完成后将安装路径下的`MinGW\bin`添加到系统环境变量。例如安装在`D:\gcc`,则需要将`D:\gcc\MinGW\bin`添加到系统环境变量Path中。 - 确认安装[CMake 3.18.3版本](https://github.com/Kitware/Cmake/releases/tag/v3.18.3)。 @@ -54,7 +54,7 @@ pip install build/package/mindspore-{version}-cp37-cp37m-win_amd64.whl -i https: 其中: - 在联网状态下,安装whl包时会自动下载MindSpore安装包的依赖项(依赖项详情参见[requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),其余情况需自行安装。 -- `{version}`表示MindSpore版本号,例如下载1.0.1版本MindSpore时,`{version}`应写为1.0.1。 +- `{version}`表示MindSpore版本号,例如安装1.1.0版本MindSpore时,`{version}`应写为1.1.0。 ## 验证是否安装成功 diff --git a/install/mindspore_cpu_win_install_source_en.md b/install/mindspore_cpu_win_install_source_en.md index 5bfdcbb9f2394f8b0b5cd6a9fe1427fb6f36da01..33a60fe6bb131eb1e8369d3e1ecd99a4dd83fe0c 100644 --- a/install/mindspore_cpu_win_install_source_en.md +++ b/install/mindspore_cpu_win_install_source_en.md @@ -53,7 +53,7 @@ pip install build/package/mindspore-{version}-cp37-cp37m-win_amd64.whl -i https: Of which, - When the network is connected, dependency items are automatically downloaded during .whl package installation. (For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)). In other cases, you need to manually install dependency items. -- `{version}` denotes the version of MindSpore. For example, when you are downloading MindSpore 1.0.1, `{version}` should be 1.0.1. +- `{version}` denotes the version of MindSpore. For example, when you are installing MindSpore 1.1.0, `{version}` should be 1.1.0. ## Installation Verification diff --git a/install/mindspore_gpu_install_conda.md b/install/mindspore_gpu_install_conda.md index 0e0965dd76409b488bd0009d1dd71cd57f9f65e9..1c7b1d1bb137e2e61fa2fbb8c22ab108c97a633c 100644 --- a/install/mindspore_gpu_install_conda.md +++ b/install/mindspore_gpu_install_conda.md @@ -64,7 +64,7 @@ pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/{version}/MindSp 其中: - 在联网状态下,安装whl包时会自动下载MindSpore安装包的依赖项(依赖项详情参见[requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),其余情况需自行安装。 -- `{version}`表示MindSpore版本号,例如下载1.0.1版本MindSpore时,`{version}`应写为1.0.1。 +- `{version}`表示MindSpore版本号,例如安装1.1.0版本MindSpore时,`{version}`应写为1.1.0。 - `{arch}`表示系统架构,例如使用的Linux系统是x86架构64位时,`{arch}`应写为`x86_64`。如果系统是ARM架构64位,则写为`aarch64`。 ## 验证是否成功安装 diff --git a/install/mindspore_gpu_install_pip.md b/install/mindspore_gpu_install_pip.md index d9316e4879e5a739fc22bb82b50726b7d530c718..4577c9d9683f63a09cc21a02724f6a4d406c2013 100644 --- a/install/mindspore_gpu_install_pip.md +++ b/install/mindspore_gpu_install_pip.md @@ -39,7 +39,7 @@ pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/{version}/MindSp 其中: - 在联网状态下,安装whl包时会自动下载MindSpore安装包的依赖项(依赖项详情参见[requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),其余情况需自行安装。 -- `{version}`表示MindSpore版本号,例如下载1.0.1版本MindSpore时,`{version}`应写为1.0.1。 +- `{version}`表示MindSpore版本号,例如安装1.1.0版本MindSpore时,`{version}`应写为1.1.0。 - `{arch}`表示系统架构,例如使用的Linux系统是x86架构64位时,`{arch}`应写为`x86_64`。如果系统是ARM架构64位,则写为`aarch64`。 ## 验证是否成功安装 diff --git a/install/mindspore_gpu_install_pip_en.md b/install/mindspore_gpu_install_pip_en.md index 687428753629f087e2a9ed1eedec366b3d6466ca..4c6ea5341c10e907b784eea73312ec9958238d6a 100644 --- a/install/mindspore_gpu_install_pip_en.md +++ b/install/mindspore_gpu_install_pip_en.md @@ -39,7 +39,7 @@ pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/{version}/MindSp Of which, - When the network is connected, dependency items are automatically downloaded during .whl package installation. (For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)). In other cases, you need to manually install dependency items. -- `{version}` denotes the version of MindSpore. For example, when you are downloading MindSpore 1.0.1, `{version}` should be 1.0.1. +- `{version}` denotes the version of MindSpore. For example, when you are installing MindSpore 1.1.0, `{version}` should be 1.1.0. - `{arch}` denotes the system architecture. For example, the Linux system you are using is x86 architecture 64-bit, `{arch}` should be `x86_64`. If the system is ARM architecture 64-bit, then it should be `aarch64`. ## Installation Verification diff --git a/install/mindspore_gpu_install_source.md b/install/mindspore_gpu_install_source.md index 1a8ac68847785e17e9c1c64fcc57072274e89e0a..a33187da6ac2912bd130852ec24ef0967b2db01e 100644 --- a/install/mindspore_gpu_install_source.md +++ b/install/mindspore_gpu_install_source.md @@ -83,7 +83,7 @@ pip install build/package/mindspore_gpu-{version}-cp37-cp37m-linux_{arch}.whl -i 其中: - 在联网状态下,安装whl包时会自动下载MindSpore安装包的依赖项(依赖项详情参见[requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)),其余情况需自行安装。 -- `{version}`表示MindSpore版本号,例如下载1.0.1版本MindSpore时,`{version}`应写为1.0.1。 +- `{version}`表示MindSpore版本号,例如安装1.1.0版本MindSpore时,`{version}`应写为1.1.0。 - `{arch}`表示系统架构,例如使用的Linux系统是x86架构64位时,`{arch}`应写为`x86_64`。如果系统是ARM架构64位,则写为`aarch64`。 ## 验证是否成功安装 diff --git a/install/mindspore_gpu_install_source_en.md b/install/mindspore_gpu_install_source_en.md index 35f41e5e3b7ef8619cd6bd1623abcfb716fa0e7a..496b79cc5a7bede856e8c2c9f6ead210d8aacf8b 100644 --- a/install/mindspore_gpu_install_source_en.md +++ b/install/mindspore_gpu_install_source_en.md @@ -82,7 +82,7 @@ pip install build/package/mindspore_gpu-{version}-cp37-cp37m-linux_{arch}.whl -i Of which, - When the network is connected, dependency items are automatically downloaded during .whl package installation. (For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt)). In other cases, you need to manually install dependency items. -- `{version}` denotes the version of MindSpore. For example, when you are downloading MindSpore 1.0.1, `{version}` should be 1.0.1. +- `{version}` denotes the version of MindSpore. For example, when you are installing MindSpore 1.1.0, `{version}` should be 1.1.0. - `{arch}` denotes the system architecture. For example, the Linux system you are using is x86 architecture 64-bit, `{arch}` should be `x86_64`. If the system is ARM architecture 64-bit, then it should be `aarch64`. ## Installation Verification diff --git a/tutorials/inference/source_en/index.rst b/tutorials/inference/source_en/index.rst index 77e779bf19d5c2c7227a3267093afa288ff086fd..352af34cc80aac119b0da4e3afba8ae9bfd25269 100644 --- a/tutorials/inference/source_en/index.rst +++ b/tutorials/inference/source_en/index.rst @@ -24,3 +24,6 @@ Inference Using MindSpore :caption: Inference Service serving_example + serving_grpc + serving_restful + serving_model diff --git a/tutorials/inference/source_en/multi_platform_inference_ascend_310.rst b/tutorials/inference/source_en/multi_platform_inference_ascend_310.rst index d16b94a6134bb498484d11cc4a9535cfddc6f39a..1544dd6a232ca90820288d832336763cff2b3774 100644 --- a/tutorials/inference/source_en/multi_platform_inference_ascend_310.rst +++ b/tutorials/inference/source_en/multi_platform_inference_ascend_310.rst @@ -5,3 +5,4 @@ Inference on Ascend 310 :maxdepth: 1 multi_platform_inference_ascend_310_air + multi_platform_inference_ascend_310_mindir \ No newline at end of file diff --git a/tutorials/inference/source_en/multi_platform_inference_ascend_310_mindir.md b/tutorials/inference/source_en/multi_platform_inference_ascend_310_mindir.md new file mode 100644 index 0000000000000000000000000000000000000000..1a98b0300b6cb18b86e9573f7b7aa719e275dc6e --- /dev/null +++ b/tutorials/inference/source_en/multi_platform_inference_ascend_310_mindir.md @@ -0,0 +1,5 @@ +# Inference on the Ascend 310 AI Processor Using Mindir Model + +No English version available right now, welcome to contribute. + + diff --git a/tutorials/inference/source_en/serving_grpc.md b/tutorials/inference/source_en/serving_grpc.md new file mode 100644 index 0000000000000000000000000000000000000000..596ae4d3e37be28fa677746bb360f1b79b54c9bf --- /dev/null +++ b/tutorials/inference/source_en/serving_grpc.md @@ -0,0 +1,5 @@ +# Access MindSpore Serving service based on gRPC interface + +No English version available right now, welcome to contribute. + + diff --git a/tutorials/inference/source_en/serving_model.md b/tutorials/inference/source_en/serving_model.md new file mode 100644 index 0000000000000000000000000000000000000000..260cbf46319ea10e51b54c5ad328da000e108606 --- /dev/null +++ b/tutorials/inference/source_en/serving_model.md @@ -0,0 +1,5 @@ +# Servable provided by configuration model + +No English version available right now, welcome to contribute. + + diff --git a/tutorials/inference/source_en/serving_restful.md b/tutorials/inference/source_en/serving_restful.md new file mode 100644 index 0000000000000000000000000000000000000000..1f2f22a30fa75ed2750a357a4d0dcf5a59bcdd20 --- /dev/null +++ b/tutorials/inference/source_en/serving_restful.md @@ -0,0 +1,5 @@ +# Access MindSpore Serving service based on RESTful interface + +No English version available right now, welcome to contribute. + + diff --git a/tutorials/lite/source_en/images/side_train_sequence.png b/tutorials/lite/source_en/images/side_train_sequence.png index 16e4af67a46370813760c09a15da756ad87fa643..058f03d3973beab9c8a245d6aa898f938d486315 100644 Binary files a/tutorials/lite/source_en/images/side_train_sequence.png and b/tutorials/lite/source_en/images/side_train_sequence.png differ diff --git a/tutorials/lite/source_en/index.rst b/tutorials/lite/source_en/index.rst index fcddb633dca596106919215c815e3ac44b6e86e5..12b05c6bf3f9bda98aff73da24bcaeeac649759e 100644 --- a/tutorials/lite/source_en/index.rst +++ b/tutorials/lite/source_en/index.rst @@ -274,7 +274,7 @@ Using MindSpore on Mobile and IoT +
+
+
+
体验
+
+
+ + +
+ +

-