From 0ce337a4b834ac00ea08b15bab9cba8a0d61245c Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E4=BA=8E=E6=8C=AF=E5=8D=8E?= Date: Sun, 26 Apr 2020 20:01:10 +0800 Subject: [PATCH 01/14] update tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md. --- .../source_zh_cn/use/saving_and_loading_model_parameters.md | 3 +++ 1 file changed, 3 insertions(+) diff --git a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md index 1bd0c24b7c..3f6155e9ee 100644 --- a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md +++ b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md @@ -9,6 +9,9 @@ - [模型参数加载](#模型参数加载) - [用于推理验证](#用于推理验证) - [用于再训练场景](#用于再训练场景) + - [导出GEIR模型和ONNX模型](#导出GEIR模型和ONNX模型) + - [导出GEIR模型](#导出GEIR模型) + - [导出ONNX模型](#导出ONNX模型) -- Gitee From 3f38dd48ae74651f287d4b2389e7ddb397daebd6 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E4=BA=8E=E6=8C=AF=E5=8D=8E?= Date: Sun, 26 Apr 2020 21:13:33 +0800 Subject: [PATCH 02/14] update tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md. --- .../saving_and_loading_model_parameters.md | 29 +++++++++++++++++-- 1 file changed, 27 insertions(+), 2 deletions(-) diff --git a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md index 3f6155e9ee..3fdfaa1668 100644 --- a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md +++ b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md @@ -10,8 +10,8 @@ - [用于推理验证](#用于推理验证) - [用于再训练场景](#用于再训练场景) - [导出GEIR模型和ONNX模型](#导出GEIR模型和ONNX模型) - - [导出GEIR模型](#导出GEIR模型) - - [导出ONNX模型](#导出ONNX模型) + - [exportexport接口说明](#exportexport接口说明) + - [导出GEIR模型和ONNX模型](#导出GEIR模型和ONNX模型) @@ -138,3 +138,28 @@ model.train(epoch, dataset) ``` `load_checkpoint`方法会返回一个参数字典,`load_param_into_net`会把参数字典中相应的参数加载到网络或优化器中。 + +###export接口说明 + +export接口需要导入mindspore.train.serialization,函数的声明如下: +def export(net, *inputs, file_name, file_format='GEIR') +参数说明: +net:导出模型用到的网络 +inputs:指定模型的输入shape和数据类型 +file_name:导出模型的文件名 +file_format:导出的模型格式,可取值:GEIR、ONNX、LITE。 +GEIR:针对昇腾芯片的模型格式 +ONNX:通用的模型格式 +LITE:针对移动端的模型格式 + +###导出GEIR模型和ONNX模型 +在调用export接口时通过指定file_format导出GEIR和ONNX模型。使用方法如下: +'''python +from mindspore.train.serialization import export +import numpy as np +net = Resnet50() #创建网络 +param_dict = load_checkpoint("resnet50-2_32.ckpt", net=resnet) #读取checkpoint文件 +load_param_into_net(net) #将参数加载到网络中 +input = np.random.uniform(0.0, 1.0, size = [32, 3, 224, 224]).astype(np.float32) +export(net, input, file_name = 'resnet50-2_32.pb', file_format = 'GEIR') #调用export接口导出模型,如果导出ONNX模型则指定file_format = 'ONNX' +''' \ No newline at end of file -- Gitee From 37d47ca8f5392f1fbd253ad9bb793d3ed27002c2 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E4=BA=8E=E6=8C=AF=E5=8D=8E?= Date: Sun, 26 Apr 2020 21:17:28 +0800 Subject: [PATCH 03/14] =?UTF-8?q?=E5=AF=BC=E5=87=BAGEIR=E5=92=8CONNX?= =?UTF-8?q?=E6=A8=A1=E5=9E=8B=E8=AF=B4=E6=98=8E?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- .../use/saving_and_loading_model_parameters.md | 12 ++++++++---- 1 file changed, 8 insertions(+), 4 deletions(-) diff --git a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md index 3fdfaa1668..be9881ba45 100644 --- a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md +++ b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md @@ -10,8 +10,8 @@ - [用于推理验证](#用于推理验证) - [用于再训练场景](#用于再训练场景) - [导出GEIR模型和ONNX模型](#导出GEIR模型和ONNX模型) - - [exportexport接口说明](#exportexport接口说明) - - [导出GEIR模型和ONNX模型](#导出GEIR模型和ONNX模型) + - [export接口说明](#export接口说明) + - [导出模型](#导出模型) @@ -139,6 +139,10 @@ model.train(epoch, dataset) `load_checkpoint`方法会返回一个参数字典,`load_param_into_net`会把参数字典中相应的参数加载到网络或优化器中。 +##导出GEIR模型和ONNX模型 + +有了保存的checkpoint文件后就可以导出GEIR模型和ONNX模型。两种模型的导出都是通过调用export接口来实现,根据不同的参数导出不同类型的模型。 + ###export接口说明 export接口需要导入mindspore.train.serialization,函数的声明如下: @@ -152,7 +156,7 @@ GEIR:针对昇腾芯片的模型格式 ONNX:通用的模型格式 LITE:针对移动端的模型格式 -###导出GEIR模型和ONNX模型 +###导出模型 在调用export接口时通过指定file_format导出GEIR和ONNX模型。使用方法如下: '''python from mindspore.train.serialization import export @@ -162,4 +166,4 @@ param_dict = load_checkpoint("resnet50-2_32.ckpt", net=resnet) #读取checkpoin load_param_into_net(net) #将参数加载到网络中 input = np.random.uniform(0.0, 1.0, size = [32, 3, 224, 224]).astype(np.float32) export(net, input, file_name = 'resnet50-2_32.pb', file_format = 'GEIR') #调用export接口导出模型,如果导出ONNX模型则指定file_format = 'ONNX' -''' \ No newline at end of file +''' -- Gitee From 4e8887fc043c5a67bf960731860eb64d1c30a355 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E4=BA=8E=E6=8C=AF=E5=8D=8E?= Date: Mon, 27 Apr 2020 09:39:50 +0800 Subject: [PATCH 04/14] update tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md. --- .../source_zh_cn/use/saving_and_loading_model_parameters.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md index be9881ba45..ea02bdf476 100644 --- a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md +++ b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md @@ -139,11 +139,11 @@ model.train(epoch, dataset) `load_checkpoint`方法会返回一个参数字典,`load_param_into_net`会把参数字典中相应的参数加载到网络或优化器中。 -##导出GEIR模型和ONNX模型 +## 导出GEIR模型和ONNX模型 有了保存的checkpoint文件后就可以导出GEIR模型和ONNX模型。两种模型的导出都是通过调用export接口来实现,根据不同的参数导出不同类型的模型。 -###export接口说明 +### export接口说明 export接口需要导入mindspore.train.serialization,函数的声明如下: def export(net, *inputs, file_name, file_format='GEIR') @@ -156,7 +156,7 @@ GEIR:针对昇腾芯片的模型格式 ONNX:通用的模型格式 LITE:针对移动端的模型格式 -###导出模型 +### 导出模型 在调用export接口时通过指定file_format导出GEIR和ONNX模型。使用方法如下: '''python from mindspore.train.serialization import export -- Gitee From f0e4de6cd4e17e38b392649f82ef8e5e4d0013f3 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E4=BA=8E=E6=8C=AF=E5=8D=8E?= Date: Mon, 27 Apr 2020 10:06:27 +0800 Subject: [PATCH 05/14] update tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md. --- .../use/saving_and_loading_model_parameters.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md index ea02bdf476..d48f32a29f 100644 --- a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md +++ b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md @@ -158,12 +158,12 @@ LITE:针对移动端的模型格式 ### 导出模型 在调用export接口时通过指定file_format导出GEIR和ONNX模型。使用方法如下: -'''python +```python from mindspore.train.serialization import export import numpy as np -net = Resnet50() #创建网络 -param_dict = load_checkpoint("resnet50-2_32.ckpt", net=resnet) #读取checkpoint文件 +net = ResNet50() #创建网络 +param_dict = load_checkpoint("resnet50-2_32.ckpt", net=resnet) #读取CheckPoint文件 load_param_into_net(net) #将参数加载到网络中 input = np.random.uniform(0.0, 1.0, size = [32, 3, 224, 224]).astype(np.float32) export(net, input, file_name = 'resnet50-2_32.pb', file_format = 'GEIR') #调用export接口导出模型,如果导出ONNX模型则指定file_format = 'ONNX' -''' +``` -- Gitee From 44879385567bb19dfbc921daa14a2440a2e2d55d Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E4=BA=8E=E6=8C=AF=E5=8D=8E?= Date: Mon, 27 Apr 2020 10:57:15 +0800 Subject: [PATCH 06/14] export GEIR and ONNX model --- .../saving_and_loading_model_parameters.md | 32 ++++++------------- 1 file changed, 10 insertions(+), 22 deletions(-) diff --git a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md index d48f32a29f..831504a2f5 100644 --- a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md +++ b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md @@ -10,8 +10,6 @@ - [用于推理验证](#用于推理验证) - [用于再训练场景](#用于再训练场景) - [导出GEIR模型和ONNX模型](#导出GEIR模型和ONNX模型) - - [export接口说明](#export接口说明) - - [导出模型](#导出模型) @@ -141,29 +139,19 @@ model.train(epoch, dataset) ## 导出GEIR模型和ONNX模型 -有了保存的checkpoint文件后就可以导出GEIR模型和ONNX模型。两种模型的导出都是通过调用export接口来实现,根据不同的参数导出不同类型的模型。 - -### export接口说明 - -export接口需要导入mindspore.train.serialization,函数的声明如下: -def export(net, *inputs, file_name, file_format='GEIR') -参数说明: -net:导出模型用到的网络 -inputs:指定模型的输入shape和数据类型 -file_name:导出模型的文件名 -file_format:导出的模型格式,可取值:GEIR、ONNX、LITE。 -GEIR:针对昇腾芯片的模型格式 -ONNX:通用的模型格式 -LITE:针对移动端的模型格式 - -### 导出模型 -在调用export接口时通过指定file_format导出GEIR和ONNX模型。使用方法如下: +当有了一个CheckPoint文件后如果我们想做推理就需要将网络和CheckPoint导出模型,当前我们支持基于昇腾芯片的GEIR模型导出和基于GPU的通用ONNX模型的导出。 +下面以GEIR为例说明模型导出的实现,代码如下: ```python from mindspore.train.serialization import export import numpy as np -net = ResNet50() #创建网络 -param_dict = load_checkpoint("resnet50-2_32.ckpt", net=resnet) #读取CheckPoint文件 -load_param_into_net(net) #将参数加载到网络中 +net = ResNet50() +# return a parameter dict for model +param_dict = load_checkpoint("resnet50-2_32.ckpt", net=resnet) +# load the parameter into net +load_param_into_net(net) input = np.random.uniform(0.0, 1.0, size = [32, 3, 224, 224]).astype(np.float32) export(net, input, file_name = 'resnet50-2_32.pb', file_format = 'GEIR') #调用export接口导出模型,如果导出ONNX模型则指定file_format = 'ONNX' ``` +使用export接口之前需要先导入mindspore.train.serialization。 +input用来指定导出模型的输入shape以及数据类型。 +如果要导出ONNX模型只需要将export接口中的file_format参数指定为OONNX即可:file_format = 'ONNX' -- Gitee From 84174f286368ec671fa8cdcb4d78af2f9a44ad0c Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E4=BA=8E=E6=8C=AF=E5=8D=8E?= Date: Mon, 27 Apr 2020 10:59:22 +0800 Subject: [PATCH 07/14] export GEIR and ONNX model --- .../use/saving_and_loading_model_parameters.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md index 831504a2f5..7e6c2970ab 100644 --- a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md +++ b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md @@ -150,8 +150,8 @@ param_dict = load_checkpoint("resnet50-2_32.ckpt", net=resnet) # load the parameter into net load_param_into_net(net) input = np.random.uniform(0.0, 1.0, size = [32, 3, 224, 224]).astype(np.float32) -export(net, input, file_name = 'resnet50-2_32.pb', file_format = 'GEIR') #调用export接口导出模型,如果导出ONNX模型则指定file_format = 'ONNX' +export(net, input, file_name = 'resnet50-2_32.pb', file_format = 'GEIR') ``` -使用export接口之前需要先导入mindspore.train.serialization。 -input用来指定导出模型的输入shape以及数据类型。 -如果要导出ONNX模型只需要将export接口中的file_format参数指定为OONNX即可:file_format = 'ONNX' +使用`export`接口之前需要先导入`mindspore.train.serialization`。 +`input`用来指定导出模型的输入shape以及数据类型。 +如果要导出ONNX模型只需要将`export`接口中的`file_format`参数指定为ONNX即可:`file_format = 'ONNX'`。 -- Gitee From 04bfafbc0fb5de73c11cd91f8a5ec77777f01483 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E4=BA=8E=E6=8C=AF=E5=8D=8E?= Date: Mon, 27 Apr 2020 14:39:04 +0800 Subject: [PATCH 08/14] export GEIR and ONNX model --- .../use/saving_and_loading_model_parameters.md | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md index 7e6c2970ab..4498ed169d 100644 --- a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md +++ b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md @@ -139,7 +139,7 @@ model.train(epoch, dataset) ## 导出GEIR模型和ONNX模型 -当有了一个CheckPoint文件后如果我们想做推理就需要将网络和CheckPoint导出模型,当前我们支持基于昇腾芯片的GEIR模型导出和基于GPU的通用ONNX模型的导出。 +当有了CheckPoint文件后,如果想继续做推理,就需要根据网络和CheckPoint生成对应的模型,当前我们支持基于昇腾芯片的GEIR模型导出和基于GPU的通用ONNX模型的导出。 下面以GEIR为例说明模型导出的实现,代码如下: ```python from mindspore.train.serialization import export @@ -152,6 +152,7 @@ load_param_into_net(net) input = np.random.uniform(0.0, 1.0, size = [32, 3, 224, 224]).astype(np.float32) export(net, input, file_name = 'resnet50-2_32.pb', file_format = 'GEIR') ``` -使用`export`接口之前需要先导入`mindspore.train.serialization`。 +使用`export`接口之前,需要先导入`mindspore.train.serialization`。 `input`用来指定导出模型的输入shape以及数据类型。 -如果要导出ONNX模型只需要将`export`接口中的`file_format`参数指定为ONNX即可:`file_format = 'ONNX'`。 +如果要导出ONNX模型,只需要将`export`接口中的`file_format`参数指定为ONNX即可:`file_format = 'ONNX'`。 + -- Gitee From 1ce744ac9f9e2568a977929aa9e4d671e56a31b9 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E4=BA=8E=E6=8C=AF=E5=8D=8E?= Date: Mon, 27 Apr 2020 14:57:17 +0800 Subject: [PATCH 09/14] export GEIR model and ONNX model --- .../saving_and_loading_model_parameters.md | 20 +++++++++++++++++++ 1 file changed, 20 insertions(+) diff --git a/tutorials/source_en/use/saving_and_loading_model_parameters.md b/tutorials/source_en/use/saving_and_loading_model_parameters.md index 7cad248fb8..a15745469b 100644 --- a/tutorials/source_en/use/saving_and_loading_model_parameters.md +++ b/tutorials/source_en/use/saving_and_loading_model_parameters.md @@ -9,6 +9,8 @@ - [Loading Model Parameters](#loading-model-parameters) - [For Inference Validation](#for-inference-validation) - [For Retraining](#for-retraining) + - [Export GEIR Model and ONNX Model](#Export GEIR Model and ONNX Model) + @@ -135,3 +137,21 @@ model.train(epoch, dataset) The `load_checkpoint` method returns a parameter dictionary and then the `load_param_into_net` method loads parameters in the parameter dictionary to the network or optimizer. +## Export GEIR Model and ONNX Model +When you have a CheckPoint file, if you want to do inference, you need to generate corresponding models based on the network and CheckPoint. +Currently we support the export of GEIR models based on Ascend chips and the export of ONNX models based on GPU. Taking the export of GEIR model as an example to illustrate the implementation of model export, +the code is as follows: +```python +from mindspore.train.serialization import export +import numpy as np +net = ResNet50() +# return a parameter dict for model +param_dict = load_checkpoint("resnet50-2_32.ckpt", net=resnet) +# load the parameter into net +load_param_into_net(net) +input = np.random.uniform(0.0, 1.0, size = [32, 3, 224, 224]).astype(np.float32) +export(net, input, file_name = 'resnet50-2_32.pb', file_format = 'GEIR') +``` +Before using the `export` interface, you need to import` mindspore.train.serialization`. +The `input` parameter is used to specify the input shape and data type of the exported model. +If you want to export the ONNX model, you only need to specify the `file_format` parameter in the` export` interface as ONNX: `file_format = 'ONNX'`. \ No newline at end of file -- Gitee From c001779c8cae9ab49c33dd6475792a859a12c629 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E4=BA=8E=E6=8C=AF=E5=8D=8E?= Date: Mon, 27 Apr 2020 15:09:34 +0800 Subject: [PATCH 10/14] eport GEIR model and ONNX model --- tutorials/source_en/use/saving_and_loading_model_parameters.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tutorials/source_en/use/saving_and_loading_model_parameters.md b/tutorials/source_en/use/saving_and_loading_model_parameters.md index a15745469b..d8ab957989 100644 --- a/tutorials/source_en/use/saving_and_loading_model_parameters.md +++ b/tutorials/source_en/use/saving_and_loading_model_parameters.md @@ -9,7 +9,7 @@ - [Loading Model Parameters](#loading-model-parameters) - [For Inference Validation](#for-inference-validation) - [For Retraining](#for-retraining) - - [Export GEIR Model and ONNX Model](#Export GEIR Model and ONNX Model) + - [Export GEIR Model and ONNX Model](#Export-GEIR-Model-and-ONNX-Model) -- Gitee From 316442b6b7f677538b3ef82b8f7c930d6dd4ba0a Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E4=BA=8E=E6=8C=AF=E5=8D=8E?= Date: Mon, 27 Apr 2020 20:58:52 +0800 Subject: [PATCH 11/14] export GEIR and ONNX model --- .../source_zh_cn/use/saving_and_loading_model_parameters.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md index 4498ed169d..de3396b96f 100644 --- a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md +++ b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md @@ -139,7 +139,7 @@ model.train(epoch, dataset) ## 导出GEIR模型和ONNX模型 -当有了CheckPoint文件后,如果想继续做推理,就需要根据网络和CheckPoint生成对应的模型,当前我们支持基于昇腾芯片的GEIR模型导出和基于GPU的通用ONNX模型的导出。 +当有了CheckPoint文件后,如果想继续做推理,就需要根据网络和CheckPoint生成对应的模型,当前我们支持基于昇腾AI处理器的GEIR模型导出和基于GPU的通用ONNX模型的导出。 下面以GEIR为例说明模型导出的实现,代码如下: ```python from mindspore.train.serialization import export -- Gitee From c46bcba420c087e43946c13de999b08c5061211e Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E4=BA=8E=E6=8C=AF=E5=8D=8E?= Date: Mon, 27 Apr 2020 21:00:05 +0800 Subject: [PATCH 12/14] export GEIR and ONNX model --- tutorials/source_en/use/saving_and_loading_model_parameters.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tutorials/source_en/use/saving_and_loading_model_parameters.md b/tutorials/source_en/use/saving_and_loading_model_parameters.md index d8ab957989..760e060795 100644 --- a/tutorials/source_en/use/saving_and_loading_model_parameters.md +++ b/tutorials/source_en/use/saving_and_loading_model_parameters.md @@ -139,7 +139,7 @@ The `load_checkpoint` method returns a parameter dictionary and then the `load_p ## Export GEIR Model and ONNX Model When you have a CheckPoint file, if you want to do inference, you need to generate corresponding models based on the network and CheckPoint. -Currently we support the export of GEIR models based on Ascend chips and the export of ONNX models based on GPU. Taking the export of GEIR model as an example to illustrate the implementation of model export, +Currently we support the export of GEIR models based on Ascend AI processor and the export of ONNX models based on GPU. Taking the export of GEIR model as an example to illustrate the implementation of model export, the code is as follows: ```python from mindspore.train.serialization import export -- Gitee From 36ed7034aaf797da605969d358587b33bc7acd8f Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E4=BA=8E=E6=8C=AF=E5=8D=8E?= Date: Tue, 28 Apr 2020 10:52:26 +0800 Subject: [PATCH 13/14] export GEIR and ONNX model --- tutorials/source_en/use/saving_and_loading_model_parameters.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tutorials/source_en/use/saving_and_loading_model_parameters.md b/tutorials/source_en/use/saving_and_loading_model_parameters.md index 760e060795..d3bf83428c 100644 --- a/tutorials/source_en/use/saving_and_loading_model_parameters.md +++ b/tutorials/source_en/use/saving_and_loading_model_parameters.md @@ -9,7 +9,7 @@ - [Loading Model Parameters](#loading-model-parameters) - [For Inference Validation](#for-inference-validation) - [For Retraining](#for-retraining) - - [Export GEIR Model and ONNX Model](#Export-GEIR-Model-and-ONNX-Model) + - [Export GEIR Model and ONNX Model](#export-geir-model-and-onnx-model) -- Gitee From f344d00d8e7d865b65d2e3afff63a84d6df082b0 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E4=BA=8E=E6=8C=AF=E5=8D=8E?= Date: Tue, 28 Apr 2020 10:53:40 +0800 Subject: [PATCH 14/14] export GEIR and ONNX model --- .../source_zh_cn/use/saving_and_loading_model_parameters.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md index de3396b96f..0fdc6d04f0 100644 --- a/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md +++ b/tutorials/source_zh_cn/use/saving_and_loading_model_parameters.md @@ -9,7 +9,7 @@ - [模型参数加载](#模型参数加载) - [用于推理验证](#用于推理验证) - [用于再训练场景](#用于再训练场景) - - [导出GEIR模型和ONNX模型](#导出GEIR模型和ONNX模型) + - [导出GEIR模型和ONNX模型](#导出geir模型和onnx模型) -- Gitee