From 9e0b6b10b0d1ed49ee9ea9212f5ab6b1e282147c Mon Sep 17 00:00:00 2001 From: wangshuide2020 <7511764+wangshuide2020@user.noreply.gitee.com> Date: Wed, 22 Jul 2020 17:22:01 +0800 Subject: [PATCH] add tensorsummary example --- tutorials/source_en/advanced_use/dashboard_and_lineage.md | 8 +++++--- .../source_zh_cn/advanced_use/dashboard_and_lineage.md | 8 +++++--- 2 files changed, 10 insertions(+), 6 deletions(-) diff --git a/tutorials/source_en/advanced_use/dashboard_and_lineage.md b/tutorials/source_en/advanced_use/dashboard_and_lineage.md index 39a373a9eb..078a616ecb 100644 --- a/tutorials/source_en/advanced_use/dashboard_and_lineage.md +++ b/tutorials/source_en/advanced_use/dashboard_and_lineage.md @@ -203,10 +203,14 @@ class Net(nn.Cell): # Init ImageSummary self.sm_image = P.ImageSummary() + # Init TensorSummary + self.sm_tensor = P.TensorSummary() def construct(self, data): # Record image by Summary operator self.sm_image("image", data) + # Record tensor by Summary operator + self.sm_tensor("tensor", data) ...... return out ``` @@ -327,9 +331,7 @@ In the saved files, `ms_output_after_hwopt.pb` is the computational graph after Remarks: The method of estimating the space usage of `TensorSummary` is as follows: - The size of a `TensorSummary` data = the number of values in the tensor * 4 bytes. Assuming that the size of the tensor recorded by `TensorSummary` is 32*1*256*256, then a `TensorSummary` data needs about 32*1*256*256*4 bytes = 8,388,608 bytes = 8MiB. - Also suppose that the collect_freq of `SummaryCollector` is set to 1, and 50 iterations are trained. Then the required space when recording these 50 sets of data is about 50*8 MiB = 400MiB. - It should be noted that due to the overhead of data structure and other factors, the actual storage space used will be slightly larger than 400MiB. + The size of a `TensorSummary` data = the number of values in the tensor * 4 bytes. Assuming that the size of the tensor recorded by `TensorSummary` is 32 * 1 * 256 * 256, then a `TensorSummary` data needs about 32 * 1 * 256 * 256 * 4 bytes = 8,388,608 bytes = 8MiB. Also suppose that the collect_freq of `SummaryCollector` is set to 1, and 50 iterations are trained. Then the required space when recording these 50 sets of data is about 50 * 8 MiB = 400MiB. It should be noted that due to the overhead of data structure and other factors, the actual storage space used will be slightly larger than 400MiB. ## Visualization Components diff --git a/tutorials/source_zh_cn/advanced_use/dashboard_and_lineage.md b/tutorials/source_zh_cn/advanced_use/dashboard_and_lineage.md index 44cd1a7ebf..b4e772a406 100644 --- a/tutorials/source_zh_cn/advanced_use/dashboard_and_lineage.md +++ b/tutorials/source_zh_cn/advanced_use/dashboard_and_lineage.md @@ -204,10 +204,14 @@ class Net(nn.Cell): # Init ImageSummary self.sm_image = P.ImageSummary() + # Init TensorSummary + self.sm_tensor = P.TensorSummary() def construct(self, data): # Record image by Summary operator self.sm_image("image", data) + # Record tensor by Summary operator + self.sm_tensor("tensor", data) ...... return out @@ -329,9 +333,7 @@ model.train(cnn_network, callbacks=[confusion_martrix]) 备注:估算`TensorSummary`空间使用量的方法如下: - 一个`TensorSummary`数据的大小 = Tensor中的数值个数 * 4 bytes。假设使用`TensorSummary`记录的Tensor大小为32*1*256*256,则一个`TensorSummary` - 数据大约需要32*1*256*256*4 bytes = 8,388,608 bytes = 8MiB。又假设`SummaryCollector`的collect_freq设置为1,且训练了50个迭代。则记录这50组数据 - 需要的空间约为50*8 MiB = 400MiB。需要注意的是,由于数据结构等因素的开销,实际使用的存储空间会略大于400MiB。 + 一个`TensorSummary`数据的大小 = Tensor中的数值个数 * 4 bytes。假设使用`TensorSummary`记录的Tensor大小为32 * 1 * 256 * 256,则一个`TensorSummary`数据大约需要32 * 1 * 256 * 256 * 4 bytes = 8,388,608 bytes = 8MiB。又假设`SummaryCollector`的collect_freq设置为1,且训练了50个迭代。则记录这50组数据需要的空间约为50 * 8 MiB = 400MiB。需要注意的是,由于数据结构等因素的开销,实际使用的存储空间会略大于400MiB。 ## 可视化组件 -- Gitee