From 0d87042f5bc98348805ca07b178271564d74391d Mon Sep 17 00:00:00 2001
From: lvmingfu <630944715@qq.com>
Date: Fri, 24 Jul 2020 19:17:48 +0800
Subject: [PATCH] Optimize quick_start content in notebook
---
tutorials/notebook/quick_start.ipynb | 9 ++++-----
1 file changed, 4 insertions(+), 5 deletions(-)
diff --git a/tutorials/notebook/quick_start.ipynb b/tutorials/notebook/quick_start.ipynb
index 8f425891f3..7afac1140a 100644
--- a/tutorials/notebook/quick_start.ipynb
+++ b/tutorials/notebook/quick_start.ipynb
@@ -330,9 +330,10 @@
"metadata": {},
"source": [
"其中\n",
- "
`batch_size`:每组包含的数据个数,现设置每组包含32个数据。\n",
- "
`repeat_size`:数据集复制的数量。\n",
- "
先进行`shuffle`、`batch`操作,再进行`repeat`操作,这样能保证1个`epoch`内数据不重复。"
+ "- `batch_size`:每组包含的数据个数,现设置每组包含32个数据。\n",
+ "- `repeat_size`:数据集复制的数量。\n",
+ "\n",
+ "先进行`shuffle`、`batch`操作,再进行`repeat`操作,这样能保证1个`epoch`内数据不重复。"
]
},
{
@@ -700,8 +701,6 @@
"\n",
"定义了损失函数后,可以得到损失函数关于权重的梯度。梯度用于指示优化器优化权重的方向,以提高模型性能。\n",
"\n",
- "定义损失函数。\n",
- "\n",
"MindSpore支持的损失函数有`SoftmaxCrossEntropyWithLogits`、`L1Loss`、`MSELoss`等。这里使用`SoftmaxCrossEntropyWithLogits`损失函数。"
]
},
--
Gitee