From 99c2b5e9cfa02383acac750549551691b3910ccd Mon Sep 17 00:00:00 2001
From: purecho <2019500005@e.gzhu.edu.cn>
Date: Fri, 21 May 2021 09:28:23 +0800
Subject: [PATCH 1/2] translate into English
---
.../source_en/initializer.md | 298 +++++++++++++++++-
1 file changed, 296 insertions(+), 2 deletions(-)
diff --git a/docs/programming_guide/source_en/initializer.md b/docs/programming_guide/source_en/initializer.md
index cd934f345f..ff823bcc62 100644
--- a/docs/programming_guide/source_en/initializer.md
+++ b/docs/programming_guide/source_en/initializer.md
@@ -1,5 +1,299 @@
# Initialization of Network Parameters
-No English version right now, welcome to contribute.
+
-
\ No newline at end of file
+## Summary
+
+MindSprore provides a weight initialization module. Users can initialize network parameters by encapsulating operator and `Initializer` to call string and `Initializer` subclass or customizing `Tensor`.`Initializer` is the basic data structure used for initialization in MindSprore. Its subclass contains several different types of data distribution(Zero,One,XavierUniform,HeUniform,HeNormal,Constant,Uniform,Normal,TruncatedNormal).The two parameter initialization modes of encapsulation operator and `Initializer` are introduced in detail then.
+
+## Initializing parameters by using encapsulation operator
+
+MindSpore provides a variety of parameter initialization methods and encapsulates the function of parameter initialization in some operators. This section describes how operators with parameter initialization function initialize parameters. Taking the operator `Conv2` as an example, the initialization of parameters in the network by string, `Initializer` subclass and custom `Tensor` is introduced. The following code examples take the subclass `Normal`of `Initializer` as an example. `Normal` can be replaced by any of the `Initializer` subclasses.
+
+### String
+
+Network parameters are initialized by using strings whose content needs to be consistent with the name of the `Initializer` subclass. Initialization using strings uses default parameters in the `Initializer` subclass, such as using strings `Normal` equivalent to using `Initializer` subclass `Normal()`. Code sample:
+
+```
+import numpy as np
+import mindspore.nn as nn
+from mindspore import Tensor
+from mindspore.common import set_seed
+
+set_seed(1)
+
+input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))
+net = nn.Conv2d(3, 64, 3, weight_init='Normal')
+output = net(input_data)
+print(output)
+[[[[ 3.10382620e-02 4.38603461e-02 4.38603461e-02 ... 4.38603461e-02
+ 4.38603461e-02 1.38719045e-02]
+ [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02
+ 3.54298912e-02 -5.54019120e-03]
+ [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02
+ 3.54298912e-02 -5.54019120e-03]
+ ...
+ [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02
+ 3.54298912e-02 -5.54019120e-03]
+ [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02
+ 3.54298912e-02 -5.54019120e-03]
+ [ 9.66199022e-03 1.24104535e-02 1.24104535e-02 ... 1.24104535e-02
+ 1.24104535e-02 -1.38977719e-02]]
+
+ ...
+
+ [[ 3.98553275e-02 -1.35465711e-03 -1.35465711e-03 ... -1.35465711e-03
+ -1.35465711e-03 -1.00310734e-02]
+ [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02
+ -3.60766202e-02 -2.95619294e-02]
+ [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02
+ -3.60766202e-02 -2.95619294e-02]
+ ...
+ [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02
+ -3.60766202e-02 -2.95619294e-02]
+ [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02
+ -3.60766202e-02 -2.95619294e-02]
+ [ 1.33139016e-02 6.74417242e-05 6.74417242e-05 ... 6.74417242e-05
+ 6.74417242e-05 -2.27325838e-02]]]]
+```
+
+### Initializer subclass
+
+The initialization of network parameters by using the `Initializer` subclass is similar to string initialization, except that string initialization uses the default parameters of the `Initializer` subclass. To use parameters in the `Initializer` subclass, parameters must be initialized using the `Initializer` subclass. For example, `Normal(0.2)`, the code sample is as follows:
+
+```
+import numpy as np
+import mindspore.nn as nn
+from mindspore import Tensor
+from mindspore.common import set_seed
+from mindspore.common.initializer import Normal
+
+set_seed(1)
+
+input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))
+net = nn.Conv2d(3, 64, 3, weight_init=Normal(0.2))
+output = net(input_data)
+print(output)
+[[[[ 6.2076533e-01 8.7720710e-01 8.7720710e-01 ... 8.7720710e-01
+ 8.7720710e-01 2.7743810e-01]
+ [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01
+ 7.0859784e-01 -1.1080378e-01]
+ [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01
+ 7.0859784e-01 -1.1080378e-01]
+ ...
+ [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01
+ 7.0859784e-01 -1.1080378e-01]
+ [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01
+ 7.0859784e-01 -1.1080378e-01]
+ [ 1.9323981e-01 2.4820906e-01 2.4820906e-01 ... 2.4820906e-01
+ 2.4820906e-01 -2.7795550e-01]]
+
+ ...
+
+ [[ 7.9710668e-01 -2.7093157e-02 -2.7093157e-02 ... -2.7093157e-02
+ -2.7093157e-02 -2.0062150e-01]
+ [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01
+ -7.2153252e-01 -5.9123868e-01]
+ [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01
+ -7.2153252e-01 -5.9123868e-01]
+ ...
+ [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01
+ -7.2153252e-01 -5.9123868e-01]
+ [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01
+ -7.2153252e-01 -5.9123868e-01]
+ [ 2.6627803e-01 1.3488382e-03 1.3488382e-03 ... 1.3488382e-03
+ 1.3488382e-03 -4.5465171e-01]]]]
+```
+
+### Custom Tensor
+
+In addition to the above two initialization methods, when the network uses a data type not found in MindSpore to initialize parameters, users can customize `Tensor` to initialize parameters. Code sample:
+
+```
+import numpy as np
+import mindspore.nn as nn
+from mindspore import Tensor
+from mindspore import dtype as mstype
+
+weight = Tensor(np.ones([64, 3, 3, 3]), dtype=mstype.float32)
+input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))
+net = nn.Conv2d(3, 64, 3, weight_init=weight)
+output = net(input_data)
+print(output)
+[[[[12. 18. 18. ... 18. 18. 12.]
+ [18. 27. 27. ... 27. 27. 18.]
+ [18. 27. 27. ... 27. 27. 18.]
+ ...
+ [18. 27. 27. ... 27. 27. 18.]
+ [18. 27. 27. ... 27. 27. 18.]
+ [12. 18. 18. ... 18. 18. 12.]]
+
+ ...
+
+ [[12. 18. 18. ... 18. 18. 12.]
+ [18. 27. 27. ... 27. 27. 18.]
+ [18. 27. 27. ... 27. 27. 18.]
+ ...
+ [18. 27. 27. ... 27. 27. 18.]
+ [18. 27. 27. ... 27. 27. 18.]
+ [12. 18. 18. ... 18. 18. 12.]]]]
+```
+
+## Using `Initializer` to initialize parameters
+
+In the code sample above, how to initialize parameters in the network is given, such as encapsulating the `Conv2d` operator in the network using NN layer, parameter `weight_Init` passes in the `Conv2d` operator as the data type to be initialized, which at initialization completes the initialization of the parameter by calling the `Parameter` class and then the `Initializer` encapsulated in the `Parameter` class. However, some operators do not encapsulate the function of parameter initialization internally as `Conv2d`, such as `Conv3d`operator whose weight is passed in as a parameter `Conv3d` operator, at which point the initialization of weights needs to be defined manually.
+
+When initializing parameters, you can use the `Initializer` call to different data types in the `Initializer` subclass to initialize parameters, resulting in different types of data.
+
+When `Initializer` is used for parameter initialization, the parameters supported for incoming are `init`, `shape`, `dtype`:
+
+- `init`:support incoming `Tensor`、 `str`、 `Initializer` subclass.
+- `shape`:support incoming `list`、 `tuple`、 `int`。
+- `dtype`:support incoming `mindspore.dtype`。
+
+### Tensor as init parameter
+
+Code sample:
+
+```
+import numpy as np
+from mindspore import Tensor
+from mindspore import dtype as mstype
+from mindspore.common import set_seed
+from mindspore.common.initializer import initializer
+from mindspore.ops.operations import nn_ops as nps
+
+set_seed(1)
+
+input_data = Tensor(np.ones([16, 3, 10, 32, 32]), dtype=mstype.float32)
+weight_init = Tensor(np.ones([32, 3, 4, 3, 3]), dtype=mstype.float32)
+weight = initializer(weight_init, shape=[32, 3, 4, 3, 3])
+conv3d = nps.Conv3D(out_channel=32, kernel_size=(4, 3, 3))
+output = conv3d(input_data, weight)
+print(output)
+```
+
+Output:
+
+```
+[[[[[108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ ...
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]]
+ ...
+ [[108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ ...
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]]]]]
+```
+
+### Str as init parameter
+
+Code sample:
+
+```
+import numpy as np
+from mindspore import Tensor
+from mindspore import dtype as mstype
+from mindspore.common import set_seed
+from mindspore.common.initializer import initializer
+from mindspore.ops.operations import nn_ops as nps
+
+set_seed(1)
+
+input_data = Tensor(np.ones([16, 3, 10, 32, 32]), dtype=mstype.float32)
+weight = initializer('Normal', shape=[32, 3, 4, 3, 3], dtype=mstype.float32)
+conv3d = nps.Conv3D(out_channel=32, kernel_size=(4, 3, 3))
+output = conv3d(input_data, weight)
+print(output)
+```
+
+Output:
+
+```
+[[[[[0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [[0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]]]]
+```
+
+### Initializer subclass as init parameter
+
+Code sample:
+
+```
+import numpy as np
+from mindspore import Tensor
+from mindspore import dtype as mstype
+from mindspore.common import set_seed
+from mindspore.ops.operations import nn_ops as nps
+from mindspore.common.initializer import Normal, initializer
+
+set_seed(1)
+
+input_data = Tensor(np.ones([16, 3, 10, 32, 32]), dtype=mstype.float32)
+weight = initializer(Normal(0.2), shape=[32, 3, 4, 3, 3], dtype=mstype.float32)
+conv3d = nps.Conv3D(out_channel=32, kernel_size=(4, 3, 3))
+output = conv3d(input_data, weight)
+print(output)
+[[[[[0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [[0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]]]]
+```
+
+### Application in Parameter
+
+Code sample:
+
+```
+import numpy as np
+from mindspore import dtype as mstype
+from mindspore.common import set_seed
+from mindspore.ops import operations as ops
+from mindspore import Tensor, Parameter, context
+from mindspore.common.initializer import Normal, initializer
+
+set_seed(1)
+
+weight1 = Parameter(initializer('Normal', [5, 4], mstype.float32), name="w1")
+weight2 = Parameter(initializer(Normal(0.2), [5, 4], mstype.float32), name="w2")
+input_data = Tensor(np.arange(20).reshape(5, 4), dtype=mstype.float32)
+net = ops.Add()
+output = net(input_data, weight1)
+output = net(output, weight2)
+print(output)
+[[-0.3305102 1.0412874 2.0412874 3.0412874]
+ [ 4.0412874 4.9479127 5.9479127 6.9479127]
+ [ 7.947912 9.063009 10.063009 11.063009 ]
+ [12.063009 13.536987 14.536987 14.857441 ]
+ [15.751231 17.073082 17.808317 19.364822 ]]
+```
\ No newline at end of file
--
Gitee
From b72e0a77fc13480e543b55b97e016709523aac63 Mon Sep 17 00:00:00 2001
From: purecho <2019500005@e.gzhu.edu.cn>
Date: Fri, 21 May 2021 10:46:10 +0800
Subject: [PATCH 2/2] translate into English
---
.../source_en/initializer.md | 24 +++++++++----------
1 file changed, 11 insertions(+), 13 deletions(-)
diff --git a/docs/programming_guide/source_en/initializer.md b/docs/programming_guide/source_en/initializer.md
index ff823bcc62..a43e2b78dd 100644
--- a/docs/programming_guide/source_en/initializer.md
+++ b/docs/programming_guide/source_en/initializer.md
@@ -14,7 +14,7 @@ MindSpore provides a variety of parameter initialization methods and encapsulate
Network parameters are initialized by using strings whose content needs to be consistent with the name of the `Initializer` subclass. Initialization using strings uses default parameters in the `Initializer` subclass, such as using strings `Normal` equivalent to using `Initializer` subclass `Normal()`. Code sample:
-```
+```python
import numpy as np
import mindspore.nn as nn
from mindspore import Tensor
@@ -61,7 +61,7 @@ print(output)
The initialization of network parameters by using the `Initializer` subclass is similar to string initialization, except that string initialization uses the default parameters of the `Initializer` subclass. To use parameters in the `Initializer` subclass, parameters must be initialized using the `Initializer` subclass. For example, `Normal(0.2)`, the code sample is as follows:
-```
+```python
import numpy as np
import mindspore.nn as nn
from mindspore import Tensor
@@ -109,7 +109,7 @@ print(output)
In addition to the above two initialization methods, when the network uses a data type not found in MindSpore to initialize parameters, users can customize `Tensor` to initialize parameters. Code sample:
-```
+```python
import numpy as np
import mindspore.nn as nn
from mindspore import Tensor
@@ -140,7 +140,6 @@ print(output)
```
## Using `Initializer` to initialize parameters
-
In the code sample above, how to initialize parameters in the network is given, such as encapsulating the `Conv2d` operator in the network using NN layer, parameter `weight_Init` passes in the `Conv2d` operator as the data type to be initialized, which at initialization completes the initialization of the parameter by calling the `Parameter` class and then the `Initializer` encapsulated in the `Parameter` class. However, some operators do not encapsulate the function of parameter initialization internally as `Conv2d`, such as `Conv3d`operator whose weight is passed in as a parameter `Conv3d` operator, at which point the initialization of weights needs to be defined manually.
When initializing parameters, you can use the `Initializer` call to different data types in the `Initializer` subclass to initialize parameters, resulting in different types of data.
@@ -155,7 +154,7 @@ When `Initializer` is used for parameter initialization, the parameters supporte
Code sample:
-```
+```python
import numpy as np
from mindspore import Tensor
from mindspore import dtype as mstype
@@ -175,7 +174,7 @@ print(output)
Output:
-```
+```python
[[[[[108 108 108 ... 108 108 108]
[108 108 108 ... 108 108 108]
[108 108 108 ... 108 108 108]
@@ -194,10 +193,8 @@ Output:
```
### Str as init parameter
-
Code sample:
-
-```
+```python
import numpy as np
from mindspore import Tensor
from mindspore import dtype as mstype
@@ -216,7 +213,7 @@ print(output)
Output:
-```
+```python
[[[[[0 0 0 ... 0 0 0]
[0 0 0 ... 0 0 0]
[0 0 0 ... 0 0 0]]
@@ -238,7 +235,7 @@ Output:
Code sample:
-```
+```python
import numpy as np
from mindspore import Tensor
from mindspore import dtype as mstype
@@ -274,7 +271,7 @@ print(output)
Code sample:
-```
+```python
import numpy as np
from mindspore import dtype as mstype
from mindspore.common import set_seed
@@ -296,4 +293,5 @@ print(output)
[ 7.947912 9.063009 10.063009 11.063009 ]
[12.063009 13.536987 14.536987 14.857441 ]
[15.751231 17.073082 17.808317 19.364822 ]]
-```
\ No newline at end of file
+```
+
--
Gitee