diff --git a/docs/programming_guide/source_en/initializer.ipynb b/docs/programming_guide/source_en/initializer.ipynb
new file mode 100644
index 0000000000000000000000000000000000000000..1834c40fe4ffc3af9769967b8207010797c2703e
--- /dev/null
+++ b/docs/programming_guide/source_en/initializer.ipynb
@@ -0,0 +1,326 @@
+# Initialization of Network Parameters
+
+
+
+## Overview
+
+MindSpore provides a weight initialization module that allows users to initialize network parameters by wrapping operators and initializer methods to call strings, Initializer subclasses or custom Tensor. Class Initializer is the basic data structure in MindSpore to initialize network parameters,its subclasses contain several different types of data distributions (Zero, One, XavierUniform, HeUniform, HeNormal, Constant, Uniform, Normal, TruncatedNormal). The following is a detailed description of the two parameter initialization modes, the wrapper operator and the initializer method.
+
+## Parameter initialization using wrapper operator
+
+MindSpore provides various ways to initialize parameters and encapsulates parameter initialization in some operators. In this section, we will introduce the methods of initializing parameters in operators with parameter initialization function, taking the Conv2d operator as an example, and introduce the initialization of parameters in the network by strings, Initializer subclasses and custom Tensor, etc. The following code examples take Normal, a subclass of Initializer, as an example. Normal can be replaced by any of the Initializer subclasses.
+
+## String
+
+Using a string to initialize the network parameters, the content of the string needs to be consistent with the name of the subclass of Initializer, using the string approach to initialization will use the default parameters in the Initializer subclass, for example, using the string Normal is equivalent to using the Initializer subclass Normal (), code examples are as follows:
+
+```python
+import numpy as np
+import mindspore.nn as nn
+from mindspore import Tensor
+from mindspore.common import set_seed
+
+set_seed(1)
+
+input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))
+net = nn.Conv2d(3, 64, 3, weight_init='Normal')
+output = net(input_data)
+print(output)
+```
+
+```python
+[[[[ 3.10382620e-02 4.38603461e-02 4.38603461e-02 ... 4.38603461e-02
+ 4.38603461e-02 1.38719045e-02]
+ [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02
+ 3.54298912e-02 -5.54019120e-03]
+ [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02
+ 3.54298912e-02 -5.54019120e-03]
+ ...
+ [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02
+ 3.54298912e-02 -5.54019120e-03]
+ [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02
+ 3.54298912e-02 -5.54019120e-03]
+ [ 9.66199022e-03 1.24104535e-02 1.24104535e-02 ... 1.24104535e-02
+ 1.24104535e-02 -1.38977719e-02]]
+
+ ...
+
+ [[ 3.98553275e-02 -1.35465711e-03 -1.35465711e-03 ... -1.35465711e-03
+ -1.35465711e-03 -1.00310734e-02]
+ [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02
+ -3.60766202e-02 -2.95619294e-02]
+ [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02
+ -3.60766202e-02 -2.95619294e-02]
+ ...
+ [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02
+ -3.60766202e-02 -2.95619294e-02]
+ [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02
+ -3.60766202e-02 -2.95619294e-02]
+ [ 1.33139016e-02 6.74417242e-05 6.74417242e-05 ... 6.74417242e-05
+ 6.74417242e-05 -2.27325838e-02]]]]
+```
+
+## Initializer subclass
+
+Using the Initializer subclass to initialize the network parameters is similar to using a string to initialize the parameters, the difference is that using a string to initialize the parameters is using the default parameters of the Initializer subclass, such as to use the parameters in the Initializer subclass, you must use the Initializer subclass way to initialize the parameters To initialize, take Normal(0.2) as an example, the code sample is as follows:
+
+```python
+import numpy as np
+import mindspore.nn as nn
+from mindspore import Tensor
+from mindspore.common import set_seed
+from mindspore.common.initializer import Normal
+
+set_seed(1)
+
+input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))
+net = nn.Conv2d(3, 64, 3, weight_init=Normal(0.2))
+output = net(input_data)
+print(output)
+```
+
+```python
+[[[[ 6.2076533e-01 8.7720710e-01 8.7720710e-01 ... 8.7720710e-01
+ 8.7720710e-01 2.7743810e-01]
+ [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01
+ 7.0859784e-01 -1.1080378e-01]
+ [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01
+ 7.0859784e-01 -1.1080378e-01]
+ ...
+ [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01
+ 7.0859784e-01 -1.1080378e-01]
+ [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01
+ 7.0859784e-01 -1.1080378e-01]
+ [ 1.9323981e-01 2.4820906e-01 2.4820906e-01 ... 2.4820906e-01
+ 2.4820906e-01 -2.7795550e-01]]
+
+ ...
+
+ [[ 7.9710668e-01 -2.7093157e-02 -2.7093157e-02 ... -2.7093157e-02
+ -2.7093157e-02 -2.0062150e-01]
+ [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01
+ -7.2153252e-01 -5.9123868e-01]
+ [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01
+ -7.2153252e-01 -5.9123868e-01]
+ ...
+ [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01
+ -7.2153252e-01 -5.9123868e-01]
+ [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01
+ -7.2153252e-01 -5.9123868e-01]
+ [ 2.6627803e-01 1.3488382e-03 1.3488382e-03 ... 1.3488382e-03
+ 1.3488382e-03 -4.5465171e-01]]]]
+```
+
+## Customized Tensor
+
+In addition to the above two initialization methods, when the network wants to initialize the parameters with data types not available in MindSpore, the user can initialize the parameters by means of a custom Tensor, as shown in the following code example.
+
+
+
+```python
+import numpy as np
+import mindspore.nn as nn
+from mindspore import Tensor
+from mindspore import dtype as mstype
+
+weight = Tensor(np.ones([64, 3, 3, 3]), dtype=mstype.float32)
+input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))
+net = nn.Conv2d(3, 64, 3, weight_init=weight)
+output = net(input_data)
+print(output)
+```
+
+```python
+[[[[12. 18. 18. ... 18. 18. 12.]
+ [18. 27. 27. ... 27. 27. 18.]
+ [18. 27. 27. ... 27. 27. 18.]
+ ...
+ [18. 27. 27. ... 27. 27. 18.]
+ [18. 27. 27. ... 27. 27. 18.]
+ [12. 18. 18. ... 18. 18. 12.]]
+
+ ...
+
+ [[12. 18. 18. ... 18. 18. 12.]
+ [18. 27. 27. ... 27. 27. 18.]
+ [18. 27. 27. ... 27. 27. 18.]
+ ...
+ [18. 27. 27. ... 27. 27. 18.]
+ [18. 27. 27. ... 27. 27. 18.]
+ [12. 18. 18. ... 18. 18. 12.]]]]
+```
+
+## Initializing parameters using the initializer method
+
+In the above code sample, it is given how to initialize the parameters in the network, such as using the nn layer to encapsulate the Conv2d operator in the network, the parameter weight_init is passed to the Conv2d operator as the data type to be initialized, and the operator will initialize the parameters by calling the Parameter class, which in turn calls the initializer method encapsulated in the Parameter class to complete the initialization of the parameter. However, some operators do not encapsulate the function of parameter initialization internally as Conv2d does, such as the weights of the Conv3d operator are passed into the Conv3d operator as parameters, and the initialization of the weights needs to be defined manually.
+
+When initializing parameters, you can use the initializer method to call different data types in the Initializer subclass to initialize the parameters and thus generate different types of data.
+
+When using initializer for parameter initialization, the parameters passed in are init, shape, and dtype:
+
+
+
+init: support passing in Tensor, str, and subclass of Initializer.
+
+
+
+shape: support passing list, tuple, int.
+
+
+
+dtype: support passing mindspore.dtype.
+
+
+
+### init parameter is Tensor
+
+Sample code is as follows.
+
+```python
+import numpy as np
+from mindspore import Tensor
+from mindspore import dtype as mstype
+from mindspore.common import set_seed
+from mindspore.common.initializer import initializer
+from mindspore.ops.operations import nn_ops as nps
+
+set_seed(1)
+
+input_data = Tensor(np.ones([16, 3, 10, 32, 32]), dtype=mstype.float32)
+weight_init = Tensor(np.ones([32, 3, 4, 3, 3]), dtype=mstype.float32)
+weight = initializer(weight_init, shape=[32, 3, 4, 3, 3])
+conv3d = nps.Conv3D(out_channel=32, kernel_size=(4, 3, 3))
+output = conv3d(input_data, weight)
+print(output)
+```
+
+The output is as follows.
+
+```python
+[[[[[108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ ...
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]]
+ ...
+ [[108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ ...
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]]]]]
+```
+
+### init parameter is str
+
+Sample code is as follows.
+
+```python
+import numpy as np
+from mindspore import Tensor
+from mindspore import dtype as mstype
+from mindspore.common import set_seed
+from mindspore.common.initializer import initializer
+from mindspore.ops.operations import nn_ops as nps
+
+set_seed(1)
+
+input_data = Tensor(np.ones([16, 3, 10, 32, 32]), dtype=mstype.float32)
+weight = initializer('Normal', shape=[32, 3, 4, 3, 3], dtype=mstype.float32)
+conv3d = nps.Conv3D(out_channel=32, kernel_size=(4, 3, 3))
+output = conv3d(input_data, weight)
+print(output)
+```
+
+The output is as follows.
+
+```python
+[[[[[0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [[0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]]]]
+```
+
+### The init parameter is a subclass of Initializer
+
+Sample code is as follows.
+
+```python
+import numpy as np
+from mindspore import Tensor
+from mindspore import dtype as mstype
+from mindspore.common import set_seed
+from mindspore.ops.operations import nn_ops as nps
+from mindspore.common.initializer import Normal, initializer
+
+set_seed(1)
+
+input_data = Tensor(np.ones([16, 3, 10, 32, 32]), dtype=mstype.float32)
+weight = initializer(Normal(0.2), shape=[32, 3, 4, 3, 3], dtype=mstype.float32)
+conv3d = nps.Conv3D(out_channel=32, kernel_size=(4, 3, 3))
+output = conv3d(input_data, weight)
+print(output)
+```
+
+```
+[[[[[0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [[0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]]]]
+```
+
+## Application in Parameter
+
+The code example is as follows.
+
+```python
+import numpy as np
+from mindspore import dtype as mstype
+from mindspore.common import set_seed
+from mindspore.ops import operations as ops
+from mindspore import Tensor, Parameter, context
+from mindspore.common.initializer import Normal, initializer
+
+set_seed(1)
+
+weight1 = Parameter(initializer('Normal', [5, 4], mstype.float32), name="w1")
+weight2 = Parameter(initializer(Normal(0.2), [5, 4], mstype.float32), name="w2")
+input_data = Tensor(np.arange(20).reshape(5, 4), dtype=mstype.float32)
+net = ops.Add()
+output = net(input_data, weight1)
+output = net(output, weight2)
+print(output)
+```
+
+```python
+[[-0.3305102 1.0412874 2.0412874 3.0412874]
+ [ 4.0412874 4.9479127 5.9479127 6.9479127]
+ [ 7.947912 9.063009 10.063009 11.063009 ]
+ [12.063009 13.536987 14.536987 14.857441 ]
+ [15.751231 17.073082 17.808317 19.364822 ]]
+```
\ No newline at end of file
diff --git a/docs/programming_guide/source_en/initializer.md b/docs/programming_guide/source_en/initializer.md
index cd934f345ffd31fe2502207806817a392642f10c..1834c40fe4ffc3af9769967b8207010797c2703e 100644
--- a/docs/programming_guide/source_en/initializer.md
+++ b/docs/programming_guide/source_en/initializer.md
@@ -1,5 +1,326 @@
# Initialization of Network Parameters
-No English version right now, welcome to contribute.
+
-
\ No newline at end of file
+## Overview
+
+MindSpore provides a weight initialization module that allows users to initialize network parameters by wrapping operators and initializer methods to call strings, Initializer subclasses or custom Tensor. Class Initializer is the basic data structure in MindSpore to initialize network parameters,its subclasses contain several different types of data distributions (Zero, One, XavierUniform, HeUniform, HeNormal, Constant, Uniform, Normal, TruncatedNormal). The following is a detailed description of the two parameter initialization modes, the wrapper operator and the initializer method.
+
+## Parameter initialization using wrapper operator
+
+MindSpore provides various ways to initialize parameters and encapsulates parameter initialization in some operators. In this section, we will introduce the methods of initializing parameters in operators with parameter initialization function, taking the Conv2d operator as an example, and introduce the initialization of parameters in the network by strings, Initializer subclasses and custom Tensor, etc. The following code examples take Normal, a subclass of Initializer, as an example. Normal can be replaced by any of the Initializer subclasses.
+
+## String
+
+Using a string to initialize the network parameters, the content of the string needs to be consistent with the name of the subclass of Initializer, using the string approach to initialization will use the default parameters in the Initializer subclass, for example, using the string Normal is equivalent to using the Initializer subclass Normal (), code examples are as follows:
+
+```python
+import numpy as np
+import mindspore.nn as nn
+from mindspore import Tensor
+from mindspore.common import set_seed
+
+set_seed(1)
+
+input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))
+net = nn.Conv2d(3, 64, 3, weight_init='Normal')
+output = net(input_data)
+print(output)
+```
+
+```python
+[[[[ 3.10382620e-02 4.38603461e-02 4.38603461e-02 ... 4.38603461e-02
+ 4.38603461e-02 1.38719045e-02]
+ [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02
+ 3.54298912e-02 -5.54019120e-03]
+ [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02
+ 3.54298912e-02 -5.54019120e-03]
+ ...
+ [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02
+ 3.54298912e-02 -5.54019120e-03]
+ [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02
+ 3.54298912e-02 -5.54019120e-03]
+ [ 9.66199022e-03 1.24104535e-02 1.24104535e-02 ... 1.24104535e-02
+ 1.24104535e-02 -1.38977719e-02]]
+
+ ...
+
+ [[ 3.98553275e-02 -1.35465711e-03 -1.35465711e-03 ... -1.35465711e-03
+ -1.35465711e-03 -1.00310734e-02]
+ [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02
+ -3.60766202e-02 -2.95619294e-02]
+ [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02
+ -3.60766202e-02 -2.95619294e-02]
+ ...
+ [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02
+ -3.60766202e-02 -2.95619294e-02]
+ [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02
+ -3.60766202e-02 -2.95619294e-02]
+ [ 1.33139016e-02 6.74417242e-05 6.74417242e-05 ... 6.74417242e-05
+ 6.74417242e-05 -2.27325838e-02]]]]
+```
+
+## Initializer subclass
+
+Using the Initializer subclass to initialize the network parameters is similar to using a string to initialize the parameters, the difference is that using a string to initialize the parameters is using the default parameters of the Initializer subclass, such as to use the parameters in the Initializer subclass, you must use the Initializer subclass way to initialize the parameters To initialize, take Normal(0.2) as an example, the code sample is as follows:
+
+```python
+import numpy as np
+import mindspore.nn as nn
+from mindspore import Tensor
+from mindspore.common import set_seed
+from mindspore.common.initializer import Normal
+
+set_seed(1)
+
+input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))
+net = nn.Conv2d(3, 64, 3, weight_init=Normal(0.2))
+output = net(input_data)
+print(output)
+```
+
+```python
+[[[[ 6.2076533e-01 8.7720710e-01 8.7720710e-01 ... 8.7720710e-01
+ 8.7720710e-01 2.7743810e-01]
+ [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01
+ 7.0859784e-01 -1.1080378e-01]
+ [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01
+ 7.0859784e-01 -1.1080378e-01]
+ ...
+ [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01
+ 7.0859784e-01 -1.1080378e-01]
+ [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01
+ 7.0859784e-01 -1.1080378e-01]
+ [ 1.9323981e-01 2.4820906e-01 2.4820906e-01 ... 2.4820906e-01
+ 2.4820906e-01 -2.7795550e-01]]
+
+ ...
+
+ [[ 7.9710668e-01 -2.7093157e-02 -2.7093157e-02 ... -2.7093157e-02
+ -2.7093157e-02 -2.0062150e-01]
+ [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01
+ -7.2153252e-01 -5.9123868e-01]
+ [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01
+ -7.2153252e-01 -5.9123868e-01]
+ ...
+ [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01
+ -7.2153252e-01 -5.9123868e-01]
+ [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01
+ -7.2153252e-01 -5.9123868e-01]
+ [ 2.6627803e-01 1.3488382e-03 1.3488382e-03 ... 1.3488382e-03
+ 1.3488382e-03 -4.5465171e-01]]]]
+```
+
+## Customized Tensor
+
+In addition to the above two initialization methods, when the network wants to initialize the parameters with data types not available in MindSpore, the user can initialize the parameters by means of a custom Tensor, as shown in the following code example.
+
+
+
+```python
+import numpy as np
+import mindspore.nn as nn
+from mindspore import Tensor
+from mindspore import dtype as mstype
+
+weight = Tensor(np.ones([64, 3, 3, 3]), dtype=mstype.float32)
+input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))
+net = nn.Conv2d(3, 64, 3, weight_init=weight)
+output = net(input_data)
+print(output)
+```
+
+```python
+[[[[12. 18. 18. ... 18. 18. 12.]
+ [18. 27. 27. ... 27. 27. 18.]
+ [18. 27. 27. ... 27. 27. 18.]
+ ...
+ [18. 27. 27. ... 27. 27. 18.]
+ [18. 27. 27. ... 27. 27. 18.]
+ [12. 18. 18. ... 18. 18. 12.]]
+
+ ...
+
+ [[12. 18. 18. ... 18. 18. 12.]
+ [18. 27. 27. ... 27. 27. 18.]
+ [18. 27. 27. ... 27. 27. 18.]
+ ...
+ [18. 27. 27. ... 27. 27. 18.]
+ [18. 27. 27. ... 27. 27. 18.]
+ [12. 18. 18. ... 18. 18. 12.]]]]
+```
+
+## Initializing parameters using the initializer method
+
+In the above code sample, it is given how to initialize the parameters in the network, such as using the nn layer to encapsulate the Conv2d operator in the network, the parameter weight_init is passed to the Conv2d operator as the data type to be initialized, and the operator will initialize the parameters by calling the Parameter class, which in turn calls the initializer method encapsulated in the Parameter class to complete the initialization of the parameter. However, some operators do not encapsulate the function of parameter initialization internally as Conv2d does, such as the weights of the Conv3d operator are passed into the Conv3d operator as parameters, and the initialization of the weights needs to be defined manually.
+
+When initializing parameters, you can use the initializer method to call different data types in the Initializer subclass to initialize the parameters and thus generate different types of data.
+
+When using initializer for parameter initialization, the parameters passed in are init, shape, and dtype:
+
+
+
+init: support passing in Tensor, str, and subclass of Initializer.
+
+
+
+shape: support passing list, tuple, int.
+
+
+
+dtype: support passing mindspore.dtype.
+
+
+
+### init parameter is Tensor
+
+Sample code is as follows.
+
+```python
+import numpy as np
+from mindspore import Tensor
+from mindspore import dtype as mstype
+from mindspore.common import set_seed
+from mindspore.common.initializer import initializer
+from mindspore.ops.operations import nn_ops as nps
+
+set_seed(1)
+
+input_data = Tensor(np.ones([16, 3, 10, 32, 32]), dtype=mstype.float32)
+weight_init = Tensor(np.ones([32, 3, 4, 3, 3]), dtype=mstype.float32)
+weight = initializer(weight_init, shape=[32, 3, 4, 3, 3])
+conv3d = nps.Conv3D(out_channel=32, kernel_size=(4, 3, 3))
+output = conv3d(input_data, weight)
+print(output)
+```
+
+The output is as follows.
+
+```python
+[[[[[108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ ...
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]]
+ ...
+ [[108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ ...
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]
+ [108 108 108 ... 108 108 108]]]]]
+```
+
+### init parameter is str
+
+Sample code is as follows.
+
+```python
+import numpy as np
+from mindspore import Tensor
+from mindspore import dtype as mstype
+from mindspore.common import set_seed
+from mindspore.common.initializer import initializer
+from mindspore.ops.operations import nn_ops as nps
+
+set_seed(1)
+
+input_data = Tensor(np.ones([16, 3, 10, 32, 32]), dtype=mstype.float32)
+weight = initializer('Normal', shape=[32, 3, 4, 3, 3], dtype=mstype.float32)
+conv3d = nps.Conv3D(out_channel=32, kernel_size=(4, 3, 3))
+output = conv3d(input_data, weight)
+print(output)
+```
+
+The output is as follows.
+
+```python
+[[[[[0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [[0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]]]]
+```
+
+### The init parameter is a subclass of Initializer
+
+Sample code is as follows.
+
+```python
+import numpy as np
+from mindspore import Tensor
+from mindspore import dtype as mstype
+from mindspore.common import set_seed
+from mindspore.ops.operations import nn_ops as nps
+from mindspore.common.initializer import Normal, initializer
+
+set_seed(1)
+
+input_data = Tensor(np.ones([16, 3, 10, 32, 32]), dtype=mstype.float32)
+weight = initializer(Normal(0.2), shape=[32, 3, 4, 3, 3], dtype=mstype.float32)
+conv3d = nps.Conv3D(out_channel=32, kernel_size=(4, 3, 3))
+output = conv3d(input_data, weight)
+print(output)
+```
+
+```
+[[[[[0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [[0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]
+ ...
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]
+ [0 0 0 ... 0 0 0]]]]]
+```
+
+## Application in Parameter
+
+The code example is as follows.
+
+```python
+import numpy as np
+from mindspore import dtype as mstype
+from mindspore.common import set_seed
+from mindspore.ops import operations as ops
+from mindspore import Tensor, Parameter, context
+from mindspore.common.initializer import Normal, initializer
+
+set_seed(1)
+
+weight1 = Parameter(initializer('Normal', [5, 4], mstype.float32), name="w1")
+weight2 = Parameter(initializer(Normal(0.2), [5, 4], mstype.float32), name="w2")
+input_data = Tensor(np.arange(20).reshape(5, 4), dtype=mstype.float32)
+net = ops.Add()
+output = net(input_data, weight1)
+output = net(output, weight2)
+print(output)
+```
+
+```python
+[[-0.3305102 1.0412874 2.0412874 3.0412874]
+ [ 4.0412874 4.9479127 5.9479127 6.9479127]
+ [ 7.947912 9.063009 10.063009 11.063009 ]
+ [12.063009 13.536987 14.536987 14.857441 ]
+ [15.751231 17.073082 17.808317 19.364822 ]]
+```
\ No newline at end of file