From b6a6ea6c42437d2ba9483ddd54e7c8a57a8916ed Mon Sep 17 00:00:00 2001 From: hzb <1219326125@qq.com> Date: Tue, 1 Jun 2021 14:35:45 +0800 Subject: [PATCH 1/3] my part of translation --- .../Untitled-checkpoint.ipynb | 150 ++++++++++++++++++ .../initializer-checkpoint.ipynb | 150 ++++++++++++++++++ .../source_en/Untitled.ipynb | 150 ++++++++++++++++++ .../source_en/initializer.md | 114 ++++++++++++- 4 files changed, 561 insertions(+), 3 deletions(-) create mode 100644 docs/programming_guide/source_en/.ipynb_checkpoints/Untitled-checkpoint.ipynb create mode 100644 docs/programming_guide/source_en/.ipynb_checkpoints/initializer-checkpoint.ipynb create mode 100644 docs/programming_guide/source_en/Untitled.ipynb diff --git a/docs/programming_guide/source_en/.ipynb_checkpoints/Untitled-checkpoint.ipynb b/docs/programming_guide/source_en/.ipynb_checkpoints/Untitled-checkpoint.ipynb new file mode 100644 index 0000000000..9c6b4b21e0 --- /dev/null +++ b/docs/programming_guide/source_en/.ipynb_checkpoints/Untitled-checkpoint.ipynb @@ -0,0 +1,150 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Initialization of Network Parameters\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "### Overview\n", + "\n", + "Mindsprore servers weight initialization module. Users can initialize network parameters through encapsulating operator and initializer method to use string, initializer subclass or the tensor you maked. Initializer class is the basic data structure type used for initialization in mindsprore. Initializer's subclass contain several different types of data distribution (Zero, One, XavierUniform, Heuniform, Henormal, Honstant, Uniform, Normal, Truncatednormal). Here are two parameter initialization modes of encapsulation operator and initializer method introduction in detail.\n", + "\n", + "### Initializing Parameters by Encapsulating operator\n", + "\n", + "Mindsprore provides a variety of parameter initialization methods, and encapsulates the function of parameter initialization in some operators. This section will introduce the method of initializing parameters by operators with parameter initialization function. Taking `Conv2d` operator as an example, this section introduces how to initialize parameters in the network by string, Subclasses of `Initializer` and user-defined `Tensor`. In the following code examples, the Subclasses normal of `Initializer` is taken as an example, In the code example, `Normal` can be replaced by any one of the Subclasses of `Initializer`.\n", + "\n", + "### String\n", + "\n", + "Using string to initialize the network parameter, the content of the string should be consistent with the name of the Subclasses of `Initializer`. The default parameter in the Subclasses of `Initializer` will be used when using string to initialize. For example, using string normal is equivalent to using the Normal() of the Subclasses of `Initializer`. The code example is as follows:\n", + "\n", + "#### Input\n", + "import numpy as np\n", + "import mindspore.nn as nn\n", + "from mindspore import Tensor\n", + "from mindspore.common import set_seed\n", + "\n", + "set_seed(1)\n", + "\n", + "input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))\n", + "net = nn.Conv2d(3, 64, 3, weight_init='Normal')\n", + "output = net(input_data)\n", + "print(output)\n", + "\n", + "#### Output\n", + "[[[[ 3.10382620e-02 4.38603461e-02 4.38603461e-02 ... 4.38603461e-02\n", + " 4.38603461e-02 1.38719045e-02]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " ...\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 9.66199022e-03 1.24104535e-02 1.24104535e-02 ... 1.24104535e-02\n", + " 1.24104535e-02 -1.38977719e-02]]\n", + "\n", + " ...\n", + "\n", + " [[ 3.98553275e-02 -1.35465711e-03 -1.35465711e-03 ... -1.35465711e-03\n", + " -1.35465711e-03 -1.00310734e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " ...\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 1.33139016e-02 6.74417242e-05 6.74417242e-05 ... 6.74417242e-05\n", + " 6.74417242e-05 -2.27325838e-02]]]]\n", + " \n", + "### Subclasses of Initializer \n", + "\n", + "Initializing network parameters with Subclasses of `Initializer` is similar to initializing parameters with String. The difference is that initializing parameters with string is the default parameter of Subclasses of `Initializer`. If you want to use parameters in Subclasses of `Initializer`, you must initialize parameters with Subclasses of `Initializer`. Take `Normal (0.2)` as an example, The code example is as follows: \n", + "\n", + "#### Input\n", + "import numpy as np\n", + "import mindspore.nn as nn\n", + "from mindspore import Tensor\n", + "from mindspore.common import set_seed\n", + "from mindspore.common.initializer import Normal\n", + "\n", + "set_seed(1)\n", + "\n", + "input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))\n", + "net = nn.Conv2d(3, 64, 3, weight_init=Normal(0.2))\n", + "output = net(input_data)\n", + "print(output)\n", + "\n", + "#### output\n", + "[[[[ 6.2076533e-01 8.7720710e-01 8.7720710e-01 ... 8.7720710e-01\n", + " 8.7720710e-01 2.7743810e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " ...\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 1.9323981e-01 2.4820906e-01 2.4820906e-01 ... 2.4820906e-01\n", + " 2.4820906e-01 -2.7795550e-01]]\n", + "\n", + " ...\n", + "\n", + " [[ 7.9710668e-01 -2.7093157e-02 -2.7093157e-02 ... -2.7093157e-02\n", + " -2.7093157e-02 -2.0062150e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " ...\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 2.6627803e-01 1.3488382e-03 1.3488382e-03 ... 1.3488382e-03\n", + " 1.3488382e-03 -4.5465171e-01]]]]\n", + " \n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.5" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/docs/programming_guide/source_en/.ipynb_checkpoints/initializer-checkpoint.ipynb b/docs/programming_guide/source_en/.ipynb_checkpoints/initializer-checkpoint.ipynb new file mode 100644 index 0000000000..9c6b4b21e0 --- /dev/null +++ b/docs/programming_guide/source_en/.ipynb_checkpoints/initializer-checkpoint.ipynb @@ -0,0 +1,150 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Initialization of Network Parameters\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "### Overview\n", + "\n", + "Mindsprore servers weight initialization module. Users can initialize network parameters through encapsulating operator and initializer method to use string, initializer subclass or the tensor you maked. Initializer class is the basic data structure type used for initialization in mindsprore. Initializer's subclass contain several different types of data distribution (Zero, One, XavierUniform, Heuniform, Henormal, Honstant, Uniform, Normal, Truncatednormal). Here are two parameter initialization modes of encapsulation operator and initializer method introduction in detail.\n", + "\n", + "### Initializing Parameters by Encapsulating operator\n", + "\n", + "Mindsprore provides a variety of parameter initialization methods, and encapsulates the function of parameter initialization in some operators. This section will introduce the method of initializing parameters by operators with parameter initialization function. Taking `Conv2d` operator as an example, this section introduces how to initialize parameters in the network by string, Subclasses of `Initializer` and user-defined `Tensor`. In the following code examples, the Subclasses normal of `Initializer` is taken as an example, In the code example, `Normal` can be replaced by any one of the Subclasses of `Initializer`.\n", + "\n", + "### String\n", + "\n", + "Using string to initialize the network parameter, the content of the string should be consistent with the name of the Subclasses of `Initializer`. The default parameter in the Subclasses of `Initializer` will be used when using string to initialize. For example, using string normal is equivalent to using the Normal() of the Subclasses of `Initializer`. The code example is as follows:\n", + "\n", + "#### Input\n", + "import numpy as np\n", + "import mindspore.nn as nn\n", + "from mindspore import Tensor\n", + "from mindspore.common import set_seed\n", + "\n", + "set_seed(1)\n", + "\n", + "input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))\n", + "net = nn.Conv2d(3, 64, 3, weight_init='Normal')\n", + "output = net(input_data)\n", + "print(output)\n", + "\n", + "#### Output\n", + "[[[[ 3.10382620e-02 4.38603461e-02 4.38603461e-02 ... 4.38603461e-02\n", + " 4.38603461e-02 1.38719045e-02]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " ...\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 9.66199022e-03 1.24104535e-02 1.24104535e-02 ... 1.24104535e-02\n", + " 1.24104535e-02 -1.38977719e-02]]\n", + "\n", + " ...\n", + "\n", + " [[ 3.98553275e-02 -1.35465711e-03 -1.35465711e-03 ... -1.35465711e-03\n", + " -1.35465711e-03 -1.00310734e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " ...\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 1.33139016e-02 6.74417242e-05 6.74417242e-05 ... 6.74417242e-05\n", + " 6.74417242e-05 -2.27325838e-02]]]]\n", + " \n", + "### Subclasses of Initializer \n", + "\n", + "Initializing network parameters with Subclasses of `Initializer` is similar to initializing parameters with String. The difference is that initializing parameters with string is the default parameter of Subclasses of `Initializer`. If you want to use parameters in Subclasses of `Initializer`, you must initialize parameters with Subclasses of `Initializer`. Take `Normal (0.2)` as an example, The code example is as follows: \n", + "\n", + "#### Input\n", + "import numpy as np\n", + "import mindspore.nn as nn\n", + "from mindspore import Tensor\n", + "from mindspore.common import set_seed\n", + "from mindspore.common.initializer import Normal\n", + "\n", + "set_seed(1)\n", + "\n", + "input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))\n", + "net = nn.Conv2d(3, 64, 3, weight_init=Normal(0.2))\n", + "output = net(input_data)\n", + "print(output)\n", + "\n", + "#### output\n", + "[[[[ 6.2076533e-01 8.7720710e-01 8.7720710e-01 ... 8.7720710e-01\n", + " 8.7720710e-01 2.7743810e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " ...\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 1.9323981e-01 2.4820906e-01 2.4820906e-01 ... 2.4820906e-01\n", + " 2.4820906e-01 -2.7795550e-01]]\n", + "\n", + " ...\n", + "\n", + " [[ 7.9710668e-01 -2.7093157e-02 -2.7093157e-02 ... -2.7093157e-02\n", + " -2.7093157e-02 -2.0062150e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " ...\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 2.6627803e-01 1.3488382e-03 1.3488382e-03 ... 1.3488382e-03\n", + " 1.3488382e-03 -4.5465171e-01]]]]\n", + " \n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.5" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/docs/programming_guide/source_en/Untitled.ipynb b/docs/programming_guide/source_en/Untitled.ipynb new file mode 100644 index 0000000000..9c6b4b21e0 --- /dev/null +++ b/docs/programming_guide/source_en/Untitled.ipynb @@ -0,0 +1,150 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Initialization of Network Parameters\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "### Overview\n", + "\n", + "Mindsprore servers weight initialization module. Users can initialize network parameters through encapsulating operator and initializer method to use string, initializer subclass or the tensor you maked. Initializer class is the basic data structure type used for initialization in mindsprore. Initializer's subclass contain several different types of data distribution (Zero, One, XavierUniform, Heuniform, Henormal, Honstant, Uniform, Normal, Truncatednormal). Here are two parameter initialization modes of encapsulation operator and initializer method introduction in detail.\n", + "\n", + "### Initializing Parameters by Encapsulating operator\n", + "\n", + "Mindsprore provides a variety of parameter initialization methods, and encapsulates the function of parameter initialization in some operators. This section will introduce the method of initializing parameters by operators with parameter initialization function. Taking `Conv2d` operator as an example, this section introduces how to initialize parameters in the network by string, Subclasses of `Initializer` and user-defined `Tensor`. In the following code examples, the Subclasses normal of `Initializer` is taken as an example, In the code example, `Normal` can be replaced by any one of the Subclasses of `Initializer`.\n", + "\n", + "### String\n", + "\n", + "Using string to initialize the network parameter, the content of the string should be consistent with the name of the Subclasses of `Initializer`. The default parameter in the Subclasses of `Initializer` will be used when using string to initialize. For example, using string normal is equivalent to using the Normal() of the Subclasses of `Initializer`. The code example is as follows:\n", + "\n", + "#### Input\n", + "import numpy as np\n", + "import mindspore.nn as nn\n", + "from mindspore import Tensor\n", + "from mindspore.common import set_seed\n", + "\n", + "set_seed(1)\n", + "\n", + "input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))\n", + "net = nn.Conv2d(3, 64, 3, weight_init='Normal')\n", + "output = net(input_data)\n", + "print(output)\n", + "\n", + "#### Output\n", + "[[[[ 3.10382620e-02 4.38603461e-02 4.38603461e-02 ... 4.38603461e-02\n", + " 4.38603461e-02 1.38719045e-02]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " ...\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 9.66199022e-03 1.24104535e-02 1.24104535e-02 ... 1.24104535e-02\n", + " 1.24104535e-02 -1.38977719e-02]]\n", + "\n", + " ...\n", + "\n", + " [[ 3.98553275e-02 -1.35465711e-03 -1.35465711e-03 ... -1.35465711e-03\n", + " -1.35465711e-03 -1.00310734e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " ...\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 1.33139016e-02 6.74417242e-05 6.74417242e-05 ... 6.74417242e-05\n", + " 6.74417242e-05 -2.27325838e-02]]]]\n", + " \n", + "### Subclasses of Initializer \n", + "\n", + "Initializing network parameters with Subclasses of `Initializer` is similar to initializing parameters with String. The difference is that initializing parameters with string is the default parameter of Subclasses of `Initializer`. If you want to use parameters in Subclasses of `Initializer`, you must initialize parameters with Subclasses of `Initializer`. Take `Normal (0.2)` as an example, The code example is as follows: \n", + "\n", + "#### Input\n", + "import numpy as np\n", + "import mindspore.nn as nn\n", + "from mindspore import Tensor\n", + "from mindspore.common import set_seed\n", + "from mindspore.common.initializer import Normal\n", + "\n", + "set_seed(1)\n", + "\n", + "input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))\n", + "net = nn.Conv2d(3, 64, 3, weight_init=Normal(0.2))\n", + "output = net(input_data)\n", + "print(output)\n", + "\n", + "#### output\n", + "[[[[ 6.2076533e-01 8.7720710e-01 8.7720710e-01 ... 8.7720710e-01\n", + " 8.7720710e-01 2.7743810e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " ...\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 1.9323981e-01 2.4820906e-01 2.4820906e-01 ... 2.4820906e-01\n", + " 2.4820906e-01 -2.7795550e-01]]\n", + "\n", + " ...\n", + "\n", + " [[ 7.9710668e-01 -2.7093157e-02 -2.7093157e-02 ... -2.7093157e-02\n", + " -2.7093157e-02 -2.0062150e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " ...\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 2.6627803e-01 1.3488382e-03 1.3488382e-03 ... 1.3488382e-03\n", + " 1.3488382e-03 -4.5465171e-01]]]]\n", + " \n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.5" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/docs/programming_guide/source_en/initializer.md b/docs/programming_guide/source_en/initializer.md index cd934f345f..0d0fcbbc8e 100644 --- a/docs/programming_guide/source_en/initializer.md +++ b/docs/programming_guide/source_en/initializer.md @@ -1,5 +1,113 @@ -# Initialization of Network Parameters +# Initialization of Network Parameters -No English version right now, welcome to contribute. + + + - \ No newline at end of file + + +### Overview + +Mindsprore servers weight initialization module. Users can initialize network parameters through encapsulating operator and initializer method to use string, initializer subclass or the tensor you maked. Initializer class is the basic data structure type used for initialization in mindsprore. Initializer's subclass contain several different types of data distribution (Zero, One, XavierUniform, Heuniform, Henormal, Honstant, Uniform, Normal, Truncatednormal). Here are two parameter initialization modes of encapsulation operator and initializer method introduction in detail. + +### Initializing Parameters by Encapsulating operator + +Mindsprore provides a variety of parameter initialization methods, and encapsulates the function of parameter initialization in some operators. This section will introduce the method of initializing parameters by operators with parameter initialization function. Taking `Conv2d` operator as an example, this section introduces how to initialize parameters in the network by string, Subclasses of `Initializer` and user-defined `Tensor`. In the following code examples, the Subclasses normal of `Initializer` is taken as an example, In the code example, `Normal` can be replaced by any one of the Subclasses of `Initializer`. + +### String + +Using string to initialize the network parameter, the content of the string should be consistent with the name of the Subclasses of `Initializer`. The default parameter in the Subclasses of `Initializer` will be used when using string to initialize. For example, using string normal is equivalent to using the Normal() of the Subclasses of `Initializer`. The code example is as follows: + +#### Input +import numpy as np +import mindspore.nn as nn +from mindspore import Tensor +from mindspore.common import set_seed + +set_seed(1) + +input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32)) +net = nn.Conv2d(3, 64, 3, weight_init='Normal') +output = net(input_data) +print(output) + +#### Output +[[[[ 3.10382620e-02 4.38603461e-02 4.38603461e-02 ... 4.38603461e-02 + 4.38603461e-02 1.38719045e-02] + [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02 + 3.54298912e-02 -5.54019120e-03] + [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02 + 3.54298912e-02 -5.54019120e-03] + ... + [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02 + 3.54298912e-02 -5.54019120e-03] + [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02 + 3.54298912e-02 -5.54019120e-03] + [ 9.66199022e-03 1.24104535e-02 1.24104535e-02 ... 1.24104535e-02 + 1.24104535e-02 -1.38977719e-02]] + + ... + + [[ 3.98553275e-02 -1.35465711e-03 -1.35465711e-03 ... -1.35465711e-03 + -1.35465711e-03 -1.00310734e-02] + [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02 + -3.60766202e-02 -2.95619294e-02] + [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02 + -3.60766202e-02 -2.95619294e-02] + ... + [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02 + -3.60766202e-02 -2.95619294e-02] + [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02 + -3.60766202e-02 -2.95619294e-02] + [ 1.33139016e-02 6.74417242e-05 6.74417242e-05 ... 6.74417242e-05 + 6.74417242e-05 -2.27325838e-02]]]] + +### Subclasses of Initializer + +Initializing network parameters with Subclasses of `Initializer` is similar to initializing parameters with String. The difference is that initializing parameters with string is the default parameter of Subclasses of `Initializer`. If you want to use parameters in Subclasses of `Initializer`, you must initialize parameters with Subclasses of `Initializer`. Take `Normal (0.2)` as an example, The code example is as follows: + +#### Input +import numpy as np +import mindspore.nn as nn +from mindspore import Tensor +from mindspore.common import set_seed +from mindspore.common.initializer import Normal + +set_seed(1) + +input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32)) +net = nn.Conv2d(3, 64, 3, weight_init=Normal(0.2)) +output = net(input_data) +print(output) + +#### output +[[[[ 6.2076533e-01 8.7720710e-01 8.7720710e-01 ... 8.7720710e-01 + 8.7720710e-01 2.7743810e-01] + [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01 + 7.0859784e-01 -1.1080378e-01] + [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01 + 7.0859784e-01 -1.1080378e-01] + ... + [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01 + 7.0859784e-01 -1.1080378e-01] + [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01 + 7.0859784e-01 -1.1080378e-01] + [ 1.9323981e-01 2.4820906e-01 2.4820906e-01 ... 2.4820906e-01 + 2.4820906e-01 -2.7795550e-01]] + + ... + + [[ 7.9710668e-01 -2.7093157e-02 -2.7093157e-02 ... -2.7093157e-02 + -2.7093157e-02 -2.0062150e-01] + [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01 + -7.2153252e-01 -5.9123868e-01] + [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01 + -7.2153252e-01 -5.9123868e-01] + ... + [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01 + -7.2153252e-01 -5.9123868e-01] + [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01 + -7.2153252e-01 -5.9123868e-01] + [ 2.6627803e-01 1.3488382e-03 1.3488382e-03 ... 1.3488382e-03 + 1.3488382e-03 -4.5465171e-01]]]] + -- Gitee From 6b0bd9445e287c42e52eefe50cef3e22cfd8aad1 Mon Sep 17 00:00:00 2001 From: hzb <1219326125@qq.com> Date: Tue, 1 Jun 2021 14:35:45 +0800 Subject: [PATCH 2/3] my part of translation --- .../source_en/initializer.md | 114 +++++++++++++++++- 1 file changed, 111 insertions(+), 3 deletions(-) diff --git a/docs/programming_guide/source_en/initializer.md b/docs/programming_guide/source_en/initializer.md index cd934f345f..0d0fcbbc8e 100644 --- a/docs/programming_guide/source_en/initializer.md +++ b/docs/programming_guide/source_en/initializer.md @@ -1,5 +1,113 @@ -# Initialization of Network Parameters +# Initialization of Network Parameters -No English version right now, welcome to contribute. + + + - \ No newline at end of file + + +### Overview + +Mindsprore servers weight initialization module. Users can initialize network parameters through encapsulating operator and initializer method to use string, initializer subclass or the tensor you maked. Initializer class is the basic data structure type used for initialization in mindsprore. Initializer's subclass contain several different types of data distribution (Zero, One, XavierUniform, Heuniform, Henormal, Honstant, Uniform, Normal, Truncatednormal). Here are two parameter initialization modes of encapsulation operator and initializer method introduction in detail. + +### Initializing Parameters by Encapsulating operator + +Mindsprore provides a variety of parameter initialization methods, and encapsulates the function of parameter initialization in some operators. This section will introduce the method of initializing parameters by operators with parameter initialization function. Taking `Conv2d` operator as an example, this section introduces how to initialize parameters in the network by string, Subclasses of `Initializer` and user-defined `Tensor`. In the following code examples, the Subclasses normal of `Initializer` is taken as an example, In the code example, `Normal` can be replaced by any one of the Subclasses of `Initializer`. + +### String + +Using string to initialize the network parameter, the content of the string should be consistent with the name of the Subclasses of `Initializer`. The default parameter in the Subclasses of `Initializer` will be used when using string to initialize. For example, using string normal is equivalent to using the Normal() of the Subclasses of `Initializer`. The code example is as follows: + +#### Input +import numpy as np +import mindspore.nn as nn +from mindspore import Tensor +from mindspore.common import set_seed + +set_seed(1) + +input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32)) +net = nn.Conv2d(3, 64, 3, weight_init='Normal') +output = net(input_data) +print(output) + +#### Output +[[[[ 3.10382620e-02 4.38603461e-02 4.38603461e-02 ... 4.38603461e-02 + 4.38603461e-02 1.38719045e-02] + [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02 + 3.54298912e-02 -5.54019120e-03] + [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02 + 3.54298912e-02 -5.54019120e-03] + ... + [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02 + 3.54298912e-02 -5.54019120e-03] + [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02 + 3.54298912e-02 -5.54019120e-03] + [ 9.66199022e-03 1.24104535e-02 1.24104535e-02 ... 1.24104535e-02 + 1.24104535e-02 -1.38977719e-02]] + + ... + + [[ 3.98553275e-02 -1.35465711e-03 -1.35465711e-03 ... -1.35465711e-03 + -1.35465711e-03 -1.00310734e-02] + [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02 + -3.60766202e-02 -2.95619294e-02] + [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02 + -3.60766202e-02 -2.95619294e-02] + ... + [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02 + -3.60766202e-02 -2.95619294e-02] + [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02 + -3.60766202e-02 -2.95619294e-02] + [ 1.33139016e-02 6.74417242e-05 6.74417242e-05 ... 6.74417242e-05 + 6.74417242e-05 -2.27325838e-02]]]] + +### Subclasses of Initializer + +Initializing network parameters with Subclasses of `Initializer` is similar to initializing parameters with String. The difference is that initializing parameters with string is the default parameter of Subclasses of `Initializer`. If you want to use parameters in Subclasses of `Initializer`, you must initialize parameters with Subclasses of `Initializer`. Take `Normal (0.2)` as an example, The code example is as follows: + +#### Input +import numpy as np +import mindspore.nn as nn +from mindspore import Tensor +from mindspore.common import set_seed +from mindspore.common.initializer import Normal + +set_seed(1) + +input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32)) +net = nn.Conv2d(3, 64, 3, weight_init=Normal(0.2)) +output = net(input_data) +print(output) + +#### output +[[[[ 6.2076533e-01 8.7720710e-01 8.7720710e-01 ... 8.7720710e-01 + 8.7720710e-01 2.7743810e-01] + [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01 + 7.0859784e-01 -1.1080378e-01] + [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01 + 7.0859784e-01 -1.1080378e-01] + ... + [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01 + 7.0859784e-01 -1.1080378e-01] + [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01 + 7.0859784e-01 -1.1080378e-01] + [ 1.9323981e-01 2.4820906e-01 2.4820906e-01 ... 2.4820906e-01 + 2.4820906e-01 -2.7795550e-01]] + + ... + + [[ 7.9710668e-01 -2.7093157e-02 -2.7093157e-02 ... -2.7093157e-02 + -2.7093157e-02 -2.0062150e-01] + [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01 + -7.2153252e-01 -5.9123868e-01] + [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01 + -7.2153252e-01 -5.9123868e-01] + ... + [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01 + -7.2153252e-01 -5.9123868e-01] + [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01 + -7.2153252e-01 -5.9123868e-01] + [ 2.6627803e-01 1.3488382e-03 1.3488382e-03 ... 1.3488382e-03 + 1.3488382e-03 -4.5465171e-01]]]] + -- Gitee From 3131410007f56a7fa89d9861287aad973252fcef Mon Sep 17 00:00:00 2001 From: lvmingfu Date: Tue, 1 Jun 2021 10:20:03 +0800 Subject: [PATCH 3/3] modify code formats my part of translation my part of translation --- .../Untitled-checkpoint.ipynb | 150 ++++++++++++++++++ .../initializer-checkpoint.ipynb | 150 ++++++++++++++++++ .../source_en/Untitled.ipynb | 150 ++++++++++++++++++ 3 files changed, 450 insertions(+) create mode 100644 docs/programming_guide/source_en/.ipynb_checkpoints/Untitled-checkpoint.ipynb create mode 100644 docs/programming_guide/source_en/.ipynb_checkpoints/initializer-checkpoint.ipynb create mode 100644 docs/programming_guide/source_en/Untitled.ipynb diff --git a/docs/programming_guide/source_en/.ipynb_checkpoints/Untitled-checkpoint.ipynb b/docs/programming_guide/source_en/.ipynb_checkpoints/Untitled-checkpoint.ipynb new file mode 100644 index 0000000000..9c6b4b21e0 --- /dev/null +++ b/docs/programming_guide/source_en/.ipynb_checkpoints/Untitled-checkpoint.ipynb @@ -0,0 +1,150 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Initialization of Network Parameters\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "### Overview\n", + "\n", + "Mindsprore servers weight initialization module. Users can initialize network parameters through encapsulating operator and initializer method to use string, initializer subclass or the tensor you maked. Initializer class is the basic data structure type used for initialization in mindsprore. Initializer's subclass contain several different types of data distribution (Zero, One, XavierUniform, Heuniform, Henormal, Honstant, Uniform, Normal, Truncatednormal). Here are two parameter initialization modes of encapsulation operator and initializer method introduction in detail.\n", + "\n", + "### Initializing Parameters by Encapsulating operator\n", + "\n", + "Mindsprore provides a variety of parameter initialization methods, and encapsulates the function of parameter initialization in some operators. This section will introduce the method of initializing parameters by operators with parameter initialization function. Taking `Conv2d` operator as an example, this section introduces how to initialize parameters in the network by string, Subclasses of `Initializer` and user-defined `Tensor`. In the following code examples, the Subclasses normal of `Initializer` is taken as an example, In the code example, `Normal` can be replaced by any one of the Subclasses of `Initializer`.\n", + "\n", + "### String\n", + "\n", + "Using string to initialize the network parameter, the content of the string should be consistent with the name of the Subclasses of `Initializer`. The default parameter in the Subclasses of `Initializer` will be used when using string to initialize. For example, using string normal is equivalent to using the Normal() of the Subclasses of `Initializer`. The code example is as follows:\n", + "\n", + "#### Input\n", + "import numpy as np\n", + "import mindspore.nn as nn\n", + "from mindspore import Tensor\n", + "from mindspore.common import set_seed\n", + "\n", + "set_seed(1)\n", + "\n", + "input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))\n", + "net = nn.Conv2d(3, 64, 3, weight_init='Normal')\n", + "output = net(input_data)\n", + "print(output)\n", + "\n", + "#### Output\n", + "[[[[ 3.10382620e-02 4.38603461e-02 4.38603461e-02 ... 4.38603461e-02\n", + " 4.38603461e-02 1.38719045e-02]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " ...\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 9.66199022e-03 1.24104535e-02 1.24104535e-02 ... 1.24104535e-02\n", + " 1.24104535e-02 -1.38977719e-02]]\n", + "\n", + " ...\n", + "\n", + " [[ 3.98553275e-02 -1.35465711e-03 -1.35465711e-03 ... -1.35465711e-03\n", + " -1.35465711e-03 -1.00310734e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " ...\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 1.33139016e-02 6.74417242e-05 6.74417242e-05 ... 6.74417242e-05\n", + " 6.74417242e-05 -2.27325838e-02]]]]\n", + " \n", + "### Subclasses of Initializer \n", + "\n", + "Initializing network parameters with Subclasses of `Initializer` is similar to initializing parameters with String. The difference is that initializing parameters with string is the default parameter of Subclasses of `Initializer`. If you want to use parameters in Subclasses of `Initializer`, you must initialize parameters with Subclasses of `Initializer`. Take `Normal (0.2)` as an example, The code example is as follows: \n", + "\n", + "#### Input\n", + "import numpy as np\n", + "import mindspore.nn as nn\n", + "from mindspore import Tensor\n", + "from mindspore.common import set_seed\n", + "from mindspore.common.initializer import Normal\n", + "\n", + "set_seed(1)\n", + "\n", + "input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))\n", + "net = nn.Conv2d(3, 64, 3, weight_init=Normal(0.2))\n", + "output = net(input_data)\n", + "print(output)\n", + "\n", + "#### output\n", + "[[[[ 6.2076533e-01 8.7720710e-01 8.7720710e-01 ... 8.7720710e-01\n", + " 8.7720710e-01 2.7743810e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " ...\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 1.9323981e-01 2.4820906e-01 2.4820906e-01 ... 2.4820906e-01\n", + " 2.4820906e-01 -2.7795550e-01]]\n", + "\n", + " ...\n", + "\n", + " [[ 7.9710668e-01 -2.7093157e-02 -2.7093157e-02 ... -2.7093157e-02\n", + " -2.7093157e-02 -2.0062150e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " ...\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 2.6627803e-01 1.3488382e-03 1.3488382e-03 ... 1.3488382e-03\n", + " 1.3488382e-03 -4.5465171e-01]]]]\n", + " \n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.5" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/docs/programming_guide/source_en/.ipynb_checkpoints/initializer-checkpoint.ipynb b/docs/programming_guide/source_en/.ipynb_checkpoints/initializer-checkpoint.ipynb new file mode 100644 index 0000000000..9c6b4b21e0 --- /dev/null +++ b/docs/programming_guide/source_en/.ipynb_checkpoints/initializer-checkpoint.ipynb @@ -0,0 +1,150 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Initialization of Network Parameters\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "### Overview\n", + "\n", + "Mindsprore servers weight initialization module. Users can initialize network parameters through encapsulating operator and initializer method to use string, initializer subclass or the tensor you maked. Initializer class is the basic data structure type used for initialization in mindsprore. Initializer's subclass contain several different types of data distribution (Zero, One, XavierUniform, Heuniform, Henormal, Honstant, Uniform, Normal, Truncatednormal). Here are two parameter initialization modes of encapsulation operator and initializer method introduction in detail.\n", + "\n", + "### Initializing Parameters by Encapsulating operator\n", + "\n", + "Mindsprore provides a variety of parameter initialization methods, and encapsulates the function of parameter initialization in some operators. This section will introduce the method of initializing parameters by operators with parameter initialization function. Taking `Conv2d` operator as an example, this section introduces how to initialize parameters in the network by string, Subclasses of `Initializer` and user-defined `Tensor`. In the following code examples, the Subclasses normal of `Initializer` is taken as an example, In the code example, `Normal` can be replaced by any one of the Subclasses of `Initializer`.\n", + "\n", + "### String\n", + "\n", + "Using string to initialize the network parameter, the content of the string should be consistent with the name of the Subclasses of `Initializer`. The default parameter in the Subclasses of `Initializer` will be used when using string to initialize. For example, using string normal is equivalent to using the Normal() of the Subclasses of `Initializer`. The code example is as follows:\n", + "\n", + "#### Input\n", + "import numpy as np\n", + "import mindspore.nn as nn\n", + "from mindspore import Tensor\n", + "from mindspore.common import set_seed\n", + "\n", + "set_seed(1)\n", + "\n", + "input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))\n", + "net = nn.Conv2d(3, 64, 3, weight_init='Normal')\n", + "output = net(input_data)\n", + "print(output)\n", + "\n", + "#### Output\n", + "[[[[ 3.10382620e-02 4.38603461e-02 4.38603461e-02 ... 4.38603461e-02\n", + " 4.38603461e-02 1.38719045e-02]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " ...\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 9.66199022e-03 1.24104535e-02 1.24104535e-02 ... 1.24104535e-02\n", + " 1.24104535e-02 -1.38977719e-02]]\n", + "\n", + " ...\n", + "\n", + " [[ 3.98553275e-02 -1.35465711e-03 -1.35465711e-03 ... -1.35465711e-03\n", + " -1.35465711e-03 -1.00310734e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " ...\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 1.33139016e-02 6.74417242e-05 6.74417242e-05 ... 6.74417242e-05\n", + " 6.74417242e-05 -2.27325838e-02]]]]\n", + " \n", + "### Subclasses of Initializer \n", + "\n", + "Initializing network parameters with Subclasses of `Initializer` is similar to initializing parameters with String. The difference is that initializing parameters with string is the default parameter of Subclasses of `Initializer`. If you want to use parameters in Subclasses of `Initializer`, you must initialize parameters with Subclasses of `Initializer`. Take `Normal (0.2)` as an example, The code example is as follows: \n", + "\n", + "#### Input\n", + "import numpy as np\n", + "import mindspore.nn as nn\n", + "from mindspore import Tensor\n", + "from mindspore.common import set_seed\n", + "from mindspore.common.initializer import Normal\n", + "\n", + "set_seed(1)\n", + "\n", + "input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))\n", + "net = nn.Conv2d(3, 64, 3, weight_init=Normal(0.2))\n", + "output = net(input_data)\n", + "print(output)\n", + "\n", + "#### output\n", + "[[[[ 6.2076533e-01 8.7720710e-01 8.7720710e-01 ... 8.7720710e-01\n", + " 8.7720710e-01 2.7743810e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " ...\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 1.9323981e-01 2.4820906e-01 2.4820906e-01 ... 2.4820906e-01\n", + " 2.4820906e-01 -2.7795550e-01]]\n", + "\n", + " ...\n", + "\n", + " [[ 7.9710668e-01 -2.7093157e-02 -2.7093157e-02 ... -2.7093157e-02\n", + " -2.7093157e-02 -2.0062150e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " ...\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 2.6627803e-01 1.3488382e-03 1.3488382e-03 ... 1.3488382e-03\n", + " 1.3488382e-03 -4.5465171e-01]]]]\n", + " \n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.5" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/docs/programming_guide/source_en/Untitled.ipynb b/docs/programming_guide/source_en/Untitled.ipynb new file mode 100644 index 0000000000..9c6b4b21e0 --- /dev/null +++ b/docs/programming_guide/source_en/Untitled.ipynb @@ -0,0 +1,150 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Initialization of Network Parameters\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "### Overview\n", + "\n", + "Mindsprore servers weight initialization module. Users can initialize network parameters through encapsulating operator and initializer method to use string, initializer subclass or the tensor you maked. Initializer class is the basic data structure type used for initialization in mindsprore. Initializer's subclass contain several different types of data distribution (Zero, One, XavierUniform, Heuniform, Henormal, Honstant, Uniform, Normal, Truncatednormal). Here are two parameter initialization modes of encapsulation operator and initializer method introduction in detail.\n", + "\n", + "### Initializing Parameters by Encapsulating operator\n", + "\n", + "Mindsprore provides a variety of parameter initialization methods, and encapsulates the function of parameter initialization in some operators. This section will introduce the method of initializing parameters by operators with parameter initialization function. Taking `Conv2d` operator as an example, this section introduces how to initialize parameters in the network by string, Subclasses of `Initializer` and user-defined `Tensor`. In the following code examples, the Subclasses normal of `Initializer` is taken as an example, In the code example, `Normal` can be replaced by any one of the Subclasses of `Initializer`.\n", + "\n", + "### String\n", + "\n", + "Using string to initialize the network parameter, the content of the string should be consistent with the name of the Subclasses of `Initializer`. The default parameter in the Subclasses of `Initializer` will be used when using string to initialize. For example, using string normal is equivalent to using the Normal() of the Subclasses of `Initializer`. The code example is as follows:\n", + "\n", + "#### Input\n", + "import numpy as np\n", + "import mindspore.nn as nn\n", + "from mindspore import Tensor\n", + "from mindspore.common import set_seed\n", + "\n", + "set_seed(1)\n", + "\n", + "input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))\n", + "net = nn.Conv2d(3, 64, 3, weight_init='Normal')\n", + "output = net(input_data)\n", + "print(output)\n", + "\n", + "#### Output\n", + "[[[[ 3.10382620e-02 4.38603461e-02 4.38603461e-02 ... 4.38603461e-02\n", + " 4.38603461e-02 1.38719045e-02]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " ...\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 3.26051228e-02 3.54298912e-02 3.54298912e-02 ... 3.54298912e-02\n", + " 3.54298912e-02 -5.54019120e-03]\n", + " [ 9.66199022e-03 1.24104535e-02 1.24104535e-02 ... 1.24104535e-02\n", + " 1.24104535e-02 -1.38977719e-02]]\n", + "\n", + " ...\n", + "\n", + " [[ 3.98553275e-02 -1.35465711e-03 -1.35465711e-03 ... -1.35465711e-03\n", + " -1.35465711e-03 -1.00310734e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " ...\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 4.38403059e-03 -3.60766202e-02 -3.60766202e-02 ... -3.60766202e-02\n", + " -3.60766202e-02 -2.95619294e-02]\n", + " [ 1.33139016e-02 6.74417242e-05 6.74417242e-05 ... 6.74417242e-05\n", + " 6.74417242e-05 -2.27325838e-02]]]]\n", + " \n", + "### Subclasses of Initializer \n", + "\n", + "Initializing network parameters with Subclasses of `Initializer` is similar to initializing parameters with String. The difference is that initializing parameters with string is the default parameter of Subclasses of `Initializer`. If you want to use parameters in Subclasses of `Initializer`, you must initialize parameters with Subclasses of `Initializer`. Take `Normal (0.2)` as an example, The code example is as follows: \n", + "\n", + "#### Input\n", + "import numpy as np\n", + "import mindspore.nn as nn\n", + "from mindspore import Tensor\n", + "from mindspore.common import set_seed\n", + "from mindspore.common.initializer import Normal\n", + "\n", + "set_seed(1)\n", + "\n", + "input_data = Tensor(np.ones([1, 3, 16, 50], dtype=np.float32))\n", + "net = nn.Conv2d(3, 64, 3, weight_init=Normal(0.2))\n", + "output = net(input_data)\n", + "print(output)\n", + "\n", + "#### output\n", + "[[[[ 6.2076533e-01 8.7720710e-01 8.7720710e-01 ... 8.7720710e-01\n", + " 8.7720710e-01 2.7743810e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " ...\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 6.5210247e-01 7.0859784e-01 7.0859784e-01 ... 7.0859784e-01\n", + " 7.0859784e-01 -1.1080378e-01]\n", + " [ 1.9323981e-01 2.4820906e-01 2.4820906e-01 ... 2.4820906e-01\n", + " 2.4820906e-01 -2.7795550e-01]]\n", + "\n", + " ...\n", + "\n", + " [[ 7.9710668e-01 -2.7093157e-02 -2.7093157e-02 ... -2.7093157e-02\n", + " -2.7093157e-02 -2.0062150e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " ...\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 8.7680638e-02 -7.2153252e-01 -7.2153252e-01 ... -7.2153252e-01\n", + " -7.2153252e-01 -5.9123868e-01]\n", + " [ 2.6627803e-01 1.3488382e-03 1.3488382e-03 ... 1.3488382e-03\n", + " 1.3488382e-03 -4.5465171e-01]]]]\n", + " \n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.5" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} -- Gitee