diff --git a/docs/lite/api/source_en/api_java/class_list.md b/docs/lite/api/source_en/api_java/class_list.md index 091ff4ed3dd9d5598e4879460258dd3d3798cbd5..9df51b5c8511b820f0a1aaf308f539d652df273a 100644 --- a/docs/lite/api/source_en/api_java/class_list.md +++ b/docs/lite/api/source_en/api_java/class_list.md @@ -15,3 +15,5 @@ | com.mindspore.config | [DataType](https://gitee.com/mindspore/mindspore-lite/blob/master/mindspore-lite/java/src/main/java/com/mindspore/config/DataType.java) | DataType defines the supported data types. | √ | √ | | com.mindspore.config | [Version](https://gitee.com/mindspore/mindspore-lite/blob/master/mindspore-lite/java/src/main/java/com/mindspore/config/Version.java) | Version is used to obtain the version information of MindSpore. | ✕ | √ | | com.mindspore.config | [ModelType](https://gitee.com/mindspore/mindspore-lite/blob/master/mindspore-lite/java/src/main/java/com/mindspore/config/ModelType.java) | ModelType defines the model file type. | √ | √ | +| com.mindspore.config | [AscendDeviceInfo](https://gitee.com/mindspore/mindspore-lite/blob/master/mindspore-lite/java/src/main/java/com/mindspore/config/AscendDeviceInfo.java) | The AscendDeviceInfo class is used to configure MindSpore Lite Ascend device options. | √ | ✕ | +| com.mindspore.config | [TrainCfg](https://gitee.com/mindspore/mindspore-lite/blob/master/mindspore-lite/java/src/main/java/com/mindspore/config/TrainCfg.java) | Configuration parameters used for model training on the device. | ✕ | √ | diff --git a/docs/lite/api/source_en/api_java/model.md b/docs/lite/api/source_en/api_java/model.md index 5ae835b760d6bc481e4890c8cbebf0f813401268..aca8789c3a4e0a7c3b17c91adb8c61aef86e53ee 100644 --- a/docs/lite/api/source_en/api_java/model.md +++ b/docs/lite/api/source_en/api_java/model.md @@ -26,6 +26,8 @@ Model defines model in MindSpore for compiling and running. | [MSTensor getOutputByTensorName(String tensorName)](#getoutputbytensorname) | √ | √ | | [List getOutputsByNodeName(String nodeName)](#getoutputsbynodename) | ✕ | √ | | [List getOutputTensorNames()](#getoutputtensornames) | √ | √ | +| [boolean loadConfig(String configPath)](#loadConfig) | √ | √ | +| [boolean updateConfig(String section, HashMap config)](#updateConfig) | √ | √ | | [boolean export(String fileName, int quantizationType, boolean isOnlyExportInfer,List outputTensorNames)](#export) | ✕ | √ | | [boolean exportWeightsCollaborateWithMicro(String weightFile, boolean isInference,boolean enableFp16, List changeableWeightNames)](#exportweightscollaboratewithmicro) | ✕ | √ | | [List getFeatureMaps()](#getfeaturemaps) | ✕ | √ | @@ -250,6 +252,39 @@ Get the MSTensors output of MindSpore model by the tensor name. MindSpore MSTensor. +## loadConfig + +```java +public boolean loadConfig(String configPath) +``` + +Load config file. + +- Parameters + + - `configPath`: config file path. + +- Returns + + Whether the LoadConfig is successful. + +## updateConfig + +```java +public boolean updateConfig(String section, HashMap config) +``` + +Update config. + +- Parameters + + - `section`: the config section. + - `config`: the config will be updated. + +- Returns + + Whether the updateConfig is successful. + ## export ```java diff --git a/docs/lite/api/source_en/api_java/mscontext.md b/docs/lite/api/source_en/api_java/mscontext.md index eefb70737081cb059672fe72f35e6f7cb7274911..f16f28d1bd88c30b34b2bdae2dd357111a983ae9 100644 --- a/docs/lite/api/source_en/api_java/mscontext.md +++ b/docs/lite/api/source_en/api_java/mscontext.md @@ -14,20 +14,21 @@ MSContext is defined for holding environment variables during runtime. | --------------------------------------------------------------------------------------------- |--------|--------| | [boolean init()](#init) | √ | √ | | [boolean init(int threadNum, int cpuBindMode)](#init) | √ | √ | -| [boolean init(int threadNum, int cpuBindMode, boolean isEnableParallel)](#init) | ✕ | √ | +| [boolean init(int threadNum, int cpuBindMode, boolean isEnableParallel)](#init) | √ | √ | | [boolean addDeviceInfo(int deviceType, boolean isEnableFloat16)](#adddeviceinfo) | √ | √ | | [boolean addDeviceInfo(int deviceType, boolean isEnableFloat16, int npuFreq)](#adddeviceinfo) | ✕ | √ | +| [boolean addDeviceInfo(AscendDeviceInfo ascendDeviceInfo)](#adddeviceinfo) | √ | ✕ | | [void free()](#free) | √ | √ | | [long getMSContextPtr()](#getmscontextptr) | √ | √ | | [void setThreadNum(int threadNum)](#setenableparallel) | √ | √ | | [int getThreadNum()](#getenableparallel) | √ | √ | -| [void setInterOpParallelNum(int parallelNum)](#setinteropparallelnum) | √ | √ | -| [int getInterOpParallelNum()](#getinteropparallelnum) | √ | √ | +| [void setInterOpParallelNum(int parallelNum)](#setinteropparallelnum) | √ | ✕ | +| [int getInterOpParallelNum()](#getinteropparallelnum) | √ | ✕ | | [void setThreadAffinity(int mode)](#setthreadaffinity) | √ | √ | | [int getThreadAffinityMode()](#getthreadaffinitycorelist) | √ | √ | | [void setThreadAffinity(ArrayList coreList)](#setthreadaffinity-1) | √ | √ | | [ArrayList getThreadAffinityCoreList()](#getthreadaffinitycorelist) | √ | √ | -| [void setEnableParallel(boolean isParallel)](#setenableparallel) | ✕ | √ | +| [void setEnableParallel(boolean isParallel)](#setenableparallel) | √ | √ | | [boolean getEnableParallel()](#getenableparallel) | ✕ | √ | | [DeviceType](#devicetype) | √ | √ | | [CpuBindMode](#cpubindmode) | √ | √ | @@ -108,6 +109,18 @@ Add device info for mscontext. Whether the device info add is successful. +```java +public boolean addDeviceInfo(AscendDeviceInfo ascendDeviceInfo) +``` + +- Parameters + + - `ascendDeviceInfo`: ascendDeviceInfo Device info for Ascend backend. + +- Returns + + Whether the device info add is successful. + ## getMSContextPtr ```java diff --git a/docs/lite/api/source_en/api_java/mstensor.md b/docs/lite/api/source_en/api_java/mstensor.md index 1e042de09dd8a48438624a84a1c2089c0dc4a998..b7fecbee3ec90d12e05531e80f8280172f46b2fb 100644 --- a/docs/lite/api/source_en/api_java/mstensor.md +++ b/docs/lite/api/source_en/api_java/mstensor.md @@ -16,10 +16,12 @@ MSTensor defined tensor in MindSpore. | [MSTensor createTensor(String tensorName, Object obj)](#createtensor) | √ | √ | | [int[] getShape()](#getshape) | √ | √ | | [int getDataType()](#getdatatype) | √ | √ | +| [Object getData()](#getData) | √ | √ | | [byte[] getByteData()](#getbytedata) | √ | √ | | [float[] getFloatData()](#getfloatdata) | √ | √ | | [int[] getIntData()](#getintdata) | √ | √ | | [long[] getLongData()](#getlongdata) | √ | √ | +| [boolean setShape(int[] tensorShape)](#setShape) | √ | √ | | [boolean setData(byte[] data)](#setdata) | √ | √ | | [boolean setData(float[] data)](#setdata) | √ | √ | | [boolean setData(int[] data)](#setdata) | √ | √ | @@ -29,6 +31,7 @@ MSTensor defined tensor in MindSpore. | [int elementsNum()](#elementsnum) | √ | √ | | [void free()](#free) | √ | √ | | [String tensorName()](#tensorname) | √ | √ | +| [long getMSTensorPtr()](#getMSTensorPtr) | √ | √ | | [DataType](#datatype) | √ | √ | ## createTensor @@ -89,6 +92,18 @@ DataType is defined in [com.mindspore.DataType](https://gitee.com/mindspore/mind The MindSpore data type of the MindSpore MSTensor class. +## getData + +```java +public Object getData() +``` + +Get output data of MSTensor, data type is the same as the type data is set. + +- Returns + + The byte array containing all MSTensor output data. + ## getByteData ```java @@ -137,6 +152,22 @@ Get output data of MSTensor, the data type is long. The long array containing all MSTensor output data. +## setShape + +```java +public boolean setShape(int[] tensorShape) +``` + +Set the shape of MSTensor. + +- Parameters + +- `tensorShape`:tensorShape of int[] type. + +- Returns + + whether set shape success. + ## setData ```java @@ -149,6 +180,10 @@ Set the input data of MSTensor. - `data`: Input data of byte[] type. +- Returns + + whether set data success. + - Returns Whether the setting data is successful. @@ -163,6 +198,10 @@ Set the input data of MSTensor. - `data`: Input data of float[] type. +- Returns + + whether set data success. + - Returns Whether the setting data is successful. @@ -177,6 +216,10 @@ Set the input data of MSTensor. - `data`: Input data of int[] type. +- Returns + + whether set data success. + - Returns Whether the setting data is successful. @@ -191,6 +234,10 @@ Set the input data of MSTensor. - `data`: Input data of long[] type. +- Returns + + whether set data success. + - Returns Whether the setting data is successful. @@ -205,6 +252,10 @@ Set the input data of MSTensor. - `data`: Input data of ByteBuffer type. +- Returns + + whether set data success. + - Returns Whether the setting data is successful. @@ -253,6 +304,18 @@ Get tensor name. Tensor name. +## getMSTensorPtr + +```java +public long getMSTensorPtr() +``` + +MSTensor pointer. + +- Returns + + MSTensor pointer. + ## DataType ```java diff --git a/docs/lite/api/source_zh_cn/api_java/class_list.md b/docs/lite/api/source_zh_cn/api_java/class_list.md index 2a3c06972fda17b9bb24f46a02501dd021ae2db2..e9ae7e8699eab275a87d39ac8caa86d8ec09e666 100644 --- a/docs/lite/api/source_zh_cn/api_java/class_list.md +++ b/docs/lite/api/source_zh_cn/api_java/class_list.md @@ -13,5 +13,8 @@ | com.mindspore.config | [CpuBindMode](https://gitee.com/mindspore/mindspore-lite/blob/master/mindspore-lite/java/src/main/java/com/mindspore/config/CpuBindMode.java) | CpuBindMode定义了CPU绑定模式。 | √ | √ | | com.mindspore.config | [DeviceType](https://gitee.com/mindspore/mindspore-lite/blob/master/mindspore-lite/java/src/main/java/com/mindspore/config/DeviceType.java) | DeviceType定义了后端设备类型。 | √ | √ | | com.mindspore.config | [DataType](https://gitee.com/mindspore/mindspore-lite/blob/master/mindspore-lite/java/src/main/java/com/mindspore/config/DataType.java) | DataType定义了所支持的数据类型。 | √ | √ | -| com.mindspore.config | [Version](https://gitee.com/mindspore/mindspore-lite/blob/master/mindspore-lite/java/src/main/java/com/mindspore/config/Version.java) | Version用于获取MindSpore的版本信息。 | ✕ | √ | +| com.mindspore.config | [Version](https://gitee.com/mindspore/mindspore-lite/blob/master/mindspore-lite/java/src/main/java/com/mindspore/config/Version.java) | Version用于获取MindSpore的版本信息。 | √ | √ | | com.mindspore.config | [ModelType](https://gitee.com/mindspore/mindspore-lite/blob/master/mindspore-lite/java/src/main/java/com/mindspore/config/ModelType.java) | ModelType 定义了模型文件的类型。 | √ | √ | +| com.mindspore.config | [AscendDeviceInfo](https://gitee.com/mindspore/mindspore-lite/blob/master/mindspore-lite/java/src/main/java/com/mindspore/config/AscendDeviceInfo.java) | MindSpore Lite用于昇腾硬件推理的配置参数。 | √ | ✕ | +| com.mindspore.config | [TrainCfg](https://gitee.com/mindspore/mindspore-lite/blob/master/mindspore-lite/java/src/main/java/com/mindspore/config/TrainCfg.java) | 用于端上模型训练的配置参数。 | ✕ | √ | + diff --git a/docs/lite/api/source_zh_cn/api_java/model.md b/docs/lite/api/source_zh_cn/api_java/model.md index b1223207a5d8035a3461d4e32d4ecb301babd98c..40f0e6eb69798079589222d55b9e21a2276bdb0f 100644 --- a/docs/lite/api/source_zh_cn/api_java/model.md +++ b/docs/lite/api/source_zh_cn/api_java/model.md @@ -12,10 +12,10 @@ Model定义了MindSpore中编译和运行的模型。 | function | 云侧推理是否支持 | 端侧推理是否支持 | | ------------------------------------------------------------ |--------|--------| -| [boolean build(final MappedByteBuffer buffer, int modelType, MSContext context, char[] decKey, String decMode, String croptoLibPath)](#build) | ✕ | √ | +| [boolean build(final MappedByteBuffer buffer, int modelType, MSContext context, char[] decKey, String decMode, String croptoLibPath)](#build) | ✕ | √ | | [boolean build(Graph graph, MSContext context, TrainCfg cfg)](#build) | ✕ | √ | | [boolean build(MappedByteBuffer buffer, MSContext context)](#build) | √ | √ | -| [boolean build(String modelPath, int modelType, MSContext context, char[] decKey, String decMode, String croptoLibPath)](#build) | ✕ | √ | +| [boolean build(String modelPath, int modelType, MSContext context, char[] decKey, String decMode, String croptoLibPath)](#build) | ✕ | √ | | [boolean build(String modelPath, int modelType, MSContext context)](#build) | √ | √ | | [boolean predict()](#predict) | √ | √ | | [boolean runStep()](#runstep) | ✕ | √ | @@ -26,6 +26,8 @@ Model定义了MindSpore中编译和运行的模型。 | [MSTensor getOutputByTensorName(String tensorName)](#getoutputbytensorname) | √ | √ | | [List getOutputsByNodeName(String nodeName)](#getoutputsbynodename) | ✕ | √ | | [List getOutputTensorNames()](#getoutputtensornames) | √ | √ | +| [boolean loadConfig(String configPath)](#loadConfig) | √ | √ | +| [boolean updateConfig(String section, HashMap config)](#updateConfig) | √ | √ | | [boolean export(String fileName, int quantizationType, boolean isOnlyExportInfer,List outputTensorNames)](#export) | ✕ | √ | | [boolean exportWeightsCollaborateWithMicro(String weightFile, boolean isInference,boolean enableFp16, List changeableWeightNames)](#exportweightscollaboratewithmicro) | ✕ | √ | | [List getFeatureMaps()](#getfeaturemaps) | ✕ | √ | @@ -250,6 +252,39 @@ public List getOutputTensorNames() 按顺序排列的输出张量名组成的List。 +## loadConfig + +```java +public boolean loadConfig(String configPath) +``` + +加载配置文件。 + +- 参数 + + - `configPath`: 配置文件路径。 + +- 返回值 + + 加载配置是否成功。 + +## updateConfig + +```java +public boolean updateConfig(String section, HashMap config) +``` + +更新配置文件。 + +- 参数 + + - `section`: 配置的分类。 + - `config`: 需要更新的具体配置项。 + +- 返回值 + + 更新配置是否成功。 + ## export ```java diff --git a/docs/lite/api/source_zh_cn/api_java/mscontext.md b/docs/lite/api/source_zh_cn/api_java/mscontext.md index 516ba04684103a5fdea99b5686303d8c67a8227a..8c412ce4141965fe856d4c2d245749d2dab93fa1 100644 --- a/docs/lite/api/source_zh_cn/api_java/mscontext.md +++ b/docs/lite/api/source_zh_cn/api_java/mscontext.md @@ -14,20 +14,21 @@ MSContext类用于配置运行时的上下文配置。 | --------------------------------------------------------------------------------------------- |--------|--------| | [boolean init()](#init) | √ | √ | | [boolean init(int threadNum, int cpuBindMode)](#init) | √ | √ | -| [boolean init(int threadNum, int cpuBindMode, boolean isEnableParallel)](#init) | ✕ | √ | +| [boolean init(int threadNum, int cpuBindMode, boolean isEnableParallel)](#init) | √ | √ | | [boolean addDeviceInfo(int deviceType, boolean isEnableFloat16)](#adddeviceinfo) | √ | √ | | [boolean addDeviceInfo(int deviceType, boolean isEnableFloat16, int npuFreq)](#adddeviceinfo) | ✕ | √ | +| [boolean addDeviceInfo(AscendDeviceInfo ascendDeviceInfo)](#adddeviceinfo) | √ | ✕ | | [void free()](#free) | √ | √ | | [long getMSContextPtr()](#getmscontextptr) | √ | √ | | [void setThreadNum(int threadNum)](#setenableparallel) | √ | √ | | [int getThreadNum()](#getenableparallel) | √ | √ | -| [void setInterOpParallelNum(int parallelNum)](#setinteropparallelnum) | √ | √ | -| [int getInterOpParallelNum()](#getinteropparallelnum) | √ | √ | +| [void setInterOpParallelNum(int parallelNum)](#setinteropparallelnum) | √ | ✕ | +| [int getInterOpParallelNum()](#getinteropparallelnum) | √ | ✕ | | [void setThreadAffinity(int mode)](#setthreadaffinity) | √ | √ | | [int getThreadAffinityMode()](#getthreadaffinitycorelist) | √ | √ | | [void setThreadAffinity(ArrayList coreList)](#setthreadaffinity-1) | √ | √ | | [ArrayList getThreadAffinityCoreList()](#getthreadaffinitycorelist) | √ | √ | -| [void setEnableParallel(boolean isParallel)](#setenableparallel) | ✕ | √ | +| [void setEnableParallel(boolean isParallel)](#setenableparallel) | √ | √ | | [boolean getEnableParallel()](#getenableparallel) | ✕ | √ | | [DeviceType](#devicetype) | √ | √ | | [CpuBindMode](#cpubindmode) | √ | √ | @@ -108,6 +109,20 @@ public boolean addDeviceInfo(int deviceType, boolean isEnableFloat16, int npuFre 设备添加是否成功。 +```java +public boolean addDeviceInfo(AscendDeviceInfo ascendDeviceInfo) +``` + +添加运行设备信息。 + +- 参数 + + - `ascendDeviceInfo`: Ascend后端设备信息。 + +- 返回值 + + 设备添加是否成功。 + ## getMSContextPtr ```java diff --git a/docs/lite/api/source_zh_cn/api_java/mstensor.md b/docs/lite/api/source_zh_cn/api_java/mstensor.md index 9cca3b7e759666aad9f76d7df2e2878f0cb3ce30..371f5c68405b0dfcb21ceba590d455e9bf2250b1 100644 --- a/docs/lite/api/source_zh_cn/api_java/mstensor.md +++ b/docs/lite/api/source_zh_cn/api_java/mstensor.md @@ -16,10 +16,12 @@ MSTensor定义了MindSpore中的张量。 | [MSTensor createTensor(String tensorName, Object obj)](#createtensor) | √ | √ | | [int[] getShape()](#getshape) | √ | √ | | [int getDataType()](#getdatatype) | √ | √ | +| [Object getData()](#getData) | √ | √ | | [byte[] getByteData()](#getbytedata) | √ | √ | | [float[] getFloatData()](#getfloatdata) | √ | √ | | [int[] getIntData()](#getintdata) | √ | √ | | [long[] getLongData()](#getlongdata) | √ | √ | +| [boolean setShape(int[] tensorShape)](#setShape) | √ | √ | | [boolean setData(byte[] data)](#setdata) | √ | √ | | [boolean setData(float[] data)](#setdata) | √ | √ | | [boolean setData(int[] data)](#setdata) | √ | √ | @@ -29,6 +31,7 @@ MSTensor定义了MindSpore中的张量。 | [int elementsNum()](#elementsnum) | √ | √ | | [void free()](#free) | √ | √ | | [String tensorName()](#tensorname) | √ | √ | +| [long getMSTensorPtr()](#getMSTensorPtr) | √ | √ | | [DataType](#datatype) | √ | √ | ## createTensor @@ -89,6 +92,18 @@ DataType在[com.mindspore.DataType](https://gitee.com/mindspore/mindspore-lite/b MindSpore MSTensor类的MindSpore DataType。 +## getData + +```java +public Object getData() +``` + +获取MSTensor的输出数据,数据类型与设置的类型数据相同。 + +- 返回值 + + 包含所有MSTensor输出数据的字节数组。 + ## getByteData ```java @@ -137,6 +152,22 @@ public long[] getLongData() 包含所有MSTensor输出数据的long类型数组。 +## setShape + +```java +public boolean setShape(int[] tensorShape) +``` + +设定MSTensor的形状。 + +- 参数 + +- `tensorShape`: int[]类型的输入数据。 + +- 返回值 + + 设置形状是否成功。 + ## setData ```java @@ -153,6 +184,10 @@ public boolean setData(byte[] data) 设置数据是否成功。 +- 返回值 + + 设置数据是否成功。 + ```java public boolean setData(float[] data) ``` @@ -167,6 +202,10 @@ public boolean setData(float[] data) 设置数据是否成功。 +- 返回值 + + 设置数据是否成功。 + ```java public boolean setData(int[] data) ``` @@ -181,6 +220,10 @@ public boolean setData(int[] data) 设置数据是否成功。 +- 返回值 + + 设置数据是否成功。 + ```java public boolean setData(long[] data) ``` @@ -195,6 +238,10 @@ public boolean setData(long[] data) 设置数据是否成功。 +- 返回值 + + 设置数据是否成功。 + ```java public boolean setData(ByteBuffer data) ``` @@ -209,6 +256,10 @@ public boolean setData(ByteBuffer data) 设置数据是否成功。 +- 返回值 + + 设置数据是否成功。 + ## size ```java @@ -253,6 +304,18 @@ public String tensorName() tensor的名称。 +## getMSTensorPtr + +```java +public long getMSTensorPtr() +``` + +返回MStensor对象的指针。 + +- 返回值 + + MStensord对象的指针。 + ## DataType ```java