diff --git a/docs/lite/api/source_en/api_java/graph.md b/docs/lite/api/source_en/api_java/graph.md index 6535d6603c2807c5534ce644f0e01c27d6bf4153..5252e7f6fb36c36a36c6bf5fd24f0cb162ce7af2 100644 --- a/docs/lite/api/source_en/api_java/graph.md +++ b/docs/lite/api/source_en/api_java/graph.md @@ -19,7 +19,7 @@ Model defines computational graph in MindSpore. ## load ```java - boolean load(String file) +public boolean load(String file) ``` Load the MindSpore model from file. diff --git a/docs/lite/api/source_en/api_java/model.md b/docs/lite/api/source_en/api_java/model.md index 2fbfc1d816cc908e7c30aed06fadf078aef099eb..4a612af12813c7e03fdcd65dfa47fb9c6ade4095 100644 --- a/docs/lite/api/source_en/api_java/model.md +++ b/docs/lite/api/source_en/api_java/model.md @@ -12,17 +12,17 @@ Model defines model in MindSpore for compiling and running. | function | Supported At Cloud-side Inference | Supported At Device-side Inference | | ------------------------------------------------------------ |--------|--------| -| [boolean build(MappedByteBuffer buffer, int modelType, MSContext context, char[] dec_key, String dec_mode)](#build) | ✕ | √ | +| [boolean build(final MappedByteBuffer buffer, int modelType, MSContext context, char[] decKey, String decMode, String croptoLibPath)](#build) | ✕ | √ | | [boolean build(Graph graph, MSContext context, TrainCfg cfg)](#build) | ✕ | √ | | [boolean build(MappedByteBuffer buffer, MSContext context)](#build) | √ | √ | -| [boolean build(String modelPath, MSContext context, char[] dec_key, String dec_mode)](#build) | ✕ | √ | -| [boolean build(String modelPath, MSContext context)](#build) | √ | √ | +| [boolean build(String modelPath, int modelType, MSContext context, char[] decKey, String decMode, String croptoLibPath)](#build) | ✕ | √ | +| [boolean build(String modelPath, int modelType, MSContext context)](#build) | √ | √ | | [boolean predict()](#predict) | √ | √ | | [boolean runStep()](#runstep) | ✕ | √ | | [boolean resize(List inputs, int[][] dims)](#resize) | √ | √ | | [List getInputs()](#getinputs) | √ | √ | | [List getOutputs()](#getoutputs) | √ | √ | -| [MSTensor getInputsByTensorName(String tensorName)](#getinputsbytensorname) | √ | √ | +| [MSTensor getInputByTensorName(String tensorName)](#getinputbytensorname) | √ | √ | | [MSTensor getOutputByTensorName(String tensorName)](#getoutputbytensorname) | √ | √ | | [List getOutputsByNodeName(String nodeName)](#getoutputsbynodename) | ✕ | √ | | [List getOutputTensorNames()](#getoutputtensornames) | √ | √ | @@ -56,7 +56,7 @@ Compile MindSpore model by computational graph. Whether the build is successful. ```java -public boolean build(MappedByteBuffer buffer, int modelType, MSContext context, char[] dec_key, String dec_mode) +public boolean build(final MappedByteBuffer buffer, int modelType, MSContext context, char[] decKey, String decMode, String croptoLibPath) ``` Compile MindSpore model by computational graph buffer. @@ -66,8 +66,9 @@ Compile MindSpore model by computational graph buffer. - `buffer`: computational graph buffer. - `modelType`: computational graph type. Optionally, there are `MT_MINDIR_LITE`, `MT_MINDIR`, corresponding to the `ms` model (exported by the `converter_lite` tool) and the `mindir` model (exported by MindSpore or exported by the `converter_lite` tool), respectively. Only MT_MINDIR_LITE is valid for Device-side Inference and the parameter value is ignored. Cloud-side inference supports `ms` and `mindir` model inference, which requires setting the parameter to the option value corresponding to the model. Cloud-side inference support for `ms` model will be removed in future iterations, and cloud-side inference via `mindir` model is recommended. - `context`: compile context. - - `dec_key`: define the key used to decrypt the ciphertext model. The key length is 16, 24, or 32. - - `dec_mode`: define the decryption mode. Options: AES-GCM, AES-CBC. + - `decKey`: define the key used to decrypt the ciphertext model. The key length is 16, 24, or 32. + - `decMode`: define the decryption mode. Options: AES-GCM, AES-CBC. + - `croptoLibPath`: define the openssl library path. - Returns @@ -90,7 +91,7 @@ Compile MindSpore model by computational graph buffer, the default is MindIR mod Whether the build is successful. ```java -public boolean build(String modelPath, int modelType, MSContext context, char[] dec_key, String dec_mode) +public boolean build(String modelPath, int modelType, MSContext context, char[] decKey, String decMode, String croptoLibPath) ``` Compile MindSpore model by computational graph file. @@ -100,8 +101,9 @@ Compile MindSpore model by computational graph file. - `modelPath`: computational graph file. - `modelType`: computational graph type. Optionally, there are `MT_MINDIR_LITE`, `MT_MINDIR`, corresponding to the `ms` model (exported by the `converter_lite` tool) and the `mindir` model (exported by MindSpore or exported by the `converter_lite` tool), respectively. Only MT_MINDIR_LITE is valid for Device-side Inference and the parameter value is ignored. Cloud-side inference supports `ms` and `mindir` model inference, which requires setting the parameter to the option value corresponding to the model. Cloud-side inference support for `ms` model will be removed in future iterations, and cloud-side inference via `mindir` model is recommended. - `context`: compile context. - - `dec_key`: define the key used to decrypt the ciphertext model. The key length is 16, 24, or 32. - - `dec_mode`: define the decryption mode. Options: AES-GCM, AES-CBC. + - `decKey`: define the key used to decrypt the ciphertext model. The key length is 16, 24, or 32. + - `decMode`: define the decryption mode. Options: AES-GCM, AES-CBC. + - `croptoLibPath`: define the openssl library path. - Returns @@ -188,10 +190,10 @@ Get the output MSTensors of MindSpore model. The MindSpore MSTensor list. -## getInputsByTensorName +## getInputByTensorName ```java -public MSTensor getInputsByTensorName(String tensorName) +public MSTensor getInputByTensorName(String tensorName) ``` Get the input MSTensors of MindSpore model by the node name. @@ -394,4 +396,5 @@ public static final int MT_AIR = 1; public static final int MT_OM = 2; public static final int MT_ONNX = 3; public static final int MT_MINDIR_LITE = 4; +public static final int MT_MINDIR_OPT = MT_MINDIR_LITE; ``` diff --git a/docs/lite/api/source_en/api_java/mscontext.md b/docs/lite/api/source_en/api_java/mscontext.md index 70d77c483f59ac60baa42b232328ffeb0fbe2053..749eccc8c58f985cb9f7de8c2448ef83ac9f068b 100644 --- a/docs/lite/api/source_en/api_java/mscontext.md +++ b/docs/lite/api/source_en/api_java/mscontext.md @@ -93,7 +93,7 @@ Add device info for mscontext. Whether the device info add is successful. ```java -boolean addDeviceInfo(int deviceType, boolean isEnableFloat16, int npuFreq) +public boolean addDeviceInfo(int deviceType, boolean isEnableFloat16, int npuFreq) ``` Add device info for mscontext. @@ -131,7 +131,7 @@ Free the memory allocated for MSContext. ## setThreadNum ```java -void setThreadNum(int threadNum) +public void setThreadNum(int threadNum) ``` Sets the number of threads at runtime. @@ -144,7 +144,7 @@ If MSContext is not initialized, this function will do nothing and output null p ## getThreadNum ```java -void int getThreadNum() +public void int getThreadNum() ``` Get the current thread number setting. @@ -157,7 +157,7 @@ If MSContext is not initialized, this function will do nothing and output null p ## setInterOpParallelNum ```java -void setInterOpParallelNum(int parallelNum) +public void setInterOpParallelNum(int parallelNum) ``` Set the parallel number of operators at runtime. @@ -170,7 +170,7 @@ If MSContext is not initialized, this function will do nothing and output null p ## getInterOpParallelNum ```java -int getInterOpParallelNum() +public int getInterOpParallelNum() ``` et the current operators parallel number setting. @@ -183,7 +183,7 @@ If MSContext is not initialized, this function will return -1 and output null po ## setThreadAffinity ```java -void setThreadAffinity(int mode) +public void setThreadAffinity(int mode) ``` Set the thread affinity to CPU cores. @@ -196,7 +196,7 @@ If MSContext is not initialized, this function will do nothing and output null p ## getThreadAffinityMode ```java - int getThreadAffinityMode() +public int getThreadAffinityMode() ``` Get the thread affinity of CPU cores. @@ -209,7 +209,7 @@ If MSContext is not initialized, this function will return -1 and output null po ## setThreadAffinity ```java -void setThreadAffinity(ArrayList coreList) +public void setThreadAffinity(ArrayList coreList) ``` Set the thread lists to CPU cores, if two different `setThreadAffinity` are set for a single MSContext at the same time, only `coreList` will take effect and `mode` will not. @@ -222,7 +222,7 @@ If MSContext is not initialized, this function will do nothing and output null p ## getThreadAffinityCoreList ```java -ArrayList getThreadAffinityCoreList() +public ArrayList getThreadAffinityCoreList() ``` Get the thread lists of CPU cores. @@ -235,7 +235,7 @@ If MSContext is not initialized, this function will retutn an empty Arraylist an ## setEnableParallel ```java -void setEnableParallel(boolean isParallel) +public void setEnableParallel(boolean isParallel) ``` Set the status whether to perform model inference or training in parallel. @@ -248,7 +248,7 @@ If MSContext is not initialized, this function will do nothing and output null p ## getEnableParallel ```java -boolean getEnableParallel() +public boolean getEnableParallel() ``` Get the status whether to perform model inference or training in parallel. @@ -272,6 +272,7 @@ Define device type. public static final int DT_CPU = 0; public static final int DT_GPU = 1; public static final int DT_NPU = 2; +public static final int DT_ASCEND = 3; ``` The value of DeviceType is 0, and the specified device type is CPU. @@ -280,6 +281,8 @@ The value of DeviceType is 1, and the specified device type is GPU. The value of DeviceType is 2, and the specified device type is NPU. +The value of DeviceType is 3, and the specified device type is Ascend. + ## CpuBindMode ```java diff --git a/docs/lite/api/source_en/api_java/mstensor.md b/docs/lite/api/source_en/api_java/mstensor.md index 8877b15bfc3ddb97e3c28b9ff21074fd67189074..8ccaffbbecefa1c1ef149a285e04c7b182488acc 100644 --- a/docs/lite/api/source_en/api_java/mstensor.md +++ b/docs/lite/api/source_en/api_java/mstensor.md @@ -20,11 +20,11 @@ MSTensor defined tensor in MindSpore. | [float[] getFloatData()](#getfloatdata) | √ | √ | | [int[] getIntData()](#getintdata) | √ | √ | | [long[] getLongData()](#getlongdata) | √ | √ | -| [void setData(byte[] data)](#setdata) | √ | √ | -| [void setData(float[] data)](#setdata) | √ | √ | -| [void setData(int[] data)](#setdata) | √ | √ | -| [void setData(long[] data)](#setdata) | √ | √ | -| [void setData(ByteBuffer data)](#setdata) | √ | √ | +| [boolean setData(byte[] data)](#setdata) | √ | √ | +| [boolean setData(float[] data)](#setdata) | √ | √ | +| [boolean setData(int[] data)](#setdata) | √ | √ | +| [boolean setData(long[] data)](#setdata) | √ | √ | +| [boolean setData(ByteBuffer data)](#setdata) | √ | √ | | [long size()](#size) | √ | √ | | [int elementsNum()](#elementsnum) | √ | √ | | [void free()](#free) | √ | √ | @@ -140,7 +140,7 @@ Get output data of MSTensor, the data type is long. ## setData ```java -public void setData(byte[] data) +public boolean setData(byte[] data) ``` Set the input data of MSTensor. @@ -149,8 +149,12 @@ Set the input data of MSTensor. - `data`: Input data of byte[] type. +- Returns + + Whether the setting data is successful. + ```java -public void setData(float[] data) +public boolean setData(float[] data) ``` Set the input data of MSTensor. @@ -159,8 +163,12 @@ Set the input data of MSTensor. - `data`: Input data of float[] type. +- Returns + + Whether the setting data is successful. + ```java -public void setData(int[] data) +public boolean setData(int[] data) ``` Set the input data of MSTensor. @@ -169,8 +177,12 @@ Set the input data of MSTensor. - `data`: Input data of int[] type. +- Returns + + Whether the setting data is successful. + ```java -public void setData(long[] data) +public boolean setData(long[] data) ``` Set the input data of MSTensor. @@ -179,8 +191,12 @@ Set the input data of MSTensor. - `data`: Input data of long[] type. +- Returns + + Whether the setting data is successful. + ```java -public void setData(ByteBuffer data) +public boolean setData(ByteBuffer data) ``` Set the input data of MSTensor. @@ -189,6 +205,10 @@ Set the input data of MSTensor. - `data`: Input data of ByteBuffer type. +- Returns + + Whether the setting data is successful. + ## size ```java diff --git a/docs/lite/api/source_en/api_java/runner_config.md b/docs/lite/api/source_en/api_java/runner_config.md index 50dceb8bb35a7be8ca28b59fa4a952b30b2918f1..e41024fc5d01162503911583942492c4c8d71298 100644 --- a/docs/lite/api/source_en/api_java/runner_config.md +++ b/docs/lite/api/source_en/api_java/runner_config.md @@ -10,9 +10,9 @@ RunnerConfig defines the configuration parameters of MindSpore Lite concurrent i | ------------------------------------------------------------ |--------|--------| | [boolean init()](#init) | √ | ✕ | | [public boolean init(MSContext msContext)](#init) | √ | ✕ | -| [public void setWorkerNum(int workerNum)](#setworkernum) | √ | ✕ | +| [public void setWorkersNum(int workersNum)](#setworkersnum) | √ | ✕ | | [public void setConfigInfo(String section, HashMap config)](#setconfiginfo) | √ | ✕ | -| [public void setConfigPath(String configPath)](#setconfigpath) | √ | ✕ | +| [public void setConfigPath(String config_path)](#setconfigpath) | √ | ✕ | | [void getConfigPath()](#getconfigpath) | √ | ✕ | | [long getRunnerConfigPtr()](#getrunnerconfigptr) | √ | ✕ | | [public void setDeviceIds(ArrayList deviceIds)](#setdeviceids) | √ | ✕ | @@ -44,17 +44,17 @@ Configuration parameter initialization for parallel inference. Whether the initialization is successful. -## setWorkerNum +## setWorkersNum ```java -public void setWorkerNum(int workerNum) +public void setWorkersNum(int workersNum) ``` The parameter setting of the number of models in parallel inference. - Parameters - - `workerNum`: Set the number of models in the configuration. + - `workersNum`: Set the number of models in the configuration. ## setConfigInfo @@ -72,14 +72,14 @@ Model configuration parameter settings in parallel inference. ## setConfigPath ```java -public void setConfigPath(String configPath) +public void setConfigPath(String config_path) ``` Set the configuration file path in concurrent inference. - Parameters - - `configPath`: config path. + - `config_path`: config path. ## getConfigPath diff --git a/docs/lite/api/source_zh_cn/api_java/graph.md b/docs/lite/api/source_zh_cn/api_java/graph.md index 1ee7c07dd20bbf097e39ccbbf585f0f2d8faaee4..ad0bcb722aa84adb03d73b3ff5ff1f3b385fcbc5 100644 --- a/docs/lite/api/source_zh_cn/api_java/graph.md +++ b/docs/lite/api/source_zh_cn/api_java/graph.md @@ -19,7 +19,7 @@ Graph定义了MindSpore的计算图。 ## load ```java - boolean load(String file) +public boolean load(String file) ``` 从指定文件加载MindSpore模型。 diff --git a/docs/lite/api/source_zh_cn/api_java/model.md b/docs/lite/api/source_zh_cn/api_java/model.md index 833d4d1fde5dab1d89f0f9d7cace2f8ee824d235..f9169d826838f05d165ab3b834ed823d5e63826a 100644 --- a/docs/lite/api/source_zh_cn/api_java/model.md +++ b/docs/lite/api/source_zh_cn/api_java/model.md @@ -12,17 +12,17 @@ Model定义了MindSpore中编译和运行的模型。 | function | 云侧推理是否支持 | 端侧推理是否支持 | | ------------------------------------------------------------ |--------|--------| -| [boolean build(MappedByteBuffer buffer, int modelType, MSContext context, char[] dec_key, String dec_mode)](#build) | ✕ | √ | +| [boolean build(final MappedByteBuffer buffer, int modelType, MSContext context, char[] decKey, String decMode, String croptoLibPath)](#build) | ✕ | √ | | [boolean build(Graph graph, MSContext context, TrainCfg cfg)](#build) | ✕ | √ | | [boolean build(MappedByteBuffer buffer, MSContext context)](#build) | √ | √ | -| [boolean build(String modelPath, MSContext context, char[] dec_key, String dec_mode)](#build) | ✕ | √ | -| [boolean build(String modelPath, MSContext context)](#build) | √ | √ | +| [boolean build(String modelPath, int modelType, MSContext context, char[] decKey, String decMode, String croptoLibPath)](#build) | ✕ | √ | +| [boolean build(String modelPath, int modelType, MSContext context)](#build) | √ | √ | | [boolean predict()](#predict) | √ | √ | | [boolean runStep()](#runstep) | ✕ | √ | | [boolean resize(List inputs, int[][] dims)](#resize) | √ | √ | | [List getInputs()](#getinputs) | √ | √ | | [List getOutputs()](#getoutputs) | √ | √ | -| [MSTensor getInputsByTensorName(String tensorName)](#getinputsbytensorname) | √ | √ | +| [MSTensor getInputByTensorName(String tensorName)](#getinputsbytensorname) | √ | √ | | [MSTensor getOutputByTensorName(String tensorName)](#getoutputbytensorname) | √ | √ | | [List getOutputsByNodeName(String nodeName)](#getoutputsbynodename) | ✕ | √ | | [List getOutputTensorNames()](#getoutputtensornames) | √ | √ | @@ -56,7 +56,7 @@ public boolean build(Graph graph, MSContext context, TrainCfg cfg) 是否编译成功。 ```java -public boolean build(MappedByteBuffer buffer, int modelType, MSContext context, char[] dec_key, String dec_mode) +public boolean build(final MappedByteBuffer buffer, int modelType, MSContext context, char[] decKey, String decMode, String croptoLibPath) ``` 通过模型计算图内存块编译MindSpore模型。 @@ -66,8 +66,9 @@ public boolean build(MappedByteBuffer buffer, int modelType, MSContext context, - `buffer`: 模型计算图内存块。 - `modelType`: 模型计算图类型,可选有`MT_MINDIR_LITE`、`MT_MINDIR`,分别对应`ms`模型(`converter_lite`工具导出)和`mindir`模型(MindSpore导出或`converter_lite`工具导出)。端侧推理只支持`ms`模型推理,该入参值被忽略。云端推理支持`ms`和`mindir`模型推理,需要将该参数设置为模型对应的选项值。云侧推理对`ms`模型的支持,将在未来的迭代中删除,推荐通过`mindir`模型进行云侧推理。 - `context`: 运行时Context上下文。 - - `dec_key`: 模型解密秘钥。 - - `dec_mode`: 模型解密算法,可选AES-GCM、AES-CBC。 + - `decKey`: 模型解密秘钥。 + - `decMode`: 模型解密算法,可选AES-GCM、AES-CBC。 + - `croptoLibPath`: openssl库路径。 - 返回值 @@ -90,7 +91,7 @@ public boolean build(final MappedByteBuffer buffer, int modelType, MSContext con 是否编译成功。 ```java -public boolean build(String modelPath, int modelType, MSContext context, char[] dec_key, String dec_mode) +public boolean build(String modelPath, int modelType, MSContext context, char[] decKey, String decMode, String croptoLibPath) ``` 通过模型计算图文件编译MindSpore MindIR模型。 @@ -100,8 +101,9 @@ public boolean build(String modelPath, int modelType, MSContext context, char[] - `modelPath`: 模型计算图文件。 - `modelType`: 模型计算图类型,可选有`MT_MINDIR_LITE`、`MT_MINDIR`,分别对应`ms`模型(`converter_lite`工具导出)和`mindir`模型(MindSpore导出或`converter_lite`工具导出)。端侧推理只支持`ms`模型推理,该入参值被忽略。云端推理支持`ms`和`mindir`模型推理,需要将该参数设置为模型对应的选项值。云侧推理对`ms`模型的支持,将在未来的迭代中删除,推荐通过`mindir`模型进行云侧推理。 - `context`: 运行时Context上下文。 - - `dec_key`: 模型解密秘钥。 - - `dec_mode`: 模型解密算法,可选AES-GCM、AES-CBC。 + - `decKey`: 模型解密秘钥。 + - `decMode`: 模型解密算法,可选AES-GCM、AES-CBC。 + - `croptoLibPath`: openssl库路径。 - 返回值 @@ -188,10 +190,10 @@ public List getOutputs() 所有输出MSTensor组成的List。 -## getInputsByTensorName +## getInputByTensorName ```java -public MSTensor getInputsByTensorName(String tensorName) +public MSTensor getInputByTensorName(String tensorName) ``` 通过张量名获取MindSpore模型的输入张量。 @@ -398,4 +400,5 @@ public static final int MT_AIR = 1; public static final int MT_OM = 2; public static final int MT_ONNX = 3; public static final int MT_MINDIR_LITE = 4; +public static final int MT_MINDIR_OPT = MT_MINDIR_LITE; ``` diff --git a/docs/lite/api/source_zh_cn/api_java/mscontext.md b/docs/lite/api/source_zh_cn/api_java/mscontext.md index b1d00999149140fa8338299ee15c9c4d277b8d04..459640adbad9a97fbe094d48ccede51cd7dd398b 100644 --- a/docs/lite/api/source_zh_cn/api_java/mscontext.md +++ b/docs/lite/api/source_zh_cn/api_java/mscontext.md @@ -78,7 +78,7 @@ public boolean init(int threadNum, int cpuBindMode, boolean isEnableParallel) ## addDeviceInfo ```java -boolean addDeviceInfo(int deviceType, boolean isEnableFloat16) +public boolean addDeviceInfo(int deviceType, boolean isEnableFloat16) ``` 添加运行设备信息。 @@ -93,7 +93,7 @@ boolean addDeviceInfo(int deviceType, boolean isEnableFloat16) 设备添加是否成功。 ```java -boolean addDeviceInfo(int deviceType, boolean isEnableFloat16, int npuFreq) +public boolean addDeviceInfo(int deviceType, boolean isEnableFloat16, int npuFreq) ``` 添加运行设备信息。 @@ -131,7 +131,7 @@ public void free() ## setThreadNum ```java -void setThreadNum(int threadNum) +public void setThreadNum(int threadNum) ``` 设置运行时的线程数量。 @@ -144,7 +144,7 @@ void setThreadNum(int threadNum) ## getThreadNum ```java -int getThreadNum() +public int getThreadNum() ``` 获取当MSContext的线程数量设置,该选项仅MindSpore Lite有效。 @@ -157,7 +157,7 @@ int getThreadNum() ## setInterOpParallelNum ```java -void setInterOpParallelNum(int parallelNum) +public void setInterOpParallelNum(int parallelNum) ``` 设置运行时的算子并行推理数目。 @@ -170,7 +170,7 @@ void setInterOpParallelNum(int parallelNum) ## getInterOpParallelNum ```java -int getInterOpParallelNum() +public int getInterOpParallelNum() ``` 获取当前算子并行数设置。 @@ -183,7 +183,7 @@ int getInterOpParallelNum() ## setThreadAffinity ```java -void setThreadAffinity(int mode) +public void setThreadAffinity(int mode) ``` 设置运行时的CPU绑核策略。 @@ -196,7 +196,7 @@ void setThreadAffinity(int mode) ## getThreadAffinityMode ```java - int getThreadAffinityMode() +public int getThreadAffinityMode() ``` 获取当前CPU绑核策略。 @@ -209,7 +209,7 @@ void setThreadAffinity(int mode) ## setThreadAffinity ```java -void setThreadAffinity(ArrayList coreList) +public void setThreadAffinity(ArrayList coreList) ``` 设置运行时的CPU绑核列表,如果同时调用了两个不同的`SetThreadAffinity`函数来设置同一个的MSContext,仅`coreList`生效,而`mode`不生效。该选项仅MindSpore Lite有效。 @@ -222,7 +222,7 @@ void setThreadAffinity(ArrayList coreList) ## getThreadAffinityCoreList ```java -ArrayList getThreadAffinityCoreList() +public ArrayList getThreadAffinityCoreList() ``` 获取当前CPU绑核列表。 @@ -235,7 +235,7 @@ ArrayList getThreadAffinityCoreList() ## setEnableParallel ```java -void setEnableParallel(boolean isParallel) +public void setEnableParallel(boolean isParallel) ``` 设置运行时是否使能异构并行。 @@ -248,7 +248,7 @@ void setEnableParallel(boolean isParallel) ## getEnableParallel ```java -boolean getEnableParallel() +public boolean getEnableParallel() ``` 获取当前是否使能异构并行。 @@ -272,6 +272,7 @@ import com.mindspore.config.DeviceType; public static final int DT_CPU = 0; public static final int DT_GPU = 1; public static final int DT_NPU = 2; +public static final int DT_ASCEND = 3; ``` DeviceType的值为0,指定设备类型为CPU。 @@ -280,6 +281,8 @@ DeviceType的值为1,指定设备类型为GPU。 DeviceType的值为2,指定设备类型为NPU。 +DeviceType的值为3,指定设备类型为Ascend。 + ## CpuBindMode ```java diff --git a/docs/lite/api/source_zh_cn/api_java/mstensor.md b/docs/lite/api/source_zh_cn/api_java/mstensor.md index d29266a1026c6b3e4d3711a3621fc13de4e8fa32..1d620a044fb6a0c7b663a2add636c323a7d99bdd 100644 --- a/docs/lite/api/source_zh_cn/api_java/mstensor.md +++ b/docs/lite/api/source_zh_cn/api_java/mstensor.md @@ -20,11 +20,11 @@ MSTensor定义了MindSpore中的张量。 | [float[] getFloatData()](#getfloatdata) | √ | √ | | [int[] getIntData()](#getintdata) | √ | √ | | [long[] getLongData()](#getlongdata) | √ | √ | -| [void setData(byte[] data)](#setdata) | √ | √ | -| [void setData(float[] data)](#setdata) | √ | √ | -| [void setData(int[] data)](#setdata) | √ | √ | -| [void setData(long[] data)](#setdata) | √ | √ | -| [void setData(ByteBuffer data)](#setdata) | √ | √ | +| [boolean setData(byte[] data)](#setdata) | √ | √ | +| [boolean setData(float[] data)](#setdata) | √ | √ | +| [boolean setData(int[] data)](#setdata) | √ | √ | +| [boolean setData(long[] data)](#setdata) | √ | √ | +| [boolean setData(ByteBuffer data)](#setdata) | √ | √ | | [long size()](#size) | √ | √ | | [int elementsNum()](#elementsnum) | √ | √ | | [void free()](#free) | √ | √ | @@ -140,7 +140,7 @@ public long[] getLongData() ## setData ```java -public void setData(byte[] data) +public boolean setData(byte[] data) ``` 设定MSTensor的输入数据。 @@ -149,8 +149,12 @@ public void setData(byte[] data) - `data`: byte[]类型的输入数据。 +- 返回值 + + 设置数据是否成功。 + ```java -public void setData(float[] data) +public boolean setData(float[] data) ``` 设定MSTensor的输入数据。 @@ -159,8 +163,12 @@ public void setData(float[] data) - `data`: float[]类型的输入数据。 +- 返回值 + + 设置数据是否成功。 + ```java -public void setData(int[] data) +public boolean setData(int[] data) ``` 设定MSTensor的输入数据。 @@ -169,8 +177,12 @@ public void setData(int[] data) - `data`: int[]类型的输入数据。 +- 返回值 + + 设置数据是否成功。 + ```java -public void setData(long[] data) +public boolean setData(long[] data) ``` 设定MSTensor的输入数据。 @@ -179,8 +191,12 @@ public void setData(long[] data) - `data`: long[]类型的输入数据。 +- 返回值 + + 设置数据是否成功。 + ```java -public void setData(ByteBuffer data) +public boolean setData(ByteBuffer data) ``` 设定MSTensor的输入数据。 @@ -189,6 +205,10 @@ public void setData(ByteBuffer data) - `data`: ByteBuffer类型的输入数据。 +- 返回值 + + 设置数据是否成功。 + ## size ```java diff --git a/docs/lite/api/source_zh_cn/api_java/runner_config.md b/docs/lite/api/source_zh_cn/api_java/runner_config.md index 3eacf0cd342d3bc7aa6627c0737f2191c6bd7d19..4eb6cb948f6f3f1882d6307f2538a0d0f8a2eeb9 100644 --- a/docs/lite/api/source_zh_cn/api_java/runner_config.md +++ b/docs/lite/api/source_zh_cn/api_java/runner_config.md @@ -10,9 +10,9 @@ RunnerConfig定义了MindSpore Lite并发推理的配置参数。 | ------------------------------------------------------------ |--------|--------| | [boolean init()](#init) | √ | ✕ | | [public boolean init(MSContext msContext)](#init) | √ | ✕ | -| [public void setWorkerNum(int workerNum)](#setworkernum) | √ | ✕ | +| [public void setWorkersNum(int workersNum)](#setworkersnum) | √ | ✕ | | [public void setConfigInfo(String section, HashMap config)](#setconfiginfo) | √ | ✕ | -| [public void setConfigPath(String configPath)](#setconfigpath) | √ | ✕ | +| [public void setConfigPath(String config_path)](#setconfigpath) | √ | ✕ | | [void getConfigPath()](#getconfigpath) | √ | ✕ | | [long getRunnerConfigPtr()](#getrunnerconfigptr) | √ | ✕ | | [public void setDeviceIds(ArrayList deviceIds)](#setdeviceids) | √ | ✕ | @@ -45,17 +45,17 @@ public boolean init(MSContext msContext) 是否初始化成功。 -## setWorkerNum +## setWorkersNum ```java -public void setWorkerNum(int workerNum) +public void setWorkersNum(int workersNum) ``` 并发推理中模型个数参数设置。 - 参数 - - `workerNum`: 模型个数。 + - `workersNum`: 模型个数。 ## setConfigInfo @@ -73,14 +73,14 @@ public void setConfigInfo(String section, HashMap config) ## setConfigPath ```java -public void setConfigPath(String configPath) +public void setConfigPath(String config_path) ``` 并发推理中模型配置文件路径参数设置。 - 参数 - - `configPath`: 配置文件路径。 + - `config_path`: 配置文件路径。 ## getConfigPath