diff --git a/docs/lite/docs/source_en/mindir/build.md b/docs/lite/docs/source_en/mindir/build.md index 3dc8f99ae48c6ac99b8c986c14fbe2d62c3cd375..be7b5044fe628b3d893eab391d4f485d3919b4a7 100644 --- a/docs/lite/docs/source_en/mindir/build.md +++ b/docs/lite/docs/source_en/mindir/build.md @@ -12,7 +12,6 @@ Cloud-side MindSpore Lite contains modules: | runtime(cpp, java) | Linux | Model Inference Framework | | benchmark | Linux | Benchmarking Tool | | minddata | Linux | Image Processing Library | -| akg | Linux | Polyhedral-based deep learning operator compiler ([Auto Kernel Generator](https://gitee.com/mindspore/akg)) | ## Environment Requirements @@ -35,9 +34,6 @@ Cloud-side MindSpore Lite contains modules: - [Python](https://www.python.org/) >= 3.7.0 - [NumPy](https://numpy.org/) >= 1.17.0 (If installation with pip fails, please upgrade the pip version first: `python -m pip install -U pip`) - [wheel](https://pypi.org/project/wheel/) >= 0.32.0 (If installation with pip fails, please upgrade the pip version first: `python -m pip install -U pip`) -- Compilation dependency for AKG (optional, compiled by default), which is not compiled if LLVM-12 or Python3 is not installed. To compile the AKG for the Ascend backend, git-lfs must be installed. - - [llvm](#installing-llvm-optional) == 12.0.1 - - [git-lfs](https://git-lfs.com/) > Gradle recommends using [gradle-6.6.1-complete](https://gradle.org/next-steps/?version=6.6.1&format=all), and configuring other versions of gradle will use the gradle wrapper mechanism to automatically download ` gradle-6.6.1-complete`. > @@ -49,7 +45,7 @@ Cloud-side MindSpore Lite contains modules: ## Compilation Options -The `build.sh` script in the MindSpore root directory can be used to compile cloud-side MindSpore Lite. +The `build.sh` script in the MindSpore Lite root directory can be used to compile cloud-side MindSpore Lite. ### Instructions for Using the Parameters of `build.sh` @@ -59,7 +55,6 @@ The `build.sh` script in the MindSpore root directory can be used to compile clo | -d | Set this parameter to compile the Debug version, otherwise compile the Release version | None | None | | -i | Set this parameter for incremental compilation, otherwise for full compilation | None | None | | -j[n] | Set the number of threads used at compile time, otherwise the default setting is 8 threads | Integer | 8 | -| -K | Set whether to compile AKG during compilation, otherwise the default setting is on | on, off | on | > - If the JAVA_HOME environment variable is configured and Gradle is installed, the JAR package is compiled at the same time. > - Add the `-i` parameter for incremental compilation does not take effect when the `-I` parameter changes, e.g. `-I x86_64` becomes `-I arm64`. @@ -73,7 +68,6 @@ General module compilation options: | Options | Description of the parameters | Range of values | Default values | | -------- | ----- | ---- | ---- | -| MSLITE_GPU_BACKEND | Set GPU backend, only tensorrt is valid at `-I x86_64`. | tensorrt, off | off in `-I x86_64` | | MSLITE_ENABLE_TOOLS | Whether to compile the accompanying benchmarking tool | on, off | on | | MSLITE_ENABLE_TESTCASES | Whether to compile test cases | on, off | off | | MSLITE_ENABLE_ACL | Whether to enable Ascend ACL | on, off | off | @@ -86,10 +80,10 @@ General module compilation options: ## Compilation Examples -First, you need to download the source code from the MindSpore code repository before compiling. +First, you need to download the source code from the MindSpore Lite code repository before compiling. ```bash -git clone https://gitee.com/mindspore/mindspore.git +git clone https://gitee.com/mindspore/mindspore-lite.git ``` ### Environment Preparation @@ -100,9 +94,9 @@ git clone https://gitee.com/mindspore/mindspore.git - The Ascend package is available in both commercial and community versions. - 1. Please refer to the [Ascend Data Center Solution 23.0.RC3 Installation Guide document](https://support.huawei.com/enterprise/en/doc/EDOC1100336282) for the download link and installation method of the commercial version. + 1. Commercial edition needs approval from Ascend to download, release date is TBD. - 2. There is no restriction on downloading the Community Edition. Please go to [CANN Community Edition](https://www.hiascend.com/developer/download/community/result?module=cann), select `7.0.RC1.beta1` version, and get the corresponding firmware and driver installation packages from the [Firmware and Driver](https://www.hiascend.com/hardware/firmware-drivers?tag=community). For package selection and installation, please refer to the commercial version installation guide document above. + 2. There is no restriction on downloading the Community Edition. Please go to [CANN Community Edition](https://www.hiascend.com/developer/download/community/result?module=cann), select `8.2.RC1` version, and get the corresponding firmware and driver installation packages from the [Firmware and Driver](https://www.hiascend.com/hardware/firmware-drivers?tag=community). For package selection and installation, please refer to the commercial version installation guide document above. - The default installation path for the installation package is `/usr/local/Ascend`. After installation, make sure the current user has permission to access the installation path of the Ascend AI processor companion package. If you don't have permission, the root user needs to add the current user to the user group where `/usr/local/Ascend` is located. - Install the whl package included with the Ascend AI processor companion software. If you have previously installed the package included with the Ascend AI processor, you need to uninstall the corresponding whl package first by using the following command. @@ -138,21 +132,13 @@ git clone https://gitee.com/mindspore/mindspore.git export PYTHONPATH=${TBE_IMPL_PATH}:${PYTHONPATH} # Python library that TBE implementation depends on ``` -#### GPU - -GPU environment compilation. Using TensorRT requires integration with CUDA, TensorRT. The current version is adapted to [CUDA 11.1](https://developer.nvidia.com/cuda-11.1.1-download-archive) and [TensorRT 8.5.1](https://developer.nvidia.com/nvidia-tensorrt-8x-download). - -Install the appropriate version of CUDA and set the installed directory to the environment variable `${CUDA_HOME}`. The build script will use this environment variable to find CUDA. - -Download the corresponding version of the TensorRT archive and set the directory where the archive was unzipped to the environment variable `${TENSORRT_PATH}`. The build script will use this environment variable to find TensorRT. - #### CPU Use x86_64 or ARM64 environment. #### Installing LLVM-optional -The CPU backend of the graph kernel fusion in the converter needs to rely on LLVM-12. Run the following commands to install [LLVM](https://llvm.org/) to enable CPU backend. If LLVM-12 is not installed, the graph kernel fusion can only support GPU and Ascend backend. +The CPU backend of the graph kernel fusion in the converter needs to rely on LLVM-12. Run the following commands to install [LLVM](https://llvm.org/) to enable CPU backend. If LLVM-12 is not installed, the graph kernel fusion can only support Ascend backend. ```shell wget -O - https://apt.llvm.org/llvm-snapshot.gpg.key | sudo apt-key add - @@ -163,16 +149,14 @@ sudo apt-get install llvm-12-dev -y ### Executing Compilation -Three-backend-unification packages need to configure the following environment variables: +Double-backend-unification packages need to configure the following environment variables: ```bash export MSLITE_ENABLE_CLOUD_INFERENCE=on -export MSLITE_GPU_BACKEND=tensorrt export MSLITE_ENABLE_ACL=on ``` > - If you don't need Ascend backend, you can configure ``export MSLITE_ENABLE_ACL=off``. -> - If you don't need GPU backend, you can configure ``export MSLITE_GPU_BACKEND=off``. Execute the following command in the source root directory to compile different versions of MindSpore Lite. @@ -188,12 +172,6 @@ Execute the following command in the source root directory to compile different bash build.sh -I arm64 -j32 ``` -- Compile the x86_64 architecture version while setting the number of threads, but do not compile AKG. - - ```bash - bash build.sh -I x86_64 -j32 -K off - ``` - Finally, the following file will be generated in the `output/` directory: - `mindspore-lite-{version}-{os}-{arch}.tar.gz`: contains runtime and companion tools. @@ -217,12 +195,6 @@ After installation, you can use the following command to check whether the insta python -c "import mindspore_lite" ``` -After installation, you can use the following command to check if the built-in AKG in MindSpore Lite is installed successfully: if no error is reported, the installation is successful. - -```bash -python -c "import mindspore_lite.akg" -``` - After successful installation, you can use the `pip show mindspore_lite` command to see where the Python modules for MindSpore Lite are installed. ## Directory Structure @@ -247,8 +219,6 @@ mindspore-lite-{version}-linux-{arch} │ ├── libjpeg-turbo │ └── securec └── tools - ├── akg - | └── akg-{version}-{python}-linux-{arch}.whl # AKG Python whl package ├── benchmark # Benchmarking Tools │ └── benchmark # Benchmarking tool executable file └── converter # Model converter diff --git a/docs/lite/docs/source_zh_cn/mindir/build.md b/docs/lite/docs/source_zh_cn/mindir/build.md index 65e8cee20c49fe38eeb42442b3d8be3105df3c22..4a984edd831c89accf34fb871ddf09884c2e403d 100644 --- a/docs/lite/docs/source_zh_cn/mindir/build.md +++ b/docs/lite/docs/source_zh_cn/mindir/build.md @@ -12,7 +12,6 @@ | runtime(cpp、java) | Linux | 模型推理框架 | | benchmark | Linux | 基准测试工具 | | minddata | Linux | 图像处理库 | -| akg | Linux | 基于Polyhedral的算子编译器([Auto Kernel Generator](https://gitee.com/mindspore/akg)) | ## 环境要求 @@ -35,9 +34,6 @@ - [Python](https://www.python.org/) >= 3.7.0 - [NumPy](https://numpy.org/) >= 1.17.0 (如果用pip安装失败,请先升级pip版本:`python -m pip install -U pip`) - [wheel](https://pypi.org/project/wheel/) >= 0.32.0 (如果用pip安装失败,请先升级pip版本:`python -m pip install -U pip`) -- AKG(可选,默认编译),未安装LLVM-12或者Python3则不编译akg,未安装git-lfs则无法编译ascend后端的akg。 - - [llvm](#安装llvm-可选) == 12.0.1 - - [git-lfs](https://git-lfs.com/) > Gradle建议采用[gradle-6.6.1-complete](https://gradle.org/next-steps/?version=6.6.1&format=all)版本,配置其他版本gradle将会采用gradle wrapper机制自动下载`gradle-6.6.1-complete`。 > @@ -49,7 +45,7 @@ ## 编译选项 -MindSpore根目录下的`build.sh`脚本可用于云侧MindSpore Lite的编译。 +MindSpore Lite根目录下的`build.sh`脚本可用于云侧MindSpore Lite的编译。 ### `build.sh`的参数使用说明 @@ -59,7 +55,6 @@ MindSpore根目录下的`build.sh`脚本可用于云侧MindSpore Lite的编译 | -d | 设置该参数,则编译Debug版本,否则编译Release版本 | 无 | 无 | | -i | 设置该参数,则进行增量编译,否则进行全量编译 | 无 | 无 | | -j[n] | 设定编译时所用的线程数,否则默认设定为8线程 | Integer | 8 | -| -K | 设定编译时是否编译akg,否则默认编译akg | on、off | on | > - 若配置了JAVA_HOME环境变量并安装了Gradle,则同时编译JAR包。 > - 在`-I`参数变动时,如`-I x86_64`变为`-I arm64`,添加`-i`参数进行增量编译不生效。 @@ -73,7 +68,6 @@ MindSpore根目录下的`build.sh`脚本可用于云侧MindSpore Lite的编译 | 选项 | 参数说明 | 取值范围 | 默认值 | | ------------------------------------ | ------------------------------------------ | ------------- | -------------------- | -| MSLITE_GPU_BACKEND | 设置GPU后端,在`-I x86_64`时仅tensorrt有效 | tensorrt、off | 在`-I x86_64`时为off | | MSLITE_ENABLE_TOOLS | 是否编译配套Benchmark基准测试工具 | on、off | on | | MSLITE_ENABLE_TESTCASES | 是否编译测试用例 | on、off | off | | MSLITE_ENABLE_ACL | 是否使能昇腾ACL | on、off | off | @@ -86,10 +80,10 @@ MindSpore根目录下的`build.sh`脚本可用于云侧MindSpore Lite的编译 ## 编译示例 -首先,在进行编译之前,需从MindSpore代码仓下载源码。 +首先,在进行编译之前,需从MindSpore Lite代码仓下载源码。 ```bash -git clone https://gitee.com/mindspore/mindspore.git +git clone https://gitee.com/mindspore/mindspore-lite.git ``` ### 环境准备 @@ -100,9 +94,9 @@ git clone https://gitee.com/mindspore/mindspore.git - 昇腾软件包提供商用版和社区版两种下载途径: - 1. 商用版下载需要申请权限,下载链接与安装方式请参考[Ascend Data Center Solution 23.0.RC3安装指引文档](https://support.huawei.com/enterprise/zh/doc/EDOC1100336282)。 + 1. 商用版下载需要申请权限,下载链接即将发布。 - 2. 社区版下载不受限制,下载链接请前往[CANN社区版](https://www.hiascend.com/developer/download/community/result?module=cann),选择`7.0.RC1.beta1`版本,以及在[固件与驱动](https://www.hiascend.com/hardware/firmware-drivers?tag=community)链接中获取对应的固件和驱动安装包,安装包的选择与安装方式请参照上述的商用版安装指引文档。 + 2. 社区版下载不受限制,下载链接请前往[CANN社区版](https://www.hiascend.com/developer/download/community/result?module=cann),选择`8.2.RC1`版本,以及在[固件与驱动](https://www.hiascend.com/hardware/firmware-drivers?tag=community)链接中获取对应的固件和驱动安装包,安装包的选择与安装方式请参照上述的商用版安装指引文档。 - 安装包默认安装路径为`/usr/local/Ascend`。安装后确认当前用户有权限访问昇腾AI处理器配套软件包的安装路径,若无权限,需要root用户将当前用户添加到`/usr/local/Ascend`所在的用户组。 - 安装昇腾AI处理器配套软件所包含的whl包。如果之前已经安装过昇腾AI处理器配套软件包,需要先使用如下命令卸载对应的whl包。 @@ -138,21 +132,13 @@ git clone https://gitee.com/mindspore/mindspore.git export PYTHONPATH=${TBE_IMPL_PATH}:${PYTHONPATH} # Python library that TBE implementation depends on ``` -#### GPU - -GPU环境编译,使用TensorRT需要集成CUDA、TensorRT。当前版本适配[CUDA 11.1](https://developer.nvidia.com/cuda-11.1.1-download-archive) 和 [TensorRT 8.5.1](https://developer.nvidia.com/nvidia-tensorrt-8x-download)。 - -安装相应版本的CUDA,并将安装后的目录设置为环境变量`${CUDA_HOME}`。构建脚本将使用这个环境变量寻找CUDA。 - -下载对应版本的TensorRT压缩包,并将压缩包解压后的目录设置为环境变量`${TENSORRT_PATH}`。构建脚本将使用这个环境变量寻找TensorRT。 - #### CPU 使用x86_64或ARM64环境。 #### 安装LLVM-可选 -模型转换工具中图算融合功能的CPU后端需要依赖LLVM-12,可以通过以下命令安装[LLVM](https://llvm.org/)。如果没有安装LLVM-12则图算融合功能仅能支持GPU和Ascend后端。 +模型转换工具中图算融合功能的CPU后端需要依赖LLVM-12,可以通过以下命令安装[LLVM](https://llvm.org/)。如果没有安装LLVM-12则图算融合功能仅能支持Ascend后端。 ```shell wget -O - https://apt.llvm.org/llvm-snapshot.gpg.key | sudo apt-key add - @@ -163,16 +149,14 @@ sudo apt-get install llvm-12-dev -y ### 执行编译 -三后端合一包需配置如下环境变量: +双后端合一包需配置如下环境变量: ```bash export MSLITE_ENABLE_CLOUD_INFERENCE=on -export MSLITE_GPU_BACKEND=tensorrt export MSLITE_ENABLE_ACL=on ``` > - 如无需Ascend后端,可配置``export MSLITE_ENABLE_ACL=off`` -> - 如无需GPU后端,可配置``export MSLITE_GPU_BACKEND=off`` 在源码根目录下执行如下命令,可编译不同版本的MindSpore Lite。 @@ -188,12 +172,6 @@ export MSLITE_ENABLE_ACL=on bash build.sh -I arm64 -j32 ``` -- 编译x86_64架构版本,同时设定线程数,但是不编译AKG。 - - ```bash - bash build.sh -I x86_64 -j32 -K off - ``` - 最后,会在`output/`目录中生成如下文件: - `mindspore-lite-{version}-{os}-{arch}.tar.gz`:包含runtime和配套工具。 @@ -217,12 +195,6 @@ pip install mindspore-lite-{version}-{python}-{os}-{arch}.whl python -c "import mindspore_lite" ``` -安装后可以使用以下命令检查mindspore_lite内置的AKG是否安装成功:若无报错,则表示安装成功。 - -```bash -python -c "import mindspore_lite.akg" -``` - 安装成功后,可使用`pip show mindspore_lite`命令查看MindSpore Lite的Python模块的安装位置。 ## 目录结构 @@ -247,8 +219,6 @@ mindspore-lite-{version}-linux-{arch} │ ├── libjpeg-turbo │ └── securec └── tools - ├── akg - | └── akg-{version}-{python}-linux-{arch}.whl # AKG的whl包 ├── benchmark # 基准测试工具 │ └── benchmark # 基准测试工具可执行文件 └── converter # 模型转换工具