diff --git a/ACL_PyTorch/built-in/cv/SwinTransformer_for_Pytorch/README.md b/ACL_PyTorch/built-in/cv/SwinTransformer_for_Pytorch/README.md
index 8bf6feb7fe02430364f58d01194694c97ddf17eb..6d5c3941aafdb6bbfe657a9a293eba6f2461358a 100644
--- a/ACL_PyTorch/built-in/cv/SwinTransformer_for_Pytorch/README.md
+++ b/ACL_PyTorch/built-in/cv/SwinTransformer_for_Pytorch/README.md
@@ -7,8 +7,6 @@
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -47,21 +45,6 @@
| logits | FLOAT16 | batchsize x num_class | ND |
-# 推理环境准备
-
-- 该模型需要以下插件与驱动:
-
- **表 1** 版本配套表
-
- | 配套 | 版本 | 环境准备指导 |
- | ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 1.0.17 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.0.RC1 | - |
- | Python | 3.7.5 | - |
- | PyTorch | 1.8.0+ | - |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
-
-
# 快速上手
diff --git a/ACL_PyTorch/built-in/cv/T2vec_for_Pytorch/README.md b/ACL_PyTorch/built-in/cv/T2vec_for_Pytorch/README.md
index f61b47a241fba5538d1069a9c8b996121ea6ed21..7aeefebe9df0081d6a5666b90ca5efcbbfdc0ab7 100644
--- a/ACL_PyTorch/built-in/cv/T2vec_for_Pytorch/README.md
+++ b/ACL_PyTorch/built-in/cv/T2vec_for_Pytorch/README.md
@@ -4,8 +4,6 @@
- [输入输出数据](#section540883920406)
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
- [准备数据集](#section183221994411)
@@ -45,20 +43,6 @@ t2vec是一种基于深度表征学习的轨迹相似性计算方法,通过学
| -------- | -------- | ------------- | ------------ |
| h | FLOAT32 | 3 x 256 x 256 | ND |
-# 推理环境准备
-
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
-
- | 配套 | 版本 | 环境准备指导 |
- | ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 1.0.17 | [Pytorch框架推理环境准备](https://gitee.com/link?target=https%3A%2F%2Fwww.hiascend.com%2Fdocument%2Fdetail%2Fzh%2FModelZoo%2Fpytorchframework%2Fpies) |
- | CANN | 6.0.RC2 | - |
- | Python | 3.7.5 | - |
- | PyTorch | 1.11.0 | - |
- | Julia | 1.6.1 | |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
# 快速上手
diff --git a/ACL_PyTorch/built-in/cv/TSM_sthv2_for_Pytorch/README.md b/ACL_PyTorch/built-in/cv/TSM_sthv2_for_Pytorch/README.md
index 1e924e158c944e5e5d1845bd4e8afa8853ee3b29..a53cf608276f861e377a3f8ad11e31247516164a 100644
--- a/ACL_PyTorch/built-in/cv/TSM_sthv2_for_Pytorch/README.md
+++ b/ACL_PyTorch/built-in/cv/TSM_sthv2_for_Pytorch/README.md
@@ -5,8 +5,6 @@
- [输入输出数据](#section540883920406)
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -33,21 +31,6 @@ TSM是一种通用且有效的时间偏移模块,它具有高效率和高性
| -------- | -------- | ------- | ------------ |
| output | FLOAT32 | batchsize x 174 | ND |
-# 推理环境准备
-
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
-
- | 配套 | 版本 | 环境准备指导 |
- | --------------------------------------------------------------- | ------- | ----------------------------------------------------------------------------------------------------- |
- | 固件与驱动 | 1.0.17 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.0.RC1 | - |
- | Python | 3.7.5 | - |
- | PyTorch | 1.9.0 | - |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
-
-
# 快速上手
## 获取源码
diff --git a/ACL_PyTorch/built-in/cv/Twins-SVT-L/README.md b/ACL_PyTorch/built-in/cv/Twins-SVT-L/README.md
index 6ed0b5ea88633126c2ac6f800a622e799bb9884b..25aa8f92dc1db5162f03fb70a98aea05878db099 100644
--- a/ACL_PyTorch/built-in/cv/Twins-SVT-L/README.md
+++ b/ACL_PyTorch/built-in/cv/Twins-SVT-L/README.md
@@ -5,8 +5,6 @@
- [输入输出数据](#section540883920406)
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -44,21 +42,6 @@
| output | FLOAT32 | batchsize x 1000 | ND |
-# 推理环境准备
-
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
-
- | 配套 | 版本 | 环境准备指导 |
- | ---------- | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 22.0.2 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.0.0 | - |
- | Python | 3.7.5 | - |
- | Pytorch | 1.7.0 | - |
-
-
-
# 快速上手
## 获取源码
diff --git a/ACL_PyTorch/built-in/cv/U2-Net_for_PyTorch/README.md b/ACL_PyTorch/built-in/cv/U2-Net_for_PyTorch/README.md
index 833f5bf254112bbf467fc5c2b035aeb9fc0c089b..f3cce58317631c0e5240c6a7ea2c9f2fe34f22b9 100644
--- a/ACL_PyTorch/built-in/cv/U2-Net_for_PyTorch/README.md
+++ b/ACL_PyTorch/built-in/cv/U2-Net_for_PyTorch/README.md
@@ -6,9 +6,6 @@
- [输入输出数据](#section540883920406)
-
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -51,22 +48,6 @@ U-2-Net是基于UNet提出的一种新的网络结构,网络基于encode-decod
| output | FLOAT32 | batchsize x 3 x 320 x 320 | NCHW |
-
-# 推理环境准备
-
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
-
- | 配套 | 版本 | 环境准备指导 |
- | ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 1.0.17 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.0.RC1 | - |
- | Python | 3.7.5 | - |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
-
-
-
# 快速上手
## 获取源码
diff --git a/ACL_PyTorch/built-in/cv/VGG16_SSD_for_PyTorch/ReadMe.md b/ACL_PyTorch/built-in/cv/VGG16_SSD_for_PyTorch/ReadMe.md
index 29ae5537686d32248a95115b90f112f9a0559c3d..78b74169f928a2a38c859d394fd07e8c33c0963b 100644
--- a/ACL_PyTorch/built-in/cv/VGG16_SSD_for_PyTorch/ReadMe.md
+++ b/ACL_PyTorch/built-in/cv/VGG16_SSD_for_PyTorch/ReadMe.md
@@ -5,8 +5,6 @@
- [输入输出数据](#section540883920406)
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -51,24 +49,6 @@ SSD网络是继YOLO之后的one-stage目标检测网络,是为了改善YOLO网
| boxes | FLOAT32 | batchsize x 8732 x 4 | ND |
-
-
-# 推理环境准备
-
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
-
- | 配套 | 版本 | 环境准备指导 |
- | ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 1.0.17 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.0.RC1 | - |
- | Python | 3.7.5 | - |
- | PyTorch | 1.6.0 | - |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
-
-
-
# 快速上手
## 获取源码
@@ -259,4 +239,8 @@ SSD网络是继YOLO之后的one-stage目标检测网络,是为了改善YOLO网
| 310P3 | 32 | VOC2007 | acc:0.7726 | 730 |
| 310P3 | 64 | VOC2007 | acc:0.7726 | 718 |
-说明:精度是所有类别的平均值
\ No newline at end of file
+说明:精度是所有类别的平均值
+
+
+# 公网地址说明
+代码涉及公网地址参考 public_address_statement.md
\ No newline at end of file
diff --git a/ACL_PyTorch/built-in/cv/VGG16_SSD_for_PyTorch/public_address_statement.md b/ACL_PyTorch/built-in/cv/VGG16_SSD_for_PyTorch/public_address_statement.md
new file mode 100644
index 0000000000000000000000000000000000000000..53b3426c8a186e69e88b51de334d91e4758276f8
--- /dev/null
+++ b/ACL_PyTorch/built-in/cv/VGG16_SSD_for_PyTorch/public_address_statement.md
@@ -0,0 +1,3 @@
+| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 |
+| ---- | ------------ | ------ | ------------------------------------ | -------- |
+|开源代码引入|https://github.com/qfgaohao/pytorch-ssd.git |vgg16_ssd_pth2onnx.py| https://github.com/qfgaohao/pytorch-ssd |源码引用说明|
\ No newline at end of file
diff --git a/ACL_PyTorch/built-in/cv/Wide_ResNet50_2_for_Pytorch/README.md b/ACL_PyTorch/built-in/cv/Wide_ResNet50_2_for_Pytorch/README.md
index 3cd0ddf3ebb0487b48bba136707911e23e5ac8ac..baecb25cfa61f058706e5e56457c594bb8d7dcf5 100644
--- a/ACL_PyTorch/built-in/cv/Wide_ResNet50_2_for_Pytorch/README.md
+++ b/ACL_PyTorch/built-in/cv/Wide_ResNet50_2_for_Pytorch/README.md
@@ -54,21 +54,9 @@
# 推理环境准备
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
-
- | 配套 | 版本 | 环境准备指导 |
- | ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 1.0.17 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.0.RC1 | - |
- | Python | 3.7.5 | - |
- | PyTorch | 1.8.1 | - |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
-
- 该模型需要以下依赖
- **表 2** 依赖列表
+ **表 1** 依赖列表
| 依赖名称 | 版本 |
| ------------- | -------- |
diff --git a/ACL_PyTorch/built-in/cv/YolactEdge_for_PyTorch/README.md b/ACL_PyTorch/built-in/cv/YolactEdge_for_PyTorch/README.md
index 59eef40425276988f08ccff7684e8e41f448f25c..73235a47d51cc8ba8444aab3fdfe4fee50762ea0 100644
--- a/ACL_PyTorch/built-in/cv/YolactEdge_for_PyTorch/README.md
+++ b/ACL_PyTorch/built-in/cv/YolactEdge_for_PyTorch/README.md
@@ -5,10 +5,6 @@
- [输入输出数据](#section540883920406)
-
-
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -55,23 +51,6 @@ YolactEdge模型是一个边缘设备上的实时实例分割模型。YolactEdge
| output4 | FLOAT32 | batchsize x 256 x 5 x 5 | NCHW |
| output5 | FLOAT32 | batchsize x 138 x 138 x 32 | NCHW |
-
-# 推理环境准备
-
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
-
- | 配套 | 版本 | 环境准备指导 |
- | ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 22.0.2 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 5.1.RC2 | - |
- | Python | 3.7.5 | - |
- | PyTorch | 1.10.0 | - |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
-
-
-
# 快速上手
## 获取源码
@@ -290,4 +269,8 @@ YolactEdge模型是一个边缘设备上的实时实例分割模型。YolactEdge
| 310P | 8 | COCO | | 175.41 |
| 310P | 16 | COCO | | 178.71 |
| 310P | 32 | COCO | | 183.77 |
-| 310P | 64 | COCO | | 183.11 |
\ No newline at end of file
+| 310P | 64 | COCO | | 183.11 |
+
+
+# 公网地址说明
+代码涉及公网地址参考 public_address_statement.md
\ No newline at end of file
diff --git a/ACL_PyTorch/built-in/cv/YolactEdge_for_PyTorch/public_address_statement.md b/ACL_PyTorch/built-in/cv/YolactEdge_for_PyTorch/public_address_statement.md
new file mode 100644
index 0000000000000000000000000000000000000000..b20f0b7518fd265b7adcd04da61485d3c67c2bef
--- /dev/null
+++ b/ACL_PyTorch/built-in/cv/YolactEdge_for_PyTorch/public_address_statement.md
@@ -0,0 +1,3 @@
+| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 |
+| ---- | ------------ | ------ | ------------------------------------ | -------- |
+|开源代码引入| https://github.com/haotian-liu/yolact_edge.git | yolact_edge.patch | https://github.com/pytorch/pytorch/issues/17108 | 代码备注说明 |
\ No newline at end of file
diff --git a/ACL_PyTorch/built-in/cv/YoloF_for_Pytorch/readme.md b/ACL_PyTorch/built-in/cv/YoloF_for_Pytorch/readme.md
index e6fcd9203cfd50ae8233833e33b411259f751c0f..0b8d682a142babc9ff18c1be4612635dd6bb63b6 100644
--- a/ACL_PyTorch/built-in/cv/YoloF_for_Pytorch/readme.md
+++ b/ACL_PyTorch/built-in/cv/YoloF_for_Pytorch/readme.md
@@ -2,7 +2,6 @@
- [概述](#概述)
- [输入输出数据](#输入输出数据)
-- [推理环境](#推理环境)
- [快速上手](#快速上手)
- [获取源码](#获取源码)
- [准备数据集](#准备数据集)
@@ -35,22 +34,8 @@ YOLOF引入了一种解决该优化问题的替代方案而无需使用复杂的
| ----------- | ---------- | ----------- | ----------- |
| output1 | FLOAT32 | ND | batchsize x num_dets x 5 |
| output1 | INT32 | ND | batchsize x num_dets |
-
----
-# 推理环境
-
-- 该模型推理所需配套的软件如下:
-
- | 配套 | 版本 | 环境准备指导 |
- | --------- | ------- | ---------- |
- | 固件与驱动 | 1.0.17 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.0.RC1 | - |
- | Python | 3.7.5 | - |
-
- 说明:请根据推理卡型号与 CANN 版本选择相匹配的固件与驱动版本。
-
-----
# 快速上手
## 安装
diff --git a/ACL_PyTorch/built-in/cv/YoloX_Tiny_for_Pytorch/README.md b/ACL_PyTorch/built-in/cv/YoloX_Tiny_for_Pytorch/README.md
index ffbac45bcffef93502a4ddc775843f3b659d50a1..a85c68023807c28f14e6e93a0de8c00a8b455a58 100644
--- a/ACL_PyTorch/built-in/cv/YoloX_Tiny_for_Pytorch/README.md
+++ b/ACL_PyTorch/built-in/cv/YoloX_Tiny_for_Pytorch/README.md
@@ -5,8 +5,6 @@
- [输入输出数据](#section540883920406)
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -51,24 +49,6 @@ YOLOX对YOLO系列进行了一些有经验的改进,将YOLO检测器转换为
| labels | INT32 | batchsize x num_dets | ND |
-
-
-# 推理环境准备
-
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
-
- | 配套 | 版本 | 环境准备指导 |
- | ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 22.0.2 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.0.RC1 | - |
- | Python | 3.7.5 | - |
- | PyTorch | 1.8.0 | - |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
-
-
-
# 快速上手
## 获取源码
@@ -232,3 +212,7 @@ YOLOX对YOLO系列进行了一些有经验的改进,将YOLO检测器转换为
| 310P3 | 16 | coco2017 | bbox_map:0.3336 | 636fps |
| 310P3 | 32 | coco2017 | bbox_map:0.3336 | 617fps |
| 310P3 | 64 | coco2017 | bbox_map:0.3336 | 621fps |
+
+
+# 公网地址说明
+代码涉及公网地址参考 public_address_statement.md
\ No newline at end of file
diff --git a/ACL_PyTorch/built-in/cv/YoloX_Tiny_for_Pytorch/public_address_statement.md b/ACL_PyTorch/built-in/cv/YoloX_Tiny_for_Pytorch/public_address_statement.md
new file mode 100644
index 0000000000000000000000000000000000000000..b627ccd2297d57edb00ebb557456a412547bbdc7
--- /dev/null
+++ b/ACL_PyTorch/built-in/cv/YoloX_Tiny_for_Pytorch/public_address_statement.md
@@ -0,0 +1,3 @@
+| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 |
+| ---- | ------------ | ------ | ------------------------------------ | -------- |
+|开源代码引入| https://github.com/open-mmlab/mmdetection.git | yolox_head.py | `YOLOX `_. |论文地址|
\ No newline at end of file
diff --git a/ACL_PyTorch/built-in/cv/YoloXs_for_Pytorch/readme.md b/ACL_PyTorch/built-in/cv/YoloXs_for_Pytorch/readme.md
index 8f4779fa03104838e24b3f4c72354f1964824714..4a3369cc1eb35e27e27c7a1a238136adfc7a1285 100644
--- a/ACL_PyTorch/built-in/cv/YoloXs_for_Pytorch/readme.md
+++ b/ACL_PyTorch/built-in/cv/YoloXs_for_Pytorch/readme.md
@@ -5,8 +5,6 @@
- [输入输出数据](#section540883920406)
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -49,25 +47,6 @@ YOLOX对YOLO系列进行了一些有经验的改进,将YOLO检测器转换为
| -------- | -------- | ----------------------- | ------------ |
| output | FLOAT32 | batchsize x dim1 x dim2 | ND |
-
-
-
-# 推理环境准备
-
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
-
- | 配套 | 版本 | 环境准备指导 |
- |---------| ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 22.0.4 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.3.RC1 | - |
- | Python | 3.7.5 | - |
- | PyTorch | 1.7.0 | - |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
-
-
-
# 快速上手
## 获取源码
diff --git a/ACL_PyTorch/built-in/cv/Yolov4_for_Pytorch/README.md b/ACL_PyTorch/built-in/cv/Yolov4_for_Pytorch/README.md
index 7728ea68836a35db3aa390b962788c3883bb1900..dcc2f51acad120f369d2d28e8b881bdb54c85e74 100755
--- a/ACL_PyTorch/built-in/cv/Yolov4_for_Pytorch/README.md
+++ b/ACL_PyTorch/built-in/cv/Yolov4_for_Pytorch/README.md
@@ -5,8 +5,6 @@
- [输入输出数据](#ZH-CN_TOPIC_0000001126281702)
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -48,20 +46,6 @@ YOLO是一个经典的物体检查网络,将物体检测作为回归问题求
| feature_map_2 | FLOAT32 | batchsize x 255 x 38 x 38 | NCHW |
| feature_map_3 | FLOAT32 | batchsize x 255 x 38 x 38 | NCHW |
-# 推理环境准备\[所有版本\]
-
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
-
-| 配套 | 版本 | 环境准备指导 |
-| --------------------------------------------------------------- | ------- | ----------------------------------------------------------------------------------------------------- |
-| 固件与驱动 | 22.0.3 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
-| CANN | 6.0.RC1 | - |
-| Python | 3.7.5 | - |
-| PyTorch | 1.8.0 | - |
-| 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
-
# 快速上手
## 获取源码
diff --git a/ACL_PyTorch/built-in/foundation_models/ControlNet_for_PyTorch/README.md b/ACL_PyTorch/built-in/foundation_models/ControlNet_for_PyTorch/README.md
index 1faed815578d64b5226ddee9c0a4baefd44e00f7..d51e951cb0d266a544cce6052703ae7e95dfe11c 100644
--- a/ACL_PyTorch/built-in/foundation_models/ControlNet_for_PyTorch/README.md
+++ b/ACL_PyTorch/built-in/foundation_models/ControlNet_for_PyTorch/README.md
@@ -5,8 +5,6 @@
- [输入输出数据](#section540883920406)
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -45,18 +43,6 @@
| -------- | -------- | -------- | ------------ |
| text_outs | 1 x 4 x 64 x 72 | FLOAT32 | NCHW |
-# 推理环境准备
-
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
- | 配套 | 版本 | 环境准备指导 |
- | ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 23.0.rc3 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 7.0.0 | - |
- | Python | 3.8.5 | - | |
-
-
# 快速上手
## 获取源码
diff --git a/ACL_PyTorch/built-in/foundation_models/stable_diffusion/README.md b/ACL_PyTorch/built-in/foundation_models/stable_diffusion/README.md
index caf81d2f5137f337c79d119e5f19fb0cf7c83e7e..4bf528a095ee149a2c0bb5558c72654d16e6cb86 100755
--- a/ACL_PyTorch/built-in/foundation_models/stable_diffusion/README.md
+++ b/ACL_PyTorch/built-in/foundation_models/stable_diffusion/README.md
@@ -46,15 +46,7 @@
# 推理环境准备
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
- | 配套 | 版本 | 环境准备指导 |
- | ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 23.0.rc1 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN(+AscendIE) | 7.0.RC1 | - |
- | Python | 3.9 | - | |
-如在优化模型时--FA不为None或--TOME_num不为0,需要安装与CANN包配套版本的AscendIE
+- 如在优化模型时--FA不为None或--TOME_num不为0,需要安装与CANN包配套版本的AscendIE
- 该模型性能受CPU规格影响,建议使用96核(2x48核)CPU(arm)以复现性能
@@ -387,7 +379,7 @@
4. 计算CLIP-score
```bash
- python clip_score.py \
+ python3 clip_score.py \
--device=cpu \
--image_info="image_info.json" \
--model_name="ViT-H-14" \
@@ -430,4 +422,8 @@
[Produce & Plants], average score: 0.374
[Outdoor Scenes], average score: 0.370
[Indoor Scenes], average score: 0.387
- ```
\ No newline at end of file
+ ```
+
+
+# 公网地址说明
+代码涉及公网地址参考 public_address_statement.md
\ No newline at end of file
diff --git a/ACL_PyTorch/built-in/foundation_models/stable_diffusion/public_address_statement.md b/ACL_PyTorch/built-in/foundation_models/stable_diffusion/public_address_statement.md
new file mode 100644
index 0000000000000000000000000000000000000000..44a78e5880e57df0d547582b77a3de20f72994c1
--- /dev/null
+++ b/ACL_PyTorch/built-in/foundation_models/stable_diffusion/public_address_statement.md
@@ -0,0 +1,8 @@
+| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 |
+| ---- | ------------ | ------ | ------------------------------------ | -------- |
+|开源代码引入| https://huggingface.co/stabilityai/stable-diffusion-2-1-base | pipeline_ascend_stable_diffusion.py |[Classifier-Free Diffusion Guidance](https://arxiv.org/abs/2207.12598). |论文地址|
+|开源代码引入| https://huggingface.co/stabilityai/stable-diffusion-2-1-base | pipeline_ascend_stable_diffusion.py |[Imagen Paper](https://arxiv.org/pdf/2205.11487.pdf). |论文地址|
+|开源代码引入| https://huggingface.co/stabilityai/stable-diffusion-2-1-base | pipeline_ascend_stable_diffusion.py | DDIM paper: https://arxiv.org/abs/2010.02502. |论文地址|
+|开源代码引入| https://huggingface.co/stabilityai/stable-diffusion-2-1-base |pipeline_ascend_stable_diffusion.py |[torch generator](https://pytorch.org/docs/stable/generated/torch.Generator.html) |论文地址|
+|开源代码引入| https://huggingface.co/stabilityai/stable-diffusion-2-1-base | pipeline_ascend_stable_diffusion.py |[PIL](https://pillow.readthedocs.io/en/stable/) |论文地址|
+|开源代码引入| https://huggingface.co/stabilityai/stable-diffusion-2-1-base | pipeline_ascend_stable_diffusion.py |Imagen paper: https://arxiv.org/pdf/2205.11487.pdf . |论文地址|
\ No newline at end of file
diff --git a/ACL_PyTorch/built-in/foundation_models/stable_diffusionxl/README.md b/ACL_PyTorch/built-in/foundation_models/stable_diffusionxl/README.md
index 03413742db6942d4d8c8ec3aa58c58d183790339..e6290851ec809d0847583d2d194f2894f79d5296 100644
--- a/ACL_PyTorch/built-in/foundation_models/stable_diffusionxl/README.md
+++ b/ACL_PyTorch/built-in/foundation_models/stable_diffusionxl/README.md
@@ -42,18 +42,8 @@
# 推理环境准备
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
- | 配套 | 版本 | 环境准备指导 |
- | ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 23.0.rc1 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN(+AscendIE) | 7.0.RC1 | - |
- | Python | 3.9 | - | |
-如在优化模型时--FA不为None或--TOME_num不为0,需要安装与CANN包配套版本的AscendIE
-该模型性能受CPU规格影响,建议使用64核CPU(arm)以复现性能
-
-
+- 如在优化模型时--FA不为None或--TOME_num不为0,需要安装与CANN包配套版本的AscendIE
+- 该模型性能受CPU规格影响,建议使用64核CPU(arm)以复现性能
# 快速上手
diff --git a/ACL_PyTorch/built-in/nlp/Bert_Base_Cased_SST2/README.md b/ACL_PyTorch/built-in/nlp/Bert_Base_Cased_SST2/README.md
index c38c3744539cfbcd0b806b613db79a1879bd2893..983da90f537a6c02f30029f191935eb0acee1ffd 100644
--- a/ACL_PyTorch/built-in/nlp/Bert_Base_Cased_SST2/README.md
+++ b/ACL_PyTorch/built-in/nlp/Bert_Base_Cased_SST2/README.md
@@ -4,8 +4,6 @@
- [输入输出数据](#section540883920406)
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -43,21 +41,6 @@
| -------- | -------- | ------------- | ------------ |
| output | FLOAT32 |batch_size x 2 | ND |
-
-## 推理环境准备\[所有版本\]
-
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
-
-| 配套 | 版本 | 环境准备指导 |
-| ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
-| 固件与驱动 | 1.0.17 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
-| CANN | 6.0.RC1 | - |
-| Python | 3.7.5 | - |
-| PyTorch | 1.8.0 | - |
-| 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
-
## 快速上手
### 获取源码
diff --git a/ACL_PyTorch/built-in/nlp/Bert_Base_Chinese_for_Pytorch/README.md b/ACL_PyTorch/built-in/nlp/Bert_Base_Chinese_for_Pytorch/README.md
index 5274b09d443adc5fa17442727a276f535ecdc56d..e5b76ecfa3e15e350425059d68de428c016837a9 100644
--- a/ACL_PyTorch/built-in/nlp/Bert_Base_Chinese_for_Pytorch/README.md
+++ b/ACL_PyTorch/built-in/nlp/Bert_Base_Chinese_for_Pytorch/README.md
@@ -4,8 +4,6 @@
- [输入输出数据](#section540883920406)
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -43,21 +41,6 @@
| -------- | -------- | -------- | ------------ |
| output | batch_size x class | FLOAT32 | ND |
-
-## 推理环境准备\[所有版本\]
-
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
-
-| 配套 | 版本 | 环境准备指导 |
-| ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
-| 固件与驱动 | 1.0.17 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
-| CANN | 6.0.RC1 | - |
-| Python | 3.7.5 | - |
-| PyTorch | 1.5.0+ | - |
-| 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
-
## 快速上手
### 获取源码
diff --git a/ACL_PyTorch/built-in/nlp/Bert_Base_Uncased_for_Pytorch/ReadMe.md b/ACL_PyTorch/built-in/nlp/Bert_Base_Uncased_for_Pytorch/ReadMe.md
index f5e28789f17b2815520863c6140398197ad3c59c..38c799023213746dbecf9cde864e5df71c57aa77 100644
--- a/ACL_PyTorch/built-in/nlp/Bert_Base_Uncased_for_Pytorch/ReadMe.md
+++ b/ACL_PyTorch/built-in/nlp/Bert_Base_Uncased_for_Pytorch/ReadMe.md
@@ -5,10 +5,6 @@
- [输入输出数据](#section540883920406)
-
-
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -54,20 +50,6 @@ BERT,即Bidirectional Encoder Representations from Transformers,是一种基
| :-------: | :----: | :-------------: | :-------: |
| output | INT64 | batchsize × 512 | ND |
-
-# 推理环境准备
-
-- 该模型需要以下插件与驱动
-
- | 配套 | 版本 | 环境准备指导 |
- | ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 22.0.4 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.0.RC1 | - |
- | Python | 3.7.5 | - |
- | PyTorch | 1.8.0 | - |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | | |
-
-
# 快速上手
## 获取源码
diff --git a/ACL_PyTorch/built-in/nlp/Bert_Uncased_Huggingface/README.md b/ACL_PyTorch/built-in/nlp/Bert_Uncased_Huggingface/README.md
index 6a61fed71b0077d70db2eee2ff5d2078bd5a2603..794dc620e7e8b4474cbe867d86d7d154e99a5e95 100644
--- a/ACL_PyTorch/built-in/nlp/Bert_Uncased_Huggingface/README.md
+++ b/ACL_PyTorch/built-in/nlp/Bert_Uncased_Huggingface/README.md
@@ -4,8 +4,6 @@
- [输入输出数据](#section540883920406)
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
@@ -62,20 +60,6 @@ BERT(Bidirectional Encoder Representations from Transformers)是一种预训
| start_logits | FLOAT32 | batchsize x seq_len | ND |
| end_logits | FLOAT32 | batchsize x seq_len | ND |
-# 推理环境准备
-
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
-
- | 配套 | 版本 | 环境准备指导 |
- | ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 23.0.RC2 | [Pytorch框架推理环境准备](https://gitee.com/link?target=https%3A%2F%2Fwww.hiascend.com%2Fdocument%2Fdetail%2Fzh%2FModelZoo%2Fpytorchframework%2Fpies) |
- | CANN | 6.3.RC2 | - |
- | Python | 3.7.5 | - |
- | PyTorch | 1.8.0 | - |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
-
# 快速上手
## 获取源码
diff --git a/ACL_PyTorch/built-in/nlp/BiLSTM_CRF_PyTorch/README.md b/ACL_PyTorch/built-in/nlp/BiLSTM_CRF_PyTorch/README.md
index 40a20d2bfb2f6a1bbb126ba631b814567434605c..30027fc5bb355348c6c2819dbbeb7697a7b837f4 100644
--- a/ACL_PyTorch/built-in/nlp/BiLSTM_CRF_PyTorch/README.md
+++ b/ACL_PyTorch/built-in/nlp/BiLSTM_CRF_PyTorch/README.md
@@ -2,7 +2,6 @@
- [概述](#概述)
- [输入输出数据](#输入输出数据)
-- [推理环境](#推理环境)
- [快速上手](#快速上手)
- [获取源码](#获取源码)
- [准备数据集](#准备数据集)
@@ -36,21 +35,6 @@ CLUENER 细粒度命名实体识别
| --------- | ---------- | ----- | ------------ |
| features | FLOAT32 | ND | bs x 50 x 33 |
-
-----
-# 推理环境
-
-- 该模型推理所需配套的软件如下:
-
- | 配套 | 版本 | 环境准备指导 |
- | --------- | ------- | ---------- |
- | 固件与驱动 | 1.0.17 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.0.1 | - |
- | Python | 3.7.5 | - |
-
- 说明:请根据推理卡型号与 CANN 版本选择相匹配的固件与驱动版本。
-
-
----
# 快速上手
diff --git a/ACL_PyTorch/built-in/nlp/CNN_Transformer_for_Pytorch/ReadMe.md b/ACL_PyTorch/built-in/nlp/CNN_Transformer_for_Pytorch/ReadMe.md
index 603270eda76f3c11db68be33938c82e1e6e627c1..b1712570663166a4533334c60769602a9733ef16 100644
--- a/ACL_PyTorch/built-in/nlp/CNN_Transformer_for_Pytorch/ReadMe.md
+++ b/ACL_PyTorch/built-in/nlp/CNN_Transformer_for_Pytorch/ReadMe.md
@@ -23,7 +23,7 @@
运行`preprocess.py`脚本,会自动在线下载所需的分词器模型、Librispeech数据集(下载过程可能比较长),并把数据处理为bin文件,同时生成数据集的info文件。
```
-python3.7 preprocess.py --pre_data_save_path=./pre_data/clean --which_dataset=clean
+python3 preprocess.py --pre_data_save_path=./pre_data/clean --which_dataset=clean
```
参数说明:
@@ -47,7 +47,7 @@ python3.7 preprocess.py --pre_data_save_path=./pre_data/clean --which_dataset=cl
运行`export_onnx.py`脚本,会自动在线下载pth模型,并把pth模型转换为onnx模型。
```
- python3.7 export_onnx.py --model_save_dir=./models
+ python3 export_onnx.py --model_save_dir=./models
```
运行完之后,会在当前目录的`models`目录下生成`wav2vec2-base-960h.onnx`模型文件。
@@ -100,7 +100,7 @@ python3.7 preprocess.py --pre_data_save_path=./pre_data/clean --which_dataset=cl
2. 运行`pyacl_infer.py`进行推理,同时输出推理性能数据。
```
- python3.7 pyacl_infer.py \
+ python3 pyacl_infer.py \
--model_path=./models/wav2vec2-base-960h.om \
--device_id=0 \
--cpu_run=True \
@@ -131,7 +131,7 @@ python3.7 preprocess.py --pre_data_save_path=./pre_data/clean --which_dataset=cl
运行`postprocess.py`,会进行推理数据后处理,并进行精度统计。
```
- python3.7 postprocess.py \
+ python3 postprocess.py \
--bin_file_path=./om_infer_res_clean \
--res_save_path=./om_infer_res_clean/transcriptions.txt \
--which_dataset=clean
@@ -150,7 +150,7 @@ python3.7 preprocess.py --pre_data_save_path=./pre_data/clean --which_dataset=cl
在GPU环境上运行`pth_online_infer.py`脚本,得到pytorch在线推理性能。
```
- python pth_online_infer.py \
+ python3 pth_online_infer.py \
--pred_res_save_path=./pth_online_infer_res/clean/transcriptions.txt \
--which_dataset=clean
```
diff --git a/ACL_PyTorch/built-in/nlp/Ernie3_for_Pytorch/readme.md b/ACL_PyTorch/built-in/nlp/Ernie3_for_Pytorch/readme.md
index 7918a4168cbd267f8ba864b5aa90ad3980f818b0..7dbc22910eb70ab3eb7117db87e614c927d77638 100644
--- a/ACL_PyTorch/built-in/nlp/Ernie3_for_Pytorch/readme.md
+++ b/ACL_PyTorch/built-in/nlp/Ernie3_for_Pytorch/readme.md
@@ -5,8 +5,6 @@
- [输入输出数据](#section540883920406)
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -42,25 +40,6 @@ Ernie通过训练数据中的词法结构,语法结构,语义信息从而进
| -------- | -------- | -------- | ------------ |
| output | FLOAT32 | batchsize x 2 | ND |
-
-
-
-# 推理环境准备
-
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
-
- | 配套 | 版本 | 环境准备指导 |
- | ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 22.0.3 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.0.RC1 | - |
- | Python | 3.7.5 | - |
- | PyTorch | 1.6.0 | - |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
-
-
-
# 快速上手
## 获取源码
diff --git a/ACL_PyTorch/built-in/nlp/GPT2_for_Pytorch/readme.md b/ACL_PyTorch/built-in/nlp/GPT2_for_Pytorch/readme.md
index 8f7cab0aff56c64d7372fe98932cef53cdab7c34..24dd541714de92a99f79549c6896b33d487e18f6 100644
--- a/ACL_PyTorch/built-in/nlp/GPT2_for_Pytorch/readme.md
+++ b/ACL_PyTorch/built-in/nlp/GPT2_for_Pytorch/readme.md
@@ -5,8 +5,6 @@
- [输入输出数据](#section540883920406)
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -52,24 +50,6 @@ GPT-2 模型只使用了多个Masked Self-Attention和Feed Forward Neural Networ
| output | FLOAT16 | batchsize x 512 x 21128 | ND |
-
-
-# 推理环境准备
-
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
-
- | 配套 | 版本 | 环境准备指导 |
- | ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 22.0.3 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.0.RC1 | - |
- | Python | 3.7.5 | - |
- | PyTorch | 1.11.0 | - |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
-
-
-
# 快速上手
## 获取源码
diff --git a/ACL_PyTorch/built-in/nlp/Pet_for_Pytorch/readme.md b/ACL_PyTorch/built-in/nlp/Pet_for_Pytorch/readme.md
index 8eeb2ad716531be4a5e55225563961c7ee8880fa..584ed622087efe4b01a2db11ea711573ff2c8945 100644
--- a/ACL_PyTorch/built-in/nlp/Pet_for_Pytorch/readme.md
+++ b/ACL_PyTorch/built-in/nlp/Pet_for_Pytorch/readme.md
@@ -5,8 +5,6 @@
- [输入输出数据](#section540883920406)
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -53,23 +51,6 @@
| out_put | float32 | batchsize x max_seq_len x 18000 | ND |
-
-
-
-# 推理环境准备
-
-- 该模型需要以下插件与驱动
-
- | 配套 | 版本 | 环境准备指导 |
- |---------| ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 22.0.4 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.3.RC1 | - |
- | Python | 3.7.5 | - |
- | PyTorch | 1.8.0 | - |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ | |
-
-
-
# 快速上手
## 获取源码
@@ -111,7 +92,7 @@
通过如下命令,指定 GPU 0 卡, 在 FewCLUE 的 `eprstmt` 数据集上进行训练&评估
```
- python -u -m paddle.distributed.launch --gpus "0" run_train.py \
+ python3 -u -m paddle.distributed.launch --gpus "0" run_train.py \
--output_dir checkpoint_eprstmt \
--task_name eprstmt \
--split_id few_all \
@@ -220,7 +201,7 @@
执行以下脚本测试数据集
```
- python evaluate_om.py \
+ python3 evaluate_om.py \
--task_name eprstmt \
--split_id few_all \
--prompt_path prompt/eprstmt.json \
diff --git a/ACL_PyTorch/built-in/nlp/TransformerXL_for_Pytorch/README.md b/ACL_PyTorch/built-in/nlp/TransformerXL_for_Pytorch/README.md
index 24c2ec713830bbee47932530372cc77c0f01d6ea..2a28ff9645fe34ab615433532a8d78132a92f71c 100644
--- a/ACL_PyTorch/built-in/nlp/TransformerXL_for_Pytorch/README.md
+++ b/ACL_PyTorch/built-in/nlp/TransformerXL_for_Pytorch/README.md
@@ -35,20 +35,20 @@ git reset --hard 44781ed
执行命令:
```shell
-python3.7 -m onnxsim model.onnx model_sim.onnx
+python3 -m onnxsim model.onnx model_sim.onnx
```
6. 修改模型。
进入om_gener目录,执行以下命令安装改图工具。
```shell
-pip3.7 install .
+pip3 install .
```
对模型进行修改,执行脚本。
```shell
-python3.7 modify_model.py model_sim.onnx
+python3 modify_model.py model_sim.onnx
```
7. 执行atc.sh脚本,将.onnx文件转为离线推理模型文件.om文件。请以实际安装环境配置环境变量。
diff --git a/ACL_PyTorch/built-in/nlp/Uie_for_Pytorch/readme.md b/ACL_PyTorch/built-in/nlp/Uie_for_Pytorch/readme.md
index 77095e85de52a925304af422151137696aa1d70e..0c83bf77791eeccc55aac2e3c0bb8c20a95209c8 100644
--- a/ACL_PyTorch/built-in/nlp/Uie_for_Pytorch/readme.md
+++ b/ACL_PyTorch/built-in/nlp/Uie_for_Pytorch/readme.md
@@ -5,8 +5,6 @@
- [输入输出数据](#section540883920406)
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -53,22 +51,6 @@ Yaojie Lu等人在ACL-2022中提出了通用信息抽取统一框架UIE。该框
| end_prob | float32 | batchsize x 1 | ND |
-
-
-# 推理环境准备
-
-- 该模型需要以下插件与驱动
-
- | 配套 | 版本 | 环境准备指导 |
- |---------| ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 22.0.4 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.3.RC1 | - |
- | Python | 3.7.5 | - |
- | PyTorch | 1.8.0 | - |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ | |
-
-
-
# 快速上手
## 获取源码
@@ -107,7 +89,7 @@ Yaojie Lu等人在ACL-2022中提出了通用信息抽取统一框架UIE。该框
下载标记好的[数据集](https://gitee.com/link?target=https%3A%2F%2Fbj.bcebos.com%2Fpaddlenlp%2Fdatasets%2Fuie%2Fdoccano_ext.json)放入${paddleNLP}/model_zoo/uie/data
```
# 执行以下脚本进行数据转换,执行后会在./data目录下生成训练/验证/测试集文件
- python doccano.py \
+ python3 doccano.py \
--doccano_file ./data/doccano_ext.json \
--task_type ext \
--save_dir ./data \
@@ -118,7 +100,7 @@ Yaojie Lu等人在ACL-2022中提出了通用信息抽取统一框架UIE。该框
```
export finetuned_model=./checkpoint/model_best
- python -u -m paddle.distributed.launch --gpus "0,1" finetune.py \
+ python3 -u -m paddle.distributed.launch --gpus "0,1" finetune.py \
--device gpu \
--logging_steps 10 \
--save_steps 100 \
@@ -164,9 +146,9 @@ Yaojie Lu等人在ACL-2022中提出了通用信息抽取统一框架UIE。该框
根据当前设备环境情况,选择执行以下推理脚本得到对应onnx模型(${finetuned_model}/model.onnx)
```
# cpu
- python deploy/python/infer_cpu.py --model_path_prefix ${finetuned_model}/model
+ python3 deploy/python/infer_cpu.py --model_path_prefix ${finetuned_model}/model
# gpu
- python deploy/python/infer_gpu.py --model_path_prefix ${finetuned_model}/model --device_id 0
+ python3 deploy/python/infer_gpu.py --model_path_prefix ${finetuned_model}/model --device_id 0
```
可配置参数说明:
@@ -281,7 +263,7 @@ Yaojie Lu等人在ACL-2022中提出了通用信息抽取统一框架UIE。该框
执行以下脚本测试数据集
```
- python evaluate_om.py \
+ python3 evaluate_om.py \
--om_path uie_bs${batch_size}.om \
--test_path ./data/dev.txt \
--batch_size ${batch_size} \
diff --git a/ACL_PyTorch/built-in/nlp/VilBert_for_Pytorch/README.md b/ACL_PyTorch/built-in/nlp/VilBert_for_Pytorch/README.md
index 2c1c93082e3738fcdb3fa3a1a8964c2a5d1de8d0..486e8e1f8b1f3b0aa811b36e0b11326786033403 100644
--- a/ACL_PyTorch/built-in/nlp/VilBert_for_Pytorch/README.md
+++ b/ACL_PyTorch/built-in/nlp/VilBert_for_Pytorch/README.md
@@ -5,10 +5,6 @@
- [输入输出数据](#section540883920406)
-
-
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -54,22 +50,6 @@
| probs | FLOAT32 | batch_size x 10026 | ND |
-# 推理环境准备
-
-- 该模型需要以下插件与驱动:
-
- **表 1** 版本配套表
-
- | 配套 | 版本 | 环境准备指导 |
- | ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 1.0.17 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.0.RC1 | - |
- | Python | 3.7.5 | - |
- | PyTorch | 1.12.0 | - |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
-
-
-
# 快速上手
## 获取源码
@@ -170,7 +150,7 @@
```
# 以bs1为例
- python -m onnxsim vqa-vilbert.onnx vqa-vilbert_bs1_sim.onnx --input-shape "box_features:1,100,1024" "box_coordinates:1,100,4" "box_mask:1,100" "token_ids:1,32" "mask:1,32" "type_ids:1,32"
+ python3 -m onnxsim vqa-vilbert.onnx vqa-vilbert_bs1_sim.onnx --input-shape "box_features:1,100,1024" "box_coordinates:1,100,4" "box_mask:1,100" "token_ids:1,32" "mask:1,32" "type_ids:1,32"
python3 fix_onnx.py vqa-vilbert_bs1_sim.onnx vqa-vilbert_bs1_sim_fix.onnx
```
diff --git a/ACL_PyTorch/built-in/nlp/textcnn/README.md b/ACL_PyTorch/built-in/nlp/textcnn/README.md
index ba3bd5e097b2fce3a5ce3972f6057821036c18a1..0060c5bcbc4394958229ef9be3985dcbd1ef77bf 100644
--- a/ACL_PyTorch/built-in/nlp/textcnn/README.md
+++ b/ACL_PyTorch/built-in/nlp/textcnn/README.md
@@ -4,9 +4,6 @@
- [输入输出数据](#section540883920406)
-
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -43,22 +40,6 @@ Textcnn是NLP模型,主要用于文本分析,使用预训练的word2vec初
| output | FLOAT32 | batchsize x 10 | ND |
-# 推理环境准备
-
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
-
- | 配套 | 版本 | 环境准备指导 |
- | ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 22.0.2 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.0.RC1 | [CANN推理架构准备](https://www/hiascend.com/software/cann/commercial) |
- | Python | 3.7.5 | 创建anaconda环境时指定python版本即可,conda create -n ${your_env_name} python==3.7.5 |
- | PyTorch | 1.11.0 | - |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
-
-
-
# 快速上手
## 获取源码
@@ -203,7 +184,7 @@ Textcnn是NLP模型,主要用于文本分析,使用预训练的word2vec初
```
mkdir ./output_data
- python3.7 -m ais_bench --model mg_om_dir/textcnn_64bs.om --input ./ascend-textcnn/bin --output ./output_data --device 0
+ python3 -m ais_bench --model mg_om_dir/textcnn_64bs.om --input ./ascend-textcnn/bin --output ./output_data --device 0
```
- 参数说明:
@@ -237,7 +218,7 @@ Textcnn是NLP模型,主要用于文本分析,使用预训练的word2vec初
可使用ais_bench推理工具的纯推理模式验证不同batch_size的om模型的性能,参考命令如下:
```
- python3.7 -m ais_bench --model mg_om_dir/textcnn_64bs_mg.om
+ python3 -m ais_bench --model mg_om_dir/textcnn_64bs_mg.om
```
- 参数说明:
diff --git a/ACL_PyTorch/built-in/ocr/CRNN/CRNN_Sierkinhane_for_Pytorch/README.md b/ACL_PyTorch/built-in/ocr/CRNN/CRNN_Sierkinhane_for_Pytorch/README.md
index f7fd55f1b8d51597e8abb906e22dac71336e4dd1..b6e90ba0d3ed8b000ed0ea4d99c4b08a497c00e3 100644
--- a/ACL_PyTorch/built-in/ocr/CRNN/CRNN_Sierkinhane_for_Pytorch/README.md
+++ b/ACL_PyTorch/built-in/ocr/CRNN/CRNN_Sierkinhane_for_Pytorch/README.md
@@ -69,25 +69,16 @@ commit_id=a565687c4076b729d4059593b7570dd388055af4
### 搭建环境
-1. 环境版本。
-
- | 环境 | 版本 | 安装指导 |
- | ---- | ---- | ---- |
- | NPU 驱动和固件 | 23.0.rc1 | [安装指导](https://www.hiascend.com/document/detail/zh/CANNCommunityEdition/63RC2alpha002/softwareinstall/instg/instg_000018.html) |
- | CANN | 6.2.T200 | [安装指导](https://www.hiascend.com/document/detail/zh/CANNCommunityEdition/63RC2alpha002/softwareinstall/instg/instg_000036.html) |
- | Python | 3.7.5 | |
- | PyTorch | 1.13.1 | |
-
-2. 新建 conda 环境,安装依赖。
+1. 新建 conda 环境,安装依赖。
```bash
# conda 环境可选
- conda create -n crnn python=3.7.5
+ conda create -n crnn python=3.10
conda activate crnn
pip3 install -r my/requirements.txt
```
-3. 设置环境变量。
+2. 设置环境变量。
```bash
source /usr/local/Ascend/ascend-toolkit/set_env.sh
diff --git a/ACL_PyTorch/built-in/ocr/DBNET/README.md b/ACL_PyTorch/built-in/ocr/DBNET/README.md
index 460eab78cf5c565d80071d433ecc234285709f3a..816ef087bcb061133117e8b2c0e1773c1ec81cb7 100644
--- a/ACL_PyTorch/built-in/ocr/DBNET/README.md
+++ b/ACL_PyTorch/built-in/ocr/DBNET/README.md
@@ -5,8 +5,6 @@
- [输入输出数据](#section540883920406)
-- [推理环境准备](#ZH-CN_TOPIC_0000001126281702)
-
- [快速上手](#ZH-CN_TOPIC_0000001126281700)
- [获取源码](#section4622531142816)
@@ -49,22 +47,6 @@
| output1 | FLOAT32 | batchsize x 1 x 736 x 1280 | ND |
-# 推理环境准备
-
-- 该模型需要以下插件与驱动
-
- **表 1** 版本配套表
-
- | 配套 | 版本 | 环境准备指导 |
- | ------------------------------------------------------------ | ------- | ------------------------------------------------------------ |
- | 固件与驱动 | 22.0.2 | [Pytorch框架推理环境准备](https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/pies) |
- | CANN | 6.0.RC1 | - |
- | Python | 3.7.5 | - |
- | PyTorch | 1.6.0 | - |
- | 说明:Atlas 300I Duo 推理卡请以CANN版本选择实际固件与驱动版本。 | \ | \ |
-
-
-
# 快速上手
## 获取源码
@@ -269,4 +251,8 @@
| 芯片型号 | 数据集 | precision:精度 | 性能 |
| :------: | :-------: | :--: | :---: |
-| 310P3 | icdar2015 | 0.88 | 16.57 |
\ No newline at end of file
+| 310P3 | icdar2015 | 0.88 | 16.57 |
+
+
+# 公网地址说明
+代码涉及公网地址参考 public_address_statement.md
\ No newline at end of file
diff --git a/ACL_PyTorch/built-in/ocr/DBNET/public_address_statement.md b/ACL_PyTorch/built-in/ocr/DBNET/public_address_statement.md
new file mode 100644
index 0000000000000000000000000000000000000000..de4a72c9ac3ed032cfd72a50ec00284ae87d28df
--- /dev/null
+++ b/ACL_PyTorch/built-in/ocr/DBNET/public_address_statement.md
@@ -0,0 +1,3 @@
+| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 |
+| ---- | ------------ | ------ | ------------------------------------ | -------- |
+|开发引入|https://github.com/MhLiao/DB |db_pth2onnx.py |https://github.com/MhLiao/DB |注释说明 |
\ No newline at end of file