diff --git a/README.en.md b/README.en.md
deleted file mode 100644
index 1960f9229c1674c4d804d8a606b3cdfebb982300..0000000000000000000000000000000000000000
--- a/README.en.md
+++ /dev/null
@@ -1,36 +0,0 @@
-# oec-application
-
-#### Description
-It is used for checking application compatibility with openEuler.
-
-#### Software Architecture
-Software architecture description
-
-#### Installation
-
-1. xxxx
-2. xxxx
-3. xxxx
-
-#### Instructions
-
-1. xxxx
-2. xxxx
-3. xxxx
-
-#### Contribution
-
-1. Fork this repository.
-2. Create Feat_xxx branch.
-3. Commit your code.
-4. Create a new Pull Request.
-
-
-#### Gitee Feature
-
-1. Readme\_XXX.md supports different languages. The file name examples are as follows: Readme\_en.md and Readme\_zh.md.
-2. Get information or seek help from the Gitee official blogs by visiting https://blog.gitee.com.
-3. Explore open source projects by visiting https://gitee.com/explore.
-4. Get information about the most valuable open source project GVP by visiting https://gitee.com/gvp.
-5. Obtain Gitee user guide by visiting https://gitee.com/help.
-6. Get to know the most popular members by visiting https://gitee.com/gitee-stars/.
diff --git a/README.md b/README.md
index dc35a3862932dd44f2eba263f8d17f395af6598f..bfc97e34e6050f7110b74a958d0e3f8fea3f8eed 100644
--- a/README.md
+++ b/README.md
@@ -1,37 +1,38 @@
-# oec-application
-
-#### 介绍
-Use for check application compatibility with openEuler
-
-#### 软件架构
-软件架构说明
-
-
-#### 安装教程
-
-1. xxxx
-2. xxxx
-3. xxxx
-
-#### 使用说明
-
-1. xxxx
-2. xxxx
-3. xxxx
-
-#### 参与贡献
-
-1. Fork 本仓库
-2. 新建 Feat_xxx 分支
-3. 提交代码
-4. 新建 Pull Request
-
-
-#### 码云特技
-
-1. 使用 Readme\_XXX.md 来支持不同的语言,例如 Readme\_en.md, Readme\_zh.md
-2. 码云官方博客 [blog.gitee.com](https://blog.gitee.com)
-3. 你可以 [https://gitee.com/explore](https://gitee.com/explore) 这个地址来了解码云上的优秀开源项目
-4. [GVP](https://gitee.com/gvp) 全称是码云最有价值开源项目,是码云综合评定出的优秀开源项目
-5. 码云官方提供的使用手册 [https://gitee.com/help](https://gitee.com/help)
-6. 码云封面人物是一档用来展示码云会员风采的栏目 [https://gitee.com/gitee-stars/](https://gitee.com/gitee-stars/)
+# oec-application
+
+#### 介绍
+Use for check application compatibility with openEuler
+check_build
+
+#### 软件架构
+软件架构说明
+
+
+#### 安装教程
+
+1. xxxx
+2. xxxx
+3. xxxx
+
+#### 使用说明
+
+1. xxxx
+2. xxxx
+3. xxxx
+
+#### 参与贡献
+
+1. Fork 本仓库
+2. 新建 Feat_xxx 分支
+3. 提交代码
+4. 新建 Pull Request
+
+
+#### 特技
+
+1. 使用 Readme\_XXX.md 来支持不同的语言,例如 Readme\_en.md, Readme\_zh.md
+2. Gitee 官方博客 [blog.gitee.com](https://blog.gitee.com)
+3. 你可以 [https://gitee.com/explore](https://gitee.com/explore) 这个地址来了解 Gitee 上的优秀开源项目
+4. [GVP](https://gitee.com/gvp) 全称是 Gitee 最有价值开源项目,是综合评定出的优秀开源项目
+5. Gitee 官方提供的使用手册 [https://gitee.com/help](https://gitee.com/help)
+6. Gitee 封面人物是一档用来展示 Gitee 会员风采的栏目 [https://gitee.com/gitee-stars/](https://gitee.com/gitee-stars/)
diff --git a/docbuild/README.md b/docbuild/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..7a18f158da18661494fdf780dfd47d9cb8c27098
--- /dev/null
+++ b/docbuild/README.md
@@ -0,0 +1,54 @@
+# 门禁检查
+
+## 如何加入检查项
+1. 在ci_check/src/ac目录下新建文件夹放置检查项代码
+2. 在ac_conf.yaml中增加配置项
+
+### 配置文件说明
+
+```yaml
+示例=>
+spec: # ac项目名称
+ hint: check_spec # gitee中显示检查项名称,缺省使用check_+项目名称
+ module: spec.check_spec # ac项目模块名称,缺省使用"项目名称+check_+项目名称"
+ entry: Entry # ac项目入口类名称,继承BaseCheck类,可自定义__callable__方法
+ exclude: true # 忽略该项检查
+ ignored: [] # ac项目内忽略的检查项,就算失败也不影响最终ac项目结果
+ allow_list: [] # 只有出现在allow_list的包才执行当前检查项
+ deny_list:[] # 出现在deny_list的包不执行当前检查项
+```
+
+### entry实现模板
+
+```yaml
+from src.ac.framework.ac_base import BaseCheck
+from src.ac.framework.ac_result import FAILED, SUCCESS, WARNING
+
+
+class Entry(BaseCheck):
+ def __call__(self, *args, **kwargs):
+ # do the work
+ ...
+
+ def check_case_a(self):
+ # do the check
+
+ return SUCCESS
+```
+
+### 检查结果
+
+| 返回码 | 描述 | emoji |
+| --- | --- | --- |
+| 0 | SUCCESS | :white_check_mark:|
+| 1 | WARNING | :bug: |
+| 2 | FAILED | :x:|
+
+## 支持的检查项
+| 检查项 | 目录 | 描述 |
+| --- | --- | --- |
+| spec文件 | spec | 检查homepage是否可以访问、版本号单调递增、检查补丁文件是否存在|
+| 代码风格 | code | 检查压缩包文件、检查补丁是否可以使用、执行linter工具 |
+| yaml文件 | package_yaml | |
+| license检查 | package_license | |
+| 代码片段检查 | sca | 目前只针对自研项目 |
\ No newline at end of file
diff --git a/docbuild/check_build.md b/docbuild/check_build.md
new file mode 100644
index 0000000000000000000000000000000000000000..2b6bb3fe7cb7fe5f5c814ee6e344da4e06def675
--- /dev/null
+++ b/docbuild/check_build.md
@@ -0,0 +1,423 @@
+[TOC]
+
+
+
+## 一、npm构建检查check_build
+
+### 1、功能
+
+npm构建检查是src-openeuler 门禁检查项中的一部分,通过首次提交 PR,或评论/retest触发。
+
+门禁检查项包括:
+
+静态检查项:
+
+
+
+npm构建项(当前正在运行的为在本地服务器配置的osc环境,不稳定):
+
+
+
+
+
+npm构建包括两种构建方式:aarch64、x86_64。
+
+目前采用OEPKGS进行构建检查,整体流程为触发门禁口,调用OEPKGS接口进行构建,返回参数检查构建结果,Jenkins输出评论(可查看日志)。
+
+## 二、执行流程
+
+
+
+### 1、Jenkins调用接口:test.py
+
+根据test.py的输入参数调用check_build:
+
+```python
+parser.add_argument("-c", type=str, dest="community", default="src-oepkgs", help="src-openeuler or openeuler") //
+parser.add_argument("-w", type=str, dest="workspace", help="workspace where to find source") //
+parser.add_argument("-r", type=str, dest="repo", help="repo name") //
+parser.add_argument("-b", type=str, dest="tbranch", help="branch merge to") //
+parser.add_argument("-o", type=str, dest="output", help="output file to save result")
+parser.add_argument("-p", type=str, dest="pr", help="pull request number")
+parser.add_argument("-t", type=str, dest="token", help="gitee api token")
+parser.add_argument("-a", type=str, dest="account", help="gitee account")
+parser.add_argument("-i", type=str, dest="build_id", help="build_id") //
+
+# dataset
+parser.add_argument("-m", type=str, dest="comment", help="trigger comment")
+parser.add_argument("-e", type=str, dest="committer", help="committer")
+parser.add_argument("-x", type=str, dest="pr_ctime", help="pr create time")
+parser.add_argument("-z", type=str, dest="trigger_time", help="job trigger time")
+parser.add_argument("-l", type=str, dest="trigger_link", help="job trigger link")
+```
+
+### 2、ac项目入口类:check_build
+
+ac项目入口类实现:
+
+```python
+import build_job
+import create_job
+import get_build_record
+import get_oepkgs_token
+import logging
+
+from src.ac.framework.ac_base import BaseCheck
+from src.ac.framework.ac_result import FAILED, SUCCESS, WARNING
+
+logger = logging.getLogger("ac")
+
+class CheckBuild(BaseCheck):
+ def __init__(self, workspace, repo, conf, dataset):
+ super(CheckBuild, self).__init__(workspace, repo, conf)
+ self.dd = dataset
+ self.repo_url = "https://gitee.com/{}/{}.git".format(self.dd.community, repo)
+
+ def __call__(self, *args, **kwargs):
+ self.token = get_oepkgs_token.get_Oepkgs_Token()
+
+ def check_build(self):
+ cre_tag, jobName, repoId, jobId = create_job.Create_Job(self.token, self.repo_url, 'x86_64', self.dd.tbranch, self.dd.build_id)
+ if(cre_tag == 'success'):
+ bui_tag = build_job.Builde_Job(self.token, jobName, repoId)
+ if(bui_tag == 'success'):
+ res = get_build_record.get_Build_Record(self.token, jobName)
+ if(res == 'success'):
+ return SUCCESS
+ else:
+ return FAILED
+ else:
+ return FAILED
+ else:
+ return FAILED
+```
+
+### 3、获取有效的:get_Oepkgs_Token
+
+由于Oepkgs构建采用了jwt token安全机制,所以没有设置长期有效的 token。建议方式为直接爬取网页,具体方式为用 selinium 模拟网页点击获取 token,流程如图:
+
+
+
+执行代码:
+
+```python
+from selenium import webdriver
+from selenium.webdriver.common.by import By
+
+def get_Oepkgs_Token():
+ options = webdriver.ChromeOptions()
+ options.add_argument("--auto-open-devtools-for-tabs")
+
+ driver = webdriver.Chrome(options=options)
+
+ # 打开登录页面
+ driver.get("https://build.dev.oepkgs.net/rpm/task") # 替换成你的登录页面地址
+
+ try:
+ driver.implicitly_wait(10)
+
+ # 找到用户名和密码输入框,并输入相应的信息
+ username_input = driver.find_element(By.NAME, "username")
+ password_input = driver.find_element(By.NAME, "password")
+
+ username_input.send_keys("mengling_cui@163.com")
+ password_input.send_keys("123456Aa!")
+
+ # 根据按钮文本来定位立即登录按钮
+ login_button = driver.find_element(By.XPATH, "//button[contains(text(), '立即登录')]")
+ login_button.click()
+
+ # 获取cookie
+ cookies = driver.get_cookies()
+
+ for cookie in cookies:
+ print(cookie['name'], cookie['value'])
+
+ finally:
+ # 关闭WebDriver
+ driver.quit()
+
+ return cookie['value']
+```
+
+
+
+### 4、创建任务:create_Job(opekgstoken, repo_url, ccArch, tbranch, build_id)
+
+输入示例:/task/create
+
+
+
+
+
+函数输入参数:
+
+| **opekgstoken** | 有效的X-auth-token |
+| --------------- | ------------------ |
+| **repo_url** | 仓库地址 |
+| **ccArch** | 目标架构 |
+| **tbranch** | 仓库分支 |
+| **build_id** | 任务编号 |
+
+请求示例:
+
+```python
+def Create_Job(opekgstoken, repo_url, ccArch, tbranch, build_id):
+
+ url = 'https://build.dev.oepkgs.net/api/build/task/create'
+
+ headers = {
+ 'authority': 'build.dev.oepkgs.net',
+ 'accept': '*/*',
+ 'accept-language': 'en-US,en;q=0.9,zh;q=0.8,zh-CN;q=0.7',
+ 'cache-control': 'no-cache',
+ 'content-type': 'application/json',
+ 'cookie': 'auth_token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoxMDksInVzZXJfbmFtZSI6ImRpZGkiLCJUWiI6IkFzaWEvU2hhbmdoYWkiLCJleHAiOjE2OTMzNzAwMzMsImlhdCI6MTY5MzM2ODIzM30.szb3CB_U3DJOVG94JzHkUBqTjD0x8WM0JCJz3fNODuc',
+ 'dnt': '1',
+ 'origin': 'https://build.dev.oepkgs.net',
+ 'pragma': 'no-cache',
+ 'referer': 'https://build.dev.oepkgs.net/rpm/task/create',
+ 'sec-ch-ua': '"Chromium";v="116", "Not)A;Brand";v="24", "Google Chrome";v="116"',
+ 'sec-ch-ua-mobile': '?0',
+ 'sec-ch-ua-platform': '"Windows"',
+ 'sec-fetch-dest': 'empty',
+ 'sec-fetch-mode': 'cors',
+ 'sec-fetch-site': 'same-origin',
+ 'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Safari/537.36',
+ 'x-auth-token': opekgstoken,
+ }
+
+ data = {
+ 'builder': 1,
+ 'jobName': str(uuid.uuid4()),
+ 'os': 'openEuler',
+ 'osFull': '22.03-LTS',
+ 'ccArch': ccArch,
+ 'repoId': build_id,
+ 'scmRepo': repo_url,
+ 'branch': tbranch
+ }
+
+ response = requests.post(url, headers=headers, json=data)
+
+ print(response.status_code)
+ print(response.text)
+
+ res_data = json.loads(response.text)
+
+ print(res_data)
+
+ if(response.status_code == '200'):
+ return 'success', data['jobName'], data['repoId'], res_data['data']['jobId']
+ else:
+ return 'false', 'nodata', 'nodata', 'nodata'
+```
+
+返回参数:
+
+| cre_tag | 是否创建成功,可选'false'和'success' |
+| ------- | ---------------------------------------------- |
+| jobName | 任务名称,采用uuid进行编码,用于后续的手动构建 |
+| repoId | 仓库Id,用于后续的手动构建 |
+| jobId | 任务Id,用于后续的构建记录查询 |
+
+请求响应状态码:
+
+| 状态码 | 说明 | schema |
+| :----- | :----------- | :----- |
+| 200 | OK | Task |
+| 201 | Created | |
+| 401 | Unauthorized | |
+| 403 | Forbidden | |
+| 404 | Not Found | |
+
+### 5、开启构建:build_Job(opekgstoken, jobName, repoId)
+
+示例:/task/buildJob
+
+
+
+函数输入参数:
+
+| **opekgstoken** | 有效的X-auth-token |
+| --------------- | -------------------------- |
+| jobName | 任务名称,用于定位构建任务 |
+| repoId | 仓库Id,用于定位构建任务 |
+
+请求示例:
+
+```python
+import requests
+
+
+def Builde_Job(opekgstoken, jobName, repoId):
+ url2 = 'https://build.dev.oepkgs.net/api/build/task/buildJob'
+
+ headers2 = {
+ 'authority': 'build.dev.oepkgs.net',
+ 'accept': '*/*',
+ 'accept-language': 'en-US,en;q=0.9,zh;q=0.8,zh-CN;q=0.7',
+ 'cache-control': 'no-cache',
+ 'content-type': 'application/json',
+ 'cookie': 'auth_token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoxMDksInVzZXJfbmFtZSI6ImRpZGkiLCJUWiI6IkFzaIAvU2hhbmdoYWkiLCJleHAiOjE2OTM1Mzk4MjgsImlhdCI6MTY5MzUzODAyOH0.sFZYWM-YiKKGcD2udonHmCM_kLxyb3Dt0P6xQZs49u8',
+ 'dnt': '1',
+ 'origin': 'https://build.dev.oepkgs.net',
+ 'pragma': 'no-cache',
+ 'referer': 'https://build.dev.oepkgs.net/rpm/task',
+ 'sec-ch-ua': '"Chromium";v="116", "Not)A;Brand";v="24", "Google Chrome";v="116"',
+ 'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Safari/537.36',
+ 'x-auth-token': opekgstoken,
+ }
+
+ data2 = {
+ 'jobName': jobName,
+ 'jobId': repoId,
+ }
+
+ response2 = requests.post(url2, headers=headers2, json=data2)
+
+ print(response2.status_code)
+ print(response2.text)
+
+ if(response2.status_code == '200'):
+ return 'success'
+ else:
+ return 'false'
+
+```
+
+返回参数:
+
+| bui_tag | 是否构建成功,可选'false'和'success' |
+| ------- | ------------------------------------ |
+
+响应状态码:
+
+| 状态码 | 说明 | schema |
+| :----- | :----------- | :----- |
+| 200 | OK | |
+| 201 | Created | |
+| 401 | Unauthorized | |
+| 403 | Forbidden | |
+| 404 | Not Found | |
+
+### 6、获取构建记录:get_Build_Record(opekgstoken, jobId) (未跑通)
+
+示例:/task/getBuildRecord/{jobId}
+
+
+
+函数输入参数:
+
+| **opekgstoken** | 有效的X-auth-token |
+| --------------- | ------------------------ |
+| jobId | 任务Id,用于定位构建任务 |
+
+请求示例:
+
+```python
+import requests
+import json
+
+def get_Build_Record(opekgstoken, jobName):
+ url = 'https://build.dev.oepkgs.net/api/build/task/getBuildRecord/955?pageNum=1&pageSize=5'
+
+ headers = {
+ 'authority': 'build.dev.oepkgs.net',
+ 'accept': '*/*',
+ 'accept-language': 'en-US,en;q=0.9,zh;q=0.8,zh-CN;q=0.7',
+ 'cache-control': 'no-cache',
+ 'content-type': 'application/json',
+ 'cookie': 'auth_token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoxMDksInVzZXJfbmFtZSI6ImRpZGkiLCJUWiI6IkFzaWEvU2hhbmdoYWkiLCJleHAiOjE2OTM1Mzk4MjgsImlhdCI6MTY5MzUzODAyOH0.sFZYWM-YiKKGcD2udonHmCM_kLxyb3Dt0P6xQZs49u8',
+ 'dnt': '1',
+ 'pragma': 'no-cache',
+ 'referer': 'https://build.dev.oepkgs.net/rpm/task',
+ 'sec-ch-ua': '"Chromium";v="116", "Not)A;Brand";v="24", "Google Chrome";v="116"',
+ 'x-auth-token': opekgstoken,
+ }
+
+ params = {
+ "jobName": jobName,
+ }
+
+ response = requests.get(url, headers=headers, params=params)
+ res_data = json.loads(response.text)
+
+ return res_data['success']
+
+```
+
+返回参数:
+
+| res | 查询构建记录,可选'false'和'success' |
+| ---- | ------------------------------------ |
+
+请求响应状态码
+
+| 状态码 | 说明 | schema |
+| :----- | :----------- | :----- |
+| 200 | OK | object |
+| 401 | Unauthorized | |
+| 403 | Forbidden | |
+| 404 | Not Found | |
+
+### 7、输出评论:comment
+
+配置文件ac.yaml:
+
+```yaml
+src-oepkgs:
+ spec:
+ hint: check_spec_file
+ module: spec.check_spec
+ entry: CheckSpec
+ ignored: ["homepage"]
+ code:
+ hint: check_code_style
+ module: code.check_code_style
+ entry: CheckCodeStyle
+ exclude: True
+ ignored: ["patch"]
+ package_yaml:
+ hint: check_package_yaml_file
+ module: package_yaml.check_yaml
+ entry: CheckPackageYaml
+ ignored: ["fields"]
+ package_license:
+ hint: check_package_license
+ module: package_license.check_license
+ entry: CheckLicense
+ binary:
+ hint: check_binary_file
+ module: binary.check_binary_file
+ entry: CheckBinaryFile
+ source_consistency:
+ hint: check_consistency
+ module: source_consistency.check_consistency
+ entry: CheckSourceConsistency
+ npmbuild:
+ hint: check_build
+ module: npmbuild.check_build
+ entry: CheckBuild
+ sca:
+ exclude: True
+ openlibing:
+ exclude: True
+ commit_msg:
+ exclude: True
+```
+
+配置文件说明:用来指定每个检查项的路径,以及配置openeuler和src-openeuler需要进行的检查项
+
+```yaml
+示例=>
+spec: # ac项目名称
+ hint: check_spec # gitee中显示检查项名称,缺省使用check_+项目名称,会显示在pr评论下面
+ module: spec.check_spec # ac项目模块名称,缺省使用"项目名称+check_+项目名称"
+ entry: Entry # ac项目入口类名称,继承BaseCheck类,可自定义__callable__方法
+ exclude: true # 忽略该项检查
+ ignored: [] # ac项目内忽略的检查项,就算失败也不影响最终ac项目结果
+ allow_list: [] # 只有出现在allow_list的包才执行当前检查项
+ deny_list:[] # 出现在deny_list的包不执行当前检查项
+```
+
diff --git a/docbuild/check_build.png b/docbuild/check_build.png
new file mode 100644
index 0000000000000000000000000000000000000000..d02275c6d9b40f7f3ceda4b499c9875f68b5a203
Binary files /dev/null and b/docbuild/check_build.png differ
diff --git a/docbuild/check_build.xmind b/docbuild/check_build.xmind
new file mode 100644
index 0000000000000000000000000000000000000000..296ca1ec6d0e62dcf7d74665fb68d2dffac1ca6a
Binary files /dev/null and b/docbuild/check_build.xmind differ
diff --git "a/docbuild/npm\346\236\204\345\273\272\346\243\200\346\237\245.JPG" "b/docbuild/npm\346\236\204\345\273\272\346\243\200\346\237\245.JPG"
new file mode 100644
index 0000000000000000000000000000000000000000..0f49c079dfd0a931dd579f62712db1b71336f49f
Binary files /dev/null and "b/docbuild/npm\346\236\204\345\273\272\346\243\200\346\237\245.JPG" differ
diff --git a/docbuild/start.png b/docbuild/start.png
new file mode 100644
index 0000000000000000000000000000000000000000..e7376248f523c20ce32c10d7cd672aaaced6048c
Binary files /dev/null and b/docbuild/start.png differ
diff --git a/docbuild/start.xmind b/docbuild/start.xmind
new file mode 100644
index 0000000000000000000000000000000000000000..83c65cd46f52b370e94b83292dac40db8ce3aa19
Binary files /dev/null and b/docbuild/start.xmind differ
diff --git "a/docbuild/\346\236\204\345\273\272.png" "b/docbuild/\346\236\204\345\273\272.png"
new file mode 100644
index 0000000000000000000000000000000000000000..55833ea19910b36fc0d093bb09a1b98efff429e0
Binary files /dev/null and "b/docbuild/\346\236\204\345\273\272.png" differ
diff --git "a/docbuild/\347\273\223\346\236\234.JPG" "b/docbuild/\347\273\223\346\236\234.JPG"
new file mode 100644
index 0000000000000000000000000000000000000000..d4cfd6d04a2a113cbb93e7fcd15485663e5bf2ef
Binary files /dev/null and "b/docbuild/\347\273\223\346\236\234.JPG" differ
diff --git "a/docbuild/\350\276\223\345\205\245.png" "b/docbuild/\350\276\223\345\205\245.png"
new file mode 100644
index 0000000000000000000000000000000000000000..98be458c0b5b9f843be65c68aa00b4a319abb225
Binary files /dev/null and "b/docbuild/\350\276\223\345\205\245.png" differ
diff --git "a/docbuild/\351\235\231\346\200\201\346\243\200\346\237\245\351\241\271.JPG" "b/docbuild/\351\235\231\346\200\201\346\243\200\346\237\245\351\241\271.JPG"
new file mode 100644
index 0000000000000000000000000000000000000000..1532a14153aef9a81af781489a6fda4b3be36e3b
Binary files /dev/null and "b/docbuild/\351\235\231\346\200\201\346\243\200\346\237\245\351\241\271.JPG" differ
diff --git a/openeuler-ci-src/ci_projects_builders/Dockerfile b/openeuler-ci-src/ci_projects_builders/Dockerfile
new file mode 100644
index 0000000000000000000000000000000000000000..a30bc47f4f2bd207b36960fa0948975fa892bece
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/Dockerfile
@@ -0,0 +1,17 @@
+FROM openeuler/openeuler:21.03
+
+RUN yum update && \
+ yum install -y vim wget git xz tar make automake autoconf libtool gcc gcc-c++ kernel-devel libmaxminddb-devel pcre-devel openssl openssl-devel tzdata \
+ readline-devel libffi-devel python3-devel mariadb-devel python3-pip net-tools.x86_64 iputils
+
+RUN pip3 install uwsgi
+
+WORKDIR /work/ci_projects_builders
+
+COPY . /work/ci_projects_builders
+
+RUN pip3 install -r requirements.txt
+
+ENV LANG=en.US_UTF-8
+
+ENTRYPOINT ["uwsgi", "--ini", "/work/ci_projects_builders/deploy/production/uwsgi.ini"]
diff --git a/openeuler-ci-src/ci_projects_builders/README.md b/openeuler-ci-src/ci_projects_builders/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..3dc7a81d97b79a74722a49fd92397ae2ba94f5f8
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/README.md
@@ -0,0 +1,33 @@
+## openEuler代码仓门禁配置
+
+目前 openEuler 企业下的所有制品仓(以 src-openeuler 开头的仓库)都配置了 Pull Request 的门禁,而代码仓(以 openeuler 开头的仓库)默认不配置 Pull Request 门禁。本文主要介绍如何给 openEuler 的代码仓配置 Pull Request 门禁。
+
+### 账号授权
+openEuler企业下所有制品仓和代码仓的门禁都托管在 Jenkins 上,Jenkins 地址为 https://openeulerjenkins.osinfra.cn 。访问Jenkins项目需要通过 authing 授权,点击`Sign in with authing` 并选择 Gitee 授权。如果您的 Gitee 绑定邮箱已在 authing 上注册并分入 openeuler-jenkins 的组(Gitee 的绑定邮箱通过往 https://gitee.com/openeuler/openeuler-jenkins 提交 Pull Request 进行 authing 配置),那么授权后您就能访问 Jenkins 项目。
+当然,如果您需要查看或修改对应项目的配置,还需要一个 Jenkins 账号。账号的注册和项目的权限配置可联系作者。
+
+### 提交Pull Request
+现在,您可以通过提交一条 Pull Request 来实现在 Jenkins 自动创建 openEuler 代码仓的门禁工程。Pull Request 提交的仓库为 https://gitee.com/openeuler/openeuler-jenkins ,仓库内的路径为 openeuler-ci/{repo}.yaml ,即您需要在 openeuler-ci 目录下新建一个与仓库名同名的yaml文件,以 openeuler/website 为例,具体的内容如下
+
+```
+repo_name: website
+container_level: l1
+init_shell: "echo hello\necho $?"
+users:
+ - login_name: xxx
+ name: xyz
+ email: xxx@yyy.com
+ gitee_id: xxx
+```
+配置文件的 repo_name 为需要配置项目的仓库名;container_level 为容器内存、磁盘的组合等级,l1为2核4G~4核8G,l2为4核6G以上; init_shell 是 x86-64 和 aarch64 工程users 是一个数组,每项为一个用户的配置。login_name 是 Jenkins 的登录账号,name 是账号在 Jenkins 的名称,email 为 authing 授权的 Gitee 绑定邮箱。
+当 Pull Request 合入后,Gitee Webhook 会触发在 Jenkins 自动创建对应 openEuler 代码仓的工程。
+
+### 门禁流程
+一个代码仓的 Pull Request 门禁从触发、构建到评论需要分别在 `multiarch/openeuler/trigger/` `multiarch/openeuler/x86/` `multiarch/openeuler/aarch64/` `multiarch/openeuler/comment/` 四个目录下存在对应仓库名的工程。而上述 Pull Request 合入流程正好创建了这四个工程。
+
+在 Jenkins 上存在一个 openEuler 代码仓的工程并完成对工程的配置后,在 openEuler 代码仓提交一条 Pull Request 或者在现有 Pull Request 下评论 `/retest` 都会触发 `multiarch/openeuler/trigger/` 目录下与仓库同名的工程;`multiarch/openeuler/trigger/` 下的工程结束运行后会触发`multiarch/openeuler/x86/` 和 `multiarch/openeuler/aarch64/` (或更多其他架构的工程)下对应的同名工程;`multiarch/openeuler/x86/` 和 `multiarch/openeuler/aarch64/` (或更多其他架构的工程)下两(多)个工程都结束运行后会触发 `multiarch/openeuler/comment/` 下对应的同名工程,最后 comment 调用 Gitee 的评论接口将门禁检查结果评论到PR下。
+
+而实现门禁的定制,只需要自定义 `multiarch/openeuler/x86/` `multiarch/openeuler/aarch64/` 等不同架构下对应工程配置中的shell,但请避免在shell中使用**chroot**等指令。
+
+### 支持其他架构
+上述内容对大多项目的配置具有普适性。如果您有更加私人的定制,如需要工程在其他架构的容器中运行,您可以联系作者并提供对应的 Docker 镜像或 Dockerfile。
diff --git a/openeuler-ci-src/ci_projects_builders/ci_projects_builders/__init__.py b/openeuler-ci-src/ci_projects_builders/ci_projects_builders/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/openeuler-ci-src/ci_projects_builders/ci_projects_builders/settings.py b/openeuler-ci-src/ci_projects_builders/ci_projects_builders/settings.py
new file mode 100644
index 0000000000000000000000000000000000000000..0d6c2ab7c955963a56381e3ce84ebb9f948c9063
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/ci_projects_builders/settings.py
@@ -0,0 +1,233 @@
+"""
+Django settings for ci_projects_builders project.
+
+Generated by 'django-admin startproject' using Django 2.1.15.
+
+For more information on this file, see
+https://docs.djangoproject.com/en/2.1/topics/settings/
+
+For the full list of settings and their values, see
+https://docs.djangoproject.com/en/2.1/ref/settings/
+"""
+
+import datetime
+import os
+import time
+
+# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
+BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
+
+
+# Quick-start development settings - unsuitable for production
+# See https://docs.djangoproject.com/en/2.1/howto/deployment/checklist/
+
+# SECURITY WARNING: keep the secret key used in production secret!
+SECRET_KEY = os.getenv('SECRET_KEY', '')
+
+# SECURITY WARNING: don't run with debug turned on in production!
+DEBUG = False
+
+ALLOWED_HOSTS = ['*']
+
+
+# Application definition
+
+INSTALLED_APPS = [
+ 'django.contrib.admin',
+ 'django.contrib.auth',
+ 'django.contrib.contenttypes',
+ 'django.contrib.sessions',
+ 'django.contrib.messages',
+ 'django.contrib.staticfiles',
+ 'projects_builders.apps.ProjectsBuildersConfig',
+ 'rest_framework'
+]
+
+CORS_ALLOW_METHODS = (
+ 'GET',
+ 'POST',
+ 'PUT',
+ 'PATCH',
+ 'DELETE',
+ 'OPTIONS'
+)
+CORS_ALLOW_HEADERS = (
+ 'XMLHttpRequest',
+ 'X_FILENAME',
+ 'accept-encoding',
+ 'content-type',
+ 'Authorization',
+ 'dnt',
+ 'origin',
+ 'user-agent',
+ 'x-csrftoken',
+ 'x-requested-with',
+ 'Pragma',
+)
+CORS_ALLOW_CREDENTIALS = True
+
+CORS_ORIGIN_ALLOW_ALL = True
+
+MIDDLEWARE = [
+ 'django.middleware.security.SecurityMiddleware',
+ 'django.contrib.sessions.middleware.SessionMiddleware',
+ 'corsheaders.middleware.CorsMiddleware',
+ 'django.middleware.common.CommonMiddleware',
+ 'django.middleware.csrf.CsrfViewMiddleware',
+ 'django.contrib.auth.middleware.AuthenticationMiddleware',
+ 'django.contrib.messages.middleware.MessageMiddleware',
+ 'django.middleware.clickjacking.XFrameOptionsMiddleware',
+]
+
+ROOT_URLCONF = 'ci_projects_builders.urls'
+
+TEMPLATES = [
+ {
+ 'BACKEND': 'django.template.backends.django.DjangoTemplates',
+ 'DIRS': [],
+ 'APP_DIRS': True,
+ 'OPTIONS': {
+ 'context_processors': [
+ 'django.template.context_processors.debug',
+ 'django.template.context_processors.request',
+ 'django.contrib.auth.context_processors.auth',
+ 'django.contrib.messages.context_processors.messages',
+ ],
+ },
+ },
+]
+
+WSGI_APPLICATION = 'ci_projects_builders.wsgi.application'
+
+
+# Database
+# https://docs.djangoproject.com/en/2.1/ref/settings/#databases
+
+DATABASES = {
+ 'default': {
+ 'ENGINE': 'django.db.backends.sqlite3',
+ 'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
+ }
+}
+
+
+# Password validation
+# https://docs.djangoproject.com/en/2.1/ref/settings/#auth-password-validators
+
+AUTH_PASSWORD_VALIDATORS = [
+ {
+ 'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
+ },
+ {
+ 'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
+ },
+ {
+ 'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
+ },
+ {
+ 'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
+ },
+]
+
+
+# Internationalization
+# https://docs.djangoproject.com/en/2.1/topics/i18n/
+
+LANGUAGE_CODE = 'en-us'
+
+TIME_ZONE = 'Asia/Shanghai'
+
+USE_I18N = True
+
+USE_L10N = True
+
+USE_TZ = False
+
+
+# Static files (CSS, JavaScript, Images)
+# https://docs.djangoproject.com/en/2.1/howto/static-files/
+
+STATIC_ROOT = os.path.join(BASE_DIR, 'static')
+
+STATIC_URL = '/static/'
+
+
+
+cur_path = os.path.dirname(os.path.realpath(__file__)) # log_path是存放日志的路径
+
+log_path = os.path.join(os.path.dirname(cur_path), 'logs')
+
+if not os.path.exists(log_path):
+ os.mkdir(log_path) # 如果不存在这个logs文件夹,就自动创建一个
+
+LOGGING = {
+ 'version': 1,
+ 'disable_existing_loggers': True,
+ 'formatters': {
+ # 日志格式
+ 'standard': {
+ 'format': '[%(asctime)s] [%(filename)s:%(lineno)d] [%(module)s:%(funcName)s] '
+ '[%(levelname)s]- %(message)s'},
+ 'simple': { # 简单格式
+ 'format': '%(levelname)s %(message)s'
+ },
+ },
+ # 过滤
+ 'filters': {
+ },
+ # 定义具体处理日志的方式
+ 'handlers': {
+ # 默认记录所有日志
+ 'default': {
+ 'level': 'INFO',
+ 'class': 'logging.handlers.RotatingFileHandler',
+ 'filename': os.path.join(log_path, 'all-{}.log'.format(time.strftime('%Y-%m-%d'))),
+ 'maxBytes': 1024 * 1024 * 5, # 文件大小
+ 'backupCount': 5, # 备份数
+ 'formatter': 'standard', # 输出格式
+ 'encoding': 'utf-8', # 设置默认编码,否则打印出来汉字乱码
+ },
+ # 输出错误日志
+ 'error': {
+ 'level': 'ERROR',
+ 'class': 'logging.handlers.RotatingFileHandler',
+ 'filename': os.path.join(log_path, 'error-{}.log'.format(time.strftime('%Y-%m-%d'))),
+ 'maxBytes': 1024 * 1024 * 5, # 文件大小
+ 'backupCount': 5, # 备份数
+ 'formatter': 'standard', # 输出格式
+ 'encoding': 'utf-8', # 设置默认编码
+ },
+ # 控制台输出
+ 'console': {
+ 'level': 'DEBUG',
+ 'class': 'logging.StreamHandler',
+ 'formatter': 'standard'
+ },
+ # 输出info日志
+ 'info': {
+ 'level': 'INFO',
+ 'class': 'logging.handlers.RotatingFileHandler',
+ 'filename': os.path.join(log_path, 'info-{}.log'.format(time.strftime('%Y-%m-%d'))),
+ 'maxBytes': 1024 * 1024 * 5,
+ 'backupCount': 5,
+ 'formatter': 'standard',
+ 'encoding': 'utf-8', # 设置默认编码
+ },
+ },
+ # 配置用哪几种 handlers 来处理日志
+ 'loggers': {
+ # 类型 为 django 处理所有类型的日志, 默认调用
+ 'django': {
+ 'handlers': ['default', 'console'],
+ 'level': 'INFO',
+ 'propagate': False
+ },
+ # log 调用时需要当作参数传入
+ 'log': {
+ 'handlers': ['error', 'info', 'console', 'default'],
+ 'level': 'INFO',
+ 'propagate': True
+ },
+ }
+}
+
diff --git a/openeuler-ci-src/ci_projects_builders/ci_projects_builders/urls.py b/openeuler-ci-src/ci_projects_builders/ci_projects_builders/urls.py
new file mode 100644
index 0000000000000000000000000000000000000000..7501e5d2f570e174e5c5825b39f56959d219b7d7
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/ci_projects_builders/urls.py
@@ -0,0 +1,22 @@
+"""ci_projects_builders URL Configuration
+
+The `urlpatterns` list routes URLs to views. For more information please see:
+ https://docs.djangoproject.com/en/2.1/topics/http/urls/
+Examples:
+Function views
+ 1. Add an import: from my_app import views
+ 2. Add a URL to urlpatterns: path('', views.home, name='home')
+Class-based views
+ 1. Add an import: from other_app.views import Home
+ 2. Add a URL to urlpatterns: path('', Home.as_view(), name='home')
+Including another URLconf
+ 1. Import the include() function: from django.urls import include, path
+ 2. Add a URL to urlpatterns: path('blog/', include('blog.urls'))
+"""
+from django.contrib import admin
+from django.urls import path, include
+
+urlpatterns = [
+ path('admin/', admin.site.urls),
+ path('', include('projects_builders.urls'))
+]
diff --git a/openeuler-ci-src/ci_projects_builders/ci_projects_builders/wsgi.py b/openeuler-ci-src/ci_projects_builders/ci_projects_builders/wsgi.py
new file mode 100644
index 0000000000000000000000000000000000000000..3b836360f04f2f9bbe2a42182cfbe036c9dea4cb
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/ci_projects_builders/wsgi.py
@@ -0,0 +1,16 @@
+"""
+WSGI config for ci_projects_builders project.
+
+It exposes the WSGI callable as a module-level variable named ``application``.
+
+For more information on this file, see
+https://docs.djangoproject.com/en/2.1/howto/deployment/wsgi/
+"""
+
+import os
+
+from django.core.wsgi import get_wsgi_application
+
+os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'ci_projects_builders.settings')
+
+application = get_wsgi_application()
diff --git a/openeuler-ci-src/ci_projects_builders/deploy/production/uwsgi.ini b/openeuler-ci-src/ci_projects_builders/deploy/production/uwsgi.ini
new file mode 100644
index 0000000000000000000000000000000000000000..369996bb24ee44b04e430697c413d10a88336da8
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/deploy/production/uwsgi.ini
@@ -0,0 +1,16 @@
+[uwsgi]
+env=DJANGO_SETTINGS_MODULE=ci_projects_builders.settings
+chdir=/work/ci_projects_builders
+module=ci_projects_builders.wsgi:application
+socket=/work/ci_projects_builders/uwsgi.sock
+workers=5
+pidfile=/work/ci_projects_builders/uwsgi.pid
+http=0.0.0.0:80
+uid=root
+gid=root
+master=true
+vacuum=true
+thunder-lock=true
+enable-threads=true
+harakiri=30
+post-buffering=4096
diff --git a/openeuler-ci-src/ci_projects_builders/manage.py b/openeuler-ci-src/ci_projects_builders/manage.py
new file mode 100644
index 0000000000000000000000000000000000000000..1fb9f54385075a075bf7b34733cadcc956085335
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/manage.py
@@ -0,0 +1,15 @@
+#!/usr/bin/env python
+import os
+import sys
+
+if __name__ == '__main__':
+ os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'ci_projects_builders.settings')
+ try:
+ from django.core.management import execute_from_command_line
+ except ImportError as exc:
+ raise ImportError(
+ "Couldn't import Django. Are you sure it's installed and "
+ "available on your PYTHONPATH environment variable? Did you "
+ "forget to activate a virtual environment?"
+ ) from exc
+ execute_from_command_line(sys.argv)
diff --git a/openeuler-ci-src/ci_projects_builders/projects_builders/__init__.py b/openeuler-ci-src/ci_projects_builders/projects_builders/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/openeuler-ci-src/ci_projects_builders/projects_builders/admin.py b/openeuler-ci-src/ci_projects_builders/projects_builders/admin.py
new file mode 100644
index 0000000000000000000000000000000000000000..8c38f3f3dad51e4585f3984282c2a4bec5349c1e
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/projects_builders/admin.py
@@ -0,0 +1,3 @@
+from django.contrib import admin
+
+# Register your models here.
diff --git a/openeuler-ci-src/ci_projects_builders/projects_builders/apps.py b/openeuler-ci-src/ci_projects_builders/projects_builders/apps.py
new file mode 100644
index 0000000000000000000000000000000000000000..04ff0aaa3bf8bddcb7343d04b2f158d839fcd355
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/projects_builders/apps.py
@@ -0,0 +1,5 @@
+from django.apps import AppConfig
+
+
+class ProjectsBuildersConfig(AppConfig):
+ name = 'projects_builders'
diff --git a/openeuler-ci-src/ci_projects_builders/projects_builders/migrations/__init__.py b/openeuler-ci-src/ci_projects_builders/projects_builders/migrations/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/openeuler-ci-src/ci_projects_builders/projects_builders/models.py b/openeuler-ci-src/ci_projects_builders/projects_builders/models.py
new file mode 100644
index 0000000000000000000000000000000000000000..71a836239075aa6e6e4ecb700e9c42c95c022d91
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/projects_builders/models.py
@@ -0,0 +1,3 @@
+from django.db import models
+
+# Create your models here.
diff --git a/openeuler-ci-src/ci_projects_builders/projects_builders/permissions.py b/openeuler-ci-src/ci_projects_builders/projects_builders/permissions.py
new file mode 100644
index 0000000000000000000000000000000000000000..de262dca06df3dbb63b3eb361b81c8fb809f8961
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/projects_builders/permissions.py
@@ -0,0 +1,10 @@
+from rest_framework import permissions
+
+
+class HookPermission(permissions.BasePermission):
+ def has_permission(self, request, view):
+ data = request.data
+ hook_name = data.get('hook_name', '')
+ action = data.get('action', '')
+ if hook_name == 'merge_request_hooks' and action == 'merge':
+ return True
diff --git a/openeuler-ci-src/ci_projects_builders/projects_builders/urls.py b/openeuler-ci-src/ci_projects_builders/projects_builders/urls.py
new file mode 100644
index 0000000000000000000000000000000000000000..bf0df5d8520b975e139cc471450faed4e8537ad1
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/projects_builders/urls.py
@@ -0,0 +1,7 @@
+from django.urls import path
+from projects_builders.views import HookView
+
+
+urlpatterns = [
+ path('hooks/', HookView.as_view()),
+]
diff --git a/openeuler-ci-src/ci_projects_builders/projects_builders/views.py b/openeuler-ci-src/ci_projects_builders/projects_builders/views.py
new file mode 100644
index 0000000000000000000000000000000000000000..e5697d97bcc8a8b638105e8757e1fc4693dd70f8
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/projects_builders/views.py
@@ -0,0 +1,185 @@
+import logging
+import jenkins
+import os
+import random
+import requests
+import subprocess
+import sys
+import time
+import yaml
+from django.http import JsonResponse
+from multiprocessing import Process
+from rest_framework.generics import GenericAPIView
+from projects_builders.permissions import HookPermission
+from utils.authing import get_token, search_member, create_authing_user, add_member
+from utils.jenkins import JenkinsLib, add_user_permissions, config_image_level, config_init_shell
+from utils.send_login_info import sendmail
+
+
+logger = logging.getLogger('log')
+BASEURL = os.getenv('BASEURL', '')
+JENKINS_URL = os.getenv('JENKINS_URL', '')
+JENKINS_USERNAME = os.getenv('JENKINS_USERNAME', '')
+JENKINS_PASSWORD = os.getenv('JENKINS_PASSWORD', '')
+AUTHING_USERID = os.getenv('AUTHING_USERID', '')
+AUTHING_SECRET = os.getenv('AUTHING_SECRET', '')
+if not (JENKINS_URL and JENKINS_USERNAME and JENKINS_PASSWORD and AUTHING_USERID and AUTHING_SECRET and BASEURL):
+ logger.error('Please check environment variables, exit...')
+ sys.exit(1)
+
+
+def get_diff_files(organization, repo, number):
+ url = 'https://gitee.com/{}/{}/pulls/{}.diff'.format(organization, repo, number)
+ r = requests.get(url)
+ if r.status_code != 200:
+ logger.error('Error! Cannot locate difference file of Pull Request. status code: {}'.format(r.status_code))
+ sys.exit(1)
+ diff_files = [x.split(' ')[0][2:] for x in r.text.split('diff --git ')[1:]]
+ return diff_files
+
+
+def conn_jenkins(url, username, password):
+ """
+ Connect to jenkins server
+ :return: jenkins server
+ """
+ server = jenkins.Jenkins(url=url, username=username, password=password, timeout=120)
+ return server
+
+
+def base_log(pr_url, hook_name, action):
+ logger.info('URL of Pull Request: {}'.format(pr_url))
+ logger.info('Hook Name: {}'.format(hook_name))
+ logger.info('Action: {}'.format(action))
+
+
+def pull_code(owner, repo):
+ if repo in os.listdir():
+ subprocess.call('rm -rf {}'.format(repo), shell=True)
+ subprocess.call('git clone https://gitee.com/{}/{}.git'.format(owner, repo), shell=True)
+
+
+def run(owner, repo, number):
+ logger.info('Step 1: Get list of repos ready to build projects')
+ # 获取更变文件名列表
+ diff_files = get_diff_files(owner, repo, number)
+
+ # 获取doc/openeuler-ci/下新增yaml文件的路径
+ waiting_repos = []
+ for diff_file in diff_files:
+ file_path = os.path.join(repo, diff_file)
+ if len(diff_file.split('/')) == 2 and \
+ diff_file.split('/')[0] == 'openeuler-ci' and \
+ diff_file.split('/')[-1].endswith('.yaml'):
+ waiting_repos.append(file_path)
+ if not waiting_repos:
+ logger.info('Notice there is no repo needs to build jenkins projects, exit...')
+ return
+
+ logger.info('Step 2: Pull code')
+ # 拉取目标仓库代码
+ pull_code(owner, repo)
+
+ logger.info('Step 3: Build Jenkins projects')
+ # 连接Jenkins server
+ server = conn_jenkins(JENKINS_URL, JENKINS_USERNAME, JENKINS_PASSWORD)
+
+ for waiting_repo in waiting_repos:
+ # 获取yaml文件的关键字段
+ f = open(waiting_repo, 'r')
+ ci_config = yaml.safe_load(f)
+ f.close()
+ repo_name = ci_config.get('repo_name')
+ init_shell = ci_config.get('init_shell')
+ container_level = ci_config.get('container_level')
+ users = ci_config.get('users')
+ parameters = {
+ 'action': 'create',
+ 'template': 'openeuler-jenkins',
+ 'jobs': repo_name,
+ 'repo_server': 'repo-service.dailybuild'
+ }
+ # 新增jenkins工程
+ server.build_job(name='multiarch/openeuler/jobs-crud/_entry', parameters=parameters)
+ logger.info('Build Jenkins projects for openeuler/{}'.format(repo_name))
+
+ # 查看工程是否创建成功
+ x86_project = 'multiarch/openeuler/x86-64/{}'.format(repo_name)
+ aarch64_project = 'multiarch/openeuler/aarch64/{}'.format(repo_name)
+ projects_created = False
+ retest = 5
+ while retest > 0:
+ time.sleep(60)
+ if server.get_job_name(x86_project) == repo_name and server.get_job_name(aarch64_project) == repo_name:
+ logger.info('Notice projects x86-64 & aarch64 for {} had been created.'.format(repo_name))
+ projects_created = True
+ break
+ retest -= 1
+ if not projects_created:
+ logger.info('Fail to create all projects for repo {}'.format(repo_name))
+ continue
+
+ if not users:
+ logger.info('Notice no users need to create and config, continue...')
+ continue
+ if not isinstance(users, list):
+ logger.error('ERROR! The field `users` must be a List Type, continue...')
+ continue
+ for user in users:
+ if not isinstance(user, dict):
+ logger.error('ERROR! The user must be a Dict Type, which content is :\n{}'.format(user))
+ continue
+ try:
+ login_name = user['login_name']
+ name = user.get('name')
+ email_addr = user['email']
+ except KeyError:
+ logger.error('ERROR! A user must declare its username and email_address.')
+ continue
+ # Authing授权、分组
+ logger.info('Create Authing users and add users to group')
+ authing_token = get_token(AUTHING_USERID, AUTHING_SECRET)
+ member_id = create_authing_user(authing_token, AUTHING_USERID, email_addr)
+ add_member(authing_token, AUTHING_USERID, 'openeuler-jenkins', member_id)
+ # 创建Jenkins用户、发送邮件并添加权限
+ logger.info('Create and config Jenkins users.')
+ login_password = str(random.randint(100000, 999999))
+ jenkinslib = JenkinsLib(BASEURL, JENKINS_USERNAME, JENKINS_PASSWORD, useCrumb=True, timeout=180)
+ res = jenkinslib.create_user(login_name, login_password, name, email_addr)
+ if res.status_code == 200:
+ sendmail(login_name, login_password, email_addr)
+ # 配置users权限
+ user_list = [user['login_name'] for user in users]
+ add_user_permissions(server, user_list, x86_project)
+ add_user_permissions(server, user_list, aarch64_project)
+ # 修改x86-64和aarch64的指定node
+ logger.info('Config container level')
+ with open('utils/container_level_mapping.yaml', 'r') as f:
+ container_level_mapping = yaml.safe_load(f)
+ x86_node = container_level_mapping.get('x86').get(container_level)
+ aarch64_node = container_level_mapping.get('aarch64').get(container_level)
+ config_image_level(server, x86_node, x86_project)
+ config_image_level(server, aarch64_node, aarch64_project)
+ # 修改x86-64和aarch64的初始脚本
+ logger.info('Config init shell')
+ config_init_shell(server, init_shell, x86_project)
+ config_init_shell(server, init_shell, aarch64_project)
+ logger.info('Finish dealing with the Merge Hook, waiting next Merge Hook.')
+
+
+class HookView(GenericAPIView):
+ permission_classes = (HookPermission,)
+
+ def post(self, request, *args, **kwargs):
+ data = self.request.data
+ hook_name = data['hook_name']
+ action = data['action']
+ pr_url = data['pull_request']['html_url']
+ base_log(pr_url, hook_name, action)
+ owner = pr_url.split('/')[-4]
+ repo = pr_url.split('/')[-3]
+ number = pr_url.split('/')[-1]
+ p1 = Process(target=run, args=(owner, repo, number))
+ p1.start()
+ return JsonResponse({'code': 200, 'msg': 'OK'})
+
diff --git a/openeuler-ci-src/ci_projects_builders/requirements.txt b/openeuler-ci-src/ci_projects_builders/requirements.txt
new file mode 100644
index 0000000000000000000000000000000000000000..7fc270db002adc243de741a899d4b0566279d203
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/requirements.txt
@@ -0,0 +1,7 @@
+Django==2.1.15
+django-cors-headers==3.2.0
+djangorestframework==3.11.0
+jenkinsapi==0.3.11
+requests==2.27.1
+python-jenkins==1.7.0
+PyYAML==5.3.1
diff --git a/openeuler-ci-src/ci_projects_builders/utils/authing.py b/openeuler-ci-src/ci_projects_builders/utils/authing.py
new file mode 100644
index 0000000000000000000000000000000000000000..e1b16d6fd5e6ec8bd6ba9d09676024521a96c17c
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/utils/authing.py
@@ -0,0 +1,87 @@
+import logging
+import json
+import requests
+
+
+logger = logging.getLogger('log')
+
+
+def get_token(userId, secret):
+ url = 'https://core.authing.cn/api/v2/userpools/access-token'
+ headers = {
+ 'Content-Type': 'application/json'
+ }
+ payload = {
+ 'userPoolId': userId,
+ 'secret': secret
+ }
+ r = requests.post(url, headers=headers, data=json.dumps(payload))
+ if r.status_code != 201:
+ logger.info('Fail to get authing access token, please check userId and secret.')
+ return None
+ logger.info('Success to get authing access token.')
+ return r.json()['accessToken']
+
+
+def search_member(token, userId, email_address):
+ url = 'https://core.authing.cn/api/v2/users/search'
+ headers = {
+ 'Content-Type': 'application/json',
+ 'Authorization': 'Bearer {}'.format(token),
+ 'x-authing-userpool-id': userId
+ }
+ params = {
+ 'query': email_address
+ }
+ r = requests.get(url, headers=headers, params=params)
+ if r.status_code == 200 and r.json()['data']['totalCount'] > 0:
+ for item in r.json()['data']['list']:
+ if item['email'] == email_address:
+ logger.info('Success to search member {} whose member_id is {}.'.format(email_address, item['id']))
+ return item['id']
+ else:
+ logger.error('ERROR! Cannot search member {}.'.format(email_address))
+ return
+
+
+def create_authing_user(token, userId, email_address):
+ url = 'https://core.authing.cn/api/v2/users'
+ headers = {
+ 'Content-Type': 'application/json',
+ 'Authorization': 'Bearer {}'.format(token),
+ 'x-authing-userpool-id': userId
+ }
+ payload = {
+ 'userInfo': {
+ 'email': email_address
+ }
+ }
+ r = requests.post(url, headers=headers, data=json.dumps(payload))
+ if r.json()['code'] == 200:
+ member_id = r.json()['data']['id']
+ logger.info('Success to create authing user for {} whose member_id is {}.'.format(email_address, member_id))
+ return member_id
+ elif r.json()['code'] == 2026:
+ member_id = search_member(token, userId, email_address)
+ logger.info('The user {} still exists and its member_id is {}.'.format(email_address, member_id))
+ return member_id
+ else:
+ logger.error('ERROR! Fail to create member, the status code is {}.'.format(r.json()['code']))
+ return
+
+
+def add_member(token, userId, group, member_id):
+ url = 'https://core.authing.cn/api/v2/groups/{}/add-users'.format(group)
+ headers = {
+ 'Content-Type': 'application/json',
+ 'Authorization': 'Bearer {}'.format(token),
+ 'x-authing-userpool-id': userId
+ }
+ payload = {
+ 'userIds': [str(member_id)]
+ }
+ r = requests.post(url, headers=headers, data=json.dumps(payload))
+ if r.status_code != 201:
+ logger.error('ERROR! Fail to add member to group {} whose member_id is {}.'.format(group, member_id))
+ else:
+ logger.info('Success to add member to group {} whose member_id is {}.'.format(group, member_id))
diff --git a/openeuler-ci-src/ci_projects_builders/utils/container_level_mapping.yaml b/openeuler-ci-src/ci_projects_builders/utils/container_level_mapping.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..5e001f9bc966a4257ec3ff9ec29906f1e4560d2c
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/utils/container_level_mapping.yaml
@@ -0,0 +1,8 @@
+x86:
+ l1: k8s-x86-oe-level1
+ l2: k8s-x86-oe-level2
+
+aarch64:
+ l1: k8s-aarch64-openeuler-level1
+ l2: k8s-aarch64-openeuler-level2
+
diff --git a/openeuler-ci-src/ci_projects_builders/utils/jenkins.py b/openeuler-ci-src/ci_projects_builders/utils/jenkins.py
new file mode 100644
index 0000000000000000000000000000000000000000..d9ea3ea7a4953d2e4dca37a3a57040194a4d509b
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/utils/jenkins.py
@@ -0,0 +1,91 @@
+import os
+from jenkinsapi import jenkins as jenkins_api
+
+
+class JenkinsLib(jenkins_api.Jenkins):
+ def __init__(self, *args, **kwargs):
+ super(JenkinsLib, self).__init__(*args, **kwargs)
+
+ def create_user(self, username, password, fullname, email):
+ body = {
+ "username": username,
+ "$redact": ["password1", "password2"],
+ "password1": password,
+ "password2": password,
+ "fullname": fullname,
+ "email": email}
+ baseurl = os.getenv('BASEURL', '')
+ url = "%s/securityRealm/createAccountByAdmin" % self.baseurl
+ valid = self.requester.VALID_STATUS_CODES + [302, ]
+ resp = self.requester.post_and_confirm_status(url, data=body,
+ valid=valid)
+ return resp
+
+
+def add_user_permissions(server, users, project):
+ users_permissions = ''
+ for user in users:
+ users_permissions += ' com.cloudbees.plugins.credentials.CredentialsProvider.Create:{}' \
+ '\n'.format(user)
+ for user in users:
+ users_permissions += ' com.cloudbees.plugins.credentials.CredentialsProvider.Delete:{}' \
+ '\n'.format(user)
+ for user in users:
+ users_permissions += ' com.cloudbees.plugins.credentials.CredentialsProvider.ManageDomains:{}' \
+ '\n'.format(user)
+ for user in users:
+ users_permissions += ' com.cloudbees.plugins.credentials.CredentialsProvider.Update:{}' \
+ '\n'.format(user)
+ for user in users:
+ users_permissions += ' com.cloudbees.plugins.credentials.CredentialsProvider.View:{}' \
+ '\n'.format(user)
+ for user in users:
+ users_permissions += ' hudson.model.Item.Build:{}\n'.format(user)
+ for user in users:
+ users_permissions += ' hudson.model.Item.Cancel:{}\n'.format(user)
+ for user in users:
+ users_permissions += ' hudson.model.Item.Configure:{}\n'.format(user)
+ for user in users:
+ users_permissions += ' hudson.model.Item.Delete:{}\n'.format(user)
+ for user in users:
+ users_permissions += ' hudson.model.Item.Discover:{}\n'.format(user)
+ for user in users:
+ users_permissions += ' hudson.model.Item.Move:{}\n'.format(user)
+ for user in users:
+ users_permissions += ' hudson.model.Item.Read:{}\n'.format(user)
+ for user in users:
+ users_permissions += ' hudson.model.Item.Workspace:{}\n'.format(user)
+ for user in users:
+ users_permissions += ' hudson.model.Run.Delete:{}\n'.format(user)
+ for user in users:
+ users_permissions += ' hudson.model.Run.Update:{}\n'.format(user)
+ for user in users:
+ users_permissions += ' hudson.scm.SCM.Tag:{}\n'.format(user)
+ conf = server.get_job_config(project) # e.g. multiarch/openeuler/trigger/kernel
+ newconf = conf.replace(
+ '\n',
+ '\n'
+ + users_permissions)
+ server.reconfig_job(project, newconf)
+
+
+def config_image_level(server, node, project):
+ arch = project.split('/')[2]
+ conf = server.get_job_config(project)
+ newconf = None
+ if arch == 'x86-64':
+ newconf = conf.replace('k8s-x86-openeuler-20.03-lts-sp1\n',
+ '{}\n'.format(node))
+ elif arch == 'aarch64':
+ newconf = conf.replace('k8s-aarch64-openeuler-20.09\n',
+ '{}\n'.format(node))
+ server.reconfig_job(project, newconf)
+
+
+def config_init_shell(server, init_shell, project):
+ conf = server.get_job_config(project)
+ if '\n' in conf:
+ newconf = conf.replace('\n', '{}\n'.format(init_shell))
+ else:
+ newconf = conf.replace('\n', '{}\n'.format(init_shell))
+ server.reconfig_job(project, newconf)
diff --git a/openeuler-ci-src/ci_projects_builders/utils/send_login_info.py b/openeuler-ci-src/ci_projects_builders/utils/send_login_info.py
new file mode 100644
index 0000000000000000000000000000000000000000..584acb2dc789a0e2515db0bc932bfafe7f9d717e
--- /dev/null
+++ b/openeuler-ci-src/ci_projects_builders/utils/send_login_info.py
@@ -0,0 +1,39 @@
+import os
+import smtplib
+from email.mime.text import MIMEText
+
+
+def msg_body(account, password):
+ body = """
+ 你好,如果你已有该Jenkins账号,请忽略这封邮件。以下是为你自动创建的Jenkins账号,密码为随机生成,请在首次登录后更改密码
+
+ 账号: {0}
+ 密码: {1}
+
+
+ Hi, ignore the message if you had the Jenkins account. The following is the Jenkins account automatically created
+ for you. The password is randomly generated. Please change the password after the first login.
+
+ Account: {0}
+ Password: {1}
+ """.format(account, int(password))
+ return body
+
+
+def sendmail(account, password, receiver):
+ smtp_server_host = os.getenv('SMTP_SERVER_HOST', '')
+ smtp_server_port = os.getenv('SMTP_SERVER_POST', '')
+ smtp_server_user = os.getenv('SMTP_SERVER_USER', '')
+ smtp_server_pass = os.getenv('SMTP_SERVER_PASS', '')
+
+ body = msg_body(account, password)
+ msg = MIMEText(body, 'plain', 'utf-8')
+ msg['Subject'] = 'openEuler门禁工程Jenkins账号配置'
+ msg['From'] = '{}'.format(smtp_server_user)
+ msg['To'] = receiver
+
+ server = smtplib.SMTP(smtp_server_host, smtp_server_port)
+ server.ehlo()
+ server.starttls()
+ server.login(smtp_server_user, smtp_server_pass)
+ server.sendmail(smtp_server_user, receiver, msg.as_string())
diff --git a/openeuler-ci/BiShengCLanguage.yaml b/openeuler-ci/BiShengCLanguage.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..651763de297d3435ee8eb91bb30e7fdd1ef081ad
--- /dev/null
+++ b/openeuler-ci/BiShengCLanguage.yaml
@@ -0,0 +1,8 @@
+repo_name: BiShengCLanguage
+container_level: l2
+init_shell: "echo Hello BiShengC\necho $?"
+users:
+ - login_name: EdwardWang
+ name: EdwardWang
+ email: wangyantao4@huawei.com
+ gitee_id: plt42
diff --git a/openeuler-ci/Kmesh.yaml b/openeuler-ci/Kmesh.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..de7f94603aa9f7b1124094f65bcb7712c935a5e1
--- /dev/null
+++ b/openeuler-ci/Kmesh.yaml
@@ -0,0 +1,8 @@
+repo_name: Kmesh
+container_level: l1
+init_shell: "echo hello\necho $?"
+users:
+ - login_name: kongweibin
+ name: kongweibin
+ email: kwb0523@163.com
+ gitee_id: kwb0523
diff --git a/openeuler-ci/aops-apollo.yaml b/openeuler-ci/aops-apollo.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..513b176269cf5ded2f1169126d75e9e1558052ef
--- /dev/null
+++ b/openeuler-ci/aops-apollo.yaml
@@ -0,0 +1,8 @@
+repo_name: aops-apollo
+container_level: l1
+init_shell: "echo hello\necho $?"
+users:
+ - login_name: dowzyx
+ name: dowzyx
+ email: zhaoyuxing2@huawei.com
+ gitee_id: dowzyx
diff --git a/openeuler-ci/aops-ceres.yaml b/openeuler-ci/aops-ceres.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..68d6a1aa00e692db2ecc7f5f8192ff7ef6ef9858
--- /dev/null
+++ b/openeuler-ci/aops-ceres.yaml
@@ -0,0 +1,8 @@
+repo_name: aops-ceres
+container_level: l1
+init_shell: "echo hello\necho $?"
+users:
+ - login_name: dowzyx
+ name: dowzyx
+ email: zhaoyuxing2@huawei.com
+ gitee_id: dowzyx
diff --git a/openeuler-ci/aops-diana.yaml b/openeuler-ci/aops-diana.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..0bbd71877ec6bcd44a9d408c556744d42f04ab01
--- /dev/null
+++ b/openeuler-ci/aops-diana.yaml
@@ -0,0 +1,8 @@
+repo_name: aops-diana
+container_level: l1
+init_shell: "echo hello\necho $?"
+users:
+ - login_name: dowzyx
+ name: dowzyx
+ email: zhaoyuxing2@huawei.com
+ gitee_id: dowzyx
diff --git a/openeuler-ci/aops-hermes.yaml b/openeuler-ci/aops-hermes.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..cdfeb639d79fd58b74e1ee1915c1ba2c286635f9
--- /dev/null
+++ b/openeuler-ci/aops-hermes.yaml
@@ -0,0 +1,8 @@
+repo_name: aops-hermes
+container_level: l1
+init_shell: "echo hello\necho $?"
+users:
+ - login_name: dowzyx
+ name: dowzyx
+ email: zhaoyuxing2@huawei.com
+ gitee_id: dowzyx
diff --git a/openeuler-ci/aops-vulcanus.yaml b/openeuler-ci/aops-vulcanus.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..c91202e66724361db94d99447556e680c4a3159f
--- /dev/null
+++ b/openeuler-ci/aops-vulcanus.yaml
@@ -0,0 +1,8 @@
+repo_name: aops-vulcanus
+container_level: l1
+init_shell: "echo hello\necho $?"
+users:
+ - login_name: dowzyx
+ name: dowzyx
+ email: zhaoyuxing2@huawei.com
+ gitee_id: dowzyx
diff --git a/openeuler-ci/aops-zeus.yaml b/openeuler-ci/aops-zeus.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..2fa219be42e9a6a2f0d77d22c8220b02118dcfe3
--- /dev/null
+++ b/openeuler-ci/aops-zeus.yaml
@@ -0,0 +1,8 @@
+repo_name: aops-zeus
+container_level: l1
+init_shell: "echo hello\necho $?"
+users:
+ - login_name: dowzyx
+ name: dowzyx
+ email: zhaoyuxing2@huawei.com
+ gitee_id: dowzyx
diff --git a/openeuler-ci/bishengjdk-build.yaml b/openeuler-ci/bishengjdk-build.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..4f953c92805af242324dbfd31d81baee001832c2
--- /dev/null
+++ b/openeuler-ci/bishengjdk-build.yaml
@@ -0,0 +1,8 @@
+repo_name: bishengjdk-build
+container_level: l1
+init_shell: "echo hello\necho $?"
+users:
+ - login_name: wanghao
+ name: wanghao
+ email: wanghao564@huawei.com
+ gitee_id: wanghao_hw
\ No newline at end of file
diff --git a/openeuler-ci/gala-anteater.yaml b/openeuler-ci/gala-anteater.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..e9deaed7d5a6ec32dea419ee6bc2178e899a1ec0
--- /dev/null
+++ b/openeuler-ci/gala-anteater.yaml
@@ -0,0 +1,8 @@
+repo_name: gala-anteater
+container_level: l1
+init_shell: "echo hello\necho $?"
+users:
+ - login_name: dowzyx
+ name: dowzyx
+ email: zhaoyuxing2@huawei.com
+ gitee_id: dowzyx
diff --git a/openeuler-ci/gala-gopher.yaml b/openeuler-ci/gala-gopher.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..b016bc4842f7ecd655cf9e7953ba68adc857a762
--- /dev/null
+++ b/openeuler-ci/gala-gopher.yaml
@@ -0,0 +1,8 @@
+repo_name: gala-gopher
+container_level: l1
+init_shell: "echo hello\necho $?"
+users:
+ - login_name: dowzyx
+ name: dowzyx
+ email: zhaoyuxing2@huawei.com
+ gitee_id: dowzyx
diff --git a/openeuler-ci/gala-ragdoll.yaml b/openeuler-ci/gala-ragdoll.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..1ebebe3e37fc9853c0b234a576e835a65c5007be
--- /dev/null
+++ b/openeuler-ci/gala-ragdoll.yaml
@@ -0,0 +1,8 @@
+repo_name: gala-ragdoll
+container_level: l1
+init_shell: "echo hello\necho $?"
+users:
+ - login_name: dowzyx
+ name: dowzyx
+ email: zhaoyuxing2@huawei.com
+ gitee_id: dowzyx
diff --git a/openeuler-ci/gala-spider.yaml b/openeuler-ci/gala-spider.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..a058ab82c73dda2b48a3c73db3fc10890bccd692
--- /dev/null
+++ b/openeuler-ci/gala-spider.yaml
@@ -0,0 +1,8 @@
+repo_name: gala-spider
+container_level: l1
+init_shell: "echo hello\necho $?"
+users:
+ - login_name: dowzyx
+ name: dowzyx
+ email: zhaoyuxing2@huawei.com
+ gitee_id: dowzyx
diff --git a/openeuler-ci/hikptool.yaml b/openeuler-ci/hikptool.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..851b30a916eeea41a8f1befd27d17b09e50cf451
--- /dev/null
+++ b/openeuler-ci/hikptool.yaml
@@ -0,0 +1,8 @@
+repo_name: hikptool
+container_level: l1
+init_shell: "echo hello\necho $?"
+users:
+ - login_name: veega2022
+ name: veega2022
+ email: zhuweijia@huawei.com
+ gitee_id: veega2022
diff --git a/openeuler-ci/llvm-project.yaml b/openeuler-ci/llvm-project.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..132af8efa0f21ac852e9957457fbbd906848aaa3
--- /dev/null
+++ b/openeuler-ci/llvm-project.yaml
@@ -0,0 +1,8 @@
+repo_name: llvm-project
+container_level: l2
+init_shell: "echo hello\necho $?"
+users:
+ - login_name: xin-yinglin
+ name: xin-yinglin
+ email: xinyinglin@huawei.com
+ gitee_id: xin-yinglin
\ No newline at end of file
diff --git a/openeuler-ci/oec-application.yaml b/openeuler-ci/oec-application.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..fbdadd34a5351ee03de4d24672373f5fb7c7b681
--- /dev/null
+++ b/openeuler-ci/oec-application.yaml
@@ -0,0 +1,8 @@
+repo_name: oec-application
+container_level: l1
+init_shell: "echo hello\necho $?"
+users:
+ - login_name: meitingli
+ name: meitingli
+ email: bubble_mt@outlook.com
+ gitee_id: meitingli
diff --git a/openeuler-ci/re2-rust.yaml b/openeuler-ci/re2-rust.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..3b7acdb385bd300e3f3b49ba200800f8c2f8a7a2
--- /dev/null
+++ b/openeuler-ci/re2-rust.yaml
@@ -0,0 +1,8 @@
+repo_name: re2-rust
+container_level: l1
+init_shell: "echo re2-rust\necho $?"
+users:
+ - login_name: zhaowei
+ name: zhaowei
+ email: zhaowei23@huawei.com
+ gitee_id: seuzw
diff --git a/openeuler-ci/skylark.yaml b/openeuler-ci/skylark.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..68c3d7e9483cdfe80e8099630d1c18e6d1f7237a
--- /dev/null
+++ b/openeuler-ci/skylark.yaml
@@ -0,0 +1,8 @@
+repo_name: skylark
+container_level: l1
+init_shell: "echo hello\necho $?"
+users:
+ - login_name: yezengruan
+ name: yezengruan
+ email: yezengruan@huawei.com
+ gitee_id: yezengruan
diff --git a/src/README.md b/src/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..f371a1e68eac3059247fba7a2f00132c76c67b4b
--- /dev/null
+++ b/src/README.md
@@ -0,0 +1,77 @@
+# 基于K8s集群的打包方案
+
+## 单包构建任务
+
+### 设计逻辑
+
+- 部署x86-64和aarch64架构下的k8s集群
+- 将集群配置为**Jenkins slave**
+- **Jenkins master** 运行在x86-64架构k8s集群内
+
+### 流水线任务
+
+> 相同任务只运行一个实例
+
+#### trigger
+
+- 码云触发
+- 并行跑门禁任务,cpu架构不限,失败则中止任务并对pr评论
+- 成功传递参数给下游 **job**
+ - 项目名(**repo**)
+ - 分支(**branch**)
+ - pull request id(**prid**)
+ - 发起者(**committer**)
+
+#### multiarch
+
+- 支持x86_64和aarch64架构
+- trigger成功后触发
+- 执行[**python osc_build_k8s.py $repo $arch $WORKSPACE**](https://gitee.com/src-openeuler/ci_check/blob/k8s/private_build/build/osc_build_k8s.py)进行构建
+
+#### comment
+
+- 收集门禁、build结果
+- 调用接口[**提交Pull Request评论**](https://gitee.com/wuyu15255872976/gitee-python-client/blob/master/docs/PullRequestsApi.md#post_v5_repos_owner_repo_pulls)反馈结果给码云
+- cpu架构不限
+
+## 制作jenkins/obs镜像
+
+### 机制
+
+- k8s集群中部署docker service 服务,对外提供的内部服务地址为tcp://docker.jenkins:2376
+- jenkins安装docker插件,并配置连接到k8s集群docker service服务
+- jenkins中配置制作镜像流水线任务obs-image
+- 触发方式:代码仓库ci_check打tag后手动触发,jenkins需安装build with parameterrs插件支持
+
+### 流水线任务obs-image
+
+> 运行该任务的K8s agent需带docker client
+
+#### 任务:_trigger
+
+- 检查Dockerfile文件【optional】
+- 设置参数 【环境变量?】
+ - name 【jenkins/obs】
+ - version 【取自tag】
+
+#### 任务:build-image-aarch64 & build-image-x86-64
+
+- 构建过程选择 **Build/Publish Docker Image**
+- 配置推送镜像的 **Registry Credentials**
+
+#### 任务:manifest
+
+多arch支持
+> docker manifest push时Registry Credentials?
+
+## 目录结构
+| 目录 | 描述 |
+| --- | --- |
+|ac/framework | 门禁框架 |
+|ac/acl | 门禁任务,每个门禁项对应一个目录 |
+|ac/common | 门禁通用代码 |
+|build| 单包构建|
+|jobs| jenkins任务管理|
+|conf|配置|
+|proxy|第三方接口代理|
+|utils|通用代码,日志等|
diff --git a/src/__init__.py b/src/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/src/ac/README.md b/src/ac/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..7a18f158da18661494fdf780dfd47d9cb8c27098
--- /dev/null
+++ b/src/ac/README.md
@@ -0,0 +1,54 @@
+# 门禁检查
+
+## 如何加入检查项
+1. 在ci_check/src/ac目录下新建文件夹放置检查项代码
+2. 在ac_conf.yaml中增加配置项
+
+### 配置文件说明
+
+```yaml
+示例=>
+spec: # ac项目名称
+ hint: check_spec # gitee中显示检查项名称,缺省使用check_+项目名称
+ module: spec.check_spec # ac项目模块名称,缺省使用"项目名称+check_+项目名称"
+ entry: Entry # ac项目入口类名称,继承BaseCheck类,可自定义__callable__方法
+ exclude: true # 忽略该项检查
+ ignored: [] # ac项目内忽略的检查项,就算失败也不影响最终ac项目结果
+ allow_list: [] # 只有出现在allow_list的包才执行当前检查项
+ deny_list:[] # 出现在deny_list的包不执行当前检查项
+```
+
+### entry实现模板
+
+```yaml
+from src.ac.framework.ac_base import BaseCheck
+from src.ac.framework.ac_result import FAILED, SUCCESS, WARNING
+
+
+class Entry(BaseCheck):
+ def __call__(self, *args, **kwargs):
+ # do the work
+ ...
+
+ def check_case_a(self):
+ # do the check
+
+ return SUCCESS
+```
+
+### 检查结果
+
+| 返回码 | 描述 | emoji |
+| --- | --- | --- |
+| 0 | SUCCESS | :white_check_mark:|
+| 1 | WARNING | :bug: |
+| 2 | FAILED | :x:|
+
+## 支持的检查项
+| 检查项 | 目录 | 描述 |
+| --- | --- | --- |
+| spec文件 | spec | 检查homepage是否可以访问、版本号单调递增、检查补丁文件是否存在|
+| 代码风格 | code | 检查压缩包文件、检查补丁是否可以使用、执行linter工具 |
+| yaml文件 | package_yaml | |
+| license检查 | package_license | |
+| 代码片段检查 | sca | 目前只针对自研项目 |
\ No newline at end of file
diff --git a/src/ac/__init__.py b/src/ac/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/src/ac/acl/__init__.py b/src/ac/acl/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/src/ac/acl/binary/__init__.py b/src/ac/acl/binary/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/src/ac/acl/binary/check_binary_file.py b/src/ac/acl/binary/check_binary_file.py
new file mode 100644
index 0000000000000000000000000000000000000000..61f2e3ff9ed6567285d175e1dbd206d4c6aecf7e
--- /dev/null
+++ b/src/ac/acl/binary/check_binary_file.py
@@ -0,0 +1,143 @@
+# -*- encoding=utf-8 -*-
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2022-02-09
+# Description: check binary file
+# **********************************************************************************
+
+import os
+import shutil
+import logging
+
+from src.ac.framework.ac_base import BaseCheck
+from src.ac.framework.ac_result import FAILED, SUCCESS
+from src.ac.common.gitee_repo import GiteeRepo
+from pyrpm.spec import Spec, replace_macros
+
+from src.utils.shell_cmd import shell_cmd
+
+logger = logging.getLogger("ac")
+
+
+class CheckBinaryFile(BaseCheck):
+ """
+ check binary file
+ """
+ # 二进制文件后缀集
+ BINARY_LIST = {".pyc", ".jar", ".o", ".ko"}
+
+ def __init__(self, workspace, repo, conf):
+ super(CheckBinaryFile, self).__init__(workspace, repo, conf)
+ self._work_tar_dir = os.path.join(workspace, "code") # 解压缩目标目录
+ self._gr = GiteeRepo(self._repo, self._work_dir, self._work_tar_dir)
+ self._tarball_in_spec = set()
+ self._upstream_community_tarball_in_spec()
+
+ def __call__(self, *args, **kwargs):
+ """
+ 入口函数
+ :param args:
+ :param kwargs:
+ :return:
+ """
+ logger.info("check %s binary files ...", self._repo)
+
+ _ = not os.path.exists(self._work_tar_dir) and os.mkdir(self._work_tar_dir)
+ try:
+ return self.start_check_with_order("compressed_file", "binary")
+ finally:
+ shutil.rmtree(self._work_tar_dir)
+
+ def check_compressed_file(self):
+ """
+ 解压缩包
+ """
+ need_compress_files = []
+ for decompress_file in self._gr.get_compress_files():
+ if decompress_file not in self._tarball_in_spec:
+ need_compress_files.append(decompress_file)
+ self._gr.set_compress_files(need_compress_files)
+ return SUCCESS if 0 == self._gr.decompress_all() else FAILED
+
+ def check_binary(self):
+ """
+ 检查二进制文件
+ """
+ suffixes_list = self._get_all_file_suffixes(self._work_tar_dir)
+ if suffixes_list:
+ logger_con = ["%s: \n%s" % (key, value) for suffix_list in suffixes_list for key, value in
+ suffix_list.items()]
+ logger.error("binary file of type exists:\n%s", "\n".join(logger_con))
+ return FAILED
+ else:
+ return SUCCESS
+
+ @staticmethod
+ def _is_compress_tar_file(filename):
+ """
+ 判断文件名是否是指定压缩文件
+ filename:文件名
+ 返回值:bool
+ """
+ return filename.endswith((".tar.gz", ".tar.bz", ".tar.bz2", ".tar.xz", "tgz", ".zip"))
+
+ def _upstream_community_tarball_in_spec(self):
+ """
+ spec指定的上游社区压缩包
+ """
+ if self._gr.spec_file is None:
+ logger.error("spec file not find")
+ return
+ with open(os.path.join(self._work_dir, self._gr.spec_file), "r", encoding="utf-8") as fp:
+ adapter = Spec.from_string(fp.read())
+ for filename in adapter.__dict__.get("sources"):
+ spec_src_name = replace_macros(filename, adapter)
+ if "http" in spec_src_name and self._is_compress_tar_file(spec_src_name):
+ self._tarball_in_spec.add(spec_src_name.split("/")[-1] if filename else "")
+ info_msg = "spec指定的上游社区压缩包:%s" % self._tarball_in_spec if self._tarball_in_spec else "暂无spec指定的上游社区压缩包"
+ logger.info(info_msg)
+ return self._tarball_in_spec
+
+ def _get_all_file_suffixes(self, path):
+ """
+ 获取当前文件中所有文件名后缀,并判断
+ """
+ binary_list = []
+ if not os.path.exists(path):
+ return binary_list
+ for root, _, files in os.walk(path):
+ binary_file = []
+ for single_file in files:
+ file_suffixes = os.path.splitext(single_file)[1]
+ if file_suffixes in self.BINARY_LIST:
+ if self._get_file_type(os.path.join(root, single_file)):
+ binary_file.append(single_file)
+ if binary_file:
+ binary_list.append({root.split("code")[-1]: binary_file})
+ return binary_list
+
+ @staticmethod
+ def _get_file_type(filepath):
+ """
+ run the file command to check whether a file is a binary file
+ :param filepath:
+ :return:
+ """
+ file_cmd = "file -b {}".format(filepath)
+ ret, out, _ = shell_cmd(file_cmd)
+ if ret:
+ logger.error("file command error, %s", ret)
+ return False
+ if out and "text" not in str(out).lower():
+ logger.debug("%s not a text file", filepath)
+ return True
+ return False
diff --git a/src/ac/acl/code/__init__.py b/src/ac/acl/code/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/src/ac/acl/code/check_code_style.py b/src/ac/acl/code/check_code_style.py
new file mode 100644
index 0000000000000000000000000000000000000000..5cd6d782a911d24dc85f66d35a4b0c85b4b8e070
--- /dev/null
+++ b/src/ac/acl/code/check_code_style.py
@@ -0,0 +1,159 @@
+# -*- encoding=utf-8 -*-
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2020-09-23
+# Description: check code style
+# **********************************************************************************
+
+import os
+import shutil
+import logging
+
+from src.proxy.git_proxy import GitProxy
+from src.ac.framework.ac_base import BaseCheck
+from src.ac.framework.ac_result import FAILED, WARNING, SUCCESS
+from src.ac.common.gitee_repo import GiteeRepo
+from src.ac.common.linter import LinterCheck
+from src.ac.common.rpm_spec_adapter import RPMSpecAdapter
+
+logger = logging.getLogger("ac")
+
+
+class CheckCodeStyle(BaseCheck):
+ """
+ check code style
+ """
+ def __init__(self, workspace, repo, conf):
+ super(CheckCodeStyle, self).__init__(workspace, repo, conf)
+
+ self._work_tar_dir = os.path.join(workspace, "code") # 解压缩目标目录
+
+ self._gr = GiteeRepo(self._repo, self._work_dir, self._work_tar_dir)
+
+ def check_compressed_file(self):
+ """
+ 解压缩包
+ """
+ return SUCCESS if 0 == self._gr.decompress_all() else FAILED
+
+ def check_patch(self):
+ """
+ 应用所有patch
+ """
+ patches = []
+ if self._gr.spec_file:
+ spec = RPMSpecAdapter(os.path.join(self._work_dir, self._gr.spec_file))
+ patches = spec.patches
+
+ rs = self._gr.apply_all_patches(*patches)
+
+ if 0 == rs:
+ return SUCCESS
+
+ return WARNING if 1 == rs else FAILED
+
+ def check_code_style(self):
+ """
+ 检查代码风格
+ :return:
+ """
+ gp = GitProxy(self._work_dir)
+ diff_files = gp.diff_files_between_commits("HEAD~1", "HEAD~0")
+ logger.debug("diff files: %s", diff_files)
+
+ diff_code_files = [] # 仓库中变更的代码文件
+ diff_patch_code_files = [] # patch内的代码文件
+ for diff_file in diff_files:
+ if GiteeRepo.is_code_file(diff_file):
+ diff_code_files.append(diff_file)
+ elif GiteeRepo.is_patch_file(diff_file):
+ patch_dir = self._gr.patch_dir_mapping.get(diff_file)
+ logger.debug("diff patch %s apply at dir %s", diff_file, patch_dir)
+ if patch_dir is not None:
+ files_in_patch = gp.extract_files_path_of_patch(diff_file)
+ patch_code_files = [os.path.join(patch_dir, file_in_patch)
+ for file_in_patch in files_in_patch
+ if GiteeRepo.is_code_file(file_in_patch)]
+ # care about deleted file in patch, filter with "git patch --summary" maybe better
+ diff_patch_code_files.extend([code_file
+ for code_file in patch_code_files
+ if os.path.exists(code_file)])
+
+ logger.debug("diff code files: %s", diff_code_files)
+ logger.debug("diff patch code files: %s", diff_patch_code_files)
+
+ rs_1 = self.check_file_under_work_dir(diff_code_files)
+ logger.debug("check_file_under_work_dir: %s", rs_1)
+ rs_2 = self.check_files_inner_patch(diff_patch_code_files)
+ logger.debug("check_files_inner_patch: %s", rs_2)
+
+ return rs_1 + rs_2
+
+ def check_file_under_work_dir(self, diff_code_files):
+ """
+ 检查仓库中变更的代码
+ :return:
+ """
+ rs = [self.__class__.check_code_file(filename) for filename in set(diff_code_files)]
+
+ return sum(rs, SUCCESS) if rs else SUCCESS
+
+ def check_files_inner_patch(self, diff_patch_code_files):
+ """
+ 检查仓库的patch内的代码
+ :return:
+ """
+ rs = [self.__class__.check_code_file(os.path.join(self._work_tar_dir, filename))
+ for filename in set(diff_patch_code_files)]
+
+ return sum(rs, SUCCESS) if rs else SUCCESS
+
+ @classmethod
+ def check_code_file(cls, file_path):
+ """
+ 检查代码风格
+ :param file_path:
+ :return:
+ """
+ if GiteeRepo.is_py_file(file_path):
+ rs = LinterCheck.check_python(file_path)
+ elif GiteeRepo.is_go_file(file_path):
+ rs = LinterCheck.check_golang(file_path)
+ elif GiteeRepo.is_c_cplusplus_file(file_path):
+ rs = LinterCheck.check_c_cplusplus(file_path)
+ else:
+ logger.error("error when arrive here, unsupport file %s", file_path)
+ return SUCCESS
+
+ logger.info("Linter: %s %s", file_path, rs)
+ if rs.get("F", 0) > 0:
+ return FAILED
+
+ if rs.get("W", 0) > 0 or rs.get("E", 0) > 0:
+ return WARNING
+
+ return SUCCESS
+
+ def __call__(self, *args, **kwargs):
+ """
+ 入口函数
+ :param args:
+ :param kwargs:
+ :return:
+ """
+ logger.info("check %s repo ...", self._repo)
+
+ _ = not os.path.exists(self._work_tar_dir) and os.mkdir(self._work_tar_dir)
+ try:
+ return self.start_check_with_order("compressed_file", "patch", "code_style")
+ finally:
+ shutil.rmtree(self._work_tar_dir)
diff --git a/src/ac/acl/commit_msg/__init__.py b/src/ac/acl/commit_msg/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..60fc9c4fa75a9a286851448688e79fd222895e3e
--- /dev/null
+++ b/src/ac/acl/commit_msg/__init__.py
@@ -0,0 +1,17 @@
+# -*- encoding=utf-8 -*-
+"""
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v1.
+# You can use this software according to the terms and conditions of the Mulan PSL v1.
+# You may obtain a copy of Mulan PSL v1 at:
+# http://license.coscl.org.cn/MulanPSL
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND, EITHER EXPRESS OR
+# IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT, MERCHANTABILITY OR FIT FOR A PARTICULAR
+# PURPOSE.
+# See the Mulan PSL v1 for more details.
+# Author:
+# Create: 2021-08-03
+# Description: check code static
+# **********************************************************************************
+"""
\ No newline at end of file
diff --git a/src/ac/acl/commit_msg/check_commit_msg.py b/src/ac/acl/commit_msg/check_commit_msg.py
new file mode 100644
index 0000000000000000000000000000000000000000..7257d7a0db1424ceae0264e2c357fe0369f1dcd4
--- /dev/null
+++ b/src/ac/acl/commit_msg/check_commit_msg.py
@@ -0,0 +1,100 @@
+import logging
+import subprocess
+import os
+
+from src.ac.framework.ac_base import BaseCheck
+from src.ac.framework.ac_result import FAILED, SUCCESS
+
+logger = logging.getLogger("ac")
+
+
+class CheckCommitMsg(BaseCheck):
+ """
+ check commit msg
+ """
+
+ def __init__(self, workspace, repo, conf=None):
+ """
+
+ :param workspace:
+ :param repo:
+ :param conf:
+ """
+ super(CheckCommitMsg, self).__init__(workspace, repo, conf)
+
+ # wait to initial
+ self._pr_number = None
+ self._tbranch = None
+ self._repo = repo
+ self._work_dir = workspace
+
+ def __call__(self, *args, **kwargs):
+ """
+ 入口函数
+ :param args:
+ :param kwargs:
+ :return:
+ """
+ logger.info("check %s commit msg ...", self._repo)
+ logger.debug("args: %s, kwargs: %s", args, kwargs)
+ checkcommit = kwargs.get("codecheck", {})
+
+ self._pr_number = checkcommit.get("pr_number", "")
+ self._tbranch = kwargs.get("tbranch", None)
+ return self.start_check()
+
+ @staticmethod
+ def get_commit_msg_result(commit_list, work_dir, gitlint_dir):
+ commit_check_dist = {}
+ try:
+ for commit in commit_list:
+ commit = str(commit, encoding='utf-8')
+ commit = commit.replace('\r', '')
+ commit = commit.replace('\n', '')
+ # get commit and add to dist for follow using
+ command = "gitlint --commit " + commit + " -C " + gitlint_dir + "/.gitlint"
+ res = subprocess.Popen(command, cwd=work_dir, shell=True, stderr=subprocess.PIPE)
+ out = res.stderr.readlines()
+ if len(out) > 0 :
+ out_str = ""
+ for line in out:
+ out_str += str(line, encoding='utf-8')
+ commit_check_dist[commit] = out_str
+ except Exception as e:
+ logger.warning(e)
+ return commit_check_dist
+
+ def check_commit_msg(self):
+ """
+ 开始进行commitmsgcheck检查
+ """
+ # prepare git branch environment
+ repo_dir = self._work_dir + "/" + self._repo
+ logger.info("repoDir: %s", repo_dir)
+ branch_log_cmd = "git fetch origin +refs/heads/{}:refs/remotes/origin/{}".format(self._tbranch, self._tbranch)
+ branch_log_pro = subprocess.Popen(branch_log_cmd, cwd=repo_dir, shell=True, stdout=subprocess.PIPE)
+ logger.info("git featch res: ")
+ logger.info(branch_log_pro.stdout.read())
+ branch_log_cmd = "git log HEAD...origin/" + self._tbranch + " --no-merges --pretty=%H"
+ branch_log_pro = subprocess.Popen(branch_log_cmd, cwd=repo_dir, shell=True, stdout=subprocess.PIPE)
+ branch_log_res = branch_log_pro.stdout.readlines()
+ conf_dir = os.path.realpath(os.path.join(os.path.dirname(os.path.realpath(__file__)), "../../../conf/"))
+ commit_check_dist = self.get_commit_msg_result(branch_log_res, repo_dir, conf_dir)
+ if len(commit_check_dist) > 0:
+ log_format = """
+commit specifications is:
+script: title
+
+this is commit body
+
+Signed-off-by: example example@xx.com
+the folowint commits do not conform to the specifications:
+ """
+ logger.info(log_format)
+ logger.info("==============================================================")
+ for commit, check_res in commit_check_dist.items():
+ logger.info("commit: %s", commit)
+ logger.info("check result: \n\r %s", check_res)
+ logger.info("==============================================================")
+ return FAILED
+ return SUCCESS
diff --git a/src/ac/acl/npmbuild/__init__.py b/src/ac/acl/npmbuild/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/src/ac/acl/npmbuild/build_job.py b/src/ac/acl/npmbuild/build_job.py
new file mode 100644
index 0000000000000000000000000000000000000000..79953b47656ea4633a68d314b77466443ce9e807
--- /dev/null
+++ b/src/ac/acl/npmbuild/build_job.py
@@ -0,0 +1,36 @@
+import requests
+
+
+def Builde_Job(opekgstoken, jobName, repoId):
+ url2 = 'https://build.dev.oepkgs.net/api/build/task/buildJob'
+
+ headers2 = {
+ 'authority': 'build.dev.oepkgs.net',
+ 'accept': '*/*',
+ 'accept-language': 'en-US,en;q=0.9,zh;q=0.8,zh-CN;q=0.7',
+ 'cache-control': 'no-cache',
+ 'content-type': 'application/json',
+ 'cookie': 'auth_token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoxMDksInVzZXJfbmFtZSI6ImRpZGkiLCJUWiI6IkFzaIAvU2hhbmdoYWkiLCJleHAiOjE2OTM1Mzk4MjgsImlhdCI6MTY5MzUzODAyOH0.sFZYWM-YiKKGcD2udonHmCM_kLxyb3Dt0P6xQZs49u8',
+ 'dnt': '1',
+ 'origin': 'https://build.dev.oepkgs.net',
+ 'pragma': 'no-cache',
+ 'referer': 'https://build.dev.oepkgs.net/rpm/task',
+ 'sec-ch-ua': '"Chromium";v="116", "Not)A;Brand";v="24", "Google Chrome";v="116"',
+ 'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Safari/537.36',
+ 'x-auth-token': opekgstoken,
+ }
+
+ data2 = {
+ 'jobName': jobName,
+ 'jobId': repoId,
+ }
+
+ response2 = requests.post(url2, headers=headers2, json=data2)
+
+ print(response2.status_code)
+ print(response2.text)
+
+ if(response2.status_code == '200'):
+ return 'success'
+ else:
+ return 'false'
diff --git a/src/ac/acl/npmbuild/check_build.py b/src/ac/acl/npmbuild/check_build.py
new file mode 100644
index 0000000000000000000000000000000000000000..8a5a5873397a9a1801d619633e3453a20e45f3de
--- /dev/null
+++ b/src/ac/acl/npmbuild/check_build.py
@@ -0,0 +1,40 @@
+import build_job
+import create_job
+import get_build_record
+import get_oepkgs_token
+import logging
+
+from src.ac.framework.ac_base import BaseCheck
+from src.ac.framework.ac_result import FAILED, SUCCESS, WARNING
+
+logger = logging.getLogger("ac")
+
+class CheckBuild(BaseCheck):
+ def __init__(self, workspace, repo, conf, dataset):
+ super(CheckBuild, self).__init__(workspace, repo, conf)
+ self.dd = dataset
+ self.repo_url = "https://gitee.com/{}/{}.git".format(self.dd.community, repo)
+
+ def __call__(self, *args, **kwargs):
+ self.token = get_oepkgs_token.get_Oepkgs_Token()
+
+ def check_build(self):
+ cre_tag, jobName, repoId, jobId = create_job.Create_Job(self.token, self.repo_url, 'x86_64', self.dd.tbranch, self.dd.build_id)
+ if(cre_tag == 'success'):
+ bui_tag = build_job.Builde_Job(self.token, jobName, repoId)
+ if(bui_tag == 'success'):
+ res = get_build_record.get_Build_Record(self.token, jobName)
+ if(res == 'success'):
+ return SUCCESS
+ else:
+ return FAILED
+ else:
+ return FAILED
+ else:
+ return FAILED
+
+
+
+
+
+
diff --git a/src/ac/acl/npmbuild/create_job.py b/src/ac/acl/npmbuild/create_job.py
new file mode 100644
index 0000000000000000000000000000000000000000..b223074fc089cb77ff5b6865dd3bdbf8116c482e
--- /dev/null
+++ b/src/ac/acl/npmbuild/create_job.py
@@ -0,0 +1,53 @@
+import requests
+import uuid
+import json
+
+def Create_Job(opekgstoken, repo_url, ccArch, tbranch, build_id):
+
+ url = 'https://build.dev.oepkgs.net/api/build/task/create'
+
+ headers = {
+ 'authority': 'build.dev.oepkgs.net',
+ 'accept': '*/*',
+ 'accept-language': 'en-US,en;q=0.9,zh;q=0.8,zh-CN;q=0.7',
+ 'cache-control': 'no-cache',
+ 'content-type': 'application/json',
+ 'cookie': 'auth_token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoxMDksInVzZXJfbmFtZSI6ImRpZGkiLCJUWiI6IkFzaWEvU2hhbmdoYWkiLCJleHAiOjE2OTMzNzAwMzMsImlhdCI6MTY5MzM2ODIzM30.szb3CB_U3DJOVG94JzHkUBqTjD0x8WM0JCJz3fNODuc',
+ 'dnt': '1',
+ 'origin': 'https://build.dev.oepkgs.net',
+ 'pragma': 'no-cache',
+ 'referer': 'https://build.dev.oepkgs.net/rpm/task/create',
+ 'sec-ch-ua': '"Chromium";v="116", "Not)A;Brand";v="24", "Google Chrome";v="116"',
+ 'sec-ch-ua-mobile': '?0',
+ 'sec-ch-ua-platform': '"Windows"',
+ 'sec-fetch-dest': 'empty',
+ 'sec-fetch-mode': 'cors',
+ 'sec-fetch-site': 'same-origin',
+ 'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Safari/537.36',
+ 'x-auth-token': opekgstoken,
+ }
+
+ data = {
+ 'builder': 1,
+ 'jobName': str(uuid.uuid4()),
+ 'os': 'openEuler',
+ 'osFull': '22.03-LTS',
+ 'ccArch': ccArch,
+ 'repoId': build_id,
+ 'scmRepo': repo_url,
+ 'branch': tbranch
+ }
+
+ response = requests.post(url, headers=headers, json=data)
+
+ print(response.status_code)
+ print(response.text)
+
+ res_data = json.loads(response.text)
+
+ print(res_data)
+
+ if(response.status_code == '200'):
+ return 'success', data['jobName'], data['repoId'], res_data['data']['jobId']
+ else:
+ return 'false', 'nodata', 'nodata', 'nodata'
\ No newline at end of file
diff --git a/src/ac/acl/npmbuild/get_build_record.py b/src/ac/acl/npmbuild/get_build_record.py
new file mode 100644
index 0000000000000000000000000000000000000000..1832fc8e6f91a85772c2b2bd455324e337cf8d9a
--- /dev/null
+++ b/src/ac/acl/npmbuild/get_build_record.py
@@ -0,0 +1,28 @@
+import requests
+import json
+
+def get_Build_Record(opekgstoken, jobName):
+ url = 'https://build.dev.oepkgs.net/api/build/task/getBuildRecord/955?pageNum=1&pageSize=5'
+
+ headers = {
+ 'authority': 'build.dev.oepkgs.net',
+ 'accept': '*/*',
+ 'accept-language': 'en-US,en;q=0.9,zh;q=0.8,zh-CN;q=0.7',
+ 'cache-control': 'no-cache',
+ 'content-type': 'application/json',
+ 'cookie': 'auth_token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoxMDksInVzZXJfbmFtZSI6ImRpZGkiLCJUWiI6IkFzaWEvU2hhbmdoYWkiLCJleHAiOjE2OTM1Mzk4MjgsImlhdCI6MTY5MzUzODAyOH0.sFZYWM-YiKKGcD2udonHmCM_kLxyb3Dt0P6xQZs49u8',
+ 'dnt': '1',
+ 'pragma': 'no-cache',
+ 'referer': 'https://build.dev.oepkgs.net/rpm/task',
+ 'sec-ch-ua': '"Chromium";v="116", "Not)A;Brand";v="24", "Google Chrome";v="116"',
+ 'x-auth-token': opekgstoken,
+ }
+
+ params = {
+ "jobName": jobName,
+ }
+
+ response = requests.get(url, headers=headers, params=params)
+ res_data = json.loads(response.text)
+
+ return res_data['success']
diff --git a/src/ac/acl/npmbuild/get_oepkgs_token.py b/src/ac/acl/npmbuild/get_oepkgs_token.py
new file mode 100644
index 0000000000000000000000000000000000000000..356059c09e1f18e921eaba56db6294fe55e3ba3b
--- /dev/null
+++ b/src/ac/acl/npmbuild/get_oepkgs_token.py
@@ -0,0 +1,37 @@
+from selenium import webdriver
+from selenium.webdriver.common.by import By
+
+def get_Oepkgs_Token():
+ options = webdriver.ChromeOptions()
+ options.add_argument("--auto-open-devtools-for-tabs")
+
+ driver = webdriver.Chrome(options=options)
+
+ # 打开登录页面
+ driver.get("https://build.dev.oepkgs.net/rpm/task") # 替换成你的登录页面地址
+
+ try:
+ driver.implicitly_wait(10)
+
+ # 找到用户名和密码输入框,并输入相应的信息
+ username_input = driver.find_element(By.NAME, "username")
+ password_input = driver.find_element(By.NAME, "password")
+
+ username_input.send_keys("mengling_cui@163.com")
+ password_input.send_keys("123456Aa!")
+
+ # 根据按钮文本来定位立即登录按钮
+ login_button = driver.find_element(By.XPATH, "//button[contains(text(), '立即登录')]")
+ login_button.click()
+
+ # 获取cookie
+ cookies = driver.get_cookies()
+
+ for cookie in cookies:
+ print(cookie['name'], cookie['value'])
+
+ finally:
+ # 关闭WebDriver
+ driver.quit()
+
+ return cookie['value']
\ No newline at end of file
diff --git a/src/ac/acl/openlibing/__init__.py b/src/ac/acl/openlibing/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..60fc9c4fa75a9a286851448688e79fd222895e3e
--- /dev/null
+++ b/src/ac/acl/openlibing/__init__.py
@@ -0,0 +1,17 @@
+# -*- encoding=utf-8 -*-
+"""
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v1.
+# You can use this software according to the terms and conditions of the Mulan PSL v1.
+# You may obtain a copy of Mulan PSL v1 at:
+# http://license.coscl.org.cn/MulanPSL
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND, EITHER EXPRESS OR
+# IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT, MERCHANTABILITY OR FIT FOR A PARTICULAR
+# PURPOSE.
+# See the Mulan PSL v1 for more details.
+# Author:
+# Create: 2021-08-03
+# Description: check code static
+# **********************************************************************************
+"""
\ No newline at end of file
diff --git a/src/ac/acl/openlibing/check_code.py b/src/ac/acl/openlibing/check_code.py
new file mode 100644
index 0000000000000000000000000000000000000000..d4aa21fc08133b87191a834398857156c1fcfc62
--- /dev/null
+++ b/src/ac/acl/openlibing/check_code.py
@@ -0,0 +1,137 @@
+# -*- encoding=utf-8 -*-
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2021-08-03
+# Description: check code static
+# **********************************************************************************
+
+import logging
+import time
+
+from src.ac.framework.ac_base import BaseCheck
+from src.ac.framework.ac_result import FAILED, WARNING, SUCCESS
+from src.proxy.requests_proxy import do_requests
+
+logger = logging.getLogger("ac")
+
+
+class CheckCode(BaseCheck):
+ """
+ code check
+ """
+
+ def __init__(self, workspace, repo, conf=None):
+ """
+
+ :param workspace:
+ :param repo:
+ :param conf:
+ """
+ super(CheckCode, self).__init__(workspace, repo, conf)
+
+ # wait to initial
+ self._pr_url = None
+ self._pr_number = None
+ self._codecheck_api_url = None
+ self._codecheck_api_key = None
+
+ def __call__(self, *args, **kwargs):
+ """
+ 入口函数
+ :param args:
+ :param kwargs:
+ :return:
+ """
+ logger.info("check %s code ...", self._repo)
+ logger.debug("args: %s, kwargs: %s", args, kwargs)
+ codecheck_conf = kwargs.get("codecheck", {})
+
+ self._pr_url = codecheck_conf.get("pr_url", "")
+ self._pr_number = codecheck_conf.get("pr_number", "")
+ self._codecheck_api_url = codecheck_conf.get("codecheck_api_url", "")
+ self._codecheck_api_key = codecheck_conf.get('codecheck_api_key', "")
+
+ return self.start_check()
+
+ @staticmethod
+ def get_codecheck_result(pr_url, codecheck_api_url, codecheck_api_key):
+ """
+ 通过api调用codecheck
+ """
+ # get codecheck Api Token
+ codecheck_token_api_url = '{}/token/{}'.format(codecheck_api_url, codecheck_api_key)
+ token_resp = {}
+ rs = do_requests("get", codecheck_token_api_url, obj=token_resp)
+ if rs != 0 or token_resp.get("code", "") != "200":
+ logger.error("get dynamic token failed")
+ return 'false', {}
+
+ token = token_resp.get("data")
+ data = {"pr_url": pr_url, "token": token}
+ response_content = {}
+ # 创建codecheck检查任务
+ codecheck_task_api_url = "{}/task".format(codecheck_api_url)
+ rs = do_requests("get", codecheck_task_api_url, querystring=data, obj=response_content)
+ if rs != 0 or response_content.get('code', '') != '200':
+ if response_content.get('msg').find("There is no proper set of languages") != -1:
+ response_content.update(code="200", msg="success", state="pass")
+ return 0, response_content
+ logger.error("create codecheck task failed; %s", response_content.get('msg', ''))
+ return 'false', {}
+
+ uuid = response_content.get('uuid')
+ task_id = response_content.get('task_id')
+ data = {"uuid": uuid, "token": token}
+ codecheck_status_api_url = '{}/{}/status'.format(codecheck_api_url, task_id)
+ current_time = 0
+ logger.info("codecheck probably need to 5min")
+ # 定时5min
+ while current_time < 300:
+ time.sleep(10)
+ response_content = {}
+ # 检查codecheck任务的执行状态
+ rs = do_requests("get", codecheck_status_api_url, querystring=data, obj=response_content)
+ if rs == 0 and response_content.get('code') == '100':
+ current_time = current_time + 10
+ continue
+ else:
+ break
+ return rs, response_content
+
+ def check_code(self):
+ """
+ 开始进行codecheck检查
+ """
+ # 等待计算结果
+ rs, response_content = self.get_codecheck_result(self._pr_url, self._codecheck_api_url, self._codecheck_api_key)
+
+ # 判断是否计算完成
+ if rs != 0:
+ return FAILED
+
+ if response_content.get('msg') == 'success':
+ """
+ # 返回结果 {
+ "code": "200",
+ "msg": "success",
+ "data": "http://{ip}:{port}/inc/{projectId}/reports/{taskId}/detail" 一个可以看到codecheck检查结果详情的地址
+ "state": "pass(通过)/no pass(不通过)"
+ }
+ """
+ logger.warning("click %s view code check detail", response_content.get('data'))
+ # 只有codecheck完成且codecheck检查的代码中存在bug,返回检查项失败的结果,以detail结尾,会显示具体的代码bug所在位置。
+ if response_content.get("state") == "no pass":
+ return FAILED
+ else:
+ logger.error("code check failed, info : %s", response_content.get('msg'))
+
+ return SUCCESS
diff --git a/src/ac/acl/package_license/__init__.py b/src/ac/acl/package_license/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..36e95ff3db7d173ff656de1605ce299b4e77f17a
--- /dev/null
+++ b/src/ac/acl/package_license/__init__.py
@@ -0,0 +1,17 @@
+# -*- encoding=utf-8 -*-
+"""
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v1.
+# You can use this software according to the terms and conditions of the Mulan PSL v1.
+# You may obtain a copy of Mulan PSL v1 at:
+# http://license.coscl.org.cn/MulanPSL
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND, EITHER EXPRESS OR
+# IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT, MERCHANTABILITY OR FIT FOR A PARTICULAR
+# PURPOSE.
+# See the Mulan PSL v1 for more details.
+# Author:
+# Create: 2020-10-16
+# Description: check spec file
+# **********************************************************************************
+"""
\ No newline at end of file
diff --git a/src/ac/acl/package_license/check_license.py b/src/ac/acl/package_license/check_license.py
new file mode 100644
index 0000000000000000000000000000000000000000..cbe411a81ea5c575c4945685e0a33700cfcb5849
--- /dev/null
+++ b/src/ac/acl/package_license/check_license.py
@@ -0,0 +1,134 @@
+# -*- encoding=utf-8 -*-
+"""
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2020-10-16
+# Description: check spec file
+# **********************************************************************************
+"""
+
+import logging
+import os
+import shutil
+
+from src.ac.acl.package_license.package_license import PkgLicense
+from src.ac.common.gitee_repo import GiteeRepo
+from src.ac.common.rpm_spec_adapter import RPMSpecAdapter
+from src.ac.framework.ac_base import BaseCheck
+from src.ac.framework.ac_result import FAILED, WARNING, SUCCESS
+from src.proxy.git_proxy import GitProxy
+
+logger = logging.getLogger("ac")
+
+
+class CheckLicense(BaseCheck):
+ """
+ check license in spec and src-code
+ """
+
+ def __init__(self, workspace, repo, conf=None):
+ super(CheckLicense, self).__init__(workspace, repo, conf)
+
+ self._gp = GitProxy(self._work_dir)
+ self._work_tar_dir = os.path.join(workspace, "code")
+ self._gr = GiteeRepo(self._repo, self._work_dir, self._work_tar_dir)
+ if self._gr.spec_file:
+ self._spec = RPMSpecAdapter(os.path.join(self._work_dir, self._gr.spec_file))
+ else:
+ self._spec = None
+
+ self._pkg_license = PkgLicense()
+ self._license_in_spec = set()
+ self._license_in_src = set()
+
+ def __call__(self, *args, **kwargs):
+ """
+ 入口函数
+ :param args:
+ :param kwargs:
+ :return:
+ """
+ logger.info("check %s license ...", self._repo)
+
+ if not os.path.exists(self._work_tar_dir):
+ os.mkdir(self._work_tar_dir)
+ self._gr.decompress_all() # decompress all compressed file into work_tar_dir
+ codecheck = kwargs.get("codecheck", {})
+ pr_url = codecheck.get("pr_url", "")
+ self.response_content = self._pkg_license.get_license_info(pr_url)
+
+ try:
+ return self.start_check_with_order("license_in_spec", "license_in_src", "license_is_same",
+ "copyright_in_repo")
+ finally:
+ shutil.rmtree(self._work_tar_dir)
+
+ def check_license_in_spec(self):
+ """
+ check whether the license in spec file is in white list
+ :return
+ """
+ if self._spec is None:
+ logger.error("spec file not find")
+ return FAILED
+ spec_license_legal = self.response_content.get("spec_license_legal")
+
+ if not spec_license_legal:
+ logger.warning("No spec license data is obtained")
+ return WARNING
+
+ res = spec_license_legal.get("pass")
+ if res:
+ logger.info("the license in spec is free")
+ return SUCCESS
+ else:
+ notice_content = spec_license_legal.get("notice")
+ logger.warning("License notice: %s", notice_content)
+ black_reason = spec_license_legal.get("detail").get("is_white").get("blackReason")
+ if black_reason:
+ logger.error("License black reason: %s", black_reason)
+ return FAILED
+
+ def check_license_in_src(self):
+ """
+ check whether the license in src file is in white list
+ :return
+ """
+ return self._pkg_license.check_license_in_scope()
+
+ def check_license_is_same(self):
+ """
+ check whether the license in spec file and in src file is same
+ :return
+ """
+ if self._spec is None:
+ logger.error("spec file not find")
+ return FAILED
+ self._license_in_spec = self._spec.license
+ self._license_in_src = self._pkg_license.scan_licenses_in_license(self._work_tar_dir)
+ self._license_in_src = self._pkg_license.translate_license(self._license_in_src)
+ if self._pkg_license.check_licenses_is_same(self._license_in_spec, self._license_in_src,
+ self._pkg_license.later_support_license):
+ logger.info("licenses in src:%s and in spec:%s are same", self._license_in_src,
+ self._license_in_spec)
+ return SUCCESS
+ else:
+ logger.error("licenses in src:%s and in spec:%s are not same", self._license_in_src,
+ self._license_in_spec)
+ return WARNING
+
+ def check_copyright_in_repo(self):
+ """
+ check whether the copyright in src file
+ :return
+ """
+ return self._pkg_license.check_repo_copyright_legal()
diff --git a/src/ac/acl/package_license/check_openeuler_license.py b/src/ac/acl/package_license/check_openeuler_license.py
new file mode 100644
index 0000000000000000000000000000000000000000..4f4faee15ee4a70ee0b906026e488d72366b6964
--- /dev/null
+++ b/src/ac/acl/package_license/check_openeuler_license.py
@@ -0,0 +1,99 @@
+# -*- encoding=utf-8 -*-
+"""
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2022-8-26
+# Description: check openeuler license file
+# **********************************************************************************
+"""
+
+import logging
+import os
+import shutil
+
+from src.ac.acl.package_license.package_license import PkgLicense
+from src.ac.common.gitee_repo import GiteeRepo
+from src.ac.common.rpm_spec_adapter import RPMSpecAdapter
+from src.ac.framework.ac_base import BaseCheck
+from src.ac.framework.ac_result import FAILED, WARNING, SUCCESS
+from src.proxy.git_proxy import GitProxy
+
+logger = logging.getLogger("ac")
+
+
+class CheckOpeneulerLicense(BaseCheck):
+ """
+ check license and copyright in repo
+ """
+
+ def __init__(self, workspace, repo, conf=None):
+ super(CheckOpeneulerLicense, self).__init__(workspace, repo, conf)
+
+ self._pkg_license = PkgLicense()
+
+ def __call__(self, *args, **kwargs):
+ """
+ 入口函数
+ :param args:
+ :param kwargs:
+ :return:
+ """
+ logger.info("check %s license ...", self._repo)
+ codecheck = kwargs.get("codecheck", {})
+ pr_url = codecheck.get("pr_url", "")
+ self.response_content = self._pkg_license.get_license_info(pr_url)
+
+ return self.start_check_with_order("license_in_repo", "license_in_scope", "copyright_in_repo")
+
+ def check_license_in_repo(self):
+ """
+ check whether the license in repo is in white list
+ :return
+ """
+ repo_license_legal = self.response_content.get("repo_license_legal")
+ if not repo_license_legal:
+ logger.warning("No repo license data is obtained")
+ return WARNING
+ res = repo_license_legal.get("pass")
+ if res:
+ logger.info("the license in repo is free")
+ return SUCCESS
+ else:
+ notice_content = repo_license_legal.get("notice")
+ logger.warning("License notice: %s", notice_content)
+
+ is_legal = repo_license_legal.get("is_legal")
+ if is_legal:
+ is_legal_pass = is_legal.get("pass")
+ if not is_legal_pass:
+ detail = is_legal.get("detail")
+ if detail:
+ black_reason = detail.get("is_white").get("blackReason")
+ if black_reason:
+ logger.error("License black reason: %s", black_reason)
+
+ return FAILED
+
+ def check_license_in_scope(self):
+ """
+ check whether the license in src file is in white list
+ :return
+ """
+ return self._pkg_license.check_license_in_scope()
+
+ def check_copyright_in_repo(self):
+ """
+ check whether the copyright in src file
+ :return
+ """
+ return self._pkg_license.check_repo_copyright_legal()
+
diff --git a/src/ac/acl/package_license/package_license.py b/src/ac/acl/package_license/package_license.py
new file mode 100644
index 0000000000000000000000000000000000000000..34d0972783e585f839210881b19576199bc5cc92
--- /dev/null
+++ b/src/ac/acl/package_license/package_license.py
@@ -0,0 +1,226 @@
+# -*- encoding=utf-8 -*-
+"""
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2020-10-16
+# Description: check spec file
+# **********************************************************************************
+"""
+
+import logging
+import os
+import re
+import json
+import chardet
+
+from src.proxy.requests_proxy import do_requests
+from src.ac.framework.ac_result import FAILED, WARNING, SUCCESS
+
+logger = logging.getLogger("ac")
+
+
+class PkgLicense(object):
+ """
+ 解析获取软件包中源码、spec中的license
+ 进行白名单校验、一致性检查
+ """
+
+ LICENSE_FILE_TARGET = ["apache-2.0",
+ "artistic",
+ "artistic.txt",
+ "libcurllicense",
+ "gpl.txt",
+ "gpl2.txt",
+ "gplv2.txt",
+ "lgpl.txt",
+ "notice",
+ "about_bsd.txt",
+ "mit",
+ "pom.xml",
+ "meta.yml",
+ "meta.json",
+ "pkg-info"]
+
+ LICENSE_TARGET_PAT = re.compile(r"^(copying)|(copyright)|(copyrights)|(licenses)|(licen[cs]e)(\.(txt|xml))?")
+
+ def __init__(self):
+ self._license_translation = {}
+ self._later_support_license = {}
+ self.response_content = {}
+ self.license_url = "https://compliance.openeuler.org/sca"
+
+ def get_license_info(self, pr_url):
+ """
+ get license info from compliance2.openeuler.org
+ """
+
+ def analysis(response):
+ """
+ requests回调
+ :param response: requests response object
+ :return:
+ """
+ if json.loads(response.text):
+ self.response_content.update(json.loads(response.text))
+
+ if pr_url:
+ data = {"prUrl": pr_url}
+ rs = do_requests("get", url=self.license_url, timeout=360, querystring=data, obj=analysis)
+ if rs != 0:
+ logger.error("Failed to obtain %s information through service", pr_url)
+ return self.response_content
+
+ def check_license_in_scope(self):
+ """
+ check whether the license in src file is in white list
+ :return
+ """
+ license_in_scope = self.response_content.get("license_in_scope")
+
+ if not license_in_scope:
+ logger.warning("No src license data is obtained")
+ return WARNING
+
+ res = license_in_scope.get("pass")
+ if res:
+ logger.info("the license in scope is free")
+ return SUCCESS
+ else:
+ notice_content = license_in_scope.get("notice")
+ logger.warning("the license in scope is not pass, notice: %s", notice_content)
+ return FAILED
+
+ def check_repo_copyright_legal(self):
+ """
+ check whether the copyright in src file
+ :return
+ """
+ repo_copyright_legal = self.response_content.get("repo_copyright_legal")
+
+ if not repo_copyright_legal:
+ logger.warning("No copyright data is obtained")
+ return WARNING
+
+ res = repo_copyright_legal.get("pass")
+ if res:
+ logger.info("the copyright in repo is pass")
+ return SUCCESS
+ else:
+ notice_content = repo_copyright_legal.get("notice")
+ logger.warning("the copyright in repo is not pass, notice: %s", notice_content)
+ return WARNING
+
+ def translate_license(self, licenses):
+ """
+ Convert license to uniform format
+ """
+ result = set()
+ for lic in licenses:
+ real_license = self._license_translation.get(lic, lic)
+ result.add(real_license)
+ return result
+
+ # 以下为从license文件中获取license
+ def scan_licenses_in_license(self, srcdir):
+ """
+ Find LICENSE files and scan.
+ """
+ licenses_in_file = set()
+ if not os.path.exists(srcdir):
+ logger.error("%s not exist.", srcdir)
+ return licenses_in_file
+
+ for root, _, filenames in os.walk(srcdir):
+ for filename in filenames:
+ if (filename.lower() in self.LICENSE_FILE_TARGET
+ or self.LICENSE_TARGET_PAT.search(filename.lower())):
+ logger.info("scan the license target file: %s", os.path.join(root, filename).replace(srcdir, ""))
+ licenses_in_file.update(
+ self.scan_licenses(
+ os.path.join(root, filename)))
+ logger.info("all licenses from src: %s", ", ".join([data for data in licenses_in_file]))
+ return licenses_in_file
+
+ def scan_licenses(self, copying):
+ """
+ Scan licenses from copying file and add to licenses_for_source_files.
+ if get contents failed or decode data failed, return nothing.
+ """
+ licenses_in_file = set()
+
+ if not os.path.exists(copying):
+ logger.warning("file: %s not exist", copying)
+ return licenses_in_file
+
+ for word in self._license_translation:
+ if word in copying:
+ licenses_in_file.add(word)
+
+ with open(copying, "rb") as f:
+ data = f.read()
+ data = PkgLicense._auto_decode_str(data)
+ blank_pat = re.compile(r"\s+")
+ data = blank_pat.sub(" ", data)
+ if not data:
+ return licenses_in_file
+ for word in self._license_translation:
+ try:
+ if word in data:
+ pattern_str = r'(^{word}$)|(^{word}(\W+))|((\W+){word}$)|((\W+){word}(\W+))' \
+ .format(word=word)
+ if re.search(pattern_str, data):
+ licenses_in_file.add(word)
+ except UnicodeDecodeError as e:
+ logger.exception("decode error: %s", str(e))
+ return licenses_in_file
+
+ @staticmethod
+ def _decode_str(data, charset):
+ """
+ Decode the license string. return the license string or nothing.
+ """
+ if not charset:
+ return ""
+ try:
+ return data.decode(charset)
+ except UnicodeDecodeError as e:
+ logger.exception("decode error: %s", str(e))
+ return ""
+
+ @staticmethod
+ def _auto_decode_str(data):
+ return PkgLicense._decode_str(data, chardet.detect(data)["encoding"])
+
+ @staticmethod
+ def check_licenses_is_same(licenses_for_spec, licenses_for_source_files, later_support_license):
+ """
+ Check if the licenses from SPEC is the same as the licenses from LICENSE file.
+ if same, return True. if not same return False.
+ """
+ all_licenses_for_spec = set()
+ for spec_license in licenses_for_spec:
+ if "-or-later" in spec_license:
+ [license_name, least_version] = spec_license.split("-or-later")[0].split("-", 1)
+ if license_name not in later_support_license:
+ all_licenses_for_spec.add(spec_license)
+ continue
+ for version in later_support_license[license_name]["versions"]:
+ if version >= least_version:
+ all_licenses_for_spec.add("{}-{}-or-later".format(license_name, version))
+ all_licenses_for_spec.add("{}-{}-only".format(license_name, version))
+ else:
+ all_licenses_for_spec.add(spec_license)
+ return all_licenses_for_spec.issuperset(licenses_for_source_files)
+
+ @property
+ def later_support_license(self):
+ return self._later_support_license
diff --git a/src/ac/acl/package_yaml/check_repo.py b/src/ac/acl/package_yaml/check_repo.py
new file mode 100644
index 0000000000000000000000000000000000000000..858ca0008fae2b459ab3d3dfe8de80f30eb3f830
--- /dev/null
+++ b/src/ac/acl/package_yaml/check_repo.py
@@ -0,0 +1,549 @@
+# -*- encoding=utf-8 -*-
+"""
+# ***********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author: DisNight
+# Create: 2020-09-22
+# Description: check yaml file in software package
+# ***********************************************************************************/
+"""
+# \ / /| /| /| /
+# \ / /_| / | / | /
+# / / | / | / | /
+# / / | / |/ | /_____
+
+import abc
+import json
+import logging
+import os
+import re
+import subprocess
+import urllib.parse as urlparse
+
+import requests
+import tldextract
+
+logging.getLogger("ac")
+
+
+class AbsReleaseTags(object):
+ """
+ 获取release tags的抽象类
+ """
+
+ __metaclass__ = abc.ABCMeta
+
+ def __init__(self, version_control):
+ self.version_control = version_control
+
+ @abc.abstractmethod
+ def url(self, repo):
+ """
+ 抽象方法
+ """
+ pass
+
+ @abc.abstractmethod
+ def get_tags(self, repo):
+ """
+ 抽象方法
+ """
+ pass
+
+
+class DefaultReleaseTags(AbsReleaseTags):
+ """
+ 获取release tags的基类
+ """
+ def url(self, repo):
+ """
+ 通过src_repo生成url
+ return: str
+ """
+ return ""
+
+ def get_tags(self, repo):
+ """
+ 通过url获取上游社区的release tags
+ return: list
+ """
+ logging.info("unsupported version control: %s", self.version_control)
+ return []
+
+
+class HttpReleaseTagsMixin(object):
+ """
+ 通过web请求形式获取release tags
+ """
+ DEFAULT_REQ_HEADER = {
+ 'User-Agent': 'Mozilla/5.0 (X11; Linux x86_64)'
+ }
+
+ def get_redirect_resp(self, url, response):
+ """
+ 获取重定向的url和cookie
+ return: bool, str, list
+ """
+ cookie = set()
+ href = ""
+ need_redirect = False
+ for line in response.text.splitlines():
+ line = line.strip()
+ if line.startswith("Redirecting"):
+ logging.debug("Redirecting with document.cookie")
+ need_redirect = True
+ search_result = re.search(r"document\.cookie=\"(.*)\";", line)
+ if search_result:
+ cookie = cookie | set(search_result.group(1).split(';'))
+ search_result = re.search(r"document\.location\.href=\"(.*)\";", line)
+ if search_result:
+ href = search_result.group(1)
+ new_url = urlparse.urljoin(url, href)
+ if "" in cookie:
+ cookie.remove("")
+ return need_redirect, new_url, list(cookie)
+
+ def get_request_response(self, url, timeout=30, headers=None):
+ """
+ 获取url请求获取response
+ return: reponse
+ """
+ headers = self.DEFAULT_REQ_HEADER if headers is None else headers
+ try:
+ response = requests.get(url, headers=headers, timeout=timeout)
+ need_redirect, new_url, cookies = self.get_redirect_resp(url, response)
+ if tldextract.extract(url).domain != tldextract.extract(new_url).domain: # 判断域名是否一致 预防csrf攻击
+ logging.warning("domain of redirection link is different: %s", new_url)
+ return ""
+ if need_redirect:
+ cookie_dict = {}
+ for cookie in cookies:
+ key, val = cookie.split("=")
+ cookie_dict[key] = val
+ url = new_url
+ response = requests.get(url, headers=headers, cookies=cookie_dict, timeout=timeout)
+ except requests.exceptions.SSLError as e:
+ logging.warning("requests %s ssl exception, %s", url, e)
+ return ""
+ except requests.exceptions.Timeout as e:
+ logging.warning("requests timeout")
+ return ""
+ except requests.exceptions.RequestException as e:
+ logging.warning("requests exception, %s", e)
+ return ""
+ return response
+
+
+class HgReleaseTags(AbsReleaseTags, HttpReleaseTagsMixin):
+ """
+ 获取hg上游社区release tags
+ """
+ def url(self, repo):
+ """
+ 通过src_repo生成url
+ return: str
+ """
+ return urlparse.urljoin(repo + "/", "json-tags") if repo else ""
+
+ def get_tags(self, repo):
+ """
+ 通过url获取上游社区的release tags
+ return: list
+ """
+ url = self.url(repo)
+ logging.debug("%s : get %s tags", url, self.version_control)
+ if not url:
+ logging.warning("illegal url: \"\"")
+ return []
+ response = self.get_request_response(url)
+ if not response:
+ logging.warning("unable to get response:")
+ return []
+ try:
+ tags_json = json.loads(response.text)
+ if not all(tags_json, isinstance(tags_json, dict)):
+ return []
+ temp_tags = tags_json["tags"]
+ temp_tags.sort(reverse=True, key=lambda x: x["date"][0])
+ release_tags = [tag["tag"] for tag in temp_tags]
+ except (json.JSONDecodeError, IOError, KeyError, IndexError) as e:
+ logging.error("exception, %s", e)
+ return []
+ return release_tags
+
+
+class HgRawReleaseTags(AbsReleaseTags, HttpReleaseTagsMixin):
+ """
+ 获取hg raw上游社区release tags
+ """
+ def url(self, repo):
+ """
+ 通过src_repo生成url
+ return: str
+ """
+ return urlparse.urljoin(repo + "/", "raw-tags") if repo else ""
+
+ def get_tags(self, repo):
+ """
+ 通过url获取上游社区的release tags
+ return: list
+ """
+ url = self.url(repo)
+ logging.debug("%s : get %s tags", url, self.version_control)
+ if not url:
+ logging.warning("illegal url: \"\"")
+ return []
+ response = self.get_request_response(url)
+ release_tags = []
+ for line in response.text.splitlines():
+ release_tags.append(line.split()[0])
+ return release_tags
+
+
+class MetacpanReleaseTags(AbsReleaseTags, HttpReleaseTagsMixin):
+ """
+ 获取metacpan上游社区release tags
+ """
+ def url(self, repo):
+ """
+ 通过src_repo生成url
+ return: str
+ """
+ return urlparse.urljoin("https://metacpan.org/release/", repo) if repo else ""
+
+ def get_tags(self, repo):
+ """
+ 通过url获取上游社区的release tags
+ return: list
+ """
+ url = self.url(repo)
+ logging.debug("%s : get %s tags", url, self.version_control)
+ if not url:
+ logging.warning("illegal url: \"\"")
+ return []
+ response = self.get_request_response(url)
+ if not response:
+ return []
+ resp_lines = response.text.splitlines()
+ release_tags = []
+ tag_condition = "value=\"/release"
+ for index in range(len(resp_lines) - 1):
+ if tag_condition in resp_lines[index]:
+ tag = resp_lines[index + 1]
+ index += 1
+ if "DEV" in tag:
+ continue
+ tag = tag.strip()
+ release_tags.append(tag)
+ return release_tags
+
+
+class PypiReleaseTags(AbsReleaseTags, HttpReleaseTagsMixin):
+ """
+ 获取pypi上游社区release tags
+ """
+ def url(self, repo):
+ """
+ 通过src_repo生成url
+ return: str
+ """
+ return urlparse.urljoin(os.path.join("https://pypi.org/pypi/", repo, "json")) if repo else ""
+
+ def get_tags(self, repo):
+ """
+ 通过url获取上游社区的release tags
+ return: list
+ """
+ url = self.url(repo)
+ logging.debug("%s : get %s tags", url, self.version_control)
+ if not url:
+ logging.warning("illegal url: \"\"")
+ return []
+ response = self.get_request_response(url)
+ try:
+ tags_json = response.json()
+ release_tags = [tag for tag in tags_json.get("releases", [])]
+ except (UnicodeDecodeError, json.JSONDecodeError) as e:
+ logging.error("exception, %s", e)
+ return []
+ return release_tags
+
+
+class RubygemReleaseTags(AbsReleaseTags, HttpReleaseTagsMixin):
+ """
+ 获取rubygem上游社区release tags
+ """
+ def url(self, repo):
+ """
+ 通过src_repo生成url
+ return: str
+ """
+ return urlparse.urljoin("https://rubygems.org/api/v1/versions/", repo + ".json") if repo else ""
+
+ def get_tags(self, repo):
+ """
+ 通过url获取上游社区的release tags
+ return: list
+ """
+ url = self.url(repo)
+ logging.debug("%s : get %s tags", url, self.version_control)
+ if not url:
+ logging.warning("illegal url: \"\"")
+ return []
+ response = self.get_request_response(url)
+ try:
+ tags_json = response.json()
+ release_tags = []
+ for element in tags_json:
+ if element.get("number"):
+ release_tags.append(element.get("number"))
+ except (UnicodeDecodeError, json.JSONDecodeError) as e:
+ logging.error("exception, %s", e)
+ return []
+ return release_tags
+
+
+class GnuftpReleaseTags(AbsReleaseTags, HttpReleaseTagsMixin):
+ """
+ 获取gnu-ftp上游社区release tags
+ """
+ def url(self, repo):
+ """
+ 通过src_repo生成url
+ return: str
+ """
+ return urlparse.urljoin("https://ftp.gnu.org/gnu/", repo) if repo else ""
+
+ def get_tags(self, repo):
+ """
+ 通过url获取上游社区的release tags
+ return: list
+ """
+ url = self.url(repo)
+ logging.debug("%s : get %s tags", url, self.version_control)
+ if not url:
+ logging.warning("illegal url: \"\"")
+ return []
+ response = self.get_request_response(url)
+ pattern = re.compile("href=\"(.*)\">(.*)")
+ release_tags = []
+ if not response:
+ return []
+ for line in response.text.splitlines():
+ search_result = pattern.search(line)
+ if search_result:
+ release_tags.append(search_result.group(1)) # python2用法 python3不同
+ return release_tags
+
+
+class FtpReleaseTags(AbsReleaseTags, HttpReleaseTagsMixin):
+ """
+ 获取ftp上游社区release tags
+ """
+ def url(self, repo):
+ """
+ 通过src_repo生成url
+ return: str
+ """
+ return urlparse.urljoin('ftp', repo + "/") if repo else ""
+
+ def get_tags(self, repo):
+ """
+ 通过url获取上游社区的release tags
+ return: list
+ """
+ url = self.url(repo)
+ logging.debug("%s : get %s tags", url, self.version_control)
+ if not url:
+ logging.warning("illegal url: \"\"")
+ return []
+ response = self.get_request_response(url)
+ pattern = re.compile("href=\"(.*)\">(.*)")
+ release_tags = []
+ for line in response.text.splitlines():
+ search_result = pattern.search(line)
+ if search_result:
+ release_tags.append(search_result.group(1)) # python2用法 python3不同
+ return release_tags
+
+
+class CmdReleaseTagsMixin(object):
+ """
+ 通过shell命令获取上游社区的release tags
+ """
+ def get_cmd_response(self, cmd_list):
+ """
+ 获取shell命令的reponse
+ return: reponse
+ """
+ sub_proc = subprocess.Popen(cmd_list, stdout=subprocess.PIPE)
+ response = sub_proc.stdout.read().decode("utf-8")
+ if sub_proc.wait():
+ logging.warning("%s > encount errors", " ".join(cmd_list))
+ return response
+
+
+class SvnReleaseTags(AbsReleaseTags, CmdReleaseTagsMixin):
+ """
+ 通过shell svn命令获取上游社区的release tags
+ """
+ def url(self, repo):
+ """
+ 通过src_repo生成url
+ return: str
+ """
+ return urlparse.urljoin(repo + "/", "tags") if repo else ""
+
+ def get_response(self, url):
+ """
+ 生成svn命令并获取reponse
+ return: response
+ """
+ cmd_list = ["/usr/bin/svn", "ls", "-v", url]
+ return self.get_cmd_response(cmd_list)
+
+ def get_tags(self, repo):
+ """
+ 通过shell cmd访问远端获取上游社区的release tags
+ return: list
+ """
+ url = self.url(repo)
+ logging.debug("%s : get svn tags", url)
+ if not url:
+ logging.warning("illegal url: \"\"")
+ return []
+ response = self.get_response(url)
+ release_tags = []
+ for line in response.splitlines():
+ for item in line.split():
+ if item and item[-1] == "/":
+ release_tags.append(item[:-1])
+ break
+ return release_tags
+
+
+class GitReleaseTags(AbsReleaseTags, CmdReleaseTagsMixin):
+ """
+ 通过shell git命令获取上游社区的release tags
+ """
+ def url(self, repo):
+ """
+ 通过src_repo生成url
+ return: str
+ """
+ return repo
+
+ def get_response(self, url):
+ """
+ 生成git命令并获取reponse
+ return: response
+ """
+ cmd_list = ["git", "ls-remote", "--tags", url]
+ return self.get_cmd_response(cmd_list)
+
+ def trans_reponse_tags(self, reponse):
+ """
+ 解析git命令返回值为纯数字形式的tag
+ return: list
+ """
+ release_tags = []
+ pattern = re.compile(r"^([^ \t]*)[ \t]*refs\/tags\/([^ \t]*)")
+ for line in reponse.splitlines():
+ match_result = pattern.match(line)
+ if match_result:
+ tag = match_result.group(2)
+ if not tag.endswith("^{}"):
+ release_tags.append(tag)
+ return release_tags
+
+ def get_tags(self, repo):
+ """
+ 通过shell cmd访问远端获取上游社区的release tags
+ return: list
+ """
+ url = self.url(repo)
+ logging.debug("%s : get %s tags", url, self.version_control)
+ if not url:
+ logging.warning("illegal url: \"\"")
+ return []
+ response = self.get_response(url)
+ return self.trans_reponse_tags(response)
+
+
+class GithubReleaseTags(GitReleaseTags):
+ """
+ 获取github上游社区release tags
+ """
+ def url(self, repo):
+ """
+ 通过src_repo生成url
+ return: str
+ """
+ return urlparse.urljoin("https://github.com/", repo + ".git") if repo else ""
+
+
+class GiteeReleaseTags(GitReleaseTags):
+ """
+ 获取gitee上游社区release tags
+ """
+ def url(self, repo):
+ """
+ 通过src_repo生成url
+ return: str
+ """
+ return urlparse.urljoin("https://gitee.com/", repo) if repo else ""
+
+
+class GitlabReleaseTags(GitReleaseTags):
+ """
+ 获取gitlab.gnome上游社区release tags
+ """
+ def url(self, repo):
+ """
+ 通过src_repo生成url
+ return: str
+ """
+ if not repo:
+ return ""
+ src_repos = repo.split("/")
+ if len(src_repos) == 1:
+ return urlparse.urljoin("https://gitlab.gnome.org/GNOME/", repo + ".git")
+ else:
+ return urlparse.urljoin("https://gitlab.gnome.org/", repo + ".git")
+
+
+class ReleaseTagsFactory(object):
+ """
+ ReleaseTags及其子类的工厂类
+ """
+ VERSION_CTRL_GETTER_MAPPING = {
+ "hg": HgReleaseTags,
+ "hg-raw": HgRawReleaseTags,
+ "github": GithubReleaseTags,
+ "git": GitReleaseTags,
+ "gitlab.gnome": GitlabReleaseTags,
+ "svn": SvnReleaseTags,
+ "metacpan": MetacpanReleaseTags,
+ "pypi": PypiReleaseTags,
+ "rubygem": RubygemReleaseTags,
+ "gitee": GiteeReleaseTags,
+ "gnu-ftp": GnuftpReleaseTags,
+ "ftp": FtpReleaseTags
+ }
+
+ @staticmethod
+ def get_release_tags(version_control):
+ """
+ 通过version control返回对应的ReleaseTags的子类
+ return: class
+ """
+ release_tags = ReleaseTagsFactory.VERSION_CTRL_GETTER_MAPPING.get(version_control, DefaultReleaseTags)
+ return release_tags(version_control)
diff --git a/src/ac/acl/sca/__init__.py b/src/ac/acl/sca/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/src/ac/acl/sca/__pycache__/__init__.cpython-38.pyc b/src/ac/acl/sca/__pycache__/__init__.cpython-38.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..63ba1f52d02beba8bcc6f6f2114d7af8e71f2cca
Binary files /dev/null and b/src/ac/acl/sca/__pycache__/__init__.cpython-38.pyc differ
diff --git a/src/ac/acl/sca/__pycache__/check_sca.cpython-38.pyc b/src/ac/acl/sca/__pycache__/check_sca.cpython-38.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..aac8d6de1c027ef1a04387debacd9df945e40092
Binary files /dev/null and b/src/ac/acl/sca/__pycache__/check_sca.cpython-38.pyc differ
diff --git a/src/ac/acl/sca/check_sca.py b/src/ac/acl/sca/check_sca.py
new file mode 100644
index 0000000000000000000000000000000000000000..b5a4e3e824013fe74e8e55f85f6782ad5df35e8f
--- /dev/null
+++ b/src/ac/acl/sca/check_sca.py
@@ -0,0 +1,75 @@
+# -*- encoding=utf-8 -*-
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2021-09-09
+# Description: check sca (software composition analysis)
+# **********************************************************************************
+
+import os
+import shutil
+import logging
+import json
+
+from src.proxy.git_proxy import GitProxy
+from src.ac.framework.ac_base import BaseCheck
+from src.ac.framework.ac_result import FAILED, WARNING, SUCCESS
+from src.ac.common.scanoss import ScanOSS
+
+logger = logging.getLogger("ac")
+
+
+class CheckSCA(BaseCheck):
+ """
+ check software composition analysis
+ """
+ def __init__(self, workspace, repo, conf):
+ """
+
+ :param workspace:
+ :param repo:
+ :param conf:
+ """
+ super(CheckSCA, self).__init__(workspace, repo, conf)
+
+ def check_scanoss(self):
+ """
+ Obtain scanoss logs and result
+ """
+ # Describes the reportUrl result jenkinsJobName jenkinsBuildNum prNo repoUrl of scanoss
+ try:
+ with open(self._scanoss_result_output, 'r') as f:
+ result_dirt = json.load(f)
+ except IOError:
+ logger.error("%s not found, make sure this file exists", self._scanoss_result_output)
+ return FAILED
+
+ result = result_dirt.get('result')
+
+ # 保存详细结果到web server
+ logger.warning("click %s view scanoss detail", result_dirt.get('reportUrl'))
+
+ return SUCCESS if result == "success" else FAILED
+
+ def __call__(self, *args, **kwargs):
+ """
+ 入口函数
+ :param args:
+ :param kwargs:
+ :return:
+ """
+ logger.info("check %s sca ...", self._repo)
+
+ logger.debug("args: %s, kwargs: %s", args, kwargs)
+ scanoss_conf = kwargs.get("scanoss", {})
+ self._scanoss_result_output = scanoss_conf.get("output", "scanoss_result")
+
+ return self.start_check()
diff --git a/src/ac/acl/source_consistency/__init__.py b/src/ac/acl/source_consistency/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/src/ac/acl/source_consistency/__pycache__/__init__.cpython-38.pyc b/src/ac/acl/source_consistency/__pycache__/__init__.cpython-38.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..dfc7d0469cba58a975b13c190f6a46a8fa9ec32b
Binary files /dev/null and b/src/ac/acl/source_consistency/__pycache__/__init__.cpython-38.pyc differ
diff --git a/src/ac/acl/source_consistency/check_consistency.py b/src/ac/acl/source_consistency/check_consistency.py
new file mode 100644
index 0000000000000000000000000000000000000000..d97db71a0d43eb4a33bd6bc4ae7ccff5a8a0b707
--- /dev/null
+++ b/src/ac/acl/source_consistency/check_consistency.py
@@ -0,0 +1,368 @@
+import difflib
+import hashlib
+import logging
+import os
+import re
+import stat
+import shutil
+# import sqlite3
+import subprocess
+# from sqlite3 import Error
+
+from src.ac.framework.ac_base import BaseCheck
+from src.ac.framework.ac_result import FAILED, SUCCESS, WARNING
+
+logger = logging.getLogger("ac")
+
+
+class CheckSourceConsistency(BaseCheck):
+ """
+ check source consistence
+ """
+
+ def __init__(self, workspace, repo, conf=None):
+ super(CheckSourceConsistency, self).__init__(workspace, repo, conf)
+
+ self._work_dir = os.path.join(workspace, "source_consistency")
+ shutil.copytree(os.path.join(workspace, repo), self._work_dir)
+ self._repo = repo
+ self.temp_txt_path = os.path.join(self._work_dir, "temp.txt")
+ self.rpmbuild_dir = os.path.join(workspace, "rpmbuild")
+ self.rpmbuild_build_path = os.path.join(self.rpmbuild_dir, "BUILD")
+ self.rpmbuild_sources_path = os.path.join(self.rpmbuild_dir, "SOURCES")
+ self.database_path = "source_clean.db"
+ self.con = None
+ self.ask_warning = "If you have some questions, you can ask q_qiutangke@163.com!"
+
+ def __call__(self, *args, **kwargs):
+ """
+ 入口函数
+ :param args:
+ :param kwargs:
+ :return:
+ """
+ logger.info("check %s source consistency. If sha256sum of repo source package is different from Official"
+ " website, check fail", self._repo)
+ _ = not os.path.exists("log") and os.mkdir("log")
+ try:
+ return self.start_check_with_order("source_consistency")
+ finally:
+ if self.con is not None:
+ self.con.close()
+ self.clear_temp()
+
+ @staticmethod
+ def get_sha256sum(package):
+ """
+ 计算文件的sha256sum值
+ :param package:包路径
+ :return:
+ """
+ logger.info("getting sha256sum of native source package...")
+ native_sha256sum = ""
+ try:
+ with open(package, "rb") as f:
+ sha256obj = hashlib.sha256()
+ sha256obj.update(f.read())
+ native_sha256sum = sha256obj.hexdigest()
+ except Exception as e:
+ logger.warning(e)
+ if native_sha256sum == "":
+ try:
+ native_sha256sum = os.popen("sha256sum {0}".format(package)).read().split()[0]
+ except Exception as e:
+ logger.warning(e)
+ return native_sha256sum.strip()
+
+ def get_package_name(self, url):
+ """
+ 从文件列表或者url中获取包名
+ """
+ package_name = os.popen("ls -S {0} |grep -v '\\.spec' |grep -v '\\.yaml' |grep -v '\\.patch' |grep -v '\\.md' |"
+ "head -n 1".format(self._work_dir)).read().split()[0]
+ if package_name == "":
+ package_name = os.path.basename(url)
+ return package_name
+
+ def get_native_sha256sum_again(self, repo):
+ base_path = os.getcwd()
+ os.chdir(self._work_dir)
+ line_list = os.popen("/usr/bin/file -i * |grep application/. |awk '{print $1}'").read().split(os.linesep)
+ os.chdir(base_path)
+ if len(line_list) == 0:
+ return "", ""
+ elif len(line_list) == 1:
+ package_name = line_list[0].strip().rstrip(":")
+ return self.get_sha256sum(os.path.join(self._work_dir, package_name)), package_name
+ else:
+ file_dict = {}
+ for line in line_list:
+ file_dict[line.strip().rstrip(":")] = difflib.SequenceMatcher(None, repo, line).quick_ratio()
+ sorted_file_dict = sorted(file_dict.items(), key=lambda x: x[1], reverse=True)
+ converted_dict = dict(sorted_file_dict)
+ package_name = next(iter(converted_dict))
+ return self.get_sha256sum(os.path.join(self._work_dir, package_name)), package_name
+
+ def check_source_consistency(self):
+ """
+ 检查源码包是否一致
+ :return:
+ """
+ if not os.path.exists(os.path.join(self.rpmbuild_dir, "SOURCES")):
+ os.makedirs(os.path.join(self.rpmbuild_dir, "SOURCES"))
+ source_url = self.get_source_url()
+ if source_url == "":
+ logger.warning("Source keywords of spec content are invalid or spec content is illegal. " +
+ self.ask_warning)
+ return WARNING
+
+ package_name = self.get_package_name(source_url)
+ if package_name not in os.listdir(self._work_dir):
+ logger.warning("no source package file in the repo, the package name is " + package_name + ". " +
+ self.ask_warning)
+ return WARNING
+
+ native_sha256sum = self.get_sha256sum(os.path.join(self._work_dir, package_name))
+ if native_sha256sum == "":
+ logger.warning("get sha256sum of native source package failed, internal error. " + self.ask_warning)
+ return WARNING
+
+ remote_sha256sum = self.get_sha256sum_from_url(source_url)
+ if remote_sha256sum == "":
+ logger.warning("Failed to get sha256sum of official website source package, there is no sha256sum in "
+ "the system database")
+ detail_info = self.check_spec_validity()
+ if detail_info == "maybe url is unreachable":
+ logger.warning("Maybe url is unreachable, you need to check the connectivity of url. If connectivity is"
+ " good, please ignore this warning! " + self.ask_warning)
+ return WARNING
+ else:
+ logger.warning(detail_info + ", please check spec file! " + self.ask_warning)
+ return WARNING
+
+ if native_sha256sum != remote_sha256sum:
+ new_native_sha256sum, new_package_name = self.get_native_sha256sum_again(self._repo)
+ logger.info("The sha256sum of new source package is " + new_native_sha256sum + ", package name is " +
+ new_package_name)
+ if new_native_sha256sum == remote_sha256sum:
+ return SUCCESS
+ logger.info("The sha256sum of source package is " + native_sha256sum + ", package name is " +
+ package_name)
+ logger.info("The sha256sum of official website source package is " + remote_sha256sum + ", package name is "
+ + package_name)
+ logger.error("The sha256sum of source package is inconsistency, maybe you modified source code, "
+ "you must let the source package keep consistency with official website source package. " +
+ self.ask_warning)
+ return FAILED
+
+ return SUCCESS
+
+ def get_source_url(self):
+ """
+ 获取spec文件中的Source URL
+ :return:
+ """
+ spec_name = ""
+ files_list = os.listdir(self._work_dir)
+ if len(files_list) == 0:
+ logger.error("copy repo error, please check!")
+ return ""
+ if self._repo + ".spec" not in files_list:
+ logger.warning("no such spec file: " + self._repo + ".spec")
+ for file_name in files_list:
+ if file_name.endswith(".spec"):
+ spec_name = file_name
+ break
+ if spec_name == "":
+ logger.error("no spec file, please check!")
+ return ""
+ source_url = self.get_source_from_spec(spec_name)
+ # If program can't get source url from spec, try to get source url by rpmbuild
+ if source_url == "":
+ source_url = self.get_source_from_rpmbuild(spec_name)
+ # source_url = None
+ return source_url
+
+ def check_spec_validity(self):
+ """
+ 检查spec文件的合法性
+ :return:
+ """
+ spec_name = ""
+ files_list = os.listdir(self._work_dir)
+ if self._repo + ".spec" not in files_list:
+ spec_file_list = []
+ for temp_file in files_list:
+ if temp_file.endswith(".spec"):
+ spec_file_list.append(temp_file)
+ if len(spec_file_list) == 1:
+ spec_name = os.path.join(self._work_dir, spec_file_list[0])
+ elif len(spec_file_list) > 1:
+ for s_file in spec_file_list:
+ if self._repo in s_file or s_file in self._repo:
+ spec_name = os.path.join(self._work_dir, s_file)
+ break
+ if spec_name == "":
+ return "no spec file"
+ else:
+ spec_name = os.path.join(self._work_dir, self._repo + ".spec")
+ source_url = self.spectool_check_source(spec_name)
+ if not source_url.startswith("http"):
+ return "source url format error"
+ return "maybe url is unreachable"
+
+ def get_source_from_rpmbuild(self, spec_name=""):
+ """
+ rpmbuild解析出可查询的Source URL
+ :param spec_name:spec文件名
+ :return:
+ """
+ if spec_name == "":
+ spec_name = self._repo + ".spec"
+ spec_file = os.path.join(self._work_dir, spec_name)
+ self.generate_new_spec(spec_file)
+ source_url = self.do_rpmbuild()
+ return source_url
+ # return None
+
+ def get_source_from_spec(self, spec_name=""):
+ """
+ spec文件中得到可查询的Source URL
+ :param spec_name:spec文件名
+ :return:
+ """
+ if spec_name == "":
+ spec_name = self._repo + ".spec"
+ spec_file = os.path.join(self._work_dir, spec_name)
+ if not os.path.exists(spec_file):
+ temp_file_list = os.listdir(self._work_dir)
+ spec_file_list = []
+ for temp_file in temp_file_list:
+ if temp_file.endswith(".spec"):
+ spec_file_list.append(temp_file)
+ if len(spec_file_list) == 1:
+ spec_file = os.path.join(self._work_dir, spec_file_list[0])
+ elif len(spec_file_list) > 1:
+ for s_file in spec_file_list:
+ if self._repo in s_file or s_file in self._repo:
+ spec_file = os.path.join(self._work_dir, s_file)
+ break
+ else:
+ return ""
+ source_url = self.spectool_check_source(spec_file)
+ return source_url
+
+ def spectool_check_source(self, spec_file):
+ """
+ 执行spectool命令
+ :param spec_file:spec文件名
+ :return:
+ """
+ ret = subprocess.check_output(["/usr/bin/spectool", "-S", spec_file], shell=False)
+ content = ret.decode('utf-8').strip()
+ source_url = content.split(os.linesep)[0].strip() if os.linesep in content else content.strip()
+ if ":" in source_url:
+ source_url = ":".join(source_url.split(":")[1:]).strip()
+ elif "No such file or directory" in source_url:
+ return ""
+ return source_url
+
+ def generate_new_spec(self, spec_file):
+ """
+ 读取spec文件并生成新的spec文件
+ :param spec_file:spec文件名
+ :return:
+ """
+ logger.info("reading spec file : %s ...", os.path.basename(spec_file))
+
+ new_spec_file = os.path.join(self.rpmbuild_sources_path, "get_source.spec")
+ cond_source = re.compile("^Source0*")
+ source_url = ""
+ new_spec_content = ""
+ with open(spec_file) as f:
+ for line in f:
+ line = line.strip()
+ if line.startswith("%prep"):
+ break
+ elif cond_source.match(line) or re.match("^Source.*", line):
+ if source_url == "":
+ if ":" in line:
+ source_url = ":".join(line.split(":")[1:]).strip()
+ new_spec_content += line + os.linesep
+ new_spec_content += self.get_prep_function(source_url)
+ logger.info("generating new spec file ...")
+ flags = os.O_WRONLY | os.O_CREAT | os.O_EXCL
+ modes = stat.S_IWUSR | stat.S_IRUSR
+ with os.fdopen(os.open(new_spec_file, flags, modes), 'w') as f:
+ f.write(new_spec_content)
+
+ def get_prep_function(self, url):
+ """
+ 生成spec文件%prep部分的内容
+ :param url:source0的值
+ :return:
+ """
+ logger.info("generating %prep function")
+ function_content = "%prep" + os.linesep
+ function_content += "source={0}".format(url) + os.linesep
+ function_content += "cd {0}".format(self.rpmbuild_sources_path) + os.linesep
+ function_content += "echo $source > {0}".format(self.temp_txt_path) + os.linesep
+ return function_content
+
+ def do_rpmbuild(self):
+ """
+ 对新生成的spec文件执行rpmbuild
+ :return:
+ """
+ # logger.info("start to do rpmbuild")
+ # new_spec_file = os.path.join(self.rpmbuild_sources_path, "get_source.spec")
+ # res = os.system("rpmbuild -bp --nodeps {0} --define \"_topdir {1}\"".format(new_spec_file, self.rpmbuild_dir))
+ res = 1
+ if res != 0:
+ # logger.error("do rpmbuild fail")
+ pass
+ if not os.path.exists(self.temp_txt_path):
+ return ""
+ with open(self.temp_txt_path, "r") as f:
+ source_url = f.read().strip()
+ return source_url
+ #
+ # def create_connection(self):
+ # """
+ # 与数据库建立连接
+ # :return:
+ # """
+ # logger.info("getting connection with source_clean.db ...")
+ # try:
+ # if os.path.exists(self.database_path):
+ # con = sqlite3.connect(self.database_path)
+ # return con
+ # except Error:
+ # logger.error(Error)
+ # return None
+
+ def get_sha256sum_from_url(self, url):
+ """
+ 查询数据库,获取url的sha256sum值
+ :param url:source0的值
+ :return:
+ """
+ logger.info("getting sha256sum of remote source package from source_clean.db ...")
+ if self.con is None:
+ logger.warning("failed to connect to database")
+ return ""
+ cursor_obj = self.con.cursor()
+ cursor_obj.execute("SELECT sha256sum FROM source_package WHERE url = ?", (url,))
+ row = cursor_obj.fetchone()
+ if row:
+ return row[0]
+ return ""
+
+ def clear_temp(self):
+ """
+ 清理生成的中间文件
+ :return:
+ """
+ if os.path.exists(self._work_dir):
+ shutil.rmtree(self._work_dir)
+ shutil.rmtree(self.rpmbuild_dir)
diff --git a/src/ac/acl/spec/__init__.py b/src/ac/acl/spec/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/src/ac/acl/spec/__pycache__/__init__.cpython-310.pyc b/src/ac/acl/spec/__pycache__/__init__.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..f24f8794507c06f25bd26984b1a22809f31719b9
Binary files /dev/null and b/src/ac/acl/spec/__pycache__/__init__.cpython-310.pyc differ
diff --git a/src/ac/acl/spec/__pycache__/__init__.cpython-38.pyc b/src/ac/acl/spec/__pycache__/__init__.cpython-38.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..45c764393fff46c9865af7aea1fb76c2427635b7
Binary files /dev/null and b/src/ac/acl/spec/__pycache__/__init__.cpython-38.pyc differ
diff --git a/src/ac/acl/spec/__pycache__/check_spec.cpython-310.pyc b/src/ac/acl/spec/__pycache__/check_spec.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..32cf64daf8cd57401496dcec0268ae6fd3dcd5cc
Binary files /dev/null and b/src/ac/acl/spec/__pycache__/check_spec.cpython-310.pyc differ
diff --git a/src/ac/acl/spec/__pycache__/check_spec.cpython-38.pyc b/src/ac/acl/spec/__pycache__/check_spec.cpython-38.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..4c46bb2a5a8d8ebebea9b2f187930b0d34fd4326
Binary files /dev/null and b/src/ac/acl/spec/__pycache__/check_spec.cpython-38.pyc differ
diff --git a/src/ac/acl/spec/check_spec.py b/src/ac/acl/spec/check_spec.py
new file mode 100644
index 0000000000000000000000000000000000000000..24ecb8a3ecd7c738678e8f7126771f30f8d906f5
--- /dev/null
+++ b/src/ac/acl/spec/check_spec.py
@@ -0,0 +1,402 @@
+# -*- encoding=utf-8 -*-
+"""
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2020-09-23
+# Description: check spec file
+# **********************************************************************************
+"""
+import os
+import calendar
+import logging
+import time
+import re
+from datetime import datetime, timezone
+import yaml
+
+from src.proxy.git_proxy import GitProxy
+from src.proxy.requests_proxy import do_requests
+from src.ac.framework.ac_result import FAILED, SUCCESS, WARNING
+from src.ac.framework.ac_base import BaseCheck
+from src.ac.common.rpm_spec_adapter import RPMSpecAdapter
+from src.ac.common.gitee_repo import GiteeRepo
+from pyrpm.spec import Spec
+from src.constant import Constant
+
+logger = logging.getLogger("ac")
+
+
+class CheckSpec(BaseCheck):
+ """
+ check spec file
+ """
+
+ def __init__(self, workspace, repo, conf=None):
+ super(CheckSpec, self).__init__(workspace, repo, conf)
+
+ self._gp = GitProxy(self._work_dir)
+ self._gr = GiteeRepo(self._repo, self._work_dir, None) # don't care about decompress
+
+ self._fp = self._gp.get_content_of_file_with_commit(self._gr.spec_file)
+ if self._fp is None:
+ logger.error("no spec file")
+ else:
+ self._spec = RPMSpecAdapter(self._fp)
+ self._latest_commit = self._gp.commit_id_of_reverse_head_index(0)
+ self._tbranch = None
+
+ def __call__(self, *args, **kwargs):
+ logger.info("check %s spec ...", self._repo)
+ if self._fp is None:
+ logger.error("no spec file")
+ else:
+ self._ex_support_arch()
+ self._tbranch = kwargs.get("tbranch", None)
+ # 因门禁系统限制外网访问权限,将涉及外网访问的检查功能check_homepage暂时关闭
+ return self.start_check_with_order("patches", "changelog", "version_pr_changelog")
+
+ def _only_change_package_yaml(self):
+ """
+ 如果本次提交只变更yaml,则无需检查version
+ :return: boolean
+ """
+ diff_files = self._gp.diff_files_between_commits("HEAD~1", "HEAD~0")
+ package_yaml = "{}.yaml".format(self._repo) # package yaml file name
+
+ if len(diff_files) == 1 and diff_files[0] == package_yaml:
+ logger.debug("diff files: %s", diff_files)
+ return True
+
+ return False
+
+ def _is_lts_branch(self):
+ """
+ 检查lts分支是否是告警分支
+ :return boolean
+ """
+ if self._tbranch:
+ if self._tbranch.lower() in Constant.ALARM_LTS_BRANCH:
+ return True
+ return False
+
+ def check_version_pr_changelog(self):
+ """
+ 检查当前版本号是否比上一个commit新,每次提交pr的changelog
+ :return:
+ """
+ # need check version?
+ if self._only_change_package_yaml():
+ logger.debug("only change package yaml")
+ return SUCCESS
+
+ self._gp.checkout_to_commit_force("HEAD~1")
+ try:
+ gr = GiteeRepo(self._repo, self._work_dir, None) # don't care about decompress
+ logger.info("gr.spec_file:%s", gr.spec_file)
+ fp = self._gp.get_content_of_file_with_commit(gr.spec_file)
+ if fp is None:
+ # last commit has no spec file
+ return SUCCESS
+ spec_o = RPMSpecAdapter(fp)
+ finally:
+ self._gp.checkout_to_commit_force(self._latest_commit) # recover whatever
+
+ self._ex_pkgship(spec_o)
+
+ # if lts branch is in MAINTENANCE_LTS_BRANCH, version update is allowed
+ if self._is_lts_branch():
+ logger.debug("lts branch %s", self._tbranch)
+ if RPMSpecAdapter.compare_version(self._spec.version, spec_o.version) == 1:
+ logger.error("version update of lts branch is forbidden")
+ return FAILED
+
+ def every_pr_changelog(changelog):
+ """
+ 提取最新的一次changelog
+ """
+ return next(need_str for need_str in changelog.split("*") if need_str)
+
+ changelog_new = every_pr_changelog(self._spec.changelog)
+ changelog_old = every_pr_changelog(spec_o.changelog)
+ if changelog_new == changelog_old:
+ logger.error("Every pr commit requires a changelog!")
+ return FAILED
+ if self._spec > spec_o:
+ return SUCCESS
+ elif self._spec < spec_o:
+ if self._gp.is_revert_commit(depth=5): # revert, version back, ignore
+ logger.debug("revert commit")
+ return SUCCESS
+
+ logger.warning("current version: %s-r%s, last version: %s-r%s",
+ self._spec.version, self._spec.release, spec_o.version, spec_o.release)
+ return WARNING
+
+ def check_homepage(self, timeout=30, retrying=3, interval=1):
+ """
+ 检查主页是否可访问
+ :param timeout: 超时时间
+ :param retrying: 重试次数
+ :param interval: 重试间隔
+ :return:
+ """
+ homepage = self._spec.url
+ logger.debug("homepage: %s", homepage)
+ if not homepage:
+ return SUCCESS
+
+ for _ in range(retrying):
+ if 0 == do_requests("get", homepage, timeout=timeout):
+ return SUCCESS
+ time.sleep(interval)
+
+ return FAILED
+
+ def check_changelog(self):
+ """
+ 检查changelog中的日期错误
+ :return:
+ """
+ ret = self._parse_spec()
+ if not ret:
+ return FAILED
+ return SUCCESS
+
+ def check_patches(self):
+ """
+ 检查spec中的patch是否存在,及patch的使用情况
+ :return:
+ """
+ if self._fp is None:
+ logger.error("spec file not find")
+ return FAILED
+ patches_spec = set(self._spec.patches)
+ patches_file = set(self._gr.patch_files_not_recursive())
+ logger.debug("spec patches: %s", patches_spec)
+ logger.debug("file patches: %s", patches_file)
+
+ result = SUCCESS
+
+ def equivalent_patch_number(patch_con):
+ """
+ 处理spec文件中patch序号
+ :param patch_con:spec文件中patch内容
+ :return:
+ """
+ patch_number = re.search(r"\d+", patch_con)
+ new_patch_number = "patch" + str(int(patch_number.group()))
+ return new_patch_number
+
+ def patch_adaptation(spec_con, patches_dict):
+ """
+ 检查spec文件中patch在prep阶段的使用情况
+ :param spec_con:spec文件内容
+ :param patches_dict:spec文件中补丁具体信息
+ :return:
+ """
+ if not patches_dict:
+ return True
+ miss_patches_dict = {}
+ prep_obj = re.search(r"%prep[\s\S]*%changelog", spec_con, re.I)
+ if not prep_obj:
+ logger.error("%prep part lost")
+ return False
+ prep_str = prep_obj.group().lower()
+ if prep_str.find("autosetup") != -1 or \
+ prep_str.find("autopatch") != -1:
+ return True
+ prep_patch = [equivalent_patch_number(single_prep_patch)
+ for single_prep_patch in re.findall(r"patch\d+", prep_str)]
+ for single_key, single_patch in patches_dict.items():
+ single_number = equivalent_patch_number(single_key)
+ if single_number not in prep_patch:
+ miss_patches_dict[single_key] = single_patch
+ if miss_patches_dict:
+ logger_con = ["%s: %s" % (key, value) for key, value in miss_patches_dict.items()]
+ logger.error("The following patches in the spec file are not used: \n%s", "\n".join(logger_con))
+ return False
+ return True
+
+ for patch in patches_spec - patches_file:
+ logger.error("patch %s lost", patch)
+ result = FAILED
+ for patch in patches_file - patches_spec:
+ logger.warning("patch %s redundant", patch)
+ result = WARNING
+ with open(os.path.join(self._work_dir, self._gr.spec_file), "r", encoding="utf-8") as fp:
+ all_str = fp.read()
+ adapter = Spec.from_string(all_str)
+ patch_dict = adapter.__dict__.get("patches_dict")
+ if not patch_adaptation(all_str, patch_dict):
+ result = FAILED
+ return result
+
+ def _ex_support_arch(self):
+ """
+ 保存spec中exclusivearch字段信息
+ :return:
+ """
+ exclusive_arch = self._spec.get_exclusivearch()
+ if exclusive_arch:
+ obj_s = list(set(exclusive_arch).intersection(("x86_64", "aarch64", "noarch")))
+ logger.info("support arch:%s", " ".join(obj_s))
+ if obj_s and "noarch" not in obj_s:
+ content = " ".join(obj_s)
+ try:
+ with open("support_arch", "w") as f:
+ f.write(content)
+ except IOError:
+ logger.exception("save support arch exception")
+
+ def _ex_pkgship(self, spec):
+ """
+ pkgship需求
+ :param spec: 上一个版本spec对应的RPMSpecAdapter对象
+ :return:
+ """
+ if not self._repo == "pkgship":
+ return
+
+ logger.debug("special repo \"pkgship\"")
+ compare_version = RPMSpecAdapter.compare_version(self._spec.version, spec.version)
+ compare_release = RPMSpecAdapter.compare_version(self._spec.release, spec.release)
+ compare = self._spec.compare(spec)
+
+ rs = {"repo": "pkgship", "curr_version": self._spec.version, "curr_release": self._spec.release,
+ "last_version": spec.version, "last_release": spec.release,
+ "compare_version": compare_version, "compare_release": compare_release, "compare": compare}
+
+ logger.info("%s", rs)
+ try:
+ with open("pkgship_notify", "w") as f:
+ yaml.safe_dump(rs, f)
+ except IOError:
+ logger.exception("save pkgship exception")
+
+ def _parse_spec(self):
+ """
+ 获取最新提交的spec文件
+ :return:
+ """
+ weeks = ["MON", "TUE", "WED", "THU", "FRI", "SAT", "SUN"]
+ months = ["JAN", "FEB", "MAR", "APR", "MAY", "JUN", "JUL", "AUG", "SEP", "OCT", "NOV", "DEC"]
+ week = 0
+ month = 1
+ day = 2
+ year = 3
+
+ def judgment_date(date_obj):
+ """
+ 检查日期合法性
+ """
+ if date_obj[week].upper() not in weeks:
+ return False
+ if date_obj[month].upper() not in months:
+ return False
+ # 日期,取1-当前月份最大天数
+ if not 0 < int(date_obj[day]) <= calendar.monthrange(int(date_obj[year]),
+ months.index(date_obj[month].upper()) + 1)[1]:
+ return False
+ return True
+
+ def bogus_date(date_obj):
+ """
+ 匹配年月日对应的星期
+ """
+ try:
+ week_num = calendar.weekday(int(date_obj[year]), months.index(date_obj[month].upper()) + 1,
+ int(date_obj[day]))
+ except (ValueError, IndexError) as error:
+ logger.error(error)
+ return False
+ if weeks[week_num] != date_obj[week].upper():
+ return False
+ return True
+
+ def release_and_version(changelog_con, version, release):
+ """
+ 检查changelog中的版本号,release号是否和spec的版本号,release号一致
+ """
+ # 排除邮箱格式中“-”的影响
+ new_str = re.sub(r"<[\w._-]+@[\w\-_]+[.a-zA-Z]+>", "", changelog_con)
+ if self._spec.epoch: # 检查spec文件中是否存在epoch字段
+ obj_s = re.search(r"\w+:(\w+(.\w+){0,9})-[\w.]+", new_str)
+ if not obj_s:
+ logger.error(
+ "There is an non-standard format in %s, please keep it consistent: Epoch:version-release \n"
+ "e.g: 1:1.0.0-1", changelog_con)
+ return False
+ version = "".join([self._spec.epoch, ":", version])
+ else:
+ obj_s = re.search(r"(\w+(.\w+){0,9})-[\w.]+", new_str)
+ if not obj_s:
+ logger.error("%s release or version incorrect format,please keep it consistent: version-release \n"
+ "e.g: 1.0.0-1", changelog_con)
+ return False
+ try:
+ version_num, release_num = obj_s.group(0).split("-")
+ except (ValueError, IOError, KeyError, IndexError) as e:
+ logger.error("%s release or version incorrect format,please keep it consistent: version-release \n"
+ "e.g: 1.0.0-1", changelog_con)
+ return False
+ if version_num != version:
+ logger.error("version error in changelog: %s is different from %s", version_num, version)
+ return False
+ if release_num != release:
+ logger.error("release error in changelog: %s is different from %s", release_num, release)
+ return False
+ return True
+
+ def check_mailbox(changelog):
+ """
+ 检查changelog中邮箱格式
+ """
+ mail_obj = re.findall(r"[\w._-]+@[\w\-_]+[.a-zA-Z]+", changelog)
+ if not mail_obj:
+ return False
+ return True
+
+ def check_changelog_entries_start(changelog):
+ """
+ %changelog 条目必须以 * 开头
+ """
+ changelog_entries_obj = re.match(r"\*", changelog)
+ if not changelog_entries_obj:
+ return False
+ return True
+ if self._fp is None:
+ logger.error("spec file not find")
+ return FAILED
+ if not check_changelog_entries_start(self._spec.changelog):
+ logger.error("%changelog entries must start with *")
+ return False
+ changelog = self._spec.changelog.split("*")
+ # 取最新一条changelog
+ changelog_con = next(need_str for need_str in changelog if need_str)
+ # 检查changelog中邮箱格式
+ if not check_mailbox(changelog_con):
+ logger.error("bad mailbox in changelog:%s", changelog_con)
+ return False
+ # date_obj是字符串列表,样例:['Tue', 'Mar', '21', '2022', 'xxx', '', '-', '2.9.24-5-', 'test', '2.9.24-5']
+ date_obj = [con for con in changelog_con.strip(" ").split(" ") if con] # 列表中的空字符串已处理
+ if len(date_obj) < 4: # 列表中的字符串至少四个,包含年、月、日、星期 ['Tue', 'Mar', '21', '2022']
+ logger.error("bad data in changelog:%s", changelog_con)
+ return False
+ if not judgment_date(date_obj):
+ logger.error("bad date in changelog:%s", changelog_con)
+ return False
+ if not release_and_version(changelog_con, self._spec.version, self._spec.release):
+ return False
+ if not bogus_date(date_obj):
+ logger.error("bogus date in changelog:%s", changelog_con)
+ return False
+ return True
diff --git a/src/ac/common/__init__.py b/src/ac/common/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/src/ac/common/__pycache__/__init__.cpython-310.pyc b/src/ac/common/__pycache__/__init__.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..90f4c6a580be161b7d5b5f02aaee9e01f2e6e91e
Binary files /dev/null and b/src/ac/common/__pycache__/__init__.cpython-310.pyc differ
diff --git a/src/ac/common/__pycache__/__init__.cpython-38.pyc b/src/ac/common/__pycache__/__init__.cpython-38.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..922588f83b6d33088bea6ea6fb39ac81e9d7a91c
Binary files /dev/null and b/src/ac/common/__pycache__/__init__.cpython-38.pyc differ
diff --git a/src/ac/common/__pycache__/gitee_repo.cpython-310.pyc b/src/ac/common/__pycache__/gitee_repo.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..abe9ce0191cd24d407ddd7caaa7f322e06f4625e
Binary files /dev/null and b/src/ac/common/__pycache__/gitee_repo.cpython-310.pyc differ
diff --git a/src/ac/common/__pycache__/gitee_repo.cpython-38.pyc b/src/ac/common/__pycache__/gitee_repo.cpython-38.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..90c2e0e13708bee8c03eaf9b2a574dd2a4090a45
Binary files /dev/null and b/src/ac/common/__pycache__/gitee_repo.cpython-38.pyc differ
diff --git a/src/ac/common/__pycache__/rpm_spec_adapter.cpython-310.pyc b/src/ac/common/__pycache__/rpm_spec_adapter.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..d38b263b817e615c7ecdf802f01010627bf5683a
Binary files /dev/null and b/src/ac/common/__pycache__/rpm_spec_adapter.cpython-310.pyc differ
diff --git a/src/ac/common/__pycache__/rpm_spec_adapter.cpython-38.pyc b/src/ac/common/__pycache__/rpm_spec_adapter.cpython-38.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..dd2ac23f5f7e1ee9a520f624ce1a8cd4eac8166b
Binary files /dev/null and b/src/ac/common/__pycache__/rpm_spec_adapter.cpython-38.pyc differ
diff --git a/src/ac/common/__pycache__/scanoss.cpython-38.pyc b/src/ac/common/__pycache__/scanoss.cpython-38.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..33c62f07ce5fbb9f5405d1cdc6ef6be74aea9586
Binary files /dev/null and b/src/ac/common/__pycache__/scanoss.cpython-38.pyc differ
diff --git a/src/ac/common/gitee_repo.py b/src/ac/common/gitee_repo.py
new file mode 100644
index 0000000000000000000000000000000000000000..e00c11091fa845cbf51511d7d611699823a7dbad
--- /dev/null
+++ b/src/ac/common/gitee_repo.py
@@ -0,0 +1,276 @@
+# -*- encoding=utf-8 -*-
+"""
+# ***********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2020-09-23
+# Description: Gitee api proxy
+# ***********************************************************************************/
+"""
+
+import os
+import logging
+
+from src.proxy.git_proxy import GitProxy
+from src.utils.shell_cmd import shell_cmd_live
+
+logger = logging.getLogger("ac")
+
+
+class GiteeRepo(object):
+ """
+ analysis src-openeuler repo
+ """
+
+ def __init__(self, repo, work_dir, decompress_dir):
+ self._repo = repo
+ self._work_dir = work_dir
+ self._decompress_dir = decompress_dir
+
+ self._patch_files = []
+ self._compress_files = []
+
+ self.spec_file = None
+ self.yaml_file = None
+ self.patch_dir_mapping = {}
+
+ self.find_file_path()
+
+ def find_file_path(self):
+ """
+ compress file, patch file, diff file, spec file
+ """
+ spec_files = []
+ for dirpath, _, filenames in os.walk(self._work_dir):
+ for filename in filenames:
+ rel_file_path = os.path.join(dirpath, filename).replace(self._work_dir, "").lstrip("/")
+ if self.is_compress_file(filename):
+ logger.debug("find compress file: %s", rel_file_path)
+ self._compress_files.append(rel_file_path)
+ elif self.is_patch_file(filename):
+ logger.debug("find patch file: %s", rel_file_path)
+ self._patch_files.append(rel_file_path)
+ elif self.is_spec_file(filename):
+ logger.debug("find spec file: %s", rel_file_path)
+ spec_files.append(filename)
+ elif self.is_package_yaml_file(filename):
+ logger.debug("find yaml file: %s", rel_file_path)
+ self.yaml_file = rel_file_path
+
+ def guess_real_spec_file():
+ """
+ maybe multi spec file of repo
+ :return:
+ """
+ if not spec_files: # closure
+ # logger.warning("no spec file")
+ return None
+
+ if len(spec_files) == 1:
+ return spec_files[0]
+
+ # file prefix
+ for spec_file in spec_files:
+ prefix = spec_file.split(".")[0]
+ if prefix == self._repo:
+ return spec_file
+
+ # will not happen
+ # logger.warning("no spec file")
+ return None
+
+ self.spec_file = guess_real_spec_file()
+
+ def patch_files_not_recursive(self):
+ """
+ 获取当前目录下patch文件
+ """
+ return [filename for filename in os.listdir(self._work_dir)
+ if os.path.isfile(os.path.join(self._work_dir, filename)) and self.is_patch_file(filename)]
+
+ def decompress_file(self, file_path):
+ """
+ 解压缩文件
+ :param file_path:
+ :return:
+ """
+ if self._is_compress_zip_file(file_path):
+ decompress_cmd = "cd {}; timeout 120s unzip -o -d {} {}".format(
+ self._work_dir, self._decompress_dir, file_path)
+ elif self._is_compress_tar_file(file_path):
+ decompress_cmd = "cd {}; timeout 120s tar -C {} -xavf {}".format(
+ self._work_dir, self._decompress_dir, file_path)
+ else:
+ logger.warning("unsupport compress file: %s", file_path)
+ return False
+
+ ret, _, _ = shell_cmd_live(decompress_cmd)
+ if ret:
+ logger.debug("decompress failed")
+ return False
+
+ return True
+
+ def decompress_all(self):
+ """
+ 解压缩所有文件
+ :return: 0/全部成功,1/部分成功,-1/全部失败
+ """
+ if not self._compress_files:
+ logger.warning("no compressed source file")
+ rs = [self.decompress_file(filepath) for filepath in self._compress_files]
+
+ return 0 if all(rs) else (1 if any(rs) else -1)
+
+ def apply_patch(self, patch, max_leading=5):
+ """
+ 尝试所有路径和leading
+ :param patch: 补丁
+ :param max_leading: leading path
+ """
+ logger.debug("apply patch %s", patch)
+ for patch_dir in [filename for filename in os.listdir(self._decompress_dir)
+ if os.path.isdir(os.path.join(self._decompress_dir, filename))] + ["."]:
+ if patch_dir.startswith(".git"):
+ continue
+ for leading in range(max_leading + 1):
+ logger.debug("try dir %s -p%s", patch_dir, leading)
+ if GitProxy.apply_patch_at_dir(os.path.join(self._decompress_dir, patch_dir),
+ os.path.join(self._work_dir, patch), leading):
+ logger.debug("patch success")
+ self.patch_dir_mapping[patch] = os.path.join(self._decompress_dir, patch_dir)
+ return True
+
+ logger.info("apply patch %s failed", patch)
+ return False
+
+ def apply_all_patches(self, *patches):
+ """
+ 打补丁通常是有先后顺序的
+ :param patches: 需要打的补丁
+ """
+ if not self._compress_files:
+ logger.debug("no compress source file, not need apply patch")
+ return 0
+
+ rs = []
+ for patch in patches:
+ if patch in set(self._patch_files):
+ rs.append(self.apply_patch(patch))
+ else:
+ logger.error("patch %s not exist", patch)
+ rs.append(False)
+
+ return 0 if all(rs) else (1 if any(rs) else -1)
+
+ def get_compress_files(self):
+ """
+ get compress files
+ """
+ return self._compress_files
+
+ def set_compress_files(self, compress_files):
+ """
+ set compress files
+ """
+ self._compress_files = compress_files
+
+ @staticmethod
+ def is_py_file(filename):
+ """
+ 功能描述:判断文件是否是python文件
+ 参数:文件名
+ 返回值:bool
+ """
+ return filename.endswith((".py",))
+
+ @staticmethod
+ def is_go_file(filename):
+ """
+ 功能描述:判断文件名是否是go文件
+ 参数:文件名
+ 返回值:bool
+ """
+ return filename.endswith((".go",))
+
+ @staticmethod
+ def is_c_cplusplus_file(filename):
+ """
+ 功能描述:判断文件名是否是c++文件
+ 参数:文件名
+ 返回值:bool
+ """
+ return filename.endswith((".c", ".cpp", ".cc", ".cxx", ".c++", ".h", ".hpp", "hxx"))
+
+ @staticmethod
+ def is_code_file(filename):
+ """
+ 功能描述:判断文件名是否是源码文件
+ 参数:文件名
+ 返回值:bool
+ """
+ return GiteeRepo.is_py_file(filename) \
+ or GiteeRepo.is_go_file(filename) \
+ or GiteeRepo.is_c_cplusplus_file(filename)
+
+ @staticmethod
+ def is_patch_file(filename):
+ """
+ 功能描述:判断文件名是否是补丁文件
+ 参数:文件名
+ 返回值:bool
+ """
+ return filename.endswith((".patch", ".diff"))
+
+ @staticmethod
+ def is_compress_file(filename):
+ """
+ 功能描述:判断文件名是否是压缩文件
+ 参数:文件名
+ 返回值:bool
+ """
+ return GiteeRepo._is_compress_tar_file(filename) or GiteeRepo._is_compress_zip_file(filename)
+
+ @staticmethod
+ def _is_compress_zip_file(filename):
+ """
+ 功能描述:判断文件名是否是zip压缩文件
+ 参数:文件名
+ 返回值:bool
+ """
+ return filename.endswith((".zip",))
+
+ @staticmethod
+ def _is_compress_tar_file(filename):
+ """
+ 功能描述:判断文件名是否是tar压缩文件
+ 参数:文件名
+ 返回值:bool
+ """
+ return filename.endswith((".tar.gz", ".tar.bz", ".tar.bz2", ".tar.xz", "tgz"))
+
+ @staticmethod
+ def is_spec_file(filename):
+ """
+ 功能描述:判断文件名是否以.spec结尾
+ 参数:文件名
+ 返回值:bool
+ """
+ return filename.endswith((".spec",))
+
+ @staticmethod
+ def is_package_yaml_file(filename):
+ """
+ 功能描述:判断文件名是否以.yaml结尾
+ 参数:文件名
+ 返回值:bool
+ """
+ return filename.endswith((".yaml",))
diff --git a/src/ac/common/linter.py b/src/ac/common/linter.py
new file mode 100644
index 0000000000000000000000000000000000000000..a7dd24e95e75ea148eef71c9f9b81e222753eadd
--- /dev/null
+++ b/src/ac/common/linter.py
@@ -0,0 +1,112 @@
+# -*- encoding=utf-8 -*-
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2020-09-23
+# Description: check code with linter tools
+# **********************************************************************************
+
+import re
+import logging
+
+from src.utils.shell_cmd import shell_cmd_live, shell_cmd
+
+logger = logging.getLogger("ac")
+
+
+class LinterCheck(object):
+ """
+ run linter check code
+ """
+ PYLINTRESULTPREFIX = ["C", "R", "W", "E", "F"]
+
+ @classmethod
+ def get_summary_of_pylint(cls, message):
+ """
+ parser message for summary and details
+ """
+ summary = {}
+ for prefix in cls.PYLINTRESULTPREFIX:
+ m = re.findall("{}: *[0-9]+, *[0-9]+:".format(prefix), "\n".join(message))
+ summary[prefix] = len(m)
+
+ return summary
+
+ @classmethod
+ def get_summary_of_golint(cls, message):
+ """
+ 所有都当作WARNING
+ """
+ m = re.findall(r"\.go:[0-9]+:[0-9]+:", "\n".join(message))
+ return {"W": len(m)}
+
+ @classmethod
+ def get_summary_of_splint(cls, message):
+ """
+ parser message for summary
+ """
+ logger.debug(message)
+ summary = {}
+
+ return summary
+
+ @classmethod
+ def check_python(cls, filepath):
+ """
+ Check python script by pylint
+ Using the default text output, the message format is :
+ MESSAGE_TYPE: LINE_NUM:[OBJECT:] MESSAGE
+ There are 5 kind of message types :
+ * (C) convention, for programming standard violation
+ * (R) refactor, for bad code smell
+ * (W) warning, for python specific problems
+ * (E) error, for probable bugs in the code
+ * (F) fatal, if an error occurred which prevented pylint from doing
+ """
+ logger.debug("check python file: %s", filepath)
+ # E0401: import module error
+ pylint_cmd = "pylint3 --disable=E0401 {}".format(filepath)
+ ret, out, _ = shell_cmd_live(pylint_cmd, cap_out=True, verbose=True)
+
+ if ret:
+ logger.debug("pylint ret, %s", ret)
+
+ return cls.get_summary_of_pylint(out)
+
+ @classmethod
+ def check_golang(cls, filepath):
+ """
+ Check golang code by golint
+ """
+ logger.debug("check go file: %s", filepath)
+ golint_cmd = "golint {}".format(filepath)
+ ret, out, _ = shell_cmd_live(golint_cmd, cap_out=True, verbose=True)
+
+ if ret:
+ logger.debug("golint error, %s", ret)
+ return {}
+
+ return cls.get_summary_of_golint(out)
+
+ @classmethod
+ def check_c_cplusplus(cls, filepath):
+ """
+ Check c/c++ code by splint
+ """
+ logger.debug("check c/c++ file: %s", filepath)
+ splint_cmd = "splint {}".format(filepath)
+ ret, out, _ = shell_cmd(splint_cmd)
+
+ if ret:
+ logger.debug("splint error, %s", ret)
+ return {}
+
+ return cls.get_summary_of_splint(out)
diff --git a/src/ac/common/rpm_spec_adapter.py b/src/ac/common/rpm_spec_adapter.py
new file mode 100644
index 0000000000000000000000000000000000000000..d602d252fcbd7bebeccfcb8fedbb5d68f7db6bbe
--- /dev/null
+++ b/src/ac/common/rpm_spec_adapter.py
@@ -0,0 +1,146 @@
+# -*- encoding=utf-8 -*-
+"""
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2020-09-23
+# Description: rpm spec analysis adapter
+# **********************************************************************************
+"""
+
+import re
+import logging
+
+from pyrpm.spec import Spec, replace_macros
+
+from src.utils.shell_cmd import shell_cmd
+
+logger = logging.getLogger("ac")
+
+
+class RPMSpecAdapter(object):
+ """
+ rpm spec file object
+ """
+ def __init__(self, fp):
+ if isinstance(fp, str):
+ with open(fp, "r") as fp:
+ self._adapter = Spec.from_string(fp.read())
+ else:
+ self._adapter = Spec.from_string(fp.read())
+ fp.close()
+
+ def __getattr__(self, item):
+ """
+
+ :param item:
+ :return
+ """
+ value = getattr(self._adapter, item)
+ if isinstance(value, list):
+ return [replace_macros(item, self._adapter) for item in value]
+ return replace_macros(value, self._adapter) if value else ""
+
+ def get_source(self, key):
+ """
+ get source url from spec.source_dict by key
+ :return:
+ """
+ src_url = self._adapter.sources_dict.get(key, "")
+ return replace_macros(src_url, self._adapter) if src_url else ""
+
+ def get_patch(self, key):
+ """
+ get source url from spec.source_dict by key
+ :return:
+ """
+ patch = self._adapter.patches_dict.get(key, "")
+ return replace_macros(patch, self._adapter) if patch else ""
+
+ def get_exclusivearch(self):
+ """
+ get exclusive arch
+ :return:
+ """
+ exclusive_arch_list = []
+ exclusive_arch = self.exclusivearch
+ logger.info("exclusive_arch \"%s\"", exclusive_arch)
+ if exclusive_arch:
+ macros_list = exclusive_arch.split()
+ for macros_mem in macros_list:
+ if macros_mem.startswith("%"):
+ cmd = 'rpm --eval "{}"'.format(macros_mem)
+ ret, out, _ = shell_cmd(cmd)
+ if out:
+ exclusive_arch_list.extend(bytes.decode(out).strip().split())
+ else:
+ exclusive_arch_list.append(macros_mem)
+ return exclusive_arch_list
+
+ @staticmethod
+ def compare_version(version_n, version_o):
+ """
+ :param version_n:
+ :param version_o:
+ :return: 0~eq, 1~gt, -1~lt
+ """
+ # replace continued chars to dot
+ version_n = re.sub("[a-zA-Z_-}]+", ".", version_n).strip().strip(".")
+ version_o = re.sub("[a-zA-Z_-}]+", ".", version_o).strip().strip(".")
+ # replace continued dots to a dot
+ version_n = re.sub("\.+", ".", version_n)
+ version_o = re.sub("\.+", ".", version_o)
+ # same partitions with ".0" padding
+ # "..." * -n = ""
+ version_n = "{}{}".format(version_n, '.0' * (len(version_o.split('.')) - len(version_n.split('.'))))
+ version_o = "{}{}".format(version_o, '.0' * (len(version_n.split('.')) - len(version_o.split('.'))))
+
+ logger.debug("compare versions: %s vs %s", version_n, version_o)
+ z = zip(version_n.split("."), version_o.split("."))
+
+ for p in z:
+ try:
+ if int(p[0]) < int(p[1]):
+ return -1
+ elif int(p[0]) > int(p[1]):
+ return 1
+ except ValueError as exc:
+ logger.debug("check version exception, %s", exc)
+ continue
+
+ return 0
+
+ def compare(self, other):
+ """
+ 比较spec的版本号和发布号
+ :param other:
+ :return: 0~eq, 1~gt, -1~lt
+ """
+ if self.__class__.compare_version(self.version, other.version) == 1:
+ return 1
+ if self.__class__.compare_version(self.version, other.version) == -1:
+ return -1
+
+ if self.__class__.compare_version(self.release, other.release) == 1:
+ return 1
+ if self.__class__.compare_version(self.release, other.release) == -1:
+ return -1
+
+ return 0
+
+ def __lt__(self, other):
+ return -1 == self.compare(other)
+
+ def __eq__(self, other):
+ return 0 == self.compare(other)
+
+ def __gt__(self, other):
+ return 1 == self.compare(other)
diff --git a/src/ac/common/scanoss.py b/src/ac/common/scanoss.py
new file mode 100644
index 0000000000000000000000000000000000000000..ce0a628fe7e3f3bad7fe9b80275e6fedaffd8474
--- /dev/null
+++ b/src/ac/common/scanoss.py
@@ -0,0 +1,174 @@
+# -*- encoding=utf-8 -*-
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2021-04-15
+# Description: sca with scanoss tools
+# **********************************************************************************
+
+import json
+import logging
+
+from src.utils.shell_cmd import shell_cmd_live, shell_cmd
+
+logger = logging.getLogger("ac")
+
+
+class ScanOSS(object):
+ """
+ scanoss 工具代理
+ """
+ def __init__(self, key, api_url, blacklist_sbom):
+ """
+
+ :param key:
+ :param api_url:
+ :param blacklist_sbom:
+ """
+ self._key = key
+ self._api_url = api_url
+ self._blacklist_sbom = blacklist_sbom
+
+ self._html_content = ""
+
+ def result_analysis(self, result):
+ """
+ 分析结果
+ 1.是否发生代码片段引用
+ 2.如果有则转换成html格式
+ :param result:
+ :return: True/没有代码引用,否则False
+ """
+ try:
+ json_format = json.loads(result)
+ except ValueError:
+ logger.exception("illegal scanoss result, \"%s\"", result)
+ return True
+
+ snippets = 0
+ files = 0
+ detail_trs = []
+ for filename, items in json_format.items():
+ for item in items:
+ if item["id"] == "none":
+ continue
+ if item["id"] == "snippet":
+ snippets += 1
+ elif item["id"] == "file":
+ files += 1
+
+ detail_trs.append(self.__class__.detail_trs(filename, item))
+
+ logger.debug("snippets: %s, files: %s", snippets, files)
+ detail = "".format(
+ th=self.__class__.detail_th(), trs="\n".join(detail_trs))
+
+ summary = "".format(th=self.__class__.summary_th(),
+ trs="".join(self.__class__.summary_trs(snippets, files)))
+
+ self._html_content = "{summary}{details}".format(summary=summary, details=detail)
+
+ return False if snippets or files else True
+
+ @staticmethod
+ def detail_th():
+ """
+ 详细结果table header
+ :return:
+ """
+ return "" \
+ "| filename | " \
+ "id | " \
+ "lines | " \
+ "oss_lines | " \
+ "matched | " \
+ "vendor | " \
+ "component | " \
+ "version | " \
+ "url | " \
+ "file | " \
+ "file_id | " \
+ "
"
+
+ @staticmethod
+ def detail_trs(filename, item):
+ """
+ 详细结果table rows
+ :param filename: 文件名
+ :param item:
+ :return:
+ """
+ return ""\
+ "| {filename} | " \
+ "{id} | " \
+ "{lines} | " \
+ "{oss_lines} | " \
+ "{matched} | " \
+ "{vendor} | " \
+ "{component} | " \
+ "{version} | " \
+ "{url} | " \
+ "{file} | " \
+ "{file_hash} | " \
+ "
".format(filename=filename, **item)
+
+ @staticmethod
+ def summary_th():
+ """
+ 归纳结果table header
+ :return:
+ """
+ return "" \
+ "| snippet | " \
+ "file | " \
+ "total | " \
+ "
"
+
+ @staticmethod
+ def summary_trs(snippets, files):
+ """
+ 归纳结果table rows
+ :param snippets:
+ :param files:
+ :return:
+ """
+ return "" \
+ "| {snippets} | " \
+ "{files} | " \
+ "{total} | " \
+ "
".format(snippets=snippets, files=files, total=snippets + files)
+
+ @property
+ def html(self):
+ """
+ 内容
+ :return:
+ """
+ return self._html_content
+
+ def scan(self, directory):
+ """
+ 执行扫描
+ :param directory: 需要扫描的目录
+ :return:
+ """
+ logger.debug("scan dir: %s", directory)
+ scanoss_cmd = "scanner.py --blacklist {} --format {} {} --apiurl {} {}".format(
+ self._blacklist_sbom, "plain", "--key {}".format(self._key) if self._key else "", self._api_url, directory)
+ ret, out, err = shell_cmd(scanoss_cmd)
+
+ if ret:
+ logger.error("scanoss error, %s", ret)
+ logger.error("%s", err)
+ return True
+
+ return self.result_analysis(out)
diff --git a/src/ac/framework/__init__.py b/src/ac/framework/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/src/ac/framework/__pycache__/ac_base.cpython-310.pyc b/src/ac/framework/__pycache__/ac_base.cpython-310.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..a9eb4fa116a2183ae6cc1276c9a14a5dba7e0c79
Binary files /dev/null and b/src/ac/framework/__pycache__/ac_base.cpython-310.pyc differ
diff --git a/src/ac/framework/__pycache__/ac_base.cpython-38.pyc b/src/ac/framework/__pycache__/ac_base.cpython-38.pyc
new file mode 100644
index 0000000000000000000000000000000000000000..9cd235fac010b7f87bfdc6c01fc18f3d3c4f8a9f
Binary files /dev/null and b/src/ac/framework/__pycache__/ac_base.cpython-38.pyc differ
diff --git a/src/ac/framework/ac.py b/src/ac/framework/ac.py
new file mode 100644
index 0000000000000000000000000000000000000000..e3ad4982a0cb1986f7a06ef1800243e30fcb474d
--- /dev/null
+++ b/src/ac/framework/ac.py
@@ -0,0 +1,351 @@
+# -*- encoding=utf-8 -*-
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2020-09-23
+# Description: access control list entrypoint
+# **********************************************************************************
+
+import argparse
+import datetime
+import importlib
+import json
+import logging.config
+import os
+import sys
+import warnings
+
+import yaml
+from yaml.error import YAMLError
+
+from src.proxy.git_proxy import GitProxy
+from src.proxy.gitee_proxy import GiteeProxy
+from src.proxy.jenkins_proxy import JenkinsProxy
+from src.proxy.kafka_proxy import KafkaProducerProxy
+from src.utils.dist_dataset import DistDataset
+from src.constant import Constant
+
+
+class AC(object):
+ """
+ ac entrypoint
+ """
+
+ def __init__(self, conf, community="src-openeuler"):
+ """
+
+ :param conf: 配置文件路径
+ :param community: src-openeuler or openeuler
+ :return:
+ """
+ self._ac_check_elements = {} # 门禁项
+ self._ac_check_result = [] # 门禁结果结果
+
+ acl_path = os.path.realpath(os.path.join(os.path.dirname(__file__), "../acl"))
+ self._acl_package = "src.ac.acl" # take attention about import module
+ self.load_check_elements_from_acl_directory(acl_path)
+ if community != "src-openeuler" and community != "openeuler":
+ self.load_check_elements_from_conf(conf, "src-openeuler")
+ else:
+ self.load_check_elements_from_conf(conf, community)
+
+ logger.debug("check list: %s", self._ac_check_elements)
+
+ @staticmethod
+ def comment_jenkins_url(gp, jp, pr):
+ """
+ 在pr评论中显示构建任务链接
+ :param gp: gitee接口
+ :param jp: jenkins接口
+ :param pr: pr编号
+ :return:
+ """
+ comments = ["门禁正在运行, 您可以通过以下链接查看实时门禁检查结果."]
+
+ trigger_job_name = os.environ.get("JOB_NAME")
+ trigger_build_id = os.environ.get("BUILD_ID")
+ trigger_job_info = jp.get_job_info(trigger_job_name)
+ trigger_job_url = trigger_job_info.get("url")
+ comments.append("门禁入口及编码规范检查: {}, 当前构建号为 {}".format(
+ trigger_job_url, jp.get_job_path_from_job_url(trigger_job_url), trigger_build_id))
+
+ down_projects = trigger_job_info.get("downstreamProjects", [])
+ build_job_name_list = []
+ build_job_link_list = []
+ for project in down_projects:
+ build_job_url = project.get("url", "")
+ if build_job_url:
+ build_job_name = jp.get_job_path_from_job_url(build_job_url)
+ build_job_name_list.append(build_job_name)
+ build_job_link_list.append("{}".format(build_job_url, build_job_name))
+ comments.append("构建及构建后检查: {}".format(", ".join(build_job_link_list)))
+
+ if build_job_name_list:
+ build_job_info = jp.get_job_info(build_job_name_list[0])
+ down_down_projects = build_job_info.get("downstreamProjects", [])
+ for project in down_down_projects:
+ comment_job_url = project.get("url")
+ comments.append("门禁结果回显: {}".format(
+ comment_job_url, jp.get_job_path_from_job_url(comment_job_url)))
+ comments.append("若您对门禁结果含义不清晰或者遇到问题不知如何解决,可参考"
+ "门禁指导手册")
+ gp.comment_pr(pr, "\n".join(comments))
+
+ @staticmethod
+ def is_repo_support_check(repo, check_element):
+ """
+ 仓库是否支持检查
+ 不在allow_list或者在deny_list,则不支持检查
+ :param repo:
+ :param check_element:
+ :return: 允许/True,否则False
+ """
+ allow_list = check_element.get("allow_list") or []
+ deny_list = check_element.get("deny_list") or []
+
+ return False if (allow_list and repo not in allow_list) or repo in deny_list else True
+
+ def check_all(self, workspace, repo, dataset, **kwargs):
+ """
+ 门禁检查
+ :param workspace:
+ :param repo:
+ :return:
+ """
+ target_branch = kwargs.get("tbranch")
+ if target_branch.lower() in Constant.STOPPED_MAINTENANCE_BRANCH:
+ logger.error("%s is no longer maintained, and access control is no longer checked.", target_branch)
+ self._ac_check_result.append({"name": "Branch is not maintained", "result": 2})
+ return
+
+ for element in self._ac_check_elements:
+ check_element = self._ac_check_elements.get(element)
+ logger.debug("check %s", element)
+
+ # show in gitee, must starts with "check_"
+ hint = check_element.get("hint", "check_{}".format(element))
+ if not hint.startswith("check_"):
+ hint = "check_{}".format(hint)
+
+ if not self.__class__.is_repo_support_check(repo, check_element):
+ logger.debug("%s not support check", repo)
+ continue
+
+ # import module
+ module_path = check_element.get("module", "{}.check_{}".format(element, element)) # eg: spec.check_spec
+ try:
+ module = importlib.import_module("." + module_path, self._acl_package)
+ logger.debug("load module %s succeed", module_path)
+ except ImportError as exc:
+ logger.exception("import module %s exception, %s", module_path, exc)
+ continue
+
+ # import entry
+ entry_name = check_element.get("entry", "Check{}".format(element.capitalize()))
+ try:
+ entry = getattr(module, entry_name)
+ logger.debug("load entry \"%s\" succeed", entry_name)
+ except AttributeError as exc:
+ logger.warning("entry \"%s\" not exist in module %s, %s", entry_name, module_path, exc)
+ continue
+
+ # new a instance
+ if isinstance(entry, type): # class object
+ entry = entry(workspace, repo, check_element) # new a instance
+
+ if not callable(entry): # check callable
+ logger.warning("entry %s not callable", entry_name)
+ continue
+
+ # do ac check
+ result = entry(**kwargs)
+ logger.debug("check result %s %s", element, result)
+
+ self._ac_check_result.append({"name": hint, "result": result.val})
+ dataset.set_attr("access_control.build.acl.{}".format(element), result.hint)
+
+ dataset.set_attr("access_control.build.content", self._ac_check_result)
+ logger.debug("ac result: %s", self._ac_check_result)
+
+ def load_check_elements_from_acl_directory(self, acl_dir):
+ """
+ 加载当前目录下所有门禁项
+ :return:
+ """
+ for filename in os.listdir(acl_dir):
+ if filename != "__pycache__" and os.path.isdir(os.path.join(acl_dir, filename)):
+ self._ac_check_elements[filename] = {} # don't worry, using default when checking
+
+ def load_check_elements_from_conf(self, conf_file, community):
+ """
+ 加载门禁项目,只支持yaml格式
+ :param conf_file: 配置文件路径
+ :param community: src-openeuler or openeuler
+ :return:
+ """
+ try:
+ with open(conf_file, "r") as f:
+ content = yaml.safe_load(f)
+ except IOError:
+ logger.exception("ac conf file %s not exist", conf_file)
+ return
+ except YAMLError:
+ logger.exception("illegal conf file format")
+ return
+
+ elements = content.get(community, {})
+ logger.debug("community \"%s\" conf: %s", community, elements)
+ for name in elements:
+ if name in self._ac_check_elements:
+ if elements[name].get("exclude"):
+ logger.debug("exclude: %s", name)
+ self._ac_check_elements.pop(name)
+ else:
+ self._ac_check_elements[name] = elements[name]
+
+ def save(self, ac_file):
+ """
+ save result
+ :param ac_file:
+ :return:
+ """
+ logger.debug("save ac result to file %s", ac_file)
+ with open(ac_file, "w") as f:
+ f.write("ACL={}".format(json.dumps(self._ac_check_result)))
+
+
+def init_args():
+ """
+ init args
+ :return:
+ """
+ parser = argparse.ArgumentParser()
+
+ parser.add_argument("-c", type=str, dest="community", default="src-openeuler", help="src-openeuler or openeuler")
+ parser.add_argument("-w", type=str, dest="workspace", help="workspace where to find source")
+ parser.add_argument("-r", type=str, dest="repo", help="repo name")
+ parser.add_argument("-b", type=str, dest="tbranch", help="branch merge to")
+ parser.add_argument("-o", type=str, dest="output", help="output file to save result")
+ parser.add_argument("-p", type=str, dest="pr", help="pull request number")
+ parser.add_argument("-t", type=str, dest="token", help="gitee api token")
+ parser.add_argument("-a", type=str, dest="account", help="gitee account")
+
+ # dataset
+ parser.add_argument("-m", type=str, dest="comment", help="trigger comment")
+ parser.add_argument("-i", type=str, dest="comment_id", help="trigger comment id")
+ parser.add_argument("-e", type=str, dest="committer", help="committer")
+ parser.add_argument("-x", type=str, dest="pr_ctime", help="pr create time")
+ parser.add_argument("-z", type=str, dest="trigger_time", help="job trigger time")
+ parser.add_argument("-l", type=str, dest="trigger_link", help="job trigger link")
+
+ # scanoss
+ parser.add_argument("--scanoss-output", type=str, dest="scanoss_output",
+ default="scanoss_result", help="scanoss result output")
+
+ parser.add_argument("--codecheck-api-key", type=str, dest="codecheck_api_key", help="codecheck api key")
+ parser.add_argument("--codecheck-api-url", type=str, dest="codecheck_api_url",
+ default="https://majun.osinfra.cn:8384/api/openlibing/codecheck", help="codecheck api url")
+
+ parser.add_argument("--jenkins-base-url", type=str, dest="jenkins_base_url",
+ default="https://openeulerjenkins.osinfra.cn/", help="jenkins base url")
+ parser.add_argument("--jenkins-user", type=str, dest="jenkins_user", help="repo name")
+ parser.add_argument("--jenkins-api-token", type=str, dest="jenkins_api_token", help="jenkins api token")
+
+ return parser.parse_args()
+
+
+if "__main__" == __name__:
+ args = init_args()
+
+ # init logging
+ _ = not os.path.exists("log") and os.mkdir("log")
+ logger_conf_path = os.path.realpath(os.path.join(
+ os.path.dirname(os.path.realpath(__file__)), "../../conf/logger.conf"))
+ logging.config.fileConfig(logger_conf_path)
+ logger = logging.getLogger("ac")
+
+ logger.info("using credential %s", args.account.split(":")[0])
+ logger.info("cloning repository https://gitee.com/%s/%s.git ", args.community, args.repo)
+ logger.info("clone depth 4")
+ logger.info("checking out pull request %s", args.pr)
+
+ dd = DistDataset()
+ dd.set_attr_stime("access_control.job.stime")
+
+ # info from args
+ dd.set_attr("id", args.comment_id)
+ dd.set_attr("pull_request.package", args.repo)
+ dd.set_attr("pull_request.number", args.pr)
+ dd.set_attr("pull_request.author", args.committer)
+ dd.set_attr("pull_request.target_branch", args.tbranch)
+ dd.set_attr("pull_request.ctime", args.pr_ctime)
+ dd.set_attr("access_control.trigger.link", args.trigger_link)
+ dd.set_attr("access_control.trigger.reason", args.comment)
+ ctime = datetime.datetime.strptime(args.trigger_time.split("+")[0], "%Y-%m-%dT%H:%M:%S")
+ dd.set_attr_ctime("access_control.job.ctime", ctime)
+
+ # suppress python warning
+ warnings.filterwarnings("ignore")
+ logging.getLogger("elasticsearch").setLevel(logging.WARNING)
+ logging.getLogger("kafka").setLevel(logging.WARNING)
+
+ kp = KafkaProducerProxy(brokers=os.environ["KAFKAURL"].split(","))
+
+ # download repo
+ dd.set_attr_stime("access_control.scm.stime")
+ git_proxy = GitProxy.init_repository(args.repo, work_dir=args.workspace)
+ repo_url = "https://{}@gitee.com/{}/{}.git".format(args.account, args.community, args.repo)
+ if not git_proxy.fetch_pull_request(repo_url, args.pr, depth=4):
+ dd.set_attr("access_control.scm.result", "failed")
+ dd.set_attr_etime("access_control.scm.etime")
+
+ dd.set_attr_etime("access_control.job.etime")
+ kp.send("openeuler_statewall_ci_ac", key=args.comment_id, value=dd.to_dict())
+ logger.info("fetch finished -")
+ sys.exit(-1)
+ else:
+ git_proxy.checkout_to_commit_force("pull/{}/MERGE".format(args.pr))
+ logger.info("fetch finished +")
+ dd.set_attr("access_control.scm.result", "successful")
+ dd.set_attr_etime("access_control.scm.etime")
+
+ logger.info("--------------------AC START---------------------")
+
+ # build start
+ dd.set_attr_stime("access_control.build.stime")
+
+ # gitee comment jenkins url
+ gitee_proxy_inst = GiteeProxy(args.community, args.repo, args.token)
+ if all([args.jenkins_base_url, args.jenkins_user, args.jenkins_api_token]):
+ jenkins_proxy_inst = JenkinsProxy(args.jenkins_base_url, args.jenkins_user, args.jenkins_api_token)
+ AC.comment_jenkins_url(gitee_proxy_inst, jenkins_proxy_inst, args.pr)
+
+ # gitee pr tag
+ gitee_proxy_inst.delete_tag_of_pr(args.pr, "ci_successful")
+ gitee_proxy_inst.delete_tag_of_pr(args.pr, "ci_failed")
+ gitee_proxy_inst.create_tags_of_pr(args.pr, "ci_processing")
+
+ # scanoss conf
+ scanoss = {"output": args.scanoss_output}
+
+ codecheck = {"pr_url": "https://gitee.com/{}/{}/pulls/{}".format(args.community, args.repo, args.pr),
+ "pr_number": args.pr, "codecheck_api_url": args.codecheck_api_url,
+ "codecheck_api_key": args.codecheck_api_key}
+
+ # build
+ ac = AC(os.path.join(os.path.dirname(os.path.realpath(__file__)), "ac.yaml"), args.community)
+ ac.check_all(workspace=args.workspace, repo=args.repo, dataset=dd, tbranch=args.tbranch, scanoss=scanoss,
+ codecheck=codecheck)
+ dd.set_attr_etime("access_control.build.etime")
+ ac.save(args.output)
+
+ dd.set_attr_etime("access_control.job.etime")
+ kp.send("openeuler_statewall_ci_ac", key=args.comment_id, value=dd.to_dict())
diff --git a/src/ac/framework/ac.yaml b/src/ac/framework/ac.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..1d4599c95207fc5f3171848819683bbee2a8bd25
--- /dev/null
+++ b/src/ac/framework/ac.yaml
@@ -0,0 +1,80 @@
+src-oepkgs:
+ spec:
+ hint: check_spec_file
+ module: spec.check_spec
+ entry: CheckSpec
+ ignored: ["homepage"]
+ code:
+ hint: check_code_style
+ module: code.check_code_style
+ entry: CheckCodeStyle
+ exclude: True
+ ignored: ["patch"]
+ package_yaml:
+ hint: check_package_yaml_file
+ module: package_yaml.check_yaml
+ entry: CheckPackageYaml
+ ignored: ["fields"]
+ package_license:
+ hint: check_package_license
+ module: package_license.check_license
+ entry: CheckLicense
+ binary:
+ hint: check_binary_file
+ module: binary.check_binary_file
+ entry: CheckBinaryFile
+ source_consistency:
+ hint: check_consistency
+ module: source_consistency.check_consistency
+ entry: CheckSourceConsistency
+ npmbuild:
+ hint: check_build
+ module: npmbuild.check_build
+ entry: CheckBuild
+ sca:
+ exclude: True
+ openlibing:
+ exclude: True
+ commit_msg:
+ exclude: True
+# source_consistency:
+# exclude: True
+#oepkgs:
+# spec:
+# exclude: True
+# code:
+# exclude: True
+# package_yaml:
+# exclude: True
+# package_license:
+# hint: check_package_license
+# module: package_license.check_openeuler_license
+# entry: CheckOpeneulerLicense
+# binary:
+# exclude: True
+# sca:
+# hint: check_sca
+# module: sca.check_sca
+# entry: CheckSCA
+# deny_list: ["bishengjdk-17", "bishengjdk-8", "bishengjdk-11", "bishengjdk-riscv", "gcc", "kernel"]
+# openlibing:
+# hint: code
+# module: openlibing.check_code
+# entry: CheckCode
+# allow_list: ["pkgship", "kunpengsecl", "release-tools", "yocto-meta-openeuler", "yocto-embedded-tools",
+# "gcc", "gcc-anti-sca", "A-Ops", "openeuler-jenkins", "lcr", "eggo", "oecp", "etmem", "A-Tune",
+# "libkae", "KubeOS", "ci-bot", "iSulad", "gazelle", "clibcni", "secGear", "eulerfs", "oemaker",
+# "go-gitee", "secpaver", "pyporter", "radiaTest", "stratovirt", "iSulad-img", "kae_driver",
+# "isula-build", "cve-manager", "attest-tools", "oec-hardware", "itrustee_sdk", "wisdom-advisor",
+# "isula-transform", " itrustee_client", "A-Tune-Collector", "itrustee_tzdriver", "website-v2",
+# "yocto-poky", "bishengjdk-17", "bishengjdk-8", "bishengjdk-11", "bishengjdk-riscv", "powerapi",
+# "eagle", "dcs", "astream", "QARobot", "oec-application", "gala-gopher", "gala-anteater",
+# "gala-spider", "gala-ragdoll", "aops-zeus", "aops-ceres", "aops-apollo", "aops-diana", "aops-hermes",
+# "aops-vulcanus"]
+# commit_msg:
+# hint: commit_msg
+# module: commit_msg.check_commit_msg
+# entry: CheckCommitMsg
+# allow_list: ["yocto-meta-openeuler"]
+# source_consistency:
+# exclude: True
\ No newline at end of file
diff --git a/src/ac/framework/ac_base.py b/src/ac/framework/ac_base.py
new file mode 100644
index 0000000000000000000000000000000000000000..2fa6dcf38ce6bb1240c957662dd1342aaff6f222
--- /dev/null
+++ b/src/ac/framework/ac_base.py
@@ -0,0 +1,97 @@
+# -*- encoding=utf-8 -*-
+"""
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2020-09-23
+# Description: access control list base class
+# **********************************************************************************
+"""
+
+import abc
+import inspect
+import logging
+import os
+
+from src.ac.framework.ac_result import SUCCESS, WARNING, FAILED
+
+logger = logging.getLogger("ac")
+
+
+class BaseCheck(object):
+ """
+ acl check base class
+ """
+
+ __metaclass__ = abc.ABCMeta
+
+ def __init__(self, workspace, repo, conf=None):
+ """
+
+ :param repo:
+ :param workspace:
+ :param conf:
+ """
+ self._repo = repo
+ self._workspace = workspace
+ self._conf = conf
+
+ self._work_dir = os.path.join(workspace, repo)
+
+ @abc.abstractmethod
+ def __call__(self, *args, **kwargs):
+ raise NotImplementedError("subclasses must override __call__!")
+
+ def start_check_with_order(self, *items):
+ """
+ 按照items中顺序运行
+ """
+ result = SUCCESS
+ for name in items:
+ try:
+ logger.debug("check %s", name)
+ method = getattr(self, "check_{}".format(name))
+ rs = method()
+ # logger.info(rs)
+ logger.debug("%s -> %s", name, rs)
+ except Exception as e:
+ # 忽略代码错误
+ logger.exception("internal error: %s", e)
+ continue
+
+ ignored = True if self._conf and name in self._conf.get("ignored", []) else False
+ logger.debug("%s ignore: %s", name, ignored)
+
+ if rs is SUCCESS:
+ logger.info("check %s pass", name)
+ elif rs is WARNING:
+ logger.warning("check %s warning %s", name, " [ignored]" if ignored else "")
+ elif rs is FAILED:
+ logger.error("check %s fail %s", name, " [ignored]" if ignored else "")
+ else:
+ # never here
+ logger.exception("check %s exception %s", name, " [ignored]" if ignored else "")
+ continue
+
+ if not ignored:
+ result += rs
+
+ return result
+
+ def start_check(self):
+ """
+ 运行所有check_开头的函数
+ """
+ members = inspect.getmembers(self, inspect.ismethod)
+ items = [member[0].replace("check_", "") for member in members if member[0].startswith("check_")]
+ logger.debug("check items: %s", items)
+
+ return self.start_check_with_order(*items)
diff --git a/src/ac/framework/ac_result.py b/src/ac/framework/ac_result.py
new file mode 100644
index 0000000000000000000000000000000000000000..b6cce84000a4f46ec7c84cff0852ad8a8e5ce902
--- /dev/null
+++ b/src/ac/framework/ac_result.py
@@ -0,0 +1,72 @@
+# -*- encoding=utf-8 -*-
+"""
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2020-09-23
+# Description: access control list base class
+# **********************************************************************************
+"""
+
+
+class ACResult(object):
+ """
+ Use this variables (FAILED, WARNING, SUCCESS, EXCLUDE) at most time,
+ and don't new ACResult unless you have specific needs.
+ """
+ def __init__(self, val):
+ self._val = val
+
+ def __add__(self, other):
+ return self if self.val >= other.val else other
+
+ def __str__(self):
+ return self.hint
+
+ def __repr__(self):
+ return self.__str__()
+
+ @classmethod
+ def get_instance(cls, val):
+ """
+
+ :param val: 0/1/2/3/True/False/success/fail/warn
+ :return: instance of ACResult
+ """
+ if isinstance(val, int):
+ return {0: SUCCESS, 1: WARNING, 2: FAILED, 3: EXCLUDE}.get(val)
+ if isinstance(val, bool):
+ return {True: SUCCESS, False: FAILED}.get(val)
+
+ try:
+ val = int(val)
+ return {0: SUCCESS, 1: WARNING, 2: FAILED, 3: EXCLUDE}.get(val)
+ except ValueError:
+ return {"success": SUCCESS, "fail": FAILED, "failed": FAILED, "failure": FAILED, "exclude": EXCLUDE,
+ "warn": WARNING, "warning": WARNING}.get(val.lower(), FAILED)
+
+ @property
+ def val(self):
+ return self._val
+
+ @property
+ def hint(self):
+ return ["SUCCESS", "WARNING", "FAILED", "EXCLUDE"][self.val]
+
+ @property
+ def emoji(self):
+ return [":white_check_mark:", ":bug:", ":x:", ":ballot_box_with_check:"][self.val]
+
+
+EXCLUDE = ACResult(3)
+FAILED = ACResult(2)
+WARNING = ACResult(1)
+SUCCESS = ACResult(0)
diff --git a/src/ac/framework/acfile b/src/ac/framework/acfile
new file mode 100644
index 0000000000000000000000000000000000000000..eb655f0c4e25bcaba93a199f674b78e74b41ba7d
--- /dev/null
+++ b/src/ac/framework/acfile
@@ -0,0 +1 @@
+ACL=[]
\ No newline at end of file
diff --git a/src/ac/framework/bmp.log b/src/ac/framework/bmp.log
new file mode 100644
index 0000000000000000000000000000000000000000..8f4ba44d6b8ecf5c11352dd7b0eeae5a2992b9ea
Binary files /dev/null and b/src/ac/framework/bmp.log differ
diff --git a/src/ac/framework/log/ac.log b/src/ac/framework/log/ac.log
new file mode 100644
index 0000000000000000000000000000000000000000..5d8f32e0989b93931a3ee1f8b827509a283dc64c
--- /dev/null
+++ b/src/ac/framework/log/ac.log
@@ -0,0 +1,936 @@
+2023-03-23 10:50:42,967 test.py[line:278] INFO : --------------------AC START---------------------
+2023-03-23 10:50:43,692 test.py[line:163] DEBUG : community "src-openeuler" conf: {}
+2023-03-23 10:50:43,692 test.py[line: 57] DEBUG : check list: {'binary': {}, 'code': {}, 'commit_msg': {}, 'openlibing': {}, 'package_license': {}, 'package_yaml': {}, 'sca': {}, 'source_consistency': {}, 'spec': {}}
+2023-03-23 10:50:43,692 test.py[line: 92] DEBUG : check binary
+2023-03-23 10:50:43,693 test.py[line:106] ERROR : import module binary.check_binary exception, No module named 'src.ac.acl.binary.check_binary'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.binary.check_binary'
+2023-03-23 10:50:43,701 test.py[line: 92] DEBUG : check code
+2023-03-23 10:50:43,702 test.py[line:106] ERROR : import module code.check_code exception, No module named 'src.ac.acl.code.check_code'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.code.check_code'
+2023-03-23 10:50:43,702 test.py[line: 92] DEBUG : check commit_msg
+2023-03-23 10:50:43,705 test.py[line:104] DEBUG : load module commit_msg.check_commit_msg succeed
+2023-03-23 10:50:43,705 test.py[line:115] WARNING : entry "CheckCommit_msg" not exist in module commit_msg.check_commit_msg, module 'src.ac.acl.commit_msg.check_commit_msg' has no attribute 'CheckCommit_msg'
+2023-03-23 10:50:43,705 test.py[line: 92] DEBUG : check openlibing
+2023-03-23 10:50:43,706 test.py[line:106] ERROR : import module openlibing.check_openlibing exception, No module named 'src.ac.acl.openlibing.check_openlibing'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.openlibing.check_openlibing'
+2023-03-23 10:50:43,706 test.py[line: 92] DEBUG : check package_license
+2023-03-23 10:50:43,707 test.py[line:106] ERROR : import module package_license.check_package_license exception, No module named 'src.ac.acl.package_license.check_package_license'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.package_license.check_package_license'
+2023-03-23 10:50:43,707 test.py[line: 92] DEBUG : check package_yaml
+2023-03-23 10:50:43,708 test.py[line:106] ERROR : import module package_yaml.check_package_yaml exception, No module named 'src.ac.acl.package_yaml.check_package_yaml'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.package_yaml.check_package_yaml'
+2023-03-23 10:50:43,713 test.py[line: 92] DEBUG : check sca
+2023-03-23 10:50:43,716 test.py[line:104] DEBUG : load module sca.check_sca succeed
+2023-03-23 10:50:43,716 test.py[line:115] WARNING : entry "CheckSca" not exist in module sca.check_sca, module 'src.ac.acl.sca.check_sca' has no attribute 'CheckSca'
+2023-03-23 10:50:43,716 test.py[line: 92] DEBUG : check source_consistency
+2023-03-23 10:50:43,717 test.py[line:106] ERROR : import module source_consistency.check_source_consistency exception, No module named 'src.ac.acl.source_consistency.check_source_consistency'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.source_consistency.check_source_consistency'
+2023-03-23 10:50:43,717 test.py[line: 92] DEBUG : check spec
+2023-03-23 10:50:43,723 test.py[line:104] DEBUG : load module spec.check_spec succeed
+2023-03-23 10:50:43,723 test.py[line:113] DEBUG : load entry "CheckSpec" succeed
+2023-03-23 10:51:31,091 test.py[line:278] INFO : --------------------AC START---------------------
+2023-03-23 10:51:31,108 test.py[line:163] DEBUG : community "src-openeuler" conf: {}
+2023-03-23 10:51:31,108 test.py[line: 57] DEBUG : check list: {'binary': {}, 'code': {}, 'commit_msg': {}, 'openlibing': {}, 'package_license': {}, 'package_yaml': {}, 'sca': {}, 'source_consistency': {}, 'spec': {}}
+2023-03-23 10:51:31,108 test.py[line: 92] DEBUG : check binary
+2023-03-23 10:51:31,109 test.py[line:106] ERROR : import module binary.check_binary exception, No module named 'src.ac.acl.binary.check_binary'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.binary.check_binary'
+2023-03-23 10:51:31,109 test.py[line: 92] DEBUG : check code
+2023-03-23 10:51:31,109 test.py[line:106] ERROR : import module code.check_code exception, No module named 'src.ac.acl.code.check_code'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.code.check_code'
+2023-03-23 10:51:31,109 test.py[line: 92] DEBUG : check commit_msg
+2023-03-23 10:51:31,109 test.py[line:104] DEBUG : load module commit_msg.check_commit_msg succeed
+2023-03-23 10:51:31,109 test.py[line:115] WARNING : entry "CheckCommit_msg" not exist in module commit_msg.check_commit_msg, module 'src.ac.acl.commit_msg.check_commit_msg' has no attribute 'CheckCommit_msg'
+2023-03-23 10:51:31,109 test.py[line: 92] DEBUG : check openlibing
+2023-03-23 10:51:31,109 test.py[line:106] ERROR : import module openlibing.check_openlibing exception, No module named 'src.ac.acl.openlibing.check_openlibing'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.openlibing.check_openlibing'
+2023-03-23 10:51:31,109 test.py[line: 92] DEBUG : check package_license
+2023-03-23 10:51:31,109 test.py[line:106] ERROR : import module package_license.check_package_license exception, No module named 'src.ac.acl.package_license.check_package_license'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.package_license.check_package_license'
+2023-03-23 10:51:31,109 test.py[line: 92] DEBUG : check package_yaml
+2023-03-23 10:51:31,109 test.py[line:106] ERROR : import module package_yaml.check_package_yaml exception, No module named 'src.ac.acl.package_yaml.check_package_yaml'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.package_yaml.check_package_yaml'
+2023-03-23 10:51:31,126 test.py[line: 92] DEBUG : check sca
+2023-03-23 10:51:31,128 test.py[line:104] DEBUG : load module sca.check_sca succeed
+2023-03-23 10:51:31,128 test.py[line:115] WARNING : entry "CheckSca" not exist in module sca.check_sca, module 'src.ac.acl.sca.check_sca' has no attribute 'CheckSca'
+2023-03-23 10:51:31,128 test.py[line: 92] DEBUG : check source_consistency
+2023-03-23 10:51:31,128 test.py[line:106] ERROR : import module source_consistency.check_source_consistency exception, No module named 'src.ac.acl.source_consistency.check_source_consistency'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.source_consistency.check_source_consistency'
+2023-03-23 10:51:31,128 test.py[line: 92] DEBUG : check spec
+2023-03-23 10:51:31,134 test.py[line:104] DEBUG : load module spec.check_spec succeed
+2023-03-23 10:51:31,134 test.py[line:113] DEBUG : load entry "CheckSpec" succeed
+2023-03-23 10:52:08,886 test.py[line:278] INFO : --------------------AC START---------------------
+2023-03-23 10:52:08,889 test.py[line:163] DEBUG : community "src-openeuler" conf: {}
+2023-03-23 10:52:08,889 test.py[line: 57] DEBUG : check list: {'binary': {}, 'code': {}, 'commit_msg': {}, 'openlibing': {}, 'package_license': {}, 'package_yaml': {}, 'sca': {}, 'source_consistency': {}, 'spec': {}}
+2023-03-23 10:52:08,889 test.py[line: 92] DEBUG : check binary
+2023-03-23 10:52:08,889 test.py[line:106] ERROR : import module binary.check_binary exception, No module named 'src.ac.acl.binary.check_binary'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.binary.check_binary'
+2023-03-23 10:52:08,889 test.py[line: 92] DEBUG : check code
+2023-03-23 10:52:08,889 test.py[line:106] ERROR : import module code.check_code exception, No module named 'src.ac.acl.code.check_code'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.code.check_code'
+2023-03-23 10:52:08,889 test.py[line: 92] DEBUG : check commit_msg
+2023-03-23 10:52:08,889 test.py[line:104] DEBUG : load module commit_msg.check_commit_msg succeed
+2023-03-23 10:52:08,889 test.py[line:115] WARNING : entry "CheckCommit_msg" not exist in module commit_msg.check_commit_msg, module 'src.ac.acl.commit_msg.check_commit_msg' has no attribute 'CheckCommit_msg'
+2023-03-23 10:52:08,889 test.py[line: 92] DEBUG : check openlibing
+2023-03-23 10:52:08,889 test.py[line:106] ERROR : import module openlibing.check_openlibing exception, No module named 'src.ac.acl.openlibing.check_openlibing'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.openlibing.check_openlibing'
+2023-03-23 10:52:08,889 test.py[line: 92] DEBUG : check package_license
+2023-03-23 10:52:08,903 test.py[line:106] ERROR : import module package_license.check_package_license exception, No module named 'src.ac.acl.package_license.check_package_license'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.package_license.check_package_license'
+2023-03-23 10:52:08,904 test.py[line: 92] DEBUG : check package_yaml
+2023-03-23 10:52:08,905 test.py[line:106] ERROR : import module package_yaml.check_package_yaml exception, No module named 'src.ac.acl.package_yaml.check_package_yaml'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.package_yaml.check_package_yaml'
+2023-03-23 10:52:08,905 test.py[line: 92] DEBUG : check sca
+2023-03-23 10:52:08,905 test.py[line:104] DEBUG : load module sca.check_sca succeed
+2023-03-23 10:52:08,905 test.py[line:115] WARNING : entry "CheckSca" not exist in module sca.check_sca, module 'src.ac.acl.sca.check_sca' has no attribute 'CheckSca'
+2023-03-23 10:52:08,905 test.py[line: 92] DEBUG : check source_consistency
+2023-03-23 10:52:08,905 test.py[line:106] ERROR : import module source_consistency.check_source_consistency exception, No module named 'src.ac.acl.source_consistency.check_source_consistency'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.source_consistency.check_source_consistency'
+2023-03-23 10:52:08,905 test.py[line: 92] DEBUG : check spec
+2023-03-23 10:52:08,914 test.py[line:104] DEBUG : load module spec.check_spec succeed
+2023-03-23 10:52:08,914 test.py[line:113] DEBUG : load entry "CheckSpec" succeed
+2023-03-23 10:53:40,655 test.py[line:278] INFO : --------------------AC START---------------------
+2023-03-23 10:53:40,671 test.py[line:163] DEBUG : community "src-openeuler" conf: {}
+2023-03-23 10:53:40,671 test.py[line: 57] DEBUG : check list: {'binary': {}, 'code': {}, 'commit_msg': {}, 'openlibing': {}, 'package_license': {}, 'package_yaml': {}, 'sca': {}, 'source_consistency': {}, 'spec': {}}
+2023-03-23 10:53:40,671 test.py[line: 92] DEBUG : check binary
+2023-03-23 10:53:40,672 test.py[line:106] ERROR : import module binary.check_binary exception, No module named 'src.ac.acl.binary.check_binary'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.binary.check_binary'
+2023-03-23 10:53:40,672 test.py[line: 92] DEBUG : check code
+2023-03-23 10:53:40,672 test.py[line:106] ERROR : import module code.check_code exception, No module named 'src.ac.acl.code.check_code'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.code.check_code'
+2023-03-23 10:53:40,672 test.py[line: 92] DEBUG : check commit_msg
+2023-03-23 10:53:40,672 test.py[line:104] DEBUG : load module commit_msg.check_commit_msg succeed
+2023-03-23 10:53:40,672 test.py[line:115] WARNING : entry "CheckCommit_msg" not exist in module commit_msg.check_commit_msg, module 'src.ac.acl.commit_msg.check_commit_msg' has no attribute 'CheckCommit_msg'
+2023-03-23 10:53:40,672 test.py[line: 92] DEBUG : check openlibing
+2023-03-23 10:53:40,672 test.py[line:106] ERROR : import module openlibing.check_openlibing exception, No module named 'src.ac.acl.openlibing.check_openlibing'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.openlibing.check_openlibing'
+2023-03-23 10:53:40,672 test.py[line: 92] DEBUG : check package_license
+2023-03-23 10:53:40,672 test.py[line:106] ERROR : import module package_license.check_package_license exception, No module named 'src.ac.acl.package_license.check_package_license'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.package_license.check_package_license'
+2023-03-23 10:53:40,672 test.py[line: 92] DEBUG : check package_yaml
+2023-03-23 10:53:40,672 test.py[line:106] ERROR : import module package_yaml.check_package_yaml exception, No module named 'src.ac.acl.package_yaml.check_package_yaml'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.package_yaml.check_package_yaml'
+2023-03-23 10:53:40,686 test.py[line: 92] DEBUG : check sca
+2023-03-23 10:53:40,688 test.py[line:104] DEBUG : load module sca.check_sca succeed
+2023-03-23 10:53:40,688 test.py[line:115] WARNING : entry "CheckSca" not exist in module sca.check_sca, module 'src.ac.acl.sca.check_sca' has no attribute 'CheckSca'
+2023-03-23 10:53:40,688 test.py[line: 92] DEBUG : check source_consistency
+2023-03-23 10:53:40,689 test.py[line:106] ERROR : import module source_consistency.check_source_consistency exception, No module named 'src.ac.acl.source_consistency.check_source_consistency'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 103, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.source_consistency.check_source_consistency'
+2023-03-23 10:53:40,689 test.py[line: 92] DEBUG : check spec
+2023-03-23 10:53:40,690 test.py[line:104] DEBUG : load module spec.check_spec succeed
+2023-03-23 10:53:40,690 test.py[line:113] DEBUG : load entry "CheckSpec" succeed
+2023-03-23 10:53:59,059 test.py[line:278] INFO : --------------------AC START---------------------
+2023-03-23 10:53:59,059 test.py[line:163] DEBUG : community "src-openeuler" conf: {}
+2023-03-23 10:53:59,059 test.py[line: 57] DEBUG : check list: {'binary': {}, 'code': {}, 'commit_msg': {}, 'openlibing': {}, 'package_license': {}, 'package_yaml': {}, 'sca': {}, 'source_consistency': {}, 'spec': {}}
+2023-03-23 10:53:59,059 test.py[line: 92] DEBUG : check binary
+2023-03-23 10:53:59,059 test.py[line: 92] DEBUG : check code
+2023-03-23 10:53:59,059 test.py[line: 92] DEBUG : check commit_msg
+2023-03-23 10:53:59,059 test.py[line: 92] DEBUG : check openlibing
+2023-03-23 10:53:59,059 test.py[line: 92] DEBUG : check package_license
+2023-03-23 10:53:59,059 test.py[line: 92] DEBUG : check package_yaml
+2023-03-23 10:53:59,059 test.py[line: 92] DEBUG : check sca
+2023-03-23 10:53:59,059 test.py[line: 92] DEBUG : check source_consistency
+2023-03-23 10:53:59,059 test.py[line: 92] DEBUG : check spec
+2023-03-23 10:53:59,082 test.py[line:178] DEBUG : save ac result to file None
+2023-03-23 10:54:16,646 test.py[line:278] INFO : --------------------AC START---------------------
+2023-03-23 10:54:16,652 test.py[line:163] DEBUG : community "src-openeuler" conf: {}
+2023-03-23 10:54:16,652 test.py[line: 57] DEBUG : check list: {'binary': {}, 'code': {}, 'commit_msg': {}, 'openlibing': {}, 'package_license': {}, 'package_yaml': {}, 'sca': {}, 'source_consistency': {}, 'spec': {}}
+2023-03-23 10:54:16,652 test.py[line: 92] DEBUG : check binary
+2023-03-23 10:54:16,652 test.py[line: 92] DEBUG : check code
+2023-03-23 10:54:16,652 test.py[line: 92] DEBUG : check commit_msg
+2023-03-23 10:54:16,652 test.py[line: 92] DEBUG : check openlibing
+2023-03-23 10:54:16,652 test.py[line: 92] DEBUG : check package_license
+2023-03-23 10:54:16,652 test.py[line: 92] DEBUG : check package_yaml
+2023-03-23 10:54:16,652 test.py[line: 92] DEBUG : check sca
+2023-03-23 10:54:16,652 test.py[line: 92] DEBUG : check source_consistency
+2023-03-23 10:54:16,652 test.py[line: 92] DEBUG : check spec
+2023-03-23 10:55:00,966 test.py[line:279] INFO : --------------------AC START---------------------
+2023-03-23 10:55:00,972 test.py[line:164] DEBUG : community "src-openeuler" conf: {}
+2023-03-23 10:55:00,972 test.py[line: 57] DEBUG : check list: {'binary': {}, 'code': {}, 'commit_msg': {}, 'openlibing': {}, 'package_license': {}, 'package_yaml': {}, 'sca': {}, 'source_consistency': {}, 'spec': {}}
+2023-03-23 10:55:00,972 test.py[line: 93] DEBUG : check binary
+2023-03-23 10:55:00,972 test.py[line: 93] DEBUG : check code
+2023-03-23 10:55:00,972 test.py[line: 93] DEBUG : check commit_msg
+2023-03-23 10:55:00,972 test.py[line: 93] DEBUG : check openlibing
+2023-03-23 10:55:00,977 test.py[line: 93] DEBUG : check package_license
+2023-03-23 10:55:00,977 test.py[line: 93] DEBUG : check package_yaml
+2023-03-23 10:55:00,977 test.py[line: 93] DEBUG : check sca
+2023-03-23 10:55:00,977 test.py[line: 93] DEBUG : check source_consistency
+2023-03-23 10:55:00,977 test.py[line: 93] DEBUG : check spec
+2023-03-23 10:55:46,626 test.py[line:280] INFO : --------------------AC START---------------------
+2023-03-23 10:55:46,631 test.py[line:165] DEBUG : community "src-openeuler" conf: {}
+2023-03-23 10:55:46,631 test.py[line: 57] DEBUG : check list: {'binary': {}, 'code': {}, 'commit_msg': {}, 'openlibing': {}, 'package_license': {}, 'package_yaml': {}, 'sca': {}, 'source_consistency': {}, 'spec': {}}
+2023-03-23 10:55:46,631 test.py[line: 93] DEBUG : check binary
+2023-03-23 10:55:46,631 test.py[line: 93] DEBUG : check code
+2023-03-23 10:55:46,631 test.py[line: 93] DEBUG : check commit_msg
+2023-03-23 10:55:46,631 test.py[line: 93] DEBUG : check openlibing
+2023-03-23 10:55:46,631 test.py[line: 93] DEBUG : check package_license
+2023-03-23 10:55:46,631 test.py[line: 93] DEBUG : check package_yaml
+2023-03-23 10:55:46,631 test.py[line: 93] DEBUG : check sca
+2023-03-23 10:55:46,631 test.py[line: 93] DEBUG : check source_consistency
+2023-03-23 10:55:46,631 test.py[line: 93] DEBUG : check spec
+2023-03-23 10:56:13,084 test.py[line:281] INFO : --------------------AC START---------------------
+2023-03-23 10:56:13,086 test.py[line:166] DEBUG : community "src-openeuler" conf: {}
+2023-03-23 10:56:13,086 test.py[line: 57] DEBUG : check list: {'binary': {}, 'code': {}, 'commit_msg': {}, 'openlibing': {}, 'package_license': {}, 'package_yaml': {}, 'sca': {}, 'source_consistency': {}, 'spec': {}}
+2023-03-23 10:56:13,086 test.py[line: 94] DEBUG : check binary
+2023-03-23 10:56:13,086 test.py[line: 94] DEBUG : check code
+2023-03-23 10:56:13,086 test.py[line: 94] DEBUG : check commit_msg
+2023-03-23 10:56:13,086 test.py[line: 94] DEBUG : check openlibing
+2023-03-23 10:56:13,086 test.py[line: 94] DEBUG : check package_license
+2023-03-23 10:56:13,086 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 10:56:13,086 test.py[line: 94] DEBUG : check sca
+2023-03-23 10:56:13,086 test.py[line: 94] DEBUG : check source_consistency
+2023-03-23 10:56:13,086 test.py[line: 94] DEBUG : check spec
+2023-03-23 10:56:26,456 test.py[line:281] INFO : --------------------AC START---------------------
+2023-03-23 10:56:26,457 test.py[line:166] DEBUG : community "src-openeuler" conf: {}
+2023-03-23 10:56:26,457 test.py[line: 57] DEBUG : check list: {'binary': {}, 'code': {}, 'commit_msg': {}, 'openlibing': {}, 'package_license': {}, 'package_yaml': {}, 'sca': {}, 'source_consistency': {}, 'spec': {}}
+2023-03-23 10:56:26,457 test.py[line: 94] DEBUG : check binary
+2023-03-23 10:56:26,457 test.py[line: 94] DEBUG : check code
+2023-03-23 10:56:26,457 test.py[line: 94] DEBUG : check commit_msg
+2023-03-23 10:56:26,457 test.py[line: 94] DEBUG : check openlibing
+2023-03-23 10:56:26,457 test.py[line: 94] DEBUG : check package_license
+2023-03-23 10:56:26,457 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 10:56:26,457 test.py[line: 94] DEBUG : check sca
+2023-03-23 10:56:26,457 test.py[line: 94] DEBUG : check source_consistency
+2023-03-23 10:56:26,457 test.py[line: 94] DEBUG : check spec
+2023-03-23 10:58:21,716 test.py[line:281] INFO : --------------------AC START---------------------
+2023-03-23 10:58:21,716 test.py[line:166] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 10:58:21,716 test.py[line:170] DEBUG : exclude: code
+2023-03-23 10:58:21,716 test.py[line:170] DEBUG : exclude: sca
+2023-03-23 10:58:21,716 test.py[line:170] DEBUG : exclude: openlibing
+2023-03-23 10:58:21,716 test.py[line:170] DEBUG : exclude: commit_msg
+2023-03-23 10:58:21,716 test.py[line:170] DEBUG : exclude: source_consistency
+2023-03-23 10:58:21,716 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 10:58:21,716 test.py[line: 94] DEBUG : check binary
+2023-03-23 10:58:21,716 test.py[line: 94] DEBUG : check package_license
+2023-03-23 10:58:21,716 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 10:58:21,716 test.py[line: 94] DEBUG : check spec
+2023-03-23 11:27:36,852 test.py[line:281] INFO : --------------------AC START---------------------
+2023-03-23 11:27:36,854 test.py[line:166] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 11:27:36,859 test.py[line:170] DEBUG : exclude: code
+2023-03-23 11:27:36,859 test.py[line:170] DEBUG : exclude: sca
+2023-03-23 11:27:36,859 test.py[line:170] DEBUG : exclude: openlibing
+2023-03-23 11:27:36,859 test.py[line:170] DEBUG : exclude: commit_msg
+2023-03-23 11:27:36,859 test.py[line:170] DEBUG : exclude: source_consistency
+2023-03-23 11:27:36,859 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 11:27:36,859 test.py[line: 94] DEBUG : check binary
+2023-03-23 11:27:36,859 test.py[line: 94] DEBUG : check package_license
+2023-03-23 11:27:36,859 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 11:27:36,859 test.py[line: 94] DEBUG : check spec
+2023-03-23 11:28:35,960 test.py[line:281] INFO : --------------------AC START---------------------
+2023-03-23 11:28:35,960 test.py[line:166] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 11:28:35,960 test.py[line:170] DEBUG : exclude: code
+2023-03-23 11:28:35,960 test.py[line:170] DEBUG : exclude: sca
+2023-03-23 11:28:35,960 test.py[line:170] DEBUG : exclude: openlibing
+2023-03-23 11:28:35,960 test.py[line:170] DEBUG : exclude: commit_msg
+2023-03-23 11:28:35,960 test.py[line:170] DEBUG : exclude: source_consistency
+2023-03-23 11:28:35,960 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 11:28:35,960 test.py[line: 94] DEBUG : check binary
+2023-03-23 11:28:35,974 test.py[line:107] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 11:28:35,974 test.py[line: 94] DEBUG : check package_license
+2023-03-23 11:28:35,981 test.py[line:107] DEBUG : load module package_license.check_license succeed
+2023-03-23 11:28:35,981 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 11:28:35,984 test.py[line:109] ERROR : import module package_yaml.check_yaml exception, No module named 'tldextract'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 106, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 975, in _find_and_load_unlocked
+ File "", line 671, in _load_unlocked
+ File "", line 783, in exec_module
+ File "", line 219, in _call_with_frames_removed
+ File "D:\pythonProject\src-oepkgs\src\ac\acl\package_yaml\check_yaml.py", line 28, in
+ from src.ac.acl.package_yaml.check_repo import ReleaseTagsFactory
+ File "D:\pythonProject\src-oepkgs\src\ac\acl\package_yaml\check_repo.py", line 32, in
+ import tldextract
+ModuleNotFoundError: No module named 'tldextract'
+2023-03-23 11:28:35,986 test.py[line: 94] DEBUG : check spec
+2023-03-23 11:28:35,987 test.py[line:107] DEBUG : load module spec.check_spec succeed
+2023-03-23 11:30:24,637 test.py[line:281] INFO : --------------------AC START---------------------
+2023-03-23 11:30:24,643 test.py[line:166] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 11:30:24,643 test.py[line:170] DEBUG : exclude: code
+2023-03-23 11:30:24,643 test.py[line:170] DEBUG : exclude: sca
+2023-03-23 11:30:24,643 test.py[line:170] DEBUG : exclude: openlibing
+2023-03-23 11:30:24,643 test.py[line:170] DEBUG : exclude: commit_msg
+2023-03-23 11:30:24,643 test.py[line:170] DEBUG : exclude: source_consistency
+2023-03-23 11:30:24,643 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 11:30:24,643 test.py[line: 94] DEBUG : check binary
+2023-03-23 11:30:24,653 test.py[line:107] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 11:30:24,653 test.py[line: 94] DEBUG : check package_license
+2023-03-23 11:30:24,654 test.py[line:107] DEBUG : load module package_license.check_license succeed
+2023-03-23 11:30:24,654 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 11:30:24,668 test.py[line:107] DEBUG : load module package_yaml.check_yaml succeed
+2023-03-23 11:30:24,668 test.py[line: 94] DEBUG : check spec
+2023-03-23 11:30:24,668 test.py[line:107] DEBUG : load module spec.check_spec succeed
+2023-03-23 11:31:16,947 test.py[line:281] INFO : --------------------AC START---------------------
+2023-03-23 11:31:16,952 test.py[line:166] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 11:31:16,952 test.py[line:170] DEBUG : exclude: code
+2023-03-23 11:31:16,952 test.py[line:170] DEBUG : exclude: sca
+2023-03-23 11:31:16,952 test.py[line:170] DEBUG : exclude: openlibing
+2023-03-23 11:31:16,952 test.py[line:170] DEBUG : exclude: commit_msg
+2023-03-23 11:31:16,952 test.py[line:170] DEBUG : exclude: source_consistency
+2023-03-23 11:31:16,952 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 11:31:16,953 test.py[line: 94] DEBUG : check binary
+2023-03-23 11:31:16,961 test.py[line:107] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 11:31:16,961 test.py[line:116] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 11:31:16,961 test.py[line: 94] DEBUG : check package_license
+2023-03-23 11:31:16,961 test.py[line:107] DEBUG : load module package_license.check_license succeed
+2023-03-23 11:31:16,961 test.py[line:116] DEBUG : load entry "CheckLicense" succeed
+2023-03-23 11:31:16,961 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 11:31:16,961 test.py[line:107] DEBUG : load module package_yaml.check_yaml succeed
+2023-03-23 11:31:16,961 test.py[line:116] DEBUG : load entry "CheckPackageYaml" succeed
+2023-03-23 11:31:16,961 test.py[line: 94] DEBUG : check spec
+2023-03-23 11:31:16,961 test.py[line:107] DEBUG : load module spec.check_spec succeed
+2023-03-23 11:31:16,961 test.py[line:116] DEBUG : load entry "CheckSpec" succeed
+2023-03-23 11:31:33,627 test.py[line:282] INFO : --------------------AC START---------------------
+2023-03-23 11:31:33,635 test.py[line:167] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 11:31:33,635 test.py[line:171] DEBUG : exclude: code
+2023-03-23 11:31:33,635 test.py[line:171] DEBUG : exclude: sca
+2023-03-23 11:31:33,635 test.py[line:171] DEBUG : exclude: openlibing
+2023-03-23 11:31:33,635 test.py[line:171] DEBUG : exclude: commit_msg
+2023-03-23 11:31:33,635 test.py[line:171] DEBUG : exclude: source_consistency
+2023-03-23 11:31:33,635 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 11:31:33,635 test.py[line: 94] DEBUG : check binary
+2023-03-23 11:31:33,635 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 11:31:33,635 test.py[line:117] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 11:31:33,635 test.py[line: 94] DEBUG : check package_license
+2023-03-23 11:31:33,635 test.py[line:108] DEBUG : load module package_license.check_license succeed
+2023-03-23 11:31:33,635 test.py[line:117] DEBUG : load entry "CheckLicense" succeed
+2023-03-23 11:31:33,635 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 11:31:33,657 test.py[line:108] DEBUG : load module package_yaml.check_yaml succeed
+2023-03-23 11:31:33,657 test.py[line:117] DEBUG : load entry "CheckPackageYaml" succeed
+2023-03-23 11:31:33,657 test.py[line: 94] DEBUG : check spec
+2023-03-23 11:31:33,658 test.py[line:108] DEBUG : load module spec.check_spec succeed
+2023-03-23 11:31:33,658 test.py[line:117] DEBUG : load entry "CheckSpec" succeed
+2023-03-23 11:32:32,665 test.py[line:282] INFO : --------------------AC START---------------------
+2023-03-23 11:32:32,667 test.py[line:167] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 11:32:32,667 test.py[line:171] DEBUG : exclude: code
+2023-03-23 11:32:32,667 test.py[line:171] DEBUG : exclude: sca
+2023-03-23 11:32:32,667 test.py[line:171] DEBUG : exclude: openlibing
+2023-03-23 11:32:32,667 test.py[line:171] DEBUG : exclude: commit_msg
+2023-03-23 11:32:32,667 test.py[line:171] DEBUG : exclude: source_consistency
+2023-03-23 11:32:32,667 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 11:32:32,667 test.py[line: 94] DEBUG : check binary
+2023-03-23 11:32:32,667 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 11:32:32,667 test.py[line:117] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 11:32:32,667 test.py[line: 94] DEBUG : check package_license
+2023-03-23 11:32:32,667 test.py[line:108] DEBUG : load module package_license.check_license succeed
+2023-03-23 11:32:32,667 test.py[line:117] DEBUG : load entry "CheckLicense" succeed
+2023-03-23 11:32:32,667 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 11:32:32,685 test.py[line:108] DEBUG : load module package_yaml.check_yaml succeed
+2023-03-23 11:32:32,685 test.py[line:117] DEBUG : load entry "CheckPackageYaml" succeed
+2023-03-23 11:32:32,685 test.py[line: 94] DEBUG : check spec
+2023-03-23 11:32:32,685 test.py[line:108] DEBUG : load module spec.check_spec succeed
+2023-03-23 11:32:32,685 test.py[line:117] DEBUG : load entry "CheckSpec" succeed
+2023-03-23 11:32:55,986 test.py[line:283] INFO : --------------------AC START---------------------
+2023-03-23 11:32:55,988 test.py[line:168] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 11:32:55,988 test.py[line:172] DEBUG : exclude: code
+2023-03-23 11:32:55,988 test.py[line:172] DEBUG : exclude: sca
+2023-03-23 11:32:55,988 test.py[line:172] DEBUG : exclude: openlibing
+2023-03-23 11:32:55,988 test.py[line:172] DEBUG : exclude: commit_msg
+2023-03-23 11:32:55,988 test.py[line:172] DEBUG : exclude: source_consistency
+2023-03-23 11:32:55,988 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 11:32:55,988 test.py[line: 94] DEBUG : check binary
+2023-03-23 11:32:55,988 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 11:32:55,988 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 11:32:55,988 test.py[line: 94] DEBUG : check package_license
+2023-03-23 11:32:56,004 test.py[line:108] DEBUG : load module package_license.check_license succeed
+2023-03-23 11:32:56,004 test.py[line:118] DEBUG : load entry "CheckLicense" succeed
+2023-03-23 11:32:56,004 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 11:32:56,013 test.py[line:108] DEBUG : load module package_yaml.check_yaml succeed
+2023-03-23 11:32:56,013 test.py[line:118] DEBUG : load entry "CheckPackageYaml" succeed
+2023-03-23 11:32:56,014 test.py[line: 94] DEBUG : check spec
+2023-03-23 11:32:56,015 test.py[line:108] DEBUG : load module spec.check_spec succeed
+2023-03-23 11:32:56,015 test.py[line:118] DEBUG : load entry "CheckSpec" succeed
+2023-03-23 11:33:18,521 test.py[line:283] INFO : --------------------AC START---------------------
+2023-03-23 11:33:18,527 test.py[line:168] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 11:33:18,527 test.py[line:172] DEBUG : exclude: code
+2023-03-23 11:33:18,527 test.py[line:172] DEBUG : exclude: sca
+2023-03-23 11:33:18,527 test.py[line:172] DEBUG : exclude: openlibing
+2023-03-23 11:33:18,527 test.py[line:172] DEBUG : exclude: commit_msg
+2023-03-23 11:33:18,527 test.py[line:172] DEBUG : exclude: source_consistency
+2023-03-23 11:33:18,527 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 11:33:18,527 test.py[line: 94] DEBUG : check binary
+2023-03-23 11:33:18,535 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 11:33:18,535 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 11:33:18,537 test.py[line: 94] DEBUG : check package_license
+2023-03-23 11:33:18,539 test.py[line:108] DEBUG : load module package_license.check_license succeed
+2023-03-23 11:33:18,539 test.py[line:118] DEBUG : load entry "CheckLicense" succeed
+2023-03-23 11:33:18,539 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 11:33:18,546 test.py[line:108] DEBUG : load module package_yaml.check_yaml succeed
+2023-03-23 11:33:18,548 test.py[line:118] DEBUG : load entry "CheckPackageYaml" succeed
+2023-03-23 11:33:18,548 test.py[line: 94] DEBUG : check spec
+2023-03-23 11:33:18,549 test.py[line:108] DEBUG : load module spec.check_spec succeed
+2023-03-23 11:33:18,549 test.py[line:118] DEBUG : load entry "CheckSpec" succeed
+2023-03-23 11:33:35,075 test.py[line:282] INFO : --------------------AC START---------------------
+2023-03-23 11:33:35,082 test.py[line:167] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 11:33:35,082 test.py[line:171] DEBUG : exclude: code
+2023-03-23 11:33:35,082 test.py[line:171] DEBUG : exclude: sca
+2023-03-23 11:33:35,082 test.py[line:171] DEBUG : exclude: openlibing
+2023-03-23 11:33:35,082 test.py[line:171] DEBUG : exclude: commit_msg
+2023-03-23 11:33:35,082 test.py[line:171] DEBUG : exclude: source_consistency
+2023-03-23 11:33:35,082 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 11:33:35,082 test.py[line: 94] DEBUG : check binary
+2023-03-23 11:33:35,095 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 11:33:35,095 test.py[line:117] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 11:33:49,355 test.py[line:282] INFO : --------------------AC START---------------------
+2023-03-23 11:33:49,357 test.py[line:167] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 11:33:49,357 test.py[line:171] DEBUG : exclude: code
+2023-03-23 11:33:49,357 test.py[line:171] DEBUG : exclude: sca
+2023-03-23 11:33:49,357 test.py[line:171] DEBUG : exclude: openlibing
+2023-03-23 11:33:49,357 test.py[line:171] DEBUG : exclude: commit_msg
+2023-03-23 11:33:49,357 test.py[line:171] DEBUG : exclude: source_consistency
+2023-03-23 11:33:49,357 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 11:33:49,357 test.py[line: 94] DEBUG : check binary
+2023-03-23 11:33:49,370 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 11:33:49,371 test.py[line:117] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 11:36:10,826 test.py[line:282] INFO : --------------------AC START---------------------
+2023-03-23 11:36:10,828 test.py[line:167] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 11:36:10,828 test.py[line:171] DEBUG : exclude: code
+2023-03-23 11:36:10,828 test.py[line:171] DEBUG : exclude: sca
+2023-03-23 11:36:10,828 test.py[line:171] DEBUG : exclude: openlibing
+2023-03-23 11:36:10,828 test.py[line:171] DEBUG : exclude: commit_msg
+2023-03-23 11:36:10,828 test.py[line:171] DEBUG : exclude: source_consistency
+2023-03-23 11:36:10,828 test.py[line: 57] DEBUG : check list: {'binary': {}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 11:36:10,828 test.py[line: 94] DEBUG : check binary
+2023-03-23 11:36:10,828 test.py[line:110] ERROR : import module binary.check_binary exception, No module named 'src.ac.acl.binary.check_binary'
+Traceback (most recent call last):
+ File "D:/pythonProject/src-oepkgs/src/ac/framework/test.py", line 106, in check_all
+ module = importlib.import_module("." + module_path, self._acl_package)
+ File "D:\Anaconda\lib\importlib\__init__.py", line 127, in import_module
+ return _bootstrap._gcd_import(name[level:], package, level)
+ File "", line 1014, in _gcd_import
+ File "", line 991, in _find_and_load
+ File "", line 973, in _find_and_load_unlocked
+ModuleNotFoundError: No module named 'src.ac.acl.binary.check_binary'
+2023-03-23 11:36:10,828 test.py[line: 94] DEBUG : check package_license
+2023-03-23 11:36:10,828 test.py[line:108] DEBUG : load module package_license.check_license succeed
+2023-03-23 11:36:10,828 test.py[line:117] DEBUG : load entry "CheckLicense" succeed
+2023-03-23 11:37:09,060 test.py[line:282] INFO : --------------------AC START---------------------
+2023-03-23 11:37:09,064 test.py[line:167] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 11:37:09,064 test.py[line:171] DEBUG : exclude: code
+2023-03-23 11:37:09,064 test.py[line:171] DEBUG : exclude: sca
+2023-03-23 11:37:09,064 test.py[line:171] DEBUG : exclude: openlibing
+2023-03-23 11:37:09,064 test.py[line:171] DEBUG : exclude: commit_msg
+2023-03-23 11:37:09,064 test.py[line:171] DEBUG : exclude: source_consistency
+2023-03-23 11:37:09,064 test.py[line: 57] DEBUG : check list: {'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 11:37:09,064 test.py[line: 94] DEBUG : check package_license
+2023-03-23 11:37:09,064 test.py[line:108] DEBUG : load module package_license.check_license succeed
+2023-03-23 11:37:09,064 test.py[line:117] DEBUG : load entry "CheckLicense" succeed
+2023-03-23 14:15:42,496 test.py[line:282] INFO : --------------------AC START---------------------
+2023-03-23 14:15:42,499 test.py[line:167] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:15:42,499 test.py[line:171] DEBUG : exclude: code
+2023-03-23 14:15:42,499 test.py[line:171] DEBUG : exclude: sca
+2023-03-23 14:15:42,499 test.py[line:171] DEBUG : exclude: openlibing
+2023-03-23 14:15:42,499 test.py[line:171] DEBUG : exclude: commit_msg
+2023-03-23 14:15:42,499 test.py[line:171] DEBUG : exclude: source_consistency
+2023-03-23 14:15:42,499 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:15:42,499 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:15:42,520 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:15:42,520 test.py[line:117] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:33:53,293 test.py[line:282] INFO : --------------------AC START---------------------
+2023-03-23 14:33:53,299 test.py[line:167] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:33:53,299 test.py[line:171] DEBUG : exclude: code
+2023-03-23 14:33:53,299 test.py[line:171] DEBUG : exclude: sca
+2023-03-23 14:33:53,299 test.py[line:171] DEBUG : exclude: openlibing
+2023-03-23 14:33:53,299 test.py[line:171] DEBUG : exclude: commit_msg
+2023-03-23 14:33:53,299 test.py[line:171] DEBUG : exclude: source_consistency
+2023-03-23 14:33:53,299 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:33:53,299 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:33:53,308 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:33:53,309 test.py[line:117] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:33:53,309 test.py[line: 94] DEBUG : check package_license
+2023-03-23 14:33:53,310 test.py[line:108] DEBUG : load module package_license.check_license succeed
+2023-03-23 14:33:53,310 test.py[line:117] DEBUG : load entry "CheckLicense" succeed
+2023-03-23 14:33:53,311 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 14:33:53,318 test.py[line:108] DEBUG : load module package_yaml.check_yaml succeed
+2023-03-23 14:33:53,318 test.py[line:117] DEBUG : load entry "CheckPackageYaml" succeed
+2023-03-23 14:33:53,318 test.py[line: 94] DEBUG : check spec
+2023-03-23 14:33:53,318 test.py[line:108] DEBUG : load module spec.check_spec succeed
+2023-03-23 14:33:53,318 test.py[line:117] DEBUG : load entry "CheckSpec" succeed
+2023-03-23 14:34:17,331 test.py[line:283] INFO : --------------------AC START---------------------
+2023-03-23 14:34:17,335 test.py[line:168] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:34:17,335 test.py[line:172] DEBUG : exclude: code
+2023-03-23 14:34:17,335 test.py[line:172] DEBUG : exclude: sca
+2023-03-23 14:34:17,335 test.py[line:172] DEBUG : exclude: openlibing
+2023-03-23 14:34:17,335 test.py[line:172] DEBUG : exclude: commit_msg
+2023-03-23 14:34:17,335 test.py[line:172] DEBUG : exclude: source_consistency
+2023-03-23 14:34:17,335 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:34:17,335 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:34:17,347 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:34:17,347 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:34:17,347 test.py[line: 94] DEBUG : check package_license
+2023-03-23 14:34:17,350 test.py[line:108] DEBUG : load module package_license.check_license succeed
+2023-03-23 14:34:17,350 test.py[line:118] DEBUG : load entry "CheckLicense" succeed
+2023-03-23 14:34:17,350 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 14:34:17,358 test.py[line:108] DEBUG : load module package_yaml.check_yaml succeed
+2023-03-23 14:34:17,358 test.py[line:118] DEBUG : load entry "CheckPackageYaml" succeed
+2023-03-23 14:34:17,358 test.py[line: 94] DEBUG : check spec
+2023-03-23 14:34:17,359 test.py[line:108] DEBUG : load module spec.check_spec succeed
+2023-03-23 14:34:17,359 test.py[line:118] DEBUG : load entry "CheckSpec" succeed
+2023-03-23 14:35:10,456 test.py[line:284] INFO : --------------------AC START---------------------
+2023-03-23 14:35:10,457 test.py[line:169] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:35:10,457 test.py[line:173] DEBUG : exclude: code
+2023-03-23 14:35:10,457 test.py[line:173] DEBUG : exclude: sca
+2023-03-23 14:35:10,457 test.py[line:173] DEBUG : exclude: openlibing
+2023-03-23 14:35:10,457 test.py[line:173] DEBUG : exclude: commit_msg
+2023-03-23 14:35:10,457 test.py[line:173] DEBUG : exclude: source_consistency
+2023-03-23 14:35:10,457 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:35:10,457 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:35:10,457 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:35:10,457 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:35:22,273 test.py[line:284] INFO : --------------------AC START---------------------
+2023-03-23 14:35:22,274 test.py[line:169] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:35:22,274 test.py[line:173] DEBUG : exclude: code
+2023-03-23 14:35:22,274 test.py[line:173] DEBUG : exclude: sca
+2023-03-23 14:35:22,274 test.py[line:173] DEBUG : exclude: openlibing
+2023-03-23 14:35:22,274 test.py[line:173] DEBUG : exclude: commit_msg
+2023-03-23 14:35:22,274 test.py[line:173] DEBUG : exclude: source_consistency
+2023-03-23 14:35:22,274 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:35:22,274 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:35:22,274 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:35:22,274 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:35:22,274 test.py[line: 94] DEBUG : check package_license
+2023-03-23 14:35:22,290 test.py[line:108] DEBUG : load module package_license.check_license succeed
+2023-03-23 14:35:22,290 test.py[line:118] DEBUG : load entry "CheckLicense" succeed
+2023-03-23 14:35:22,290 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 14:35:22,294 test.py[line:108] DEBUG : load module package_yaml.check_yaml succeed
+2023-03-23 14:35:22,294 test.py[line:118] DEBUG : load entry "CheckPackageYaml" succeed
+2023-03-23 14:35:22,294 test.py[line: 94] DEBUG : check spec
+2023-03-23 14:35:22,294 test.py[line:108] DEBUG : load module spec.check_spec succeed
+2023-03-23 14:35:22,294 test.py[line:118] DEBUG : load entry "CheckSpec" succeed
+2023-03-23 14:36:22,696 test.py[line:284] INFO : --------------------AC START---------------------
+2023-03-23 14:36:22,713 test.py[line:169] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:36:22,713 test.py[line:173] DEBUG : exclude: code
+2023-03-23 14:36:22,713 test.py[line:173] DEBUG : exclude: sca
+2023-03-23 14:36:22,713 test.py[line:173] DEBUG : exclude: openlibing
+2023-03-23 14:36:22,714 test.py[line:173] DEBUG : exclude: commit_msg
+2023-03-23 14:36:22,714 test.py[line:173] DEBUG : exclude: source_consistency
+2023-03-23 14:36:22,714 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:36:22,714 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:36:22,716 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:36:22,716 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:37:11,391 test.py[line:284] INFO : --------------------AC START---------------------
+2023-03-23 14:37:11,397 test.py[line:169] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:37:11,398 test.py[line:173] DEBUG : exclude: code
+2023-03-23 14:37:11,398 test.py[line:173] DEBUG : exclude: sca
+2023-03-23 14:37:11,398 test.py[line:173] DEBUG : exclude: openlibing
+2023-03-23 14:37:11,398 test.py[line:173] DEBUG : exclude: commit_msg
+2023-03-23 14:37:11,398 test.py[line:173] DEBUG : exclude: source_consistency
+2023-03-23 14:37:11,398 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:37:11,398 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:37:11,406 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:37:11,406 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:37:24,759 test.py[line:284] INFO : --------------------AC START---------------------
+2023-03-23 14:37:24,764 test.py[line:169] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:37:24,764 test.py[line:173] DEBUG : exclude: code
+2023-03-23 14:37:24,764 test.py[line:173] DEBUG : exclude: sca
+2023-03-23 14:37:24,764 test.py[line:173] DEBUG : exclude: openlibing
+2023-03-23 14:37:24,764 test.py[line:173] DEBUG : exclude: commit_msg
+2023-03-23 14:37:24,764 test.py[line:173] DEBUG : exclude: source_consistency
+2023-03-23 14:37:24,764 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:37:24,764 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:37:24,774 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:37:24,774 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:38:56,275 test.py[line:284] INFO : --------------------AC START---------------------
+2023-03-23 14:38:56,286 test.py[line:169] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:38:56,286 test.py[line:173] DEBUG : exclude: code
+2023-03-23 14:38:56,286 test.py[line:173] DEBUG : exclude: sca
+2023-03-23 14:38:56,286 test.py[line:173] DEBUG : exclude: openlibing
+2023-03-23 14:38:56,286 test.py[line:173] DEBUG : exclude: commit_msg
+2023-03-23 14:38:56,286 test.py[line:173] DEBUG : exclude: source_consistency
+2023-03-23 14:38:56,286 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:38:56,286 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:38:56,291 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:38:56,291 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:39:51,536 test.py[line:284] INFO : --------------------AC START---------------------
+2023-03-23 14:39:51,540 test.py[line:169] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:39:51,542 test.py[line:173] DEBUG : exclude: code
+2023-03-23 14:39:51,542 test.py[line:173] DEBUG : exclude: sca
+2023-03-23 14:39:51,542 test.py[line:173] DEBUG : exclude: openlibing
+2023-03-23 14:39:51,542 test.py[line:173] DEBUG : exclude: commit_msg
+2023-03-23 14:39:51,542 test.py[line:173] DEBUG : exclude: source_consistency
+2023-03-23 14:39:51,542 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:39:51,542 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:39:51,552 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:39:51,552 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:42:51,118 test.py[line:284] INFO : --------------------AC START---------------------
+2023-03-23 14:42:51,118 test.py[line:169] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:42:51,118 test.py[line:173] DEBUG : exclude: code
+2023-03-23 14:42:51,118 test.py[line:173] DEBUG : exclude: sca
+2023-03-23 14:42:51,118 test.py[line:173] DEBUG : exclude: openlibing
+2023-03-23 14:42:51,118 test.py[line:173] DEBUG : exclude: commit_msg
+2023-03-23 14:42:51,118 test.py[line:173] DEBUG : exclude: source_consistency
+2023-03-23 14:42:51,118 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:42:51,118 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:42:51,136 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:42:51,136 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:44:03,924 test.py[line:284] INFO : --------------------AC START---------------------
+2023-03-23 14:44:03,940 test.py[line:169] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:44:03,940 test.py[line:173] DEBUG : exclude: code
+2023-03-23 14:44:03,940 test.py[line:173] DEBUG : exclude: sca
+2023-03-23 14:44:03,940 test.py[line:173] DEBUG : exclude: openlibing
+2023-03-23 14:44:03,940 test.py[line:173] DEBUG : exclude: commit_msg
+2023-03-23 14:44:03,940 test.py[line:173] DEBUG : exclude: source_consistency
+2023-03-23 14:44:03,941 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:44:03,941 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:44:03,947 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:44:03,947 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:44:43,425 test.py[line:284] INFO : --------------------AC START---------------------
+2023-03-23 14:44:43,441 test.py[line:169] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:44:43,441 test.py[line:173] DEBUG : exclude: code
+2023-03-23 14:44:43,441 test.py[line:173] DEBUG : exclude: sca
+2023-03-23 14:44:43,441 test.py[line:173] DEBUG : exclude: openlibing
+2023-03-23 14:44:43,441 test.py[line:173] DEBUG : exclude: commit_msg
+2023-03-23 14:44:43,441 test.py[line:173] DEBUG : exclude: source_consistency
+2023-03-23 14:44:43,441 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:44:43,442 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:44:43,447 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:44:43,447 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:52:03,644 test.py[line:284] INFO : --------------------AC START---------------------
+2023-03-23 14:52:03,652 test.py[line:169] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:52:03,652 test.py[line:173] DEBUG : exclude: code
+2023-03-23 14:52:03,652 test.py[line:173] DEBUG : exclude: sca
+2023-03-23 14:52:03,652 test.py[line:173] DEBUG : exclude: openlibing
+2023-03-23 14:52:03,652 test.py[line:173] DEBUG : exclude: commit_msg
+2023-03-23 14:52:03,652 test.py[line:173] DEBUG : exclude: source_consistency
+2023-03-23 14:52:03,652 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:52:03,652 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:52:03,664 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:52:03,664 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:52:03,664 test.py[line: 94] DEBUG : check package_license
+2023-03-23 14:52:03,666 test.py[line:108] DEBUG : load module package_license.check_license succeed
+2023-03-23 14:52:03,666 test.py[line:118] DEBUG : load entry "CheckLicense" succeed
+2023-03-23 14:52:03,666 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 14:52:03,682 test.py[line:108] DEBUG : load module package_yaml.check_yaml succeed
+2023-03-23 14:52:03,682 test.py[line:118] DEBUG : load entry "CheckPackageYaml" succeed
+2023-03-23 14:52:03,682 test.py[line: 94] DEBUG : check spec
+2023-03-23 14:52:03,683 test.py[line:108] DEBUG : load module spec.check_spec succeed
+2023-03-23 14:52:03,683 test.py[line:118] DEBUG : load entry "CheckSpec" succeed
+2023-03-23 14:53:09,341 test.py[line:284] INFO : --------------------AC START---------------------
+2023-03-23 14:53:09,342 test.py[line:169] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:53:09,342 test.py[line:173] DEBUG : exclude: code
+2023-03-23 14:53:09,342 test.py[line:173] DEBUG : exclude: sca
+2023-03-23 14:53:09,342 test.py[line:173] DEBUG : exclude: openlibing
+2023-03-23 14:53:09,342 test.py[line:173] DEBUG : exclude: commit_msg
+2023-03-23 14:53:09,342 test.py[line:173] DEBUG : exclude: source_consistency
+2023-03-23 14:53:09,342 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:53:09,342 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:53:09,358 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:53:09,358 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:53:50,971 test.py[line:284] INFO : --------------------AC START---------------------
+2023-03-23 14:53:50,977 test.py[line:169] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:53:50,977 test.py[line:173] DEBUG : exclude: code
+2023-03-23 14:53:50,977 test.py[line:173] DEBUG : exclude: sca
+2023-03-23 14:53:50,977 test.py[line:173] DEBUG : exclude: openlibing
+2023-03-23 14:53:50,977 test.py[line:173] DEBUG : exclude: commit_msg
+2023-03-23 14:53:50,977 test.py[line:173] DEBUG : exclude: source_consistency
+2023-03-23 14:53:50,977 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:53:50,978 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:53:50,987 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:53:50,987 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:53:50,987 gitee_repo.py[line: 74] WARNING : no spec file
+2023-03-23 14:53:50,988 check_binary_file.py[line: 98] ERROR : spec file not find
+2023-03-23 14:53:50,988 test.py[line: 94] DEBUG : check package_license
+2023-03-23 14:53:50,990 test.py[line:108] DEBUG : load module package_license.check_license succeed
+2023-03-23 14:53:50,990 test.py[line:118] DEBUG : load entry "CheckLicense" succeed
+2023-03-23 14:53:50,990 gitee_repo.py[line: 74] WARNING : no spec file
+2023-03-23 14:53:50,990 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 14:53:50,998 test.py[line:108] DEBUG : load module package_yaml.check_yaml succeed
+2023-03-23 14:53:50,998 test.py[line:118] DEBUG : load entry "CheckPackageYaml" succeed
+2023-03-23 14:53:50,998 gitee_repo.py[line: 74] WARNING : no spec file
+2023-03-23 14:53:50,998 test.py[line: 94] DEBUG : check spec
+2023-03-23 14:53:50,998 test.py[line:108] DEBUG : load module spec.check_spec succeed
+2023-03-23 14:53:50,998 test.py[line:118] DEBUG : load entry "CheckSpec" succeed
+2023-03-23 14:53:50,998 gitee_repo.py[line: 74] WARNING : no spec file
+2023-03-23 14:54:07,456 test.py[line:284] INFO : --------------------AC START---------------------
+2023-03-23 14:54:07,465 test.py[line:169] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:54:07,465 test.py[line:173] DEBUG : exclude: code
+2023-03-23 14:54:07,465 test.py[line:173] DEBUG : exclude: sca
+2023-03-23 14:54:07,465 test.py[line:173] DEBUG : exclude: openlibing
+2023-03-23 14:54:07,465 test.py[line:173] DEBUG : exclude: commit_msg
+2023-03-23 14:54:07,465 test.py[line:173] DEBUG : exclude: source_consistency
+2023-03-23 14:54:07,465 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:54:07,466 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:54:07,475 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:54:07,475 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:54:07,475 gitee_repo.py[line: 74] WARNING : no spec file
+2023-03-23 14:54:07,475 check_binary_file.py[line: 98] ERROR : spec file not find
+2023-03-23 14:54:07,475 test.py[line: 94] DEBUG : check package_license
+2023-03-23 14:54:07,475 test.py[line:108] DEBUG : load module package_license.check_license succeed
+2023-03-23 14:54:07,475 test.py[line:118] DEBUG : load entry "CheckLicense" succeed
+2023-03-23 14:54:07,475 gitee_repo.py[line: 74] WARNING : no spec file
+2023-03-23 14:54:07,475 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 14:54:07,489 test.py[line:108] DEBUG : load module package_yaml.check_yaml succeed
+2023-03-23 14:54:07,489 test.py[line:118] DEBUG : load entry "CheckPackageYaml" succeed
+2023-03-23 14:54:07,489 gitee_repo.py[line: 74] WARNING : no spec file
+2023-03-23 14:54:07,489 test.py[line: 94] DEBUG : check spec
+2023-03-23 14:54:07,491 test.py[line:108] DEBUG : load module spec.check_spec succeed
+2023-03-23 14:54:07,491 test.py[line:118] DEBUG : load entry "CheckSpec" succeed
+2023-03-23 14:54:07,491 gitee_repo.py[line: 74] WARNING : no spec file
+2023-03-23 14:54:36,505 test.py[line:285] INFO : --------------------AC START---------------------
+2023-03-23 14:54:36,525 test.py[line:170] DEBUG : community "src-oepkgs" conf: {'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}, 'code': {'hint': 'check_code_style', 'module': 'code.check_code_style', 'entry': 'CheckCodeStyle', 'exclude': True, 'ignored': ['patch']}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'sca': {'exclude': True}, 'openlibing': {'exclude': True}, 'commit_msg': {'exclude': True}, 'source_consistency': {'exclude': True}}
+2023-03-23 14:54:36,525 test.py[line:174] DEBUG : exclude: code
+2023-03-23 14:54:36,525 test.py[line:174] DEBUG : exclude: sca
+2023-03-23 14:54:36,525 test.py[line:174] DEBUG : exclude: openlibing
+2023-03-23 14:54:36,525 test.py[line:174] DEBUG : exclude: commit_msg
+2023-03-23 14:54:36,525 test.py[line:174] DEBUG : exclude: source_consistency
+2023-03-23 14:54:36,525 test.py[line: 57] DEBUG : check list: {'binary': {'hint': 'check_binary_file', 'module': 'binary.check_binary_file', 'entry': 'CheckBinaryFile'}, 'package_license': {'hint': 'check_package_license', 'module': 'package_license.check_license', 'entry': 'CheckLicense'}, 'package_yaml': {'hint': 'check_package_yaml_file', 'module': 'package_yaml.check_yaml', 'entry': 'CheckPackageYaml', 'ignored': ['fields']}, 'spec': {'hint': 'check_spec_file', 'module': 'spec.check_spec', 'entry': 'CheckSpec', 'ignored': ['homepage']}}
+2023-03-23 14:54:36,525 test.py[line: 94] DEBUG : check binary
+2023-03-23 14:54:36,525 test.py[line:108] DEBUG : load module binary.check_binary_file succeed
+2023-03-23 14:54:36,525 test.py[line:118] DEBUG : load entry "CheckBinaryFile" succeed
+2023-03-23 14:54:36,525 gitee_repo.py[line: 74] WARNING : no spec file
+2023-03-23 14:54:36,525 check_binary_file.py[line: 98] ERROR : spec file not find
+2023-03-23 14:54:36,525 test.py[line: 94] DEBUG : check package_license
+2023-03-23 14:54:36,525 test.py[line:108] DEBUG : load module package_license.check_license succeed
+2023-03-23 14:54:36,525 test.py[line:118] DEBUG : load entry "CheckLicense" succeed
+2023-03-23 14:54:36,525 gitee_repo.py[line: 74] WARNING : no spec file
+2023-03-23 14:54:36,525 test.py[line: 94] DEBUG : check package_yaml
+2023-03-23 14:54:36,543 test.py[line:108] DEBUG : load module package_yaml.check_yaml succeed
+2023-03-23 14:54:36,543 test.py[line:118] DEBUG : load entry "CheckPackageYaml" succeed
+2023-03-23 14:54:36,543 gitee_repo.py[line: 74] WARNING : no spec file
+2023-03-23 14:54:36,543 test.py[line: 94] DEBUG : check spec
+2023-03-23 14:54:36,543 test.py[line:108] DEBUG : load module spec.check_spec succeed
+2023-03-23 14:54:36,543 test.py[line:118] DEBUG : load entry "CheckSpec" succeed
+2023-03-23 14:54:36,543 gitee_repo.py[line: 74] WARNING : no spec file
+2023-09-10 11:31:39,915 test.py[line:245] INFO : None
+2023-09-10 11:31:39,916 test.py[line:246] INFO : None
+2023-09-10 11:31:39,916 test.py[line:247] INFO : None
+2023-09-10 11:31:39,916 test.py[line:248] INFO : None
+2023-09-10 11:31:39,916 test.py[line:249] INFO : None
+2023-09-10 11:31:39,917 test.py[line:250] INFO : None
+2023-09-10 11:31:45,534 test.py[line:245] INFO : None
+2023-09-10 11:31:45,534 test.py[line:246] INFO : None
+2023-09-10 11:31:45,534 test.py[line:247] INFO : None
+2023-09-10 11:31:45,535 test.py[line:248] INFO : None
+2023-09-10 11:31:45,535 test.py[line:249] INFO : None
+2023-09-10 11:31:45,535 test.py[line:250] INFO : None
+2023-09-10 11:31:52,927 test.py[line:245] INFO : None
+2023-09-10 11:31:52,927 test.py[line:246] INFO : None
+2023-09-10 11:31:52,927 test.py[line:247] INFO : None
+2023-09-10 11:31:52,928 test.py[line:248] INFO : None
+2023-09-10 11:31:52,928 test.py[line:249] INFO : None
+2023-09-10 11:31:52,928 test.py[line:250] INFO : None
+2023-09-10 18:51:06,228 test.py[line:245] INFO : None
+2023-09-10 18:51:06,228 test.py[line:246] INFO : None
+2023-09-10 18:51:06,228 test.py[line:247] INFO : None
+2023-09-10 18:51:06,229 test.py[line:248] INFO : None
+2023-09-10 18:51:06,229 test.py[line:249] INFO : None
+2023-09-10 18:51:06,229 test.py[line:250] INFO : None
+2023-09-10 21:56:39,509 test.py[line:245] INFO : None
+2023-09-10 21:56:39,509 test.py[line:246] INFO : None
+2023-09-10 21:56:39,510 test.py[line:247] INFO : None
+2023-09-10 21:56:39,510 test.py[line:248] INFO : None
+2023-09-10 21:56:39,510 test.py[line:249] INFO : None
+2023-09-10 21:56:39,510 test.py[line:250] INFO : None
+2023-09-10 21:57:12,226 test.py[line:245] INFO : None
+2023-09-10 21:57:12,227 test.py[line:246] INFO : None
+2023-09-10 21:57:12,227 test.py[line:247] INFO : None
+2023-09-10 21:57:12,227 test.py[line:248] INFO : None
+2023-09-10 21:57:12,227 test.py[line:249] INFO : None
+2023-09-10 21:57:12,228 test.py[line:250] INFO : None
+2023-09-13 17:01:27,546 test.py[line:245] INFO : None
+2023-09-13 17:01:27,547 test.py[line:246] INFO : None
+2023-09-13 17:01:27,548 test.py[line:247] INFO : None
+2023-09-13 17:01:27,548 test.py[line:248] INFO : None
+2023-09-13 17:01:27,548 test.py[line:249] INFO : None
+2023-09-13 17:01:27,549 test.py[line:250] INFO : None
+2023-09-14 09:42:05,381 test.py[line:245] INFO : None
+2023-09-14 09:42:05,382 test.py[line:246] INFO : None
+2023-09-14 09:42:05,382 test.py[line:247] INFO : None
+2023-09-14 09:42:05,383 test.py[line:248] INFO : None
+2023-09-14 09:42:05,383 test.py[line:249] INFO : None
+2023-09-14 09:42:05,384 test.py[line:250] INFO : None
diff --git a/src/ac/framework/log/build.log b/src/ac/framework/log/build.log
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/src/ac/framework/log/common.log b/src/ac/framework/log/common.log
new file mode 100644
index 0000000000000000000000000000000000000000..f5522821f1e1355799c4ac63bba0a77c1be31ca5
--- /dev/null
+++ b/src/ac/framework/log/common.log
@@ -0,0 +1,49 @@
+2023-03-23 10:50:42,974 gitee_proxy.py[line:102] DEBUG : delete tag ci_successful of pull request None
+2023-03-23 10:50:42,974 requests_proxy.py[line: 41] DEBUG : http requests, delete https://gitee.com/api/v5/repos/src-openeuler/None/pulls/None/labels/ci_successful?access_token=None 10
+2023-03-23 10:50:42,974 requests_proxy.py[line: 42] DEBUG : querystring: None
+2023-03-23 10:50:42,974 requests_proxy.py[line: 43] DEBUG : body: None
+2023-03-23 10:50:43,257 requests_proxy.py[line: 63] DEBUG : status_code 401
+2023-03-23 10:50:43,258 gitee_proxy.py[line:109] WARNING : delete tags:ci_successful failed
+2023-03-23 10:50:43,258 gitee_proxy.py[line:102] DEBUG : delete tag ci_failed of pull request None
+2023-03-23 10:50:43,258 requests_proxy.py[line: 41] DEBUG : http requests, delete https://gitee.com/api/v5/repos/src-openeuler/None/pulls/None/labels/ci_failed?access_token=None 10
+2023-03-23 10:50:43,258 requests_proxy.py[line: 42] DEBUG : querystring: None
+2023-03-23 10:50:43,258 requests_proxy.py[line: 43] DEBUG : body: None
+2023-03-23 10:50:43,474 requests_proxy.py[line: 63] DEBUG : status_code 401
+2023-03-23 10:50:43,474 gitee_proxy.py[line:109] WARNING : delete tags:ci_failed failed
+2023-03-23 10:50:43,474 gitee_proxy.py[line: 61] DEBUG : create tags ('ci_processing',) of pull request None
+2023-03-23 10:50:43,475 requests_proxy.py[line: 41] DEBUG : http requests, post https://gitee.com/api/v5/repos/src-openeuler/None/pulls/None/labels?access_token=None 10
+2023-03-23 10:50:43,475 requests_proxy.py[line: 42] DEBUG : querystring: None
+2023-03-23 10:50:43,475 requests_proxy.py[line: 43] DEBUG : body: ['ci_processing']
+2023-03-23 10:50:43,682 requests_proxy.py[line: 63] DEBUG : status_code 401
+2023-03-23 10:50:43,682 gitee_proxy.py[line: 68] WARNING : create tags:('ci_processing',) failed
+2023-03-23 10:53:59,082 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 10:54:16,652 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 10:55:00,979 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 10:55:46,631 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 10:56:13,086 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 10:56:26,457 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 10:58:21,716 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 11:27:36,859 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 11:28:35,987 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 11:30:24,668 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 11:31:16,961 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 11:31:33,659 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 11:32:32,685 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 11:32:56,017 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 11:33:18,551 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 14:33:53,323 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 14:34:17,359 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 14:35:22,294 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 14:52:03,684 dist_dataset.py[line:111] DEBUG : no correspond ctime
+2023-03-23 14:53:50,998 shell_cmd.py[line: 55] DEBUG : exec cmd -- cd D:\pythonProject\src-oepkgs\src\ac\framework\mysql; git show HEAD~0:None
+2023-03-23 14:53:51,012 shell_cmd.py[line: 77] DEBUG : total 0 lines output
+2023-03-23 14:53:51,012 shell_cmd.py[line: 83] DEBUG : return code 1
+2023-03-23 14:53:51,024 git_proxy.py[line: 89] WARNING : get file content of commit failed, 1
+2023-03-23 14:54:07,491 shell_cmd.py[line: 55] DEBUG : exec cmd -- cd D:\pythonProject\src-oepkgs\src\ac\framework\mysql; git show HEAD~0:None
+2023-03-23 14:54:07,504 shell_cmd.py[line: 77] DEBUG : total 0 lines output
+2023-03-23 14:54:07,504 shell_cmd.py[line: 83] DEBUG : return code 1
+2023-03-23 14:54:07,507 git_proxy.py[line: 89] WARNING : get file content of commit failed, 1
+2023-03-23 14:54:36,543 shell_cmd.py[line: 55] DEBUG : exec cmd -- cd D:\pythonProject\src-oepkgs\src\ac\framework\mysql; git show HEAD~0:None
+2023-03-23 14:54:36,558 shell_cmd.py[line: 77] DEBUG : total 0 lines output
+2023-03-23 14:54:36,558 shell_cmd.py[line: 83] DEBUG : return code 1
+2023-03-23 14:54:36,559 git_proxy.py[line: 89] WARNING : get file content of commit failed, 1
diff --git a/src/ac/framework/log/jobs.log b/src/ac/framework/log/jobs.log
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/src/ac/framework/npm_test.py b/src/ac/framework/npm_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..34b5b8a067a803c6f95f3f8561790f2147d828cb
--- /dev/null
+++ b/src/ac/framework/npm_test.py
@@ -0,0 +1,98 @@
+import requests
+
+# url = 'https://build.dev.oepkgs.net/api/build/task/create'
+#
+tmp = 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoxNTMsInVzZXJfbmFtZSI6Im1lbmdsaW5nX2N1aUAxNjMuY29tIiwiVFoiOiJBc2lhL1NoYW5naGFpIiwiZXhwIjoxNjk1MjI0NTg3LCJpYXQiOjE2OTUyMjI3ODd9.R8MZeJHEXXMdDakV7_Z-kex1SvFClFa6yIc53c879AM'
+#
+# headers = {
+# 'authority': 'build.dev.oepkgs.net',
+# 'accept': '*/*',
+# 'accept-language': 'en-US,en;q=0.9,zh;q=0.8,zh-CN;q=0.7',
+# 'cache-control': 'no-cache',
+# 'content-type': 'application/json',
+# 'cookie': 'auth_token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoxMDksInVzZXJfbmFtZSI6ImRpZGkiLCJUWiI6IkFzaWEvU2hhbmdoYWkiLCJleHAiOjE2OTMzNzAwMzMsImlhdCI6MTY5MzM2ODIzM30.szb3CB_U3DJOVG94JzHkUBqTjD0x8WM0JCJz3fNODuc',
+# 'dnt': '1',
+# 'origin': 'https://build.dev.oepkgs.net',
+# 'pragma': 'no-cache',
+# 'referer': 'https://build.dev.oepkgs.net/rpm/task/create',
+# 'sec-ch-ua': '"Chromium";v="116", "Not)A;Brand";v="24", "Google Chrome";v="116"',
+# 'sec-ch-ua-mobile': '?0',
+# 'sec-ch-ua-platform': '"Windows"',
+# 'sec-fetch-dest': 'empty',
+# 'sec-fetch-mode': 'cors',
+# 'sec-fetch-site': 'same-origin',
+# 'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Safari/537.36',
+# 'x-auth-token': tmp,
+# }
+#
+# data = {
+# 'builder': 1,
+# 'jobName': 'task0901-3',
+# 'os': 'openEuler',
+# 'osFull': '22.03-LTS',
+# 'ccArch': 'x86_64',
+# 'repoId': 251,
+# 'scmRepo': 'https://gitee.com/src-oepkgs/OCK',
+# 'branch': 'master'
+# }
+#
+# response = requests.post(url, headers=headers, json=data)
+#
+# print(response.status_code)
+# print(response.text)
+
+
+#
+# url2 = 'https://build.dev.oepkgs.net/api/build/task/buildJob'
+#
+# headers2 = {
+# 'authority': 'build.dev.oepkgs.net',
+# 'accept': '*/*',
+# 'accept-language': 'en-US,en;q=0.9,zh;q=0.8,zh-CN;q=0.7',
+# 'cache-control': 'no-cache',
+# 'content-type': 'application/json',
+# 'cookie': 'auth_token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoxMDksInVzZXJfbmFtZSI6ImRpZGkiLCJUWiI6IkFzaIAvU2hhbmdoYWkiLCJleHAiOjE2OTM1Mzk4MjgsImlhdCI6MTY5MzUzODAyOH0.sFZYWM-YiKKGcD2udonHmCM_kLxyb3Dt0P6xQZs49u8',
+# 'dnt': '1',
+# 'origin': 'https://build.dev.oepkgs.net',
+# 'pragma': 'no-cache',
+# 'referer': 'https://build.dev.oepkgs.net/rpm/task',
+# 'sec-ch-ua': '"Chromium";v="116", "Not)A;Brand";v="24", "Google Chrome";v="116"',
+# 'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Safari/537.36',
+# 'x-auth-token': tmp,
+# }
+#
+# data2 = {
+# 'jobName': 'task0901-3',
+# 'jobId': 251,
+# }
+#
+# response2 = requests.post(url2, headers=headers2, json=data2)
+#
+# print(response2.status_code)
+# print(response2.text)
+#
+#
+#
+url3 = 'https://build-api.dev.oepkgs.net/task/buildHis'
+
+headers3 = {
+ 'authority': 'build.dev.oepkgs.net',
+ 'accept': '*/*',
+ 'accept-language': 'en-US,en;q=0.9,zh;q=0.8,zh-CN;q=0.7',
+ 'cache-control': 'no-cache',
+ 'content-type': 'application/json',
+ 'cookie': 'auth_token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoxMDksInVzZXJfbmFtZSI6ImRpZGkiLCJUWiI6IkFzaIAvU2hhbmdoYWkiLCJleHAiOjE2OTM1Mzk4MjgsImlhdCI6MTY5MzUzODAyOH0.sFZYWM-YiKKGcD2udonHmCM_kLxyb3Dt0P6xQZs49u8',
+ 'dnt': '1',
+ 'pragma': 'no-cache',
+ 'referer': 'https://build.dev.oepkgs.net/rpm/task',
+ 'sec-ch-ua': '"Chromium";v="116", "Not)A;Brand";v="24", "Google Chrome";v="116"',
+ 'x-auth-token': tmp,
+}
+
+
+# 发送 GET 请求
+response3 = requests.get(url3, headers=headers3)
+
+print(response3.status_code)
+print(response3.text)
+
diff --git a/src/ac/framework/server.log b/src/ac/framework/server.log
new file mode 100644
index 0000000000000000000000000000000000000000..41e99be73adf29f80b8306f9cd638f96ae2db9d4
Binary files /dev/null and b/src/ac/framework/server.log differ
diff --git a/src/ac/framework/spider.py b/src/ac/framework/spider.py
new file mode 100644
index 0000000000000000000000000000000000000000..f08f930a23ec4e5ec44a60950ae94b207f0e00f5
--- /dev/null
+++ b/src/ac/framework/spider.py
@@ -0,0 +1,28 @@
+import requests
+import json
+tmp = 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoxNTMsInVzZXJfbmFtZSI6Im1lbmdsaW5nX2N1aUAxNjMuY29tIiwiVFoiOiJBc2lhL1NoYW5naGFpIiwiZXhwIjoxNjk1MTgxNjAwLCJpYXQiOjE2OTUxNzk4MDB9.kLMhHJenl4H63BwtyVfvLL835p6dl72RLCv3k5ARhsQ'
+
+url = 'https://build.dev.oepkgs.net/api/build/task/getBuildRecord/955?pageNum=1&pageSize=5'
+
+headers = {
+ 'authority': 'build.dev.oepkgs.net',
+ 'accept': '*/*',
+ 'accept-language': 'en-US,en;q=0.9,zh;q=0.8,zh-CN;q=0.7',
+ 'cache-control': 'no-cache',
+ 'content-type': 'application/json',
+ 'cookie': 'auth_token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoxMDksInVzZXJfbmFtZSI6ImRpZGkiLCJUWiI6IkFzaWEvU2hhbmdoYWkiLCJleHAiOjE2OTM1Mzk4MjgsImlhdCI6MTY5MzUzODAyOH0.sFZYWM-YiKKGcD2udonHmCM_kLxyb3Dt0P6xQZs49u8',
+ 'dnt': '1',
+ 'pragma': 'no-cache',
+ 'referer': 'https://build.dev.oepkgs.net/rpm/task',
+ 'sec-ch-ua': '"Chromium";v="116", "Not)A;Brand";v="24", "Google Chrome";v="116"',
+ 'x-auth-token': tmp,
+}
+
+params = {
+ "jobName": "fb2031f5-dafd-4b68-902f-7f772c196d3f",
+}
+
+response = requests.get(url, headers=headers, params=params)
+res_data = json.loads(response.text)
+
+print(res_data['success'])
diff --git a/src/ac/framework/test-token.py b/src/ac/framework/test-token.py
new file mode 100644
index 0000000000000000000000000000000000000000..57328bf04aa0c370a0406359cab92d7fe917ab1a
--- /dev/null
+++ b/src/ac/framework/test-token.py
@@ -0,0 +1,74 @@
+import requests
+
+url = 'https://build-api.dev.oepkgs.net/genURL'
+
+headers = {
+ 'Accept': '*/*',
+ 'Content-Type': 'application/json',
+}
+
+response = requests.get(url, headers=headers)
+
+print(response.status_code)
+print(response.text)
+
+
+if response.status_code == 200:
+ # 尝试解析 JSON 数据
+ try:
+ json_data = response.json()
+ # 提取 "status" 字段
+ data = json_data.get("data")
+ status = data.get("status")
+ if status:
+ print(f"Status: {status}")
+ else:
+ print("No 'status' field found in the JSON data.")
+ except ValueError:
+ print("Failed to parse JSON data.")
+else:
+ print(f"Failed to retrieve data. Status code: {response.status_code}")
+
+# import requests
+
+# BUILDURL = status # 请将此处的URL替换为实际的URL
+# state = "your_state_value" # 请替换为实际的值
+# session_state = "your_session_state_value" # 请替换为实际的值
+# code = "your_code_value" # 请替换为实际的值
+# redirectPath = "your_redirect_path" # 请替换为实际的值
+#
+# url = f"{BUILDURL}/callback"
+# params = {
+# 'state': state,
+# 'session_state': session_state,
+# 'code': code,
+# 'redirectPath': redirectPath
+# }
+#
+# response = requests.get(url, params=params)
+#
+# print(response.status_code)
+# print(response.text)
+
+import requests
+
+url2 = "https://build-api.dev.oepkgs.net/callback"
+params2 = {
+ "code": "",
+ "redirectPath": "",
+ "session_state": "",
+ "state": status
+}
+
+headers2 = {
+ "Accept": "*/*",
+ "Content-Type": "application/x-www-form-urlencoded",
+}
+
+response2 = requests.get(url2, headers=headers2, params=params2)
+print(response2.status_code)
+if response2.status_code == 200:
+ print(response2.text)
+else:
+ print(f"Failed to retrieve data. Status code: {response2.status_code}")
+
diff --git a/src/ac/framework/test.py b/src/ac/framework/test.py
new file mode 100644
index 0000000000000000000000000000000000000000..3c99d2dfa38f0e1fe3586015cc551a88aac2da64
--- /dev/null
+++ b/src/ac/framework/test.py
@@ -0,0 +1,338 @@
+# -*- encoding=utf-8 -*-
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2020-09-23
+# Description: access control list entrypoint
+# **********************************************************************************
+
+import argparse
+import datetime
+import importlib
+import json
+import logging.config
+import os
+import sys
+import warnings
+
+import yaml
+from yaml.error import YAMLError
+
+sys.path.append("/var/jenkins_home/src-oepkgs/")
+
+from src.proxy.git_proxy import GitProxy
+from src.proxy.gitee_proxy import GiteeProxy
+from src.build.gitee_comment import Comment
+from src.utils.dist_dataset import DistDataset
+from src.ac.framework.ac_result import SUCCESS
+
+
+class AC(object):
+ """
+ ac entrypoint
+ """
+
+ def __init__(self, conf, community="src-oepkgs"):
+ """
+
+ :param conf: 配置文件路径
+ :param community: src-oepkgs or oepkgs
+ :return:
+ """
+ self._ac_check_elements = {} # 门禁项
+ self._ac_check_result = [] # 门禁结果结果
+
+ acl_path = os.path.realpath(os.path.join(os.path.dirname(__file__), "../acl"))
+ self._acl_package = "src.ac.acl" # take attention about import module
+ self.load_check_elements_from_acl_directory(acl_path)
+ self.load_check_elements_from_conf(conf, community)
+
+ logger.debug("check list: %s", self._ac_check_elements)
+
+ @staticmethod
+ def comment_jenkins_url(gp, pr, build_id):
+ """
+ 在pr评论中显示构建任务链接
+ :param gp: gitee接口
+ :param build_id: build_id
+ :param pr: pr编号
+ :return:
+ """
+ comments = ["门禁正在运行, 您可以通过以下链接查看实时门禁检查结果.",
+ "门禁入口及编码规范检查: {}".format(
+ build_id, "jenkins"),
+ "若您对门禁结果含义不清晰或者遇到问题不知如何解决,可参考"
+ "门禁指导手册"]
+
+ gp.comment_pr(pr, "\n".join(comments))
+
+ def check_all(self, workspace, repo, dataset, **kwargs):
+ """
+ 门禁检查
+ :param workspace:
+ :param repo:
+ :return:
+ """
+ for element in self._ac_check_elements:
+ check_element = self._ac_check_elements.get(element)
+ logger.debug("check %s", element)
+
+ # show in gitee, must starts with "check_"
+ hint = check_element.get("hint", "check_{}".format(element))
+
+ if not hint.startswith("check_"):
+ hint = "check_{}".format(hint)
+
+ # import module
+ # eg: spec.check_spec
+ module_path = check_element.get("module", "{}.check_{}".format(element, element))
+
+ try:
+ module = importlib.import_module("." + module_path, self._acl_package)
+ logger.debug("load module %s succeed", module_path)
+ except ImportError as exc:
+ logger.exception("import module %s exception, %s", module_path, exc)
+ continue
+
+ # import entry
+ entry_name = check_element.get("entry", "Check{}".format(element.capitalize()))
+ try:
+ entry = getattr(module, entry_name)
+ logger.debug("load entry \"%s\" succeed", entry_name)
+ except AttributeError as exc:
+ logger.warning("entry \"%s\" not exist in module %s, %s", entry_name, module_path, exc)
+ continue
+
+ # new a instance
+ if isinstance(entry, type): # class object
+ logger.info(entry)
+ entry = entry(workspace, repo, check_element, dataset) # new a instance
+
+ if not callable(entry): # check callable
+ logger.warning("entry %s not callable", entry_name)
+ continue
+
+ # do ac check
+ result = entry(**kwargs)
+ logger.debug("check result %s %s", element, result)
+
+ self._ac_check_result.append({"name": hint, "result": result.val})
+ dataset.set_attr("access_control.build.acl.{}".format(element), result.hint)
+
+ dataset.set_attr("access_control.build.content", self._ac_check_result)
+ logger.debug("ac result: %s", self._ac_check_result)
+
+ def load_check_elements_from_acl_directory(self, acl_dir):
+ """
+ 加载当前目录下所有门禁项s
+ :return:
+ """
+ for filename in os.listdir(acl_dir):
+ if filename != "__pycache__" and os.path.isdir(os.path.join(acl_dir, filename)):
+ self._ac_check_elements[filename] = {} # don't worry, using default when checking
+
+ def load_check_elements_from_conf(self, conf_file, community):
+ """
+ 加载门禁项目,只支持yaml格式
+ :param conf_file: 配置文件路径
+ :param community: src-openeuler or openeuler
+ :return:
+ """
+ try:
+ with open(conf_file, "r") as f:
+ content = yaml.safe_load(f)
+ except IOError:
+ logger.exception("ac conf file %s not exist", conf_file)
+ return
+ except YAMLError:
+ logger.exception("illegal conf file format")
+ return
+
+ elements = content.get(community, {})
+ logger.debug("community \"%s\" conf: %s", community, elements)
+ for name in elements:
+ if name in self._ac_check_elements:
+ if elements[name].get("exclude"):
+ logger.debug("exclude: %s", name)
+ self._ac_check_elements.pop(name)
+ else:
+ self._ac_check_elements[name] = elements[name]
+
+ def save(self):
+ """
+ save result
+ :return:
+ """
+ logger.debug("save ac result to file %s")
+ result = json.dumps(self._ac_check_result)
+ os.environ['ACL'] = result
+
+
+def init_args():
+ """
+ init args
+ :return:
+ """
+ parser = argparse.ArgumentParser()
+
+ parser.add_argument("-c", type=str, dest="community", default="src-oepkgs", help="src-openeuler or openeuler")
+ parser.add_argument("-w", type=str, dest="workspace", help="workspace where to find source")
+ parser.add_argument("-r", type=str, dest="repo", help="repo name")
+ parser.add_argument("-b", type=str, dest="tbranch", help="branch merge to")
+ parser.add_argument("-o", type=str, dest="output", help="output file to save result")
+ parser.add_argument("-p", type=str, dest="pr", help="pull request number")
+ parser.add_argument("-t", type=str, dest="token", help="gitee api token")
+ parser.add_argument("-a", type=str, dest="account", help="gitee account")
+ parser.add_argument("-i", type=str, dest="build_id", help="build_id")
+
+ # dataset
+ parser.add_argument("-m", type=str, dest="comment", help="trigger comment")
+ # parser.add_argument("-i", type=str, dest="comment_id", help="trigger comment id")
+ parser.add_argument("-e", type=str, dest="committer", help="committer")
+ parser.add_argument("-x", type=str, dest="pr_ctime", help="pr create time")
+ parser.add_argument("-z", type=str, dest="trigger_time", help="job trigger time")
+ parser.add_argument("-l", type=str, dest="trigger_link", help="job trigger link")
+
+ parser.add_argument("-f", type=str, dest="check_result_file", default="", help="compare package check item result")
+ parser.add_argument("-d", type=str, dest="check_item_comment_files", nargs="*", help="check item comment files")
+
+ # scanoss
+ parser.add_argument("--scanoss-output", type=str, dest="scanoss_output",
+ default="scanoss_result", help="scanoss result output")
+
+ parser.add_argument("--codecheck-api-key", type=str, dest="codecheck_api_key", help="codecheck api key")
+ parser.add_argument("--codecheck-api-url", type=str, dest="codecheck_api_url",
+ default="https://majun.osinfra.cn:8384/api/openlibing/codecheck", help="codecheck api url")
+
+ parser.add_argument("--jenkins-base-url", type=str, dest="jenkins_base_url",
+ default="https://openeulerjenkins.osinfra.cn/", help="jenkins base url")
+ parser.add_argument("--jenkins-user", type=str, dest="jenkins_user", help="repo name")
+ parser.add_argument("--jenkins-api-token", type=str, dest="jenkins_api_token", help="jenkins api token")
+
+ return parser.parse_args()
+
+
+if "__main__" == __name__:
+ args = init_args()
+
+ # init logging
+ _ = not os.path.exists("log") and os.mkdir("log")
+ logger_conf_path = os.path.realpath(os.path.join(
+ os.path.dirname(os.path.realpath(__file__)), "../../conf/logger.conf"))
+ logging.config.fileConfig(logger_conf_path)
+ logger = logging.getLogger("ac")
+
+ dd = DistDataset()
+ dd.set_attr_stime("access_control.job.stime")
+
+ community = "src-oepkgs"
+ repo = args.repo
+ workspace = os.path.join(os.path.dirname(os.path.realpath(__file__)), community)
+ pr = args.pr
+ token = args.token
+ build_id = args.build_id
+ tbranch = args.tbranch
+ committer = args.committer
+
+ logger.info(repo)
+ logger.info(pr)
+ logger.info(token)
+ logger.info(build_id)
+ logger.info(tbranch)
+ logger.info(committer)
+
+ # info from args
+ dd.set_attr("id", build_id)
+ dd.set_attr("pull_request.package", repo)
+ dd.set_attr("pull_request.number", pr)
+ dd.set_attr("pull_request.author", committer)
+ dd.set_attr("pull_request.target_branch", tbranch)
+ dd.set_attr("pull_request.ctime", args.pr_ctime)
+ dd.set_attr("access_control.trigger.link", args.trigger_link)
+ dd.set_attr("access_control.trigger.reason", args.comment)
+ ctime = datetime.datetime.strptime(datetime.datetime.now().strftime("%Y-%m-%dT%H:%M:%S"), "%Y-%m-%dT%H:%M:%S")
+ dd.set_attr_ctime("access_control.job.ctime", ctime)
+
+ # suppress python warning
+ warnings.filterwarnings("ignore")
+ logging.getLogger("elasticsearch").setLevel(logging.WARNING)
+ logging.getLogger("kafka").setLevel(logging.WARNING)
+
+ # download repo
+ dd.set_attr_stime("access_control.scm.stime")
+ git_proxy = GitProxy.init_repository(repo, work_dir=workspace)
+ repo_url = "https://gitee.com/{}/{}.git".format(community, repo)
+ "https://gitee.com/src-oepkgs/OCK-test.git"
+ if not git_proxy.fetch_pull_request(repo_url, pr, depth=4):
+ dd.set_attr("access_control.scm.result", "failed")
+ dd.set_attr_etime("access_control.scm.etime")
+
+ dd.set_attr_etime("access_control.job.etime")
+ logger.info("fetch finished -")
+ sys.exit(-1)
+ else:
+ git_proxy.checkout_to_commit_force("pull/{}/MERGE".format(pr))
+ logger.info("fetch finished +")
+ dd.set_attr("access_control.scm.result", "successful")
+ dd.set_attr_etime("access_control.scm.etime")
+
+ logger.info("--------------------AC START---------------------")
+
+ # build start
+ dd.set_attr_stime("access_control.build.stime")
+
+ # gitee comment jenkins url
+ gp = GiteeProxy(community, repo, token)
+ AC.comment_jenkins_url(gp, pr, build_id)
+ #
+ # # gitee pr tag
+ # gp.delete_tag_of_pr(pr, "ci_successful")
+ # gp.delete_tag_of_pr(pr, "ci_failed")
+ # gp.create_tags_of_pr(pr, "ci_processing")
+
+ # scanoss conf
+ scanoss = {"output": args.scanoss_output}
+
+ codecheck = {"pr_url": "https://gitee.com/{}/{}/pulls/{}".format(community, repo, pr),
+ "pr_number": pr, "codecheck_api_url": args.codecheck_api_url,
+ "codecheck_api_key": args.codecheck_api_key}
+
+ # build
+ ac = AC(os.path.join(os.path.dirname(os.path.realpath(__file__)), "ac.yaml"), community)
+ ac.check_all(workspace=workspace, repo=repo, dataset=dd, tbranch=tbranch, scanoss=scanoss,
+ codecheck=codecheck)
+ dd.set_attr_etime("access_control.build.etime")
+ ac.save()
+
+ comment = Comment(pr, *args.check_item_comment_files) \
+ if args.check_item_comment_files else Comment(pr)
+ logger.info("comment: build result......")
+ comment_content = comment.comment_build(gp, build_id)
+ dd.set_attr_etime("comment.build.etime")
+ dd.set_attr("comment.build.content.html", comment_content)
+
+ if comment.check_build_result() == SUCCESS:
+ gp.delete_tag_of_pr(pr, "ci_failed")
+ gp.create_tags_of_pr(pr, "ci_successful")
+ dd.set_attr("comment.build.tags", ["ci_successful"])
+ dd.set_attr("comment.build.result", "successful")
+ if args.check_result_file:
+ comment.comment_compare_package_details(gp, args.check_result_file)
+ else:
+ gp.delete_tag_of_pr(pr, "ci_successful")
+ gp.create_tags_of_pr(pr, "ci_failed")
+ dd.set_attr("comment.build.tags", ["ci_failed"])
+ dd.set_attr("comment.build.result", "failed")
+
+ logger.info("comment: at committer......")
+ comment.comment_at(committer, gp)
+ dd.set_attr_etime("comment.job.etime")
+ dd.set_attr_etime("access_control.job.etime")
diff --git a/src/ac/framework/token_test2.py b/src/ac/framework/token_test2.py
new file mode 100644
index 0000000000000000000000000000000000000000..813489df542e60f0b9b80f9f3be62be5af710cd7
--- /dev/null
+++ b/src/ac/framework/token_test2.py
@@ -0,0 +1,40 @@
+from selenium import webdriver
+from selenium.webdriver.common.by import By
+from selenium.webdriver.common.keys import Keys
+from selenium.webdriver.common.action_chains import ActionChains
+import requests
+
+options = webdriver.ChromeOptions()
+options.add_argument("--auto-open-devtools-for-tabs")
+
+# 创建一个Chrome WebDriver实例,你可以根据需要选择其他浏览器
+driver = webdriver.Chrome(options=options)
+
+# 打开登录页面
+driver.get("https://build.dev.oepkgs.net/rpm/task") # 替换成你的登录页面地址
+
+try:
+ # 等待页面加载完毕,你可以根据实际情况调整等待时间
+ driver.implicitly_wait(10)
+
+ # 找到用户名和密码输入框,并输入相应的信息
+ username_input = driver.find_element(By.NAME, "username") # 使用By.NAME定位用户名输入框
+ password_input = driver.find_element(By.NAME, "password") # 使用By.NAME定位密码输入框
+
+ username_input.send_keys("mengling_cui@163.com")
+ password_input.send_keys("123456Aa!")
+
+ # 根据按钮文本来定位立即登录按钮
+ login_button = driver.find_element(By.XPATH, "//button[contains(text(), '立即登录')]")
+ login_button.click() # 模拟点击登录按钮
+
+ # 获取cookie
+ cookies = driver.get_cookies()
+
+ for cookie in cookies:
+ print(cookie['name'], cookie['value'])
+
+
+finally:
+ # 关闭WebDriver
+ driver.quit()
diff --git a/src/ac/framework/yaml_test.py b/src/ac/framework/yaml_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..d3482302ea3e7ffcb2213a2833716c22af571ced
--- /dev/null
+++ b/src/ac/framework/yaml_test.py
@@ -0,0 +1,48 @@
+import abc
+import inspect
+import logging
+import os
+from src.ac.common.gitee_repo import GiteeRepo
+
+from src.ac.framework.ac_result import SUCCESS, WARNING, FAILED
+
+logger = logging.getLogger("ac")
+
+
+class BaseCheck(object):
+ """
+ acl check base class
+ """
+
+ __metaclass__ = abc.ABCMeta
+
+ def __init__(self, workspace, repo, conf=None):
+ """
+
+ :param repo:
+ :param workspace:
+ :param conf:
+ """
+ self._repo = repo
+ self._workspace = workspace
+ self._conf = conf
+
+ self._work_dir = os.path.join(workspace, repo)
+
+
+class CheckBinaryFile(BaseCheck):
+ """
+ check binary file
+ """
+ # 二进制文件后缀集
+ BINARY_LIST = {".pyc", ".jar", ".o", ".ko"}
+
+ def __init__(self, workspace, repo, conf):
+ super(CheckBinaryFile, self).__init__(workspace, repo, conf)
+ self._work_tar_dir = os.path.join(workspace, "code") # 解压缩目标目录
+ self._gr = GiteeRepo(self._repo, self._work_dir, self._work_tar_dir)
+ self._tarball_in_spec = set()
+ # self._upstream_community_tarball_in_spec()
+
+
+CheckBinaryFile()
\ No newline at end of file
diff --git a/src/build/build_rpm_package.py b/src/build/build_rpm_package.py
new file mode 100644
index 0000000000000000000000000000000000000000..0a9d5e27b6adacb1b1215a232c49678b68581fde
--- /dev/null
+++ b/src/build/build_rpm_package.py
@@ -0,0 +1,201 @@
+# -*- encoding=utf-8 -*-
+"""
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2020-09-23
+# Description: build result rpm package info
+# **********************************************************************************
+"""
+
+import os
+import re
+
+
+class BuildRPMPackage(object):
+ """
+ build result rpm package info
+ """
+
+ LINKMAGIC = "0X080480000XC0000000" # 不要和gitee中用户名相同
+
+ def __init__(self, repo, rpmbuild_dir):
+ """
+
+ :param repo: 包名
+ :param rpmbuild_dir: rpmbuild路径
+ """
+ self._repo = repo
+ self._rpmbuild_dir = rpmbuild_dir
+
+ self._rpm_packages = {"srpm": {}, "rpm": {}}
+ self._package_structure(rpmbuild_dir)
+
+ def main_package_local(self):
+ """
+ 返回主包在本地路径
+ :param arch:
+ :return:
+ """
+ package = self._rpm_packages["rpm"].get(self._repo)
+ if not package:
+ # not exist
+ return None
+
+ return os.path.join(self._rpmbuild_dir, "RPMS", package["arch"], package["fullname"])
+
+ def main_package_in_repo(self, committer, arch, rpm_repo_url):
+ """
+ 返回主包在repo mirror路径
+ :param committer:
+ :param arch:
+ :param rpm_repo_url:
+ :return:
+ """
+ return self.get_package_path(committer, arch, self._repo, rpm_repo_url)
+
+ def last_main_package(self, arch, rpm_repo_url):
+ """
+ 返回主包在repo mirror链接路径(上次构建的rpm包)
+ :param arch:
+ :param rpm_repo_url: 构建出的rpm包保存的远端地址
+ :return:
+ """
+ return os.path.join(rpm_repo_url, self.LINKMAGIC, arch, self._repo)
+
+ def debuginfo_package_local(self):
+ """
+ 返回debuginfo包在本地路径
+ :return:
+ """
+ package = self._rpm_packages["rpm"].get("{}-debuginfo".format(self._repo))
+ if not package:
+ # not exist
+ return None
+
+ return os.path.join(self._rpmbuild_dir, "RPMS", package["arch"], package["fullname"])
+
+ def debuginfo_package_in_repo(self, committer, arch, rpm_repo_url):
+ """
+ 返回debuginfo包在repo mirror路径
+ :param committer:
+ :param arch:
+ :return:
+ """
+ return self.get_package_path(committer, arch, "{}-debuginfo".format(self._repo), rpm_repo_url)
+
+ def last_debuginfo_package(self, arch, rpm_repo_url):
+ """
+ 返回debuginfo包在repo mirror链接路径(上次构建的rpm包)
+ :param arch:
+ :return:
+ """
+ return os.path.join(rpm_repo_url, self.LINKMAGIC, arch, "{}-debuginfo".format(self._repo))
+
+ @staticmethod
+ def checkabi_md_in_repo(committer, repo, arch, md, rpm_repo_url):
+ """
+ 返回checkabi结果在repo mirror路径
+ :param committer:
+ :param arch:
+ :param md:
+ :param rpm_repo_url:
+ :return:
+ """
+ return os.path.join(rpm_repo_url, committer, repo, arch, md)
+
+ def get_package_path(self, committer, arch, name, remote_url):
+ """
+ 返回包在repo mirror路径
+ :param committer:
+ :param arch:
+ :param name: 包名
+ :param remote_url: 仓库远端地址
+ :return:
+ """
+ package = self._rpm_packages["rpm"].get(name)
+ if not package:
+ # not exist
+ return None
+
+ if arch == "noarch":
+ return os.path.join(remote_url, committer, name, arch, package["fullname"])
+
+ return os.path.join(remote_url, committer, name, arch, "noarch", package["fullname"])
+
+ def get_package_fullname(self, name):
+ """
+ 获取包全名
+ :param name:
+ :return:
+ """
+ package = self._rpm_packages["rpm"].get(name)
+ return package["fullname"] if package else name
+
+ def get_srpm_path(self):
+ """
+ for future
+ :return:
+ """
+ raise NotImplementedError
+
+ @staticmethod
+ def extract_rpm_name(rpm_fullname):
+ """
+ 取出名字部分
+ :param rpm_fullname:
+ :return:
+ """
+ match_name = ''
+ if rpm_fullname:
+ match_name = "-".join(rpm_fullname.split("-")[:-2])
+ return match_name if match_name else rpm_fullname
+
+ def _package_structure(self, rpmbuild_dir):
+ """
+ rpm package 结构
+ :param rpmbuild_dir: rpmbuild路径
+ :return:
+ """
+ rpms_dir = os.path.join(rpmbuild_dir, "RPMS")
+ for dirname, _, filenames in os.walk(rpms_dir):
+ arch = dirname.split("/")[-1]
+ if arch == "i386":
+ arch = "x86-64"
+ for filename in filenames:
+ name = self.extract_rpm_name(filename)
+ self._rpm_packages["rpm"][name] = {"name": name, "fullname": filename, "arch": arch}
+
+ srpms = os.path.join(rpmbuild_dir, "SRPMS")
+ for dirname, _, filenames in os.walk(srpms):
+ for filename in filenames:
+ name = self.extract_rpm_name(filename)
+ self._rpm_packages["srpm"][name] = {"name": name, "fullname": filename}
+
+ def iter_all_rpm(self):
+ """
+ 遍历所有rpm包,返回包在local的路径
+ :return:
+ """
+ packages = self._rpm_packages.get("rpm", {})
+ for name, package in packages.items():
+ yield name, os.path.join(self._rpmbuild_dir, "RPMS", package["arch"], package["fullname"])
+
+ def iter_all_srpm(self):
+ """
+ 遍历所有source rpm包,返回包在local的路径
+ :return:
+ """
+ packages = self._rpm_packages.get("rpm", {})
+
+ for name in packages:
+ package = packages[name]
+ yield name, os.path.join(self._rpmbuild_dir, "SRPMS", package["fullname"])
diff --git a/src/build/comment_to_dashboard.py b/src/build/comment_to_dashboard.py
new file mode 100644
index 0000000000000000000000000000000000000000..0a0414204a518ffd6b20c341ae7f070af515b3ad
--- /dev/null
+++ b/src/build/comment_to_dashboard.py
@@ -0,0 +1,158 @@
+# -*- coding: utf-8 -*-
+"""
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2020-09-23
+# Description: comment pr with build result to dashboard
+# **********************************************************************************
+"""
+
+import os
+import argparse
+import stat
+import time
+from datetime import datetime
+
+import yaml
+
+from src.proxy.gitee_proxy import GiteeProxy
+from src.proxy.kafka_proxy import KafkaProducerProxy
+from src.logger import logger
+
+
+class CommentToDashboard(object):
+ """
+ comments process
+ """
+
+ @staticmethod
+ def output_build_num(args_list, base_dict):
+ """
+ output_build_num
+ :param args_list:
+ :param base_dict:
+ :return:
+ """
+ build_num_list = []
+ build_num_file = "{}_{}_{}_build_num.yaml".format(args_list.owner, args_list.repo, args_list.prid)
+ try:
+ if os.path.exists(build_num_file):
+ with open(build_num_file, "r") as f:
+ build_num_list = yaml.safe_load(f)
+ except yaml.MarkedYAMLError:
+ logger.exception("Read trigger build number file exception, yaml format error")
+ if args_list.trigger_build_id not in build_num_list:
+ build_num_list.append(args_list.trigger_build_id)
+ else:
+ base_dict["build_time"] = 0
+ logger.info("build_num_list = %s", build_num_list)
+ flags = os.O_WRONLY | os.O_CREAT
+ modes = stat.S_IWUSR | stat.S_IRUSR
+ try:
+ with os.fdopen(os.open(build_num_file, flags, modes), "w") as f:
+ yaml.safe_dump(build_num_list, f)
+ except IOError:
+ logger.exception("save build number file exception")
+
+ def get_all_result_to_kafka(self, args_list):
+ """
+ 名称 类型 必选 说明
+ pr_url 字符串 是 需要进行上报的pr地址,(pr_url, build_no)共同确定一次门禁结果
+ pr_title 字符串 是 pr标题
+ pr_create_at 数值 是 pr创建时间戳
+ pr_committer 字符串 是 pr提交人
+ pr_branch 字符串 是 pr目标分支
+ build_no 数值 是 门禁评论工程构建编号,区分同一个pr的多次门禁结果
+ build_at 数值 是 门禁触发时间戳
+ update_at 数值 是 当前时间对应的时间戳
+ build_exception 布尔 是 门禁执行是否异常,异常情况部分字段可以为空
+ build_urls 字典 是 包含多个门禁工程链接和显示文本
+ build_time 数值 是 整体构建时间(单位秒) trigger触发时间~comment时间
+ check_total 字符串 是 门禁整体结果
+ check_details 字典 是 门禁各个检查项结果
+ :return:
+ """
+ pr_create_time = round(datetime.timestamp(datetime.strptime(args_list.pr_create_time,
+ '%Y-%m-%dT%H:%M:%S%z')), 1)
+ trigger_time = round(datetime.timestamp(datetime.strptime(args_list.trigger_time, '%Y-%m-%dT%H:%M:%S%z')), 1)
+ current_time = round(time.time(), 1)
+
+ base_dict = {"pr_title": args_list.pr_title,
+ "pr_url": args_list.pr_url,
+ "pr_create_at": pr_create_time,
+ "pr_committer": args_list.committer,
+ "pr_branch": args_list.tbranch,
+ "build_at": trigger_time,
+ "update_at": current_time,
+ "build_no": args_list.trigger_build_id
+ }
+ build_time = round(current_time - trigger_time, 1)
+ base_dict["build_time"] = build_time
+
+ self.output_build_num(args_list, base_dict)
+ build_file = "build_result.yaml"
+ try:
+ if os.path.exists(build_file):
+ base_dict["build_exception"] = False
+ with open(build_file, "r") as f:
+ comments = yaml.safe_load(f)
+ base_dict.update(comments)
+ else:
+ base_dict["build_exception"] = True
+ except yaml.MarkedYAMLError:
+ logger.exception("Read build result file exception, yaml format error")
+
+ logger.info("base_dict = %s", base_dict)
+ # upload to es
+ kp = KafkaProducerProxy(brokers=os.environ["KAFKAURL"].split(","))
+ kp.send("openeuler_statewall_ci_result", key=args_list.comment_id, value=base_dict)
+
+ comment_tips = "门禁常见失败问题及解决方案, 可参考" \
+ "" \
+ "门禁问题排查手册\n" \
+ "若门禁存在误报,您可以评论/ci_mistake {}进行误报标记,{}表示本次构建号\n" \
+ "也可带上误报的门禁检查项以及误报类型(ci、obs、infra)," \
+ "比如/ci_mistake {} obs npmbuild check_install表示的是check_build和check_install存在误报," \
+ "误报类型为obs\n若想取消误报标记,可以评论/ci_unmistake {}取消\n" \
+ "也可在评论后加上一段文字描述,但请另起一行".format(
+ args_list.trigger_build_id, args_list.trigger_build_id,
+ args_list.trigger_build_id, args_list.trigger_build_id)
+ gp = GiteeProxy(args_list.owner, args_list.repo, args_list.gitee_token)
+ gp.comment_pr(args_list.prid, comment_tips)
+
+
+def init_args():
+ """
+ init args
+ :return:
+ """
+ parser = argparse.ArgumentParser()
+ parser.add_argument("-r", type=str, dest="repo", help="repo name")
+ parser.add_argument("-c", type=str, dest="committer", help="commiter")
+ parser.add_argument("-m", type=str, dest="comment_id", help="uniq comment id")
+ parser.add_argument("-g", type=str, dest="trigger_time", help="job trigger time")
+ parser.add_argument("-k", type=str, dest="pr_title", help="pull request title")
+ parser.add_argument("-t", type=str, dest="pr_create_time", help="pull request create time")
+ parser.add_argument("-b", type=str, dest="tbranch", help="target branch")
+ parser.add_argument("-u", type=str, dest="pr_url", help="pull request url")
+ parser.add_argument("-p", type=str, dest="prid", help="pull request id")
+ parser.add_argument("-o", type=str, dest="owner", help="gitee owner")
+ parser.add_argument("-i", type=int, dest="trigger_build_id", help="trigger build id")
+ parser.add_argument("--gitee_token", type=str, dest="gitee_token", help="gitee api token")
+
+ return parser.parse_args()
+
+
+if "__main__" == __name__:
+ args = init_args()
+ comment = CommentToDashboard()
+ comment.get_all_result_to_kafka(args)
diff --git a/src/build/extra_work.py b/src/build/extra_work.py
new file mode 100644
index 0000000000000000000000000000000000000000..e7a7740e4e329d868c2a950ba2e62b3b3dfaffaa
--- /dev/null
+++ b/src/build/extra_work.py
@@ -0,0 +1,350 @@
+# -*- encoding=utf-8 -*-
+"""
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2020-09-23
+# Description: pkgship and check_abi
+# **********************************************************************************
+"""
+
+import os
+import argparse
+import logging.config
+import logging
+import yaml
+
+from src.proxy.obs_proxy import OBSProxy
+from src.proxy.requests_proxy import do_requests
+from src.constant import Constant
+from src.build.obs_repo_source import OBSRepoSource
+from src.build.build_rpm_package import BuildRPMPackage
+from src.build.related_rpm_package import RelatedRpms
+from src.utils.shell_cmd import shell_cmd_live
+from src.utils.check_abi import CheckAbi
+from src.utils.compare_package import ComparePackage
+from src.utils.check_conf import CheckConfig
+
+
+class ExtraWork(object):
+ """
+ pkgship
+ check_abi
+ """
+ def __init__(self, package, rpmbuild_dir="/home/jenkins/agent/buildroot/home/abuild/rpmbuild"):
+ """
+
+ :param package: obs package
+ :param rpmbuild_dir: rpmbuild 路径
+ """
+ self._repo = package
+ self._rpm_package = BuildRPMPackage(package, rpmbuild_dir)
+
+ def is_pkgship_need_notify(self, pkgship_meta_path):
+ """
+ 是否需要发起notify
+ :param pkgship_meta_path: 保存门禁中解析的pkgship spec版本元信息文件路径
+ :return:
+ """
+ if self._repo == "pkgship": # 只有pkgship包需要通知
+ try:
+ with open(pkgship_meta_path, "r") as f:
+ pkgship_meta = yaml.safe_load(f)
+ logger.debug("pkgship meta: %s", pkgship_meta)
+ if pkgship_meta.get("compare_version") == 1: # version upgrade
+ logger.debug("pkgship: notify")
+ return True
+ except IOError:
+ # file not exist, bug
+ logger.warning("pkgship meta file not exist!")
+ return True
+
+ return False
+
+ def pkgship_notify(
+ self, notify_url, notify_token, package_url, package_arch, notify_jenkins_user, notify_jenkins_password):
+ """
+ notify
+ :param notify_url: notify url
+ :param notify_token: notify token
+ :param package_url: package addr
+ :param package_arch: cpu arch
+ :param notify_jenkins_user:
+ :param notify_jenkins_password:
+ :return:
+ """
+ package = self._rpm_package.last_main_package(package_arch, package_url)
+ querystring = {"token": notify_token, "PACKAGE_URL": package, "arch": package_arch}
+ ret = do_requests("get", notify_url, querystring=querystring,
+ auth={"user": notify_jenkins_user, "password": notify_jenkins_password}, timeout=1)
+ if ret in [0, 2]:
+ # send async, don't care about response, timeout will be ok
+ logger.info("notify ...ok")
+ else:
+ logger.error("notify ...fail")
+
+ def check_rpm_abi(self, package_url, package_arch, output, committer, comment_file, obs_addr,
+ branch_name="master", obs_repo_url=None):
+ """
+ 对比两个版本rpm包之间的接口差异,根据差异找到受影响的rpm包
+ :param package_arch:
+ :param obs_repo_url:
+ :return:
+ """
+ #get rpms
+ curr_rpm = self._rpm_package.main_package_local()
+ last_rpm = self._rpm_package.last_main_package(package_arch, package_url)
+ logger.debug("curr_rpm: %s", curr_rpm)
+ logger.debug("last_rpm: %s", last_rpm)
+ if not curr_rpm or not last_rpm:
+ logger.info("no rpms")
+ return
+
+ #check configs
+ check_conf = CheckConfig(last_rpm, curr_rpm, output_file=output)
+ check_conf.conf_check()
+ rpms = [last_rpm, curr_rpm]
+
+ #get debuginfos
+ debuginfos = None
+ curr_rpm_debug = self._rpm_package.debuginfo_package_local()
+ last_rpm_debug = self._rpm_package.last_debuginfo_package(package_arch, package_url)
+ logger.debug("curr_rpm_debug: %s", curr_rpm_debug)
+ logger.debug("last_rpm_debug: %s", last_rpm_debug)
+ if curr_rpm_debug and last_rpm_debug:
+ debuginfos = [last_rpm_debug, curr_rpm_debug]
+
+ #get related rpms url
+ related_rpms_url = None
+ if obs_repo_url:
+ rp = RelatedRpms(obs_addr, obs_repo_url, branch_name, package_arch)
+ related_rpms_url = rp.get_related_rpms_url(curr_rpm)
+
+ #check abi
+ check_abi = CheckAbi(result_output_file=output, input_rpms_path=related_rpms_url)
+ ret = check_abi.process_with_rpm(rpms, debuginfos)
+ if ret == 1:
+ logger.error("check abi error: %s", ret)
+ else:
+ logger.debug("check abi ok: %s", ret)
+
+ if os.path.exists(output):
+ # change of abi
+ comment = {"name": "check_abi/{}/{}".format(package_arch, self._repo), "result": "WARNING",
+ "link": self._rpm_package.checkabi_md_in_repo(committer, self._repo, package_arch,
+ os.path.basename(output), package_url)}
+ else:
+ comment = {"name": "check_abi/{}/{}".format(package_arch, self._repo), "result": "SUCCESS"}
+
+ logger.debug("check abi comment: %s", comment)
+ comments = []
+ try:
+ if os.path.exists(comment_file):
+ with open(comment_file, "r") as f: # one repo with multi build package
+ comments = yaml.safe_load(f)
+ logger.debug("check abi comments: %s", comments)
+ comments.append(comment)
+ with open(comment_file, "w") as f:
+ yaml.safe_dump(comments, f) # list
+ except IOError:
+ logger.exception("save check abi comment file exception")
+ except yaml.MarkedYAMLError:
+ logger.exception("save check abi comment file exception, yaml format error")
+
+ def check_install_rpm(self, config):
+ """
+ 检查生成的rpm是否可以安装
+ :param config:
+ :return:
+ """
+ logger.info("*** start check install start ***")
+
+ # 1. prepare install root directory
+ _ = not os.path.exists(config.install_root) and os.makedirs(config.install_root)
+ logger.info("create install root directory: %s", config.install_root)
+
+ # 2. prepare repo
+ repo_source = OBSRepoSource() # obs 实时构建repo地址
+ obs_branch_list = Constant.GITEE_BRANCH_PROJECT_MAPPING.get(config.branch_name, [])
+ repo_config = repo_source.generate_repo_info(obs_branch_list, config.arch, "check_install")
+ logger.info("repo source config:\n%s", repo_config)
+
+ # write to /etc/yum.repos.d
+ with open("obs_realtime.repo", "w+") as f:
+ f.write(repo_config)
+
+ # 3. dnf install using repo name start with "check_install"
+ names = []
+ packages = []
+ for name, package in self._rpm_package.iter_all_rpm():
+ # ignore debuginfo rpm
+ if "debuginfo" in name or "debugsource" in name:
+ logger.debug("ignore debug rpm: %s", name)
+ continue
+ names.append(name)
+ packages.append(package)
+
+ logger.info("install rpms: %s", names)
+ if packages:
+ check_install_cmd = "sudo dnf install -y --installroot={} --setopt=reposdir=. {}".format(
+ config.install_root, " ".join(packages))
+ ret, _, err = shell_cmd_live(check_install_cmd, verbose=True)
+ if ret:
+ logger.error("install rpms error, %s, %s", ret, err)
+ comment = {"name": "check_install", "result": "FAILED"}
+ else:
+ logger.info("install rpm success")
+ comment = {"name": "check_install", "result": "SUCCESS"}
+
+ logger.info("check install rpm comment: %s", comment)
+ comments = []
+ try:
+ if os.path.exists(config.comment_file):
+ with open(config.comment_file, "r") as f: # one repo with multi build package
+ comments = yaml.safe_load(f)
+ comments.append(comment)
+ with open(config.comment_file, "w") as f:
+ yaml.safe_dump(comments, f) # list
+ except IOError:
+ logger.exception("save check install comment file exception")
+
+
+def notify(config, extrawork):
+ """
+ notify, run after copy rpm to rpm repo
+ :param config: args
+ :param extrawork:
+ :return:
+ """
+ if extrawork.is_pkgship_need_notify(config.pkgship_meta):
+ extrawork.pkgship_notify(config.notify_url, config.token, config.rpm_repo_url,
+ config.arch, config.notify_user, config.notify_password)
+
+
+def checkabi(config, extrawork):
+ """
+ check abi, run before copy rpm to rpm repo
+ :param config: args
+ :param extrawork:
+ :return:
+ """
+ extrawork.check_rpm_abi(config.rpm_repo_url, config.arch, config.output, config.committer, config.comment_file,
+ config.obs_addr, config.branch_name, config.obs_repo_url)
+
+
+def comparepackage(config, extrawork):
+ """
+ 对比两个版本rpm包之间的差异,根据差异找到受影响的rpm包
+ :param config: args
+ :return:
+ """
+ logger.info("compare package start")
+ compare_package = ComparePackage(logger=logger)
+ result = compare_package.output_result_to_console(config.json_path, config.pr_link, config.ignore, config.package,
+ config.check_result_file, config.pr_commit_json_file)
+ logger.info("compare package result:%s", result)
+ logger.info("compare package finish")
+
+
+def checkinstall(config, extrawork):
+ """
+ check install
+ :param config: args
+ :param extrawork:
+ :return:
+ """
+ extrawork.check_install_rpm(config)
+
+
+def getrelatedrpm(config, extrawork):
+ """
+ get related rpm package
+ :param config:
+ :param extrawork:
+ :return:
+ """
+ for project in Constant.GITEE_BRANCH_PROJECT_MAPPING.get(config.branch_name):
+ logging.debug("find project %s, package: %s, aarch: %s", project, config.package, config.arch)
+ result = OBSProxy.get_binaries(project, config.package, config.arch)
+ if result:
+ logging.info("get binaries: %s", result)
+ break
+
+
+if "__main__" == __name__:
+ parser = argparse.ArgumentParser()
+ parser.add_argument("-p", "--package", type=str, default="src-openeuler", help="obs package")
+ parser.add_argument("-d", "--rpmbuild_dir", type=str,
+ default="/home/jenkins/agent/buildroot/home/abuild/rpmbuild", help="rpmbuild dir")
+ subparsers = parser.add_subparsers(help='sub-command help')
+
+ # 添加子命令 notify
+ parser_notify = subparsers.add_parser('notify', help='add help')
+ parser_notify.add_argument("-n", type=str, dest="notify_url", help="target branch that merged to ")
+ parser_notify.add_argument("-t", type=str, dest="token", default=os.getcwd(), help="obs workspace dir path")
+ parser_notify.add_argument("-l", type=str, dest="rpm_repo_url", help="rpm repo where rpm saved")
+ parser_notify.add_argument("-a", type=str, dest="arch", help="build arch")
+ parser_notify.add_argument("-u", type=str, dest="notify_user", default="trigger", help="notify trigger user")
+ parser_notify.add_argument("-w", type=str, dest="notify_password", help="notify trigger password")
+ parser_notify.set_defaults(func=notify)
+
+ # 添加子命令 checkabi
+ parser_checkabi = subparsers.add_parser('checkabi', help='add help')
+ parser_checkabi.add_argument("-l", type=str, dest="rpm_repo_url", help="rpm repo where rpm saved")
+ parser_checkabi.add_argument("-a", type=str, dest="arch", help="build arch")
+ parser_checkabi.add_argument("-o", type=str, dest="output", help="checkabi result")
+ parser_checkabi.add_argument("-c", type=str, dest="committer", help="committer")
+ parser_checkabi.add_argument("-s", type=str, dest="obs_addr", help="obs address")
+ parser_checkabi.add_argument("-r", type=str, dest="branch_name", help="obs project name")
+ parser_checkabi.add_argument("-b", type=str, dest="obs_repo_url", help="obs repo where rpm saved")
+ parser_checkabi.add_argument("-p", "--package", type=str, help="obs package")
+ parser_checkabi.add_argument("-e", type=str, dest="comment_file", help="check abi result comment")
+ parser_checkabi.set_defaults(func=checkabi)
+
+ # 添加子命令 compare_package
+ parser_comparepackage = subparsers.add_parser('comparepackage', help='add help')
+ parser_comparepackage.add_argument("-f", type=str, dest="check_result_file",
+ help="compare package check item result")
+ parser_comparepackage.add_argument("-j", type=str, dest="json_path", help="compare package json path")
+ parser_comparepackage.add_argument("-i", "--ignore", action="store_true", default=False, help="ignore or not")
+ parser_comparepackage.add_argument("-pr", type=str, dest="pr_link", help="PR link")
+ parser_comparepackage.add_argument("-pr_commit", type=str, dest="pr_commit_json_file",
+ help="PR commit file difference")
+ parser_comparepackage.add_argument("-p", "--package", type=str, help="obs package")
+ parser_comparepackage.set_defaults(func=comparepackage)
+
+ # 添加子命令 checkinstall
+ parser_checkinstall = subparsers.add_parser('checkinstall', help='add help')
+ parser_checkinstall.add_argument("-r", type=str, dest="branch_name", help="obs project name")
+ parser_checkinstall.add_argument("-a", type=str, dest="arch", help="build arch")
+ parser_checkinstall.add_argument("-e", type=str, dest="comment_file", help="check install result comment")
+ parser_checkinstall.add_argument("--obs_rpm_host", type=str, dest="obs_rpm_host", default="", help="obs rpm host")
+ parser_checkinstall.add_argument("--install-root", type=str, dest="install_root",
+ help="check install root dir")
+ parser_checkinstall.set_defaults(func=checkinstall)
+
+ # 添加子命令 getrelatedrpm
+ parser_getrelatedrpm = subparsers.add_parser('getrelatedrpm', help='add help')
+ parser_getrelatedrpm.add_argument("-r", type=str, dest="branch_name", help="obs project name")
+ parser_getrelatedrpm.add_argument("-a", type=str, dest="arch", help="build arch")
+ parser_getrelatedrpm.add_argument("-p", "--package", type=str, help="obs package")
+ parser_getrelatedrpm.set_defaults(func=getrelatedrpm)
+
+ args = parser.parse_args()
+
+ _ = not os.path.exists("log") and os.mkdir("log")
+ logger_conf_path = os.path.realpath(os.path.join(os.path.realpath(__file__),
+ "../../conf/logger.conf"))
+ logging.config.fileConfig(logger_conf_path)
+ logger = logging.getLogger("build")
+
+ ew = ExtraWork(args.package, args.rpmbuild_dir)
+ args.func(args, ew)
diff --git a/src/build/extract_file b/src/build/extract_file
new file mode 100644
index 0000000000000000000000000000000000000000..55d30a972cf15475f18cecc3c09d2cab6bae841a
--- /dev/null
+++ b/src/build/extract_file
@@ -0,0 +1,84 @@
+#!/bin/bash
+
+# A simple script to checkout or update a svn or git repo as source service
+
+# defaults
+MYARCHIVE=""
+MYFILES=""
+OUTFILE="."
+FILES=""
+
+while test $# -gt 0; do
+ case $1 in
+ *-archive)
+ MYARCHIVE="${2##*/}"
+ shift
+ ;;
+ *-file|*-files)
+ MYFILES="$MYFILES ${2}"
+ FILES=${2}
+ shift
+ ;;
+ *-outfilename)
+ OUTFILE="${2}"
+ shift
+ ;;
+ *-outdir)
+ MYOUTDIR="$2"
+ shift
+ ;;
+ *)
+ echo Unknown parameter $1.
+ echo 'Usage: extract_file --archive $ARCHIVE --file $FILE --outdir $OUT'
+ exit 1
+ ;;
+ esac
+ shift
+done
+
+if [ -z "$MYARCHIVE" ]; then
+ echo "ERROR: no archive specified!"
+ exit 1
+fi
+if [ -z "$MYFILES" ]; then
+ echo "ERROR: no checkout URL is given via --file parameter!"
+ exit 1
+fi
+if [ -z "$MYOUTDIR" ]; then
+ echo "ERROR: no output directory is given via --outdir parameter!"
+ exit 1
+fi
+set -x
+
+if [ "${FILES}" == '*' ];then
+ MYFILES=" "
+fi
+
+existing_archive="$MYOUTDIR/$(echo $MYARCHIVE)"
+cd "$MYOUTDIR"
+
+existing_archive=`ls $existing_archive`
+if [ -e "$existing_archive" ]; then
+ if [ "${existing_archive%.tar.gz}" != "$existing_archive" ]; then
+ tar xfz "$existing_archive" --wildcards $MYFILES || exit 1
+ elif [ "${existing_archive%.tar.bz2}" != "$existing_archive" ]; then
+ tar xfj "$existing_archive" --wildcards $MYFILES || exit 1
+ elif [ "${existing_archive%.tar.xz}" != "$existing_archive" ]; then
+ tar xfJ "$existing_archive" --wildcards $MYFILES || exit 1
+ elif [ "${existing_archive%.tar}" != "$existing_archive" ]; then
+ tar xf "$existing_archive" --wildcards $MYFILES || exit 1
+ elif [ "${existing_archive%.zip}" != "$existing_archive" ]; then
+ unzip "$existing_archive" $MYFILES || exit 1
+ else
+ echo "ERROR: unknown archive format $existing_archive"
+ exit 1
+ fi
+ for i in $MYFILES; do
+ mv "$i" "$OUTFILE"
+ done
+else
+ echo "ERROR: archive not found: $existing_archive"
+ exit 1
+fi
+
+exit 0
diff --git a/src/build/gitee_comment.py b/src/build/gitee_comment.py
new file mode 100644
index 0000000000000000000000000000000000000000..beb5dc3c9fa8e91948a5bb6fc5518d6f62e01cd3
--- /dev/null
+++ b/src/build/gitee_comment.py
@@ -0,0 +1,495 @@
+# -*- coding: utf-8 -*-
+"""
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2020-09-23
+# Description: comment pr with build result
+# **********************************************************************************
+"""
+
+import os
+import re
+import stat
+import sys
+import logging.config
+import json
+import argparse
+import warnings
+import yaml
+
+sys.path.append("/var/jenkins_home/src-oepkgs/")
+
+from yaml.error import YAMLError
+from src.ac.framework.ac_result import ACResult, SUCCESS
+from src.proxy.gitee_proxy import GiteeProxy
+# # from src.proxy.kafka_proxy import KafkaProducerProxy
+# # from src.proxy.jenkins_proxy import JenkinsProxy
+from src.utils.dist_dataset import DistDataset
+
+_ = not os.path.exists("log") and os.mkdir("log")
+logger_conf_path = os.path.realpath(os.path.join(os.path.realpath(__file__), "../../conf/logger.conf"))
+logging.config.fileConfig(logger_conf_path)
+logger = logging.getLogger("build")
+
+
+class Comment(object):
+ """
+ comments process
+ """
+
+ def __init__(self, pr, *check_item_comment_files):
+ """
+
+ :param pr: pull request number
+ """
+ self._pr = pr
+ self._check_item_comment_files = check_item_comment_files
+ self._up_builds = []
+ self._up_up_builds = []
+ # self._get_upstream_builds(jenkins_proxy)
+ self.ac_result = {}
+ self.compare_package_result = {}
+ self.check_item_result = {}
+
+ def comment_build(self, gitee_proxy, build_id):
+ """
+ 构建结果
+ :param build_id:
+ :param gitee_proxy:
+ :return:
+ """
+ comments = self._comment_build_html_format(build_id)
+ gitee_proxy.comment_pr(self._pr, "\n".join(comments))
+
+ return "\n".join(comments)
+
+ def comment_compare_package_details(self, gitee_proxy, check_result_file):
+ """
+ compare package结果上报
+
+ :param gitee_proxy:
+ :param check_result_file:
+ :return:
+ """
+ comments = self._comment_of_compare_package_details(check_result_file)
+ gitee_proxy.comment_pr(self._pr, "\n".join(comments))
+
+ return "\n".join(comments)
+
+ def comment_at(self, committer, gitee_proxy):
+ """
+ 通知committer
+ @committer
+ :param committer:
+ :param gitee_proxy:
+ :return:
+ """
+ gitee_proxy.comment_pr(self._pr, "@{}".format(committer))
+
+ def check_build_result(self):
+ """
+ build result check
+ :return:
+ """
+ build_result = sum([ACResult.get_instance(build["result"]) for build in self._up_builds], SUCCESS)
+ return build_result
+
+ # def _get_upstream_builds(self, jenkins_proxy):
+ # """
+ # get upstream builds
+ # :param jenkins_proxy:
+ # :return:
+ # """
+ # base_job_name = os.environ.get("JOB_NAME")
+ # base_build_id = os.environ.get("BUILD_ID")
+ # base_build_id = int(base_build_id)
+ # logger.debug("base_job_name: %s, base_build_id: %s", base_job_name, base_build_id)
+ # base_build = jenkins_proxy.get_build_info(base_job_name, base_build_id)
+ # logger.debug("get base build")
+ # self._up_builds = jenkins_proxy.get_upstream_builds(base_build)
+ # if self._up_builds:
+ # logger.debug("get up_builds")
+ # self._up_up_builds = jenkins_proxy.get_upstream_builds(self._up_builds[0])
+
+ def _comment_build_html_format(self, build_id):
+ """
+ 组装构建信息,并评论pr
+ :param build_id: JenkinsProxy object
+ :return:
+ """
+ comments = ["", self.comment_html_table_th()]
+
+ # logger.debug("get up_up_builds")
+ comments.extend(self._comment_of_ac(build_id))
+
+ # if self._up_up_builds:
+ # logger.debug("get up_up_builds")
+ # comments.extend(self._comment_of_ac(self._up_up_builds[0]))
+ # if self._up_builds:
+ # comments.extend(self._comment_of_check_item(self._up_builds))
+
+ comments.append("
")
+ return comments
+
+ def _comment_of_ac(self, build_id):
+ """
+ 组装门禁检查结果
+ :param build_id: Jenkins Build object,门禁检查jenkins构建对象
+ :return:
+ """
+ if "ACL" not in os.environ:
+ logger.debug("no ac check")
+ return []
+ #
+ try:
+ acl = json.loads(os.environ["ACL"])
+ logger.debug("ac result: %s", acl)
+ except ValueError:
+ logger.exception("invalid ac result format")
+ return []
+
+ comments = []
+
+ for index, item in enumerate(acl):
+ ac_result = ACResult.get_instance(item["result"])
+ if index == 0:
+ # build_url = build["url"]
+ comments.append(self.__class__.comment_html_table_tr(
+ item["name"], ac_result.emoji, ac_result.hint,
+ "{}{}{}".format("https://jenkins.openeuler.isrc.ac.cn/job/src-oepkgs/", build_id, "/console"),
+ build_id,
+ rowspan=len(acl)))
+ else:
+ comments.append(self.__class__.comment_html_table_tr_rowspan(
+ item["name"], ac_result.emoji, ac_result.hint))
+ self.ac_result[item["name"]] = ac_result.hint
+ logger.info("ac comment: %s", comments)
+
+ return comments
+
+ def _comment_of_compare_package_details(self, check_result_file):
+ """
+ compare package details
+ :param:
+ :return:
+ """
+ comments = []
+ comments_title = [" | Arch Name | Check Items | Rpm Name | Check Result | "
+ "Build Details |
"]
+
+ def match(name, comment_file):
+ if "aarch64" in name and "aarch64" in comment_file:
+ return True, "aarch64"
+ if "x86-64" in name and "x86_64" in comment_file:
+ return True, "x86_64"
+ return False, ""
+
+ for result_file in check_result_file.split(","):
+ logger.info("check_result_file: %s", result_file)
+ if not os.path.exists(result_file):
+ logger.info("%s not exists", result_file)
+ continue
+ for build in self._up_builds:
+ arch_cmp_result = "SUCCESS"
+ # name = JenkinsProxy.get_job_path_from_job_url(build["url"])
+ logger.info("check build %s", "src-oepkgs")
+ arch_result, arch_name = match("src-oepkgs", result_file)
+ if not arch_result: # 找到匹配的jenkins build
+ continue
+ logger.info("build \"%s\" match", "src-oepkgs")
+
+ status = build["result"]
+ logger.info("build state: %s", status)
+ content = {}
+ if ACResult.get_instance(status) == SUCCESS: # 保证build状态成功
+ with open(result_file, "r") as f:
+ try:
+ content = yaml.safe_load(f)
+ except YAMLError: # yaml base exception
+ logger.exception("illegal yaml format of compare package comment file ")
+ logger.info("comment: %s", content)
+ for index, item in enumerate(content):
+ rpm_name = content.get(item)
+ check_item = item.replace(" ", "_")
+ result = "FAILED" if rpm_name else "SUCCESS"
+ if result == "FAILED":
+ arch_cmp_result = "FAILED"
+ compare_result = ACResult.get_instance(result)
+ if index == 0:
+ comments.append("| compare_package({}) | {} | {} | "
+ "{}{} | {}{} |
"
+ .format(len(content), arch_name, check_item, "
".join(rpm_name),
+ compare_result.emoji, compare_result.hint, len(content),
+ "{}{}".format(build["url"], "console"), "#", build["number"]))
+ else:
+ comments.append("| {} | {} | {}{} |
".format(
+ check_item, "
".join(rpm_name), compare_result.emoji, compare_result.hint))
+ self.compare_package_result[arch_name] = arch_cmp_result
+ if comments:
+ comments = comments_title + comments
+ comments.append("
")
+ logger.info("compare package comment: %s", comments)
+
+ return comments
+
+ def _comment_of_check_item(self, builds):
+ """
+ check item comment
+ :param builds:
+ :return:
+ """
+ comments = []
+
+ def match(name, comment_file):
+ if "aarch64" in name and "aarch64" in comment_file:
+ return True
+ if "x86-64" in name and "x86_64" in comment_file:
+ return True
+ return False
+
+ for build in builds:
+ # name, _ = JenkinsProxy.get_job_path_build_no_from_build_url(build["url"])
+ status = build["result"]
+ ac_result = ACResult.get_instance(status)
+ build_url = build["url"]
+ if "x86-64" in "x86-64":
+ arch = "x86_64"
+ elif "aarch64" in "x86-64":
+ arch = "aarch64"
+ else:
+ arch = "x86-64".split("/")[-2]
+ arch_dict = {}
+ check_item_result = {}
+ for check_item_comment_file in self._check_item_comment_files:
+ if not os.path.exists(check_item_comment_file):
+ logger.info("%s not exists", check_item_comment_file)
+ continue
+ if ACResult.get_instance(status) == SUCCESS and match("x86-64", check_item_comment_file): # 保证build状态成功
+ with open(check_item_comment_file, "r") as f:
+ try:
+ content = yaml.safe_load(f)
+ except YAMLError: # yaml base exception
+ logger.exception("illegal yaml format of check item comment file ")
+ logger.debug("comment: %s", content)
+ for item in content:
+ check_item_result[item.get("name")] = ACResult.get_instance(item.get("result"))
+ break
+ item_num = 1 + len(check_item_result)
+ if os.path.exists("support_arch"):
+ with open("support_arch", "r") as s_file:
+ if arch not in s_file.readline():
+ ac_result = ACResult.get_instance("EXCLUDE")
+ item_num = 2
+ comments.append("| {} | {} | {}{} | " \
+ "#{} |
".format(
+ item_num, arch, "npmbuild", ac_result.emoji, ac_result.hint, item_num,
+ "{}{}".format(build_url, "console"), build["number"]))
+ arch_dict["npmbuild"] = ac_result.hint
+ if ac_result.hint == "EXCLUDE":
+ comments.append("| {} | {}{} | ".format(
+ "check_install", ac_result.emoji, ac_result.hint))
+ arch_dict["check_install"] = ac_result.hint
+ else:
+ for check_item, check_result in check_item_result.items():
+ comments.append("
| {} | {}{} | ".format(
+ check_item, check_result.emoji, check_result.hint))
+ arch_dict[check_item] = check_result.hint
+ self.check_item_result[arch] = arch_dict
+ logger.info("check item comment: %s", comments)
+
+ return comments
+
+ @classmethod
+ def comment_html_table_th(cls):
+ """
+ table header
+ """
+ return "
| Check Name | Build Result | Build Details |
"
+
+ @classmethod
+ def comment_html_table_tr(cls, name, icon, status, href, build_no, hashtag=True, rowspan=1):
+ """
+ one row or span row
+ """
+ return "| {} | {}{} | " \
+ "{}{} |
".format(
+ name, icon, status, rowspan, href, "#" if hashtag else "", build_no)
+
+ @classmethod
+ def comment_html_table_tr_rowspan(cls, name, icon, status):
+ """
+ span row
+ """
+ return "| {} | {}{} |
".format(name, icon, status)
+
+ def _get_job_url(self, comment_url):
+ """
+ get_job_url
+ :param url:
+ :return:
+ """
+ build_urls = {"trigger": self._up_up_builds[0]["url"],
+ "comment": os.path.join(comment_url, os.environ.get("BUILD_ID"))
+ }
+ for build in self._up_builds:
+ arch = ""
+ try:
+ arch_index = 3
+ list_step = 2
+ if build["url"]:
+ job_path = re.sub(r"http[s]?://", "", build["url"])
+ arch = job_path.split("/")[::list_step][arch_index]
+ except IndexError:
+ logger.info("get arch from job failed, index error.")
+ except KeyError:
+ logger.info("not find build url key")
+ if arch:
+ build_urls[arch] = build["url"]
+
+ return build_urls
+
+ def _get_all_job_result(self, check_details):
+ """
+ get_all_job_result
+ :return:
+ """
+
+ check_details["static_code"] = self.ac_result
+ for arch, arch_result in self.check_item_result.items():
+ if self.compare_package_result.get(arch):
+ arch_result["compare_package"] = self.compare_package_result.get(arch)
+ check_details[arch] = arch_result
+
+ return check_details
+
+ # def get_all_result_to_kafka(self, comment_url):
+ # """
+ # 名称 类型 必选 说明
+ # build_urls 字典 是 包含多个门禁工程链接和显示文本
+ # check_total 字符串 是 门禁整体结果
+ # check_details 字典 是 门禁各个检查项结果
+ # :return:
+ # """
+ # check_details = {}
+ # build_urls = self._get_job_url(comment_url)
+ # self._get_all_job_result(check_details)
+ #
+ # if self.check_build_result() == SUCCESS:
+ # check_total = 'SUCCESS'
+ # else:
+ # check_total = 'FAILED'
+ #
+ # all_dict = {"build_urls": build_urls,
+ # "check_total": check_total,
+ # "check_details": check_details
+ # }
+ # logger.info("all_dict = %s", all_dict)
+ # flags = os.O_WRONLY | os.O_CREAT | os.O_EXCL
+ # modes = stat.S_IWUSR | stat.S_IRUSR
+ # try:
+ # with os.fdopen(os.open("build_result.yaml", flags, modes), "w") as f:
+ # yaml.safe_dump(all_dict, f)
+ # except IOError:
+ # logger.exception("save build result file exception")
+
+
+def init_args():
+ """
+ init args
+ :return:
+ """
+ parser = argparse.ArgumentParser()
+ parser.add_argument("-p", type=int, dest="pr", help="pull request number")
+ parser.add_argument("-m", type=str, dest="comment_id", help="uniq comment id")
+ parser.add_argument("-c", type=str, dest="committer", help="commiter")
+ parser.add_argument("-o", type=str, dest="owner", help="gitee owner")
+ parser.add_argument("-r", type=str, dest="repo", help="repo name")
+ parser.add_argument("-t", type=str, dest="gitee_token", help="gitee api token")
+
+ parser.add_argument("-b", type=str, dest="jenkins_base_url", default="https://openeulerjenkins.osinfra.cn/",
+ help="jenkins base url")
+ parser.add_argument("-u", type=str, dest="jenkins_user", help="repo name")
+ parser.add_argument("-j", type=str, dest="jenkins_api_token", help="jenkins api token")
+ parser.add_argument("-f", type=str, dest="check_result_file", default="", help="compare package check item result")
+ parser.add_argument("-a", type=str, dest="check_item_comment_files", nargs="*", help="check item comment files")
+
+ parser.add_argument("--disable", dest="enable", default=True, action="store_false", help="comment to gitee switch")
+
+ return parser.parse_args()
+
+
+if "__main__" == __name__:
+ args = init_args()
+ if not args.enable:
+ sys.exit(0)
+
+ # community = "src-oepkgs"
+ # repo = "OCK-test"
+ # workspace = "/var/jenkins_home/src-oepkgs/src/ac/framework"
+ # pr = "5"
+ # token = "c951fee688f4b037d27602d7461b81fc"
+ # build_id = args.build_id
+ # tbranch = args.tbranch
+ # committer = args.committer
+
+ # dd = DistDataset()
+ # dd.set_attr_stime("comment.job.stime")
+
+ # gitee pr tag
+ # gp = GiteeProxy(community, repo, "c951fee688f4b037d27602d7461b81fc")
+ # gp.delete_tag_of_pr("5", "ci_processing")
+ #
+ # # jp = JenkinsProxy(args.jenkins_base_url, args.jenkins_user, args.jenkins_api_token)
+ # # url, build_time, reason = jp.get_job_build_info(os.environ.get("JOB_NAME"), int(os.environ.get("BUILD_ID")))
+ # dd.set_attr_ctime("comment.job.ctime", build_time)
+ # dd.set_attr("comment.job.link", url)
+ # dd.set_attr("comment.trigger.reason", reason)
+
+ # dd.set_attr_stime("comment.build.stime")
+ #
+ # comment = Comment(pr, *args.check_item_comment_files) \
+ # if args.check_item_comment_files else Comment(pr)
+ # logger.info("comment: build result......")
+ # comment_content = comment.comment_build(gp)
+ # dd.set_attr_etime("comment.build.etime")
+ # dd.set_attr("comment.build.content.html", comment_content)
+ #
+ # if comment.check_build_result() == SUCCESS:
+ # gp.delete_tag_of_pr(args.pr, "ci_failed")
+ # gp.create_tags_of_pr(args.pr, "ci_successful")
+ # dd.set_attr("comment.build.tags", ["ci_successful"])
+ # dd.set_attr("comment.build.result", "successful")
+ # if args.check_result_file:
+ # comment.comment_compare_package_details(gp, args.check_result_file)
+ # else:
+ # gp.delete_tag_of_pr(args.pr, "ci_successful")
+ # gp.create_tags_of_pr(args.pr, "ci_failed")
+ # dd.set_attr("comment.build.tags", ["ci_failed"])
+ # dd.set_attr("comment.build.result", "failed")
+ # # if args.owner != "openeuler":
+ # # comment.get_all_result_to_kafka(url)
+ #
+ # logger.info("comment: at committer......")
+ # comment.comment_at(args.committer, gp)
+ #
+ # dd.set_attr_etime("comment.job.etime")
+ #
+ # # suppress python warning
+ # warnings.filterwarnings("ignore")
+ # logging.getLogger("elasticsearch").setLevel(logging.WARNING)
+ # logging.getLogger("kafka").setLevel(logging.WARNING)
+ #
+ # # upload to es
+ # # kp = KafkaProducerProxy(brokers=os.environ["KAFKAURL"].split(","))
+ # query = {"term": {"id": args.comment_id}}
+ # script = {"lang": "painless", "source": "ctx._source.comment = params.comment", "params": dd.to_dict()}
+ # kp.send("openeuler_statewall_ci_ac", key=args.comment_id, value=dd.to_dict())
diff --git a/src/build/obs_repo_source.py b/src/build/obs_repo_source.py
new file mode 100644
index 0000000000000000000000000000000000000000..e85473be5fc2e6c27c34e8df232e18b2b7692974
--- /dev/null
+++ b/src/build/obs_repo_source.py
@@ -0,0 +1,78 @@
+# -*- encoding=utf-8 -*-
+"""
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2021-05-27
+# Description: obs repo as dnf source
+# **********************************************************************************
+"""
+import logging
+import os
+
+from src.proxy.requests_proxy import do_requests
+from src.utils.file_operator import FileOperator
+
+logger = logging.getLogger("common")
+
+
+class OBSRepoSource(object):
+ """
+ 生成obs实时仓作为rpm源的配置
+ """
+ def __init__(self):
+ """
+
+ :param repo_host: obs仓库host
+ """
+ cur_path = os.path.abspath(os.path.dirname(__file__))
+ config_file = os.path.join(cur_path, "../conf/project_host_mapping.yaml")
+ self.project_host_map = FileOperator.filereader(config_file, "yaml")
+
+ @staticmethod
+ def repo_format(repo_id, repo_name, repo_baseurl, priority=None):
+ """
+ repo内容格式
+ :param repo_id:
+ :param repo_name:
+ :param repo_baseurl:
+ :param priority:
+ :return:
+ """
+ if priority:
+ return "[{}]\nname={}\nbaseurl={}\nenabled=1\ngpgcheck=0\npriority={}\n".format(repo_id, repo_name, repo_baseurl, priority)
+ else:
+ return "[{}]\nname={}\nbaseurl={}\nenabled=1\ngpgcheck=0\n".format(repo_id, repo_name,repo_baseurl)
+
+ def generate_repo_info(self, obs_branch_list, arch, repo_name_prefix):
+ """
+ 不同的分支生成不同的repo
+ :param obs_branch_list:
+ :param arch:
+ :param repo_name_prefix:
+ :return:
+ """
+ repo_config = ""
+ priority = 1
+ for obs_branch in obs_branch_list:
+ host = ""
+ for backend_name, backend_conf in self.project_host_map.items():
+ if obs_branch in backend_conf.get("project_list"):
+ host = backend_conf.get("host")
+ break
+ branch = obs_branch.replace(":", ":/")
+ url = "{}/{}/standard_{}".format(host, branch, arch)
+ if do_requests("GET", url) == 0:
+ logger.debug("add openstack base repo: %s", url)
+ repo_config += self.repo_format(obs_branch, repo_name_prefix + "_" + branch, url, priority)
+ priority += 1
+
+ return repo_config
diff --git a/src/build/osc_build_k8s.py b/src/build/osc_build_k8s.py
new file mode 100644
index 0000000000000000000000000000000000000000..881e792fe4d49ec71d7bce33290ace55e6fb5f31
--- /dev/null
+++ b/src/build/osc_build_k8s.py
@@ -0,0 +1,317 @@
+# -*- encoding=utf-8 -*-
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2020-09-23
+# Description: build single package using obs
+# **********************************************************************************
+
+import os
+import re
+import sys
+import logging.config
+import logging
+import argparse
+import warnings
+from xml.etree import ElementTree
+
+from src.constant import Constant
+
+
+class SinglePackageBuild(object):
+ """
+ build single package using obs
+ """
+ BUILD_IGNORED_GITEE_BRANCH = ["riscv"]
+ PACKAGES_USE_ROOT = ["iproute", "libaio", "A-Ops", "multipath-tools", "libnetfilter_conntrack", "mcelog", "openEuler_chroot", "conntrack-tools"]
+
+ def __init__(self, package, arch, target_branch):
+ """
+
+ :param package: package name
+ :param arch: x86_64 or aarch64
+ :param target_branch: branch pull request apply
+ """
+ self._package = package
+ self._arch = arch
+ self._branch = target_branch
+
+ def get_need_build_obs_repos(self, project):
+ """
+ 需要构建obs repo列表
+ :return: list
+ """
+ return OBSProxy.list_repos_of_arch(project, self._package, self._arch, show_exclude=True)
+
+ def build_obs_repos(self, project, repos, spec, work_dir, code_dir):
+ """
+ build
+ :param project: 项目名
+ :param repos: obs repo
+ :param spec: 指定spec文件
+ :param code_dir: 码云代码在本地路径
+ :param work_dir:
+ :return:
+ """
+ # osc co
+ if not OBSProxy.checkout_package(project, self._package):
+ logger.error("checkout ... failed")
+ return 1
+
+ logger.info("checkout ... ok")
+
+ # update package meta file "_service"
+ self._handle_package_meta(project, work_dir, code_dir)
+ logger.debug("prepare \"_service\" ... ok")
+
+ # process_service.pl
+ if not self._prepare_build_environ(project, work_dir):
+ logger.error("prepare environ ... failed")
+ return 2
+
+ logger.info("prepare environ ... ok")
+
+ # osc build
+ for repo in repos:
+ if repo["state"] == "excluded" and repo["mpac"] == "raspberrypi-kernel":
+ logger.info("repo %s:%s excluded", repo["repo"], repo["mpac"])
+ continue
+ root_build = repo["mpac"] in self.PACKAGES_USE_ROOT
+ if not OBSProxy.build_package(
+ project, self._package, repo["repo"], self._arch, spec, repo["mpac"],
+ root_build=root_build, disable_cpio=True):
+ logger.error("build %s ... failed", repo["repo"])
+ return 3
+
+ logger.info("build %s ... ok", repo["repo"])
+
+ logger.debug("build all repos ... finished")
+
+ return 0
+
+ def _handle_package_meta(self, project, obs_work_dir, code_path):
+ """
+ _service文件重组
+
+
+
+ repo
+ next/openEuler/perl-Archive-Zip
+
+
+
+ :param project: obs项目
+ :param obs_work_dir: obs工作目录
+ :param code_path: 代码目录
+ :return:
+ """
+ _service_file_path = os.path.join(obs_work_dir, project, self._package, "_service")
+ tree = ElementTree.parse(_service_file_path)
+
+ logger.info("before update meta------")
+ ElementTree.dump(tree)
+ sys.stdout.flush()
+
+ services = tree.findall("service")
+
+ for service in services:
+ if service.get("name") == "tar_scm_repo_docker":
+ service.set("name", "tar_local")
+ elif service.get("name") == "tar_scm_repo":
+ service.set("name", "tar_local")
+ elif service.get("name") == "tar_scm_kernel_repo":
+ service.set("name", "tar_local_kernel")
+ elif service.get("name") == "tar_scm_kernels_repo":
+ service.set("name", "tar_local_kernels")
+ elif service.get("name") == "tar_scm":
+ service.set("name", "tar_local_kernel")
+
+ for param in service.findall("param"):
+ if param.get("name") == "scm":
+ param.text = "local"
+ elif param.get("name") == "tar_scm":
+ param.text = "tar_local"
+ elif param.get("name") == "url":
+ if "openEuler_kernel" in param.text or "LTS_kernel" in param.text \
+ or "openEuler-kernel" in param.text \
+ or "openEuler-20.09_kernel" in param.text:
+ param.text = "{}/{}".format(code_path, "code") # kernel special logical
+ else:
+ gitee_repo = re.sub(r"\.git", "", param.text.split("/")[-1])
+ param.text = "{}/{}".format(code_path, gitee_repo)
+
+ logger.info("after update meta------")
+
+ ElementTree.dump(tree)
+ sys.stdout.flush()
+ tree.write(_service_file_path)
+
+ def _prepare_build_environ(self, project, obs_work_dir):
+ """
+ 准备obs build环境
+ :param project: obs项目
+ :param obs_work_dir: obs工作目录
+ :return:
+ """
+ _process_perl_path = os.path.realpath(os.path.join(os.path.realpath(__file__), "../process_service.pl"))
+ _service_file_path = os.path.join(obs_work_dir, project, self._package, "_service")
+ _obs_package_path = os.path.join(obs_work_dir, project, self._package)
+
+ cmd = "perl {} -f {} -p {} -m {} -w {}".format(
+ _process_perl_path, _service_file_path, project, self._package, _obs_package_path)
+
+ ret, _, _ = shell_cmd_live(cmd, verbose=True)
+
+ if ret:
+ logger.error("prepare build environ error, %s", ret)
+ return False
+
+ return True
+
+ def build(self, spec, work_dir, code_dir):
+ """
+ 入口
+ :param spec: 指定spec文件
+ :param work_dir: obs工作目录
+ :param code_dir: 代码目录
+ :return:
+ """
+ if self._branch in self.BUILD_IGNORED_GITEE_BRANCH:
+ logger.error("branch \"%s\" ignored", self._branch)
+ return 0
+ if self._branch.lower() in Constant.STOPPED_MAINTENANCE_BRANCH:
+ logger.error("branch \"%s\" is no longer maintained!", self._branch)
+ return 1
+ if self._branch not in Constant.GITEE_BRANCH_PROJECT_MAPPING:
+ logger.error("branch \"%s\" not support yet", self._branch)
+ return 1
+
+ has_any_repo_build = False
+ for project in Constant.GITEE_BRANCH_PROJECT_MAPPING.get(self._branch):
+ logger.debug("start build project %s", project)
+
+ obs_repos = self.get_need_build_obs_repos(project)
+ if not obs_repos:
+ logger.info("all repos ignored of project %s", project)
+ continue
+
+ logger.debug("build obs repos: %s", obs_repos)
+ has_any_repo_build = True
+ ret = self.build_obs_repos(project, obs_repos, spec, work_dir, code_dir)
+ if ret > 0:
+ logger.debug("build run return %s", ret)
+ logger.error("build %s %s %s ... %s", project, self._package, self._arch, "failed")
+ return 1 # finish if any error
+ else:
+ logger.info("build %s %s %s ... %s", project, self._package, self._arch, "ok")
+ break
+
+ # if no repo build, regard as fail
+ if not has_any_repo_build:
+ logger.error("package not in any obs projects, please add package into obs")
+ return 1
+
+ return 0
+
+
+def init_args():
+ """
+ init args
+ :return:
+ """
+ parser = argparse.ArgumentParser()
+
+ parser.add_argument("-p", type=str, dest="package", help="obs package")
+ parser.add_argument("-a", type=str, dest="arch", help="build arch")
+ parser.add_argument("-b", type=str, dest="branch", help="target branch that merged to ")
+ parser.add_argument("-c", type=str, dest="code", help="code dir path")
+ parser.add_argument("-w", type=str, dest="workspace", default=os.getcwd(), help="obs workspace dir path")
+
+ parser.add_argument("-m", type=str, dest="comment_id", help="uniq comment id")
+ parser.add_argument("-r", type=str, dest="repo", help="repo")
+ parser.add_argument("--pr", type=str, dest="pr", help="pull request")
+ parser.add_argument("-t", type=str, dest="account", help="gitee account")
+
+ parser.add_argument("-o", type=str, dest="owner", default="src-openeuler", help="gitee owner")
+ parser.add_argument("--spec", type=str, dest="spec", default="", help="spec files")
+
+ return parser.parse_args()
+
+
+if "__main__" == __name__:
+ args = init_args()
+
+ _ = not os.path.exists("log") and os.mkdir("log")
+ logger_conf_path = os.path.realpath(os.path.join(os.path.realpath(__file__), "../../conf/logger.conf"))
+ logging.config.fileConfig(logger_conf_path)
+ logger = logging.getLogger("build")
+
+ logger.info("using credential %s", args.account.split(":")[0])
+ logger.info("cloning repository https://gitee.com/%s/%s.git ", args.owner, args.repo)
+ logger.info("clone depth 1")
+ logger.info("checking out pull request %s", args.pr)
+
+ from src.utils.dist_dataset import DistDataset
+ from src.proxy.git_proxy import GitProxy
+ from src.proxy.obs_proxy import OBSProxy
+ from src.proxy.es_proxy import ESProxy
+ from src.proxy.kafka_proxy import KafkaProducerProxy
+ from src.utils.shell_cmd import shell_cmd_live
+
+ dd = DistDataset()
+ dd.set_attr_stime("spb.job.stime")
+ dd.set_attr("spb.job.link", os.environ["BUILD_URL"])
+ dd.set_attr("spb.trigger.reason", os.environ["BUILD_CAUSE"])
+
+ # suppress python warning
+ warnings.filterwarnings("ignore")
+ logging.getLogger("elasticsearch").setLevel(logging.WARNING)
+ logging.getLogger("kafka").setLevel(logging.WARNING)
+
+ kp = KafkaProducerProxy(brokers=os.environ["KAFKAURL"].split(","))
+
+ # download repo
+ dd.set_attr_stime("spb.scm.stime")
+ gp = GitProxy.init_repository(args.repo, work_dir=args.workspace)
+ repo_url = "https://{}@gitee.com/{}/{}.git".format(args.account, args.owner, args.repo)
+ if not gp.fetch_pull_request(repo_url, args.pr, depth=1):
+ logger.info("fetch finished -")
+
+ dd.set_attr("spb.scm.result", "failed")
+ dd.set_attr_etime("spb.scm.etime")
+ dd.set_attr_etime("spb.job.etime")
+ #dd.set_attr("spb.job.result", "failed")
+
+ # upload to es
+ query = {"term": {"id": args.comment_id}}
+ script = {"lang": "painless", "source": "ctx._source.spb_{}=params.spb".format(args.arch),
+ "params": dd.to_dict()}
+ kp.send("openeuler_statewall_ci_ac", key=args.comment_id, value=dd.to_dict())
+ sys.exit(-1)
+ else:
+ gp.checkout_to_commit_force("pull/{}/MERGE".format(args.pr))
+ logger.info("fetch finished +")
+ dd.set_attr("spb.scm.result", "successful")
+ dd.set_attr_etime("spb.scm.etime")
+
+ dd.set_attr_stime("spb.build.stime")
+ spb = SinglePackageBuild(args.package, args.arch, args.branch)
+ rs = spb.build(args.spec, args.workspace, args.code)
+ dd.set_attr("spb.build.result", "failed" if rs else "successful")
+ dd.set_attr_etime("spb.build.etime")
+
+ dd.set_attr_etime("spb.job.etime")
+
+ # upload to es
+ query = {"term": {"id": args.comment_id}}
+ script = {"lang": "painless", "source": "ctx._source.spb_{}=params.spb".format(args.arch), "params": dd.to_dict()}
+ kp.send("openeuler_statewall_ci_ac", key=args.comment_id, value=dd.to_dict())
+ sys.exit(rs)
diff --git a/src/build/process_service.pl b/src/build/process_service.pl
new file mode 100644
index 0000000000000000000000000000000000000000..24c299611d785838dfcfe496d121ca3a6e705fe8
--- /dev/null
+++ b/src/build/process_service.pl
@@ -0,0 +1,95 @@
+#!/usr/bin/perl -w
+
+
+use File::Spec::Functions qw(rel2abs);
+use File::Basename qw(dirname);
+use Getopt::Std;
+use POSIX;
+use Data::Dumper;
+use XML::Structured;
+use strict;
+
+our $services = [
+ 'services' =>
+ [[ 'service' =>
+ 'name',
+ 'mode', # "localonly" is skipping this service on server side, "trylocal" is trying to merge changes directly in local files, "disabled" is just skipping it
+ [[ 'param' =>
+ 'name',
+ '_content'
+ ]],
+ ]],
+];
+
+die " USAGE: $0 -f service_file -p product -c code_dir -m module -w workdir\n" if (@ARGV < 5);
+
+our ($opt_f,$opt_p,$opt_c,$opt_m,$opt_w) =("","","","","");
+
+&getopts("Hf:p:c:m:w:");
+
+my $service_file = $opt_f if ($opt_f);
+my $product = $opt_p if ($opt_p);
+my $code_dir = $opt_c if ($opt_c);
+my $module = $opt_m if ($opt_m);
+my $myworkdir = $opt_w if ($opt_w);
+
+#open lg, ">/home/test.log";
+
+my $xml_file = readstr($service_file);
+my $serviceinfo = XMLin($services, $xml_file);
+for my $service (@{$serviceinfo->{'service'}}) {
+ #print lg "Run for ".getcwd. "/$service->{'name'}"."\n";
+ my @run;
+
+ push @run, dirname(rel2abs($0))."/$service->{'name'}";
+ for my $param (@{$service->{'param'}}) {
+ if ($service->{'name'} eq 'recompress') {
+ push @run, "--$param->{'name'}";
+ if ($param->{'name'} eq 'file') {
+ push @run, $myworkdir.'/'.$param->{'_content'};
+# print lg '--'. $param->{'name'} . " ".$myworkdir.'/'.$param->{'_content'}."\n";
+ }
+ else {
+ push @run, $param->{'_content'};
+# print lg '--'. $param->{'name'}. " " .$param->{'_content'}."\n";
+ }
+# print lg '--outdir '. $myworkdir."\n";
+ } else {
+ if ($param->{'name'} eq 'submodules'){
+ print 'skip submodules para';
+ }else{
+ next if $param->{'name'} eq 'outdir';
+ next unless $param->{'_content'};
+ push @run, "--$param->{'name'}";
+ push @run, $param->{'_content'};
+ }
+ }
+ }
+
+ push @run, "--outdir";
+ push @run, "$myworkdir";
+
+ if ($service->{'name'} =~ /tar/) {
+ push @run, "--project";
+ push @run, "$product";
+
+ push @run, "--package";
+ push @run, "$module";
+ }
+
+ print @run;
+ system(@run);
+}
+
+sub readstr {
+ my ($fn, $nonfatal) = @_;
+ local *F;
+ if (!open(F, '<', $fn)) {
+ die("$fn: $!\n") unless $nonfatal;
+ return undef;
+ }
+ my $d = '';
+ 1 while sysread(F, $d, 8192, length($d));
+ close F;
+ return $d;
+}
diff --git a/src/build/recompress b/src/build/recompress
new file mode 100644
index 0000000000000000000000000000000000000000..4daa0b5692a3bb3164d357f19d22ddf7f612f741
--- /dev/null
+++ b/src/build/recompress
@@ -0,0 +1,140 @@
+#!/bin/bash
+
+# A simple script to checkout or update a svn or git repo as source service
+#
+# (C) 2010 by Adrian Schröter
+#
+# This program is free software; you can redistribute it and/or
+# modify it under the terms of the GNU General Public License
+# as published by the Free Software Foundation; either version 2
+# of the License, or (at your option) any later version.
+# See http://www.gnu.org/licenses/gpl-2.0.html for full license text.
+
+# defaults
+MYCOMPRESSION=""
+FILES=""
+SRCDIR=""
+
+while test $# -gt 0; do
+ case $1 in
+ *-compression)
+ MYCOMPRESSION="$2"
+ shift
+ ;;
+ *-file)
+ SRCDIR="$FILES ${2%/*}/"
+ FILES="$FILES ${2##*/}"
+
+ echo 'SRCDIR ' $SRCDIR
+ echo 'FILES ' $FILES
+ shift
+ ;;
+ *-outdir)
+ MYOUTDIR="$2"
+ shift
+ ;;
+ *)
+ echo Unknown parameter $1.
+ echo 'Usage: recompress --compression $COMPRESSION --file $FILE --outdir $OUT'
+ exit 1
+ ;;
+ esac
+ shift
+done
+
+if [ -z "$MYCOMPRESSION" ]; then
+ MYCOMPRESSION="bz2"
+fi
+if [ -z "$FILES" ]; then
+ echo "ERROR: no inputs files are given via --file parameter!"
+ exit 1
+fi
+if [ -z "$MYOUTDIR" ]; then
+ echo "ERROR: no output directory is given via --outdir parameter!"
+ exit 1
+fi
+
+cd $SRCDIR
+echo `pwd`
+echo `ls`
+echo `ls $FILES`
+for i in `ls $FILES`; do
+#for i in "ls $SRCIDR"; do
+ FILE=`ls -1 "$i" || ls -1 "_service:*:$i"`
+ #FILE=`ls -1 "$i" || ls -1 "$i"`
+ if [ ! -f "$FILE" ]; then
+ echo "Unknown file $i"
+ exit 1
+ fi
+ UNCOMPRESS="cat"
+ BASENAME="$FILE"
+ if [ "${FILE%.gz}" != "$FILE" ]; then
+ UNCOMPRESS="gunzip -c"
+ BASENAME="${FILE%.gz}"
+ elif [ "${FILE%.tgz}" != "$FILE" ]; then
+ UNCOMPRESS="gunzip -c"
+ BASENAME="${FILE%.tgz}.tar"
+ elif [ "${FILE%.bz2}" != "$FILE" ]; then
+ UNCOMPRESS="bunzip2 -c"
+ BASENAME="${FILE%.bz2}"
+ elif [ "${FILE%.xz}" != "$FILE" ]; then
+ UNCOMPRESS="xz -dc"
+ BASENAME="${FILE%.xz}"
+ fi
+
+ if [ "$MYCOMPRESSION" == "gz" ]; then
+ COMPRESS="gzip -c -n --rsyncable -"
+ NEWFILE="${BASENAME#_service:}.gz"
+ elif [ "$MYCOMPRESSION" == "bz2" ]; then
+ COMPRESS="bzip2 -c -"
+ NEWFILE="${BASENAME#_service:}.bz2"
+ elif [ "$MYCOMPRESSION" == "xz" ]; then
+ COMPRESS="xz -c -"
+ NEWFILE="${BASENAME#_service:}.xz"
+ elif [ "$MYCOMPRESSION" == "none" ]; then
+ COMPRESS="cat -"
+ NEWFILE="${BASENAME#_service:}"
+ else
+ echo "ERROR: Unknown compression"
+ exit 1
+ fi
+
+ echo "pwd: ". `pwd`;
+ # do the real work
+ echo "UnCompress". $UNCOMPRESS
+ echo "file ". $FILE
+ echo "Compress". $COMPRESS
+ echo "NEWFILE ". $NEWFILE
+ $UNCOMPRESS "$FILE" | $COMPRESS > "$MYOUTDIR/$NEWFILE" || exit 1
+
+ # Check if the (compressed) target file already exists in the directory where
+ # the service is invoked and drop the newly generated one. Avoids overwriting
+ # otherwise identical files which only have different timestamps. Note that
+ # zdiff and co all fail to do that properly...
+ echo "pwd: ". `pwd`;
+ if [ -f $NEWFILE ] ; then
+ DIFF_TMPDIR=$(mktemp -d)
+ SRC_DIR="$PWD"
+ echo "SRC_DIR ". $SRC_DIR
+ echo "MYOUTDIR ". $MYOUTDIR
+ cd $DIFF_TMPDIR
+ mkdir new old
+ $(cd new ; tar -xxf "$MYOUTDIR/$NEWFILE" 2> /dev/null || mv "$MYOUTDIR/$NEWFILE" .)
+ $(cd old ; tar -xxf "$SRC_DIR/$NEWFILE" 2> /dev/null || mv "$SRC_DIR/$NEWFILE" .)
+ if diff -r new old > /dev/null ; then
+ echo "Identical target file $NEWFILE already exists, skipping.."
+ #rm -r "$MYOUTDIR/$NEWFILE"
+ else
+ echo "Compressed $FILE to $NEWFILE"
+ fi
+ cd $SRC_DIR
+ rm -r $DIFF_TMPDIR
+ else
+ echo "Compressed $FILE to $NEWFILE"
+ fi
+
+ # we can remove service files, no need to store them twice
+ rm -f "$FILE"
+done
+
+exit 0
diff --git a/src/build/related_rpm_package.py b/src/build/related_rpm_package.py
new file mode 100644
index 0000000000000000000000000000000000000000..2752431de1503119ce151a6192f9f85f0eb3f098
--- /dev/null
+++ b/src/build/related_rpm_package.py
@@ -0,0 +1,183 @@
+# -*- encoding=utf-8 -*-
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2020-09-23
+# Description: This module is used to find related rpms
+# **********************************************************************************
+
+"""
+This module is used to find related rpms
+"""
+import os
+import subprocess
+import logging
+import requests
+import tempfile
+import shutil
+from src.proxy.obs_proxy import OBSProxy
+from src.constant import Constant
+logging.basicConfig(format='%(message)s', level=logging.INFO)
+
+
+class RelatedRpms(object):
+ """
+ Found related rpms
+ """
+ def __init__(self, obs_addr, obs_repo_url, branch_name, package_arch):
+ """
+ :param repo: obs address
+ :param obs_repo_url: obs repo
+ :param project_name: project name
+ :param package_arch: aarch64/x86_64
+ """
+ self._obs_addr = obs_addr
+ self._obs_repo_url = obs_repo_url
+ self._project_name = "openEuler:Mainline"
+ self._package_arch = package_arch
+ self._arch_names = {}
+ self._branch_name = branch_name
+
+ def get_src_name(self, temp_path, rpm_name):
+ """
+ Get src name by rpm_name
+ """
+ rpm_qi_info_file = os.path.join(temp_path, "rpmqi.txt")
+ subprocess.run("rpm -qpi {} > {}".format(rpm_name, rpm_qi_info_file), shell=True, stderr=subprocess.PIPE)
+ with open(rpm_qi_info_file, 'r') as fd:
+ lines = fd.readlines()
+ fd.close()
+ for line in lines:
+ if line.startswith("Source RPM : "):
+ src_name = line.split("Source RPM : ")[-1].split("\n")[0]
+ return src_name.rsplit("-", 2)[0]
+
+ def get_requeset_by_name(self, temp_path, rpm_name, src_name):
+ """
+ Get requeset by name
+ """
+ if self._package_arch == "x86_64":
+ self._arch_names["standard_x86_64/x86_64"] = os.path.join(temp_path, "x86_64.html")
+ self._arch_names["standard_x86_64/noarch"] = os.path.join(temp_path, "x86_noarch.html")
+ else:
+ self._arch_names["standard_aarch64/aarch64"] = os.path.join(temp_path, "aarch64.html")
+ self._arch_names["standard_aarch64/noarch"] = os.path.join(temp_path, "noarch.html")
+ download_project_name = self._project_name.replace(":", ":/")
+ rpm_base_name = os.path.basename(rpm_name).rsplit("-", 2)[0]
+ rpm_arch_name = os.path.basename(rpm_name).split(".oe1")[-1]
+ rpm_final_name = ""
+ has_found = False
+ for name in self._arch_names:
+ download_req_addr = os.path.join(self._obs_repo_url, download_project_name, name)
+ logging.info("downloading index of %s", download_req_addr)
+ subprocess.run("wget -t 5 -c {} -O {} > /dev/null 2>&1".format(download_req_addr, self._arch_names[name]),
+ shell=True)
+ if not has_found:
+ with open(self._arch_names[name], "r") as fd:
+ lines = fd.readlines()
+ fd.close()
+ for lin in lines:
+ if rpm_base_name in lin and rpm_arch_name in lin:
+ if "title=\"" in lin:
+ find_rpm_name = lin.split("title=\"")[-1].split("\"")[0]
+ else:
+ find_rpm_name = lin.split("href=\"")[-1].split("\"")[0]
+ if find_rpm_name.rsplit("-", 2)[0] == rpm_base_name:
+ if find_rpm_name.split(".oe1")[-1] == rpm_arch_name:
+ rpm_final_name = find_rpm_name
+ has_found = True
+ logging.info("------------rpm_name:%s rpm_final_name:%s------------------", rpm_name, rpm_final_name)
+ rpm_requeset_info = os.path.join(temp_path, "rpm_requeset.html")
+
+ req_addr = os.path.join(self._obs_addr, "package/binary", self._project_name, src_name,
+ "standard_" + self._package_arch, self._package_arch, os.path.basename(rpm_final_name))
+
+ logging.info("find required_by info from: %s", req_addr)
+ req = requests.get(req_addr)
+ with open(rpm_requeset_info, "wb+") as fd:
+ fd.write(req.content)
+ fd.close()
+ with open(rpm_requeset_info, "r") as fd:
+ lines = fd.readlines()
+ fd.close()
+
+ required_by = set()
+ has_found_required_by = False
+ for line in lines:
+ if "Required by" in line:
+ has_found_required_by = True
+ continue
+ if has_found_required_by and "package/dependency/" in line:
+ required_by_name = line.split("\">")[-1].split("<")[0]
+ required_by.add(required_by_name)
+ continue
+ if has_found_required_by and "
+#
+# This program is free software; you can redistribute it and/or
+# modify it under the terms of the GNU General Public License
+# as published by the Free Software Foundation; either version 2
+# of the License, or (at your option) any later version.
+# See http://www.gnu.org/licenses/gpl-2.0.html for full license text.
+
+SERVICE='tar_scm'
+
+set_default_params () {
+ MYSCM=""
+ MYURL=""
+ MYVERSION="_auto_"
+ MYFORMAT=""
+ MYPREFIX=""
+ MYFILENAME=""
+ MYREVISION=""
+ MYPACKAGEMETA=""
+# MYHISTORYDEPTH=""
+ INCLUDES=""
+}
+
+get_config_options () {
+ # config options for this host ?
+ if [ -f /etc/obs/services/$SERVICE ]; then
+ . /etc/obs/services/$SERVICE
+ fi
+ # config options for this user ?
+ if [ -f "$HOME"/.obs/$SERVICE ]; then
+ . "$HOME"/.obs/$SERVICE
+ fi
+}
+
+parse_params () {
+ while test $# -gt 0; do
+ case $1 in
+ *-scm)
+ MYSCM="$2"
+ shift
+ ;;
+ *-url)
+ MYURL="$2"
+ CI_PRO_NAME=${MYURL%%/*}
+ TEMP_URL="$MYURL"
+ MYURL=$TEMP_URL
+ shift
+ ;;
+ *-subdir)
+ MYSUBDIR="$2"
+ shift
+ ;;
+ *-revision)
+ MYREVISION="$2"
+ shift
+ ;;
+ *-version)
+ MYVERSION="$2"
+ shift
+ ;;
+ *-include)
+ INCLUDES="$INCLUDES $2"
+ shift
+ ;;
+ *-versionformat)
+ MYFORMAT="$2"
+ shift
+ ;;
+ *-versionprefix)
+ MYPREFIX="$2"
+ shift
+ ;;
+ *-exclude)
+ EXCLUDES="$EXCLUDES --exclude=${2#/}"
+ shift
+ ;;
+ *-filename)
+ MYFILENAME="${2#/}"
+ shift
+ ;;
+ *-package-meta)
+ MYPACKAGEMETA="${2#/}"
+ shift
+ ;;
+ *-outdir)
+ MYOUTDIR="$2"
+ shift
+ ;;
+ *-history-depth)
+ echo "history-depth parameter is obsolete and will be ignored"
+ shift
+ ;;
+ *-project)
+ MYPROJECT="$2"
+ shift
+ ;;
+ *-package)
+ MYPACKAGE="$2"
+ shift
+ ;;
+ *)
+ echo "Unknown parameter: $1"
+ echo 'Usage: $SERVICE --scm $SCM --url $URL [--subdir $SUBDIR] [--revision $REVISION] [--version $VERSION] [--include $INCLUDE]* [--exclude $EXCLUDE]* [--versionformat $FORMAT] [--versionprefix $PREFIX] [--filename $FILENAME] [--package-meta $META] --outdir $OUT'
+ exit 1
+ ;;
+ esac
+ shift
+ done
+}
+
+error () {
+ echo "ERROR: $*"
+ exit 1
+}
+
+debug () {
+ [ -n "$DEBUG_TAR_SCM" ] && echo "$*"
+}
+
+safe_run () {
+ if ! "$@"; then
+ error "$* failed; aborting!"
+ fi
+}
+
+sanitise_params () {
+ TAR_VERSION="$MYVERSION"
+
+ if [ -z "$MYSCM" ]; then
+ error "no scm is given via --scm parameter (git/svn/hg/bzr)!"
+ fi
+ if [ -z "$MYURL" ]; then
+ error "no checkout URL is given via --url parameter!"
+ fi
+ if [ -z "$MYOUTDIR" ]; then
+ error "no output directory is given via --outdir parameter!"
+ fi
+ if [ -z "$MYPROJECT" ]; then
+ error "no project is given via --project parameter!"
+ fi
+ if [ -z "$MYPACKAGE" ]; then
+ error "no package is given via --package parameter!"
+ fi
+
+ FILE="$MYFILENAME"
+ WD_VERSION="$MYVERSION"
+ if [ -z "$MYPACKAGEMETA" ]; then
+ EXCLUDES="$EXCLUDES --exclude=.svn"
+ fi
+ # if [ "$MYHISTORYDEPTH" == "full" ]; then
+ # MYHISTORYDEPTH="999999999"
+ # fi
+}
+
+detect_default_filename_param () {
+ if [ -n "$FILE" ]; then
+ return
+ fi
+
+ case "$MYSCM" in
+ git)
+ FILE="${MYURL%/}"
+ FILE="${FILE##*/}"
+ FILE="${FILE%.git}"
+ FILE="${FILE#*@*:}"
+ ;;
+ svn|hg|bzr)
+ FILE="${MYURL%/}"
+ FILE="${FILE##*/}"
+ ;;
+ local)
+ FILE="temp_dir"
+ ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ esac
+}
+
+fetch_upstream () {
+ TOHASH="$MYURL"
+ [ "$MYSCM" = 'svn' ] && TOHASH="$TOHASH/$MYSUBDIR"
+ HASH=`echo "$TOHASH" | sha256sum | cut -d\ -f 1`
+ REPOCACHE=
+ CACHEDIRECTORY=/tmp/local_code/xdf
+ if [ -n "$CACHEDIRECTORY" ]; then
+ REPOCACHEINCOMING="$CACHEDIRECTORY/incoming"
+ REPOCACHEROOT="$CACHEDIRECTORY/repo"
+ REPOCACHE="$REPOCACHEROOT/$MYPROJECT/$MYPACKAGE"
+ REPOURLCACHE="$CACHEDIRECTORY/repourl/$HASH"
+ fi
+
+
+ debug "check local cache if configured"
+ if [ -n "$CACHEDIRECTORY" -a -d "$REPOCACHE/" ]; then
+ debug "cache hit: $REPOCACHE"
+ check_cache
+ else
+ if [ -n "$CACHEDIRECTORY" ]; then
+ debug "cache miss: $REPOCACHE/"
+ else
+ debug "cache not enabled"
+ fi
+
+ calc_dir_to_clone_to
+ debug "new $MYSCM checkout to $CLONE_TO"
+ initial_clone
+
+ if [ -n "$CACHEDIRECTORY" ]; then
+ #cache_repo
+ REPOPATH="$REPOCACHE"
+ else
+ REPOPATH="$MYOUTDIR/$FILE"
+ fi
+
+ if [ "$TAR_VERSION" == "_auto_" -o -n "$MYFORMAT" ]; then
+ detect_version
+ fi
+ #exit 22
+ fi
+
+}
+
+calc_dir_to_clone_to () {
+ if [ -n "$CACHEDIRECTORY" ]; then
+ if [ ! -d REPOCACHE ]; then
+ mkdir -p "$REPOCACHE"
+ fi
+ safe_run cd "$REPOCACHE"
+ # Use dry-run mode because git/hg refuse to clone into
+ # an empty directory on SLES11
+ #debug mktemp -u -d "tmp.XXXXXXXXXX"
+ #CLONE_TO=`mktemp -u -d "tmp.XXXXXXXXXX"`
+ CLONE_TO="$REPOCACHE"
+ else
+ CLONE_TO="$FILE"
+ fi
+}
+
+initial_clone () {
+ echo "Fetching from $MYURL ..."
+
+ case "$MYSCM" in
+ git)
+ # Clone with full depth; so that the revision can be found if specified
+ safe_run git clone "$MYURL" "$CLONE_TO"
+ ;;
+ svn)
+ args=
+ [ -n "$MYREVISION" ] && args="-r$MYREVISION"
+ if [[ $(svn --version --quiet) > "1.5.99" ]]; then
+ TRUST_SERVER_CERT="--trust-server-cert"
+ fi
+ safe_run svn checkout --non-interactive $TRUST_SERVER_CERT \
+ $args "$MYURL/$MYSUBDIR" "$CLONE_TO"
+ MYSUBDIR= # repo root is subdir
+ ;;
+ local)
+ echo "xdffff: $MYURL ---- $CLONE_TO --- `pwd`"
+ safe_run ls -A $MYURL | grep -v .git | xargs -I {} cp -a $MYURL/{} .
+ if [ -e $MYURL/.git ]; then
+ safe_run rm -f $MYURL/.git/shallow
+ safe_run cp -aL $MYURL/.git .
+ fi
+ if [ -d "$MYURL/.svn" ]; then
+ safe_run cp -av $MYURL/.svn ./
+ fi
+ ;;
+ hg)
+ safe_run hg clone "$MYURL" "$CLONE_TO"
+ ;;
+ bzr)
+ args=
+ [ -n "$MYREVISION" ] && args="-r $MYREVISION"
+ safe_run bzr checkout $args "$MYURL" "$CLONE_TO"
+ ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ esac
+}
+
+cache_repo () {
+ if [ -e "$REPOCACHE" ]; then
+ error "Somebody else beat us to populating the cache for $MYURL ($REPOCACHE)"
+ else
+ # FIXME: small race window here; do source services need to be thread-safe?
+ if [ ! -d $REPOCACHE ]; then
+ mkdir -p $REPOCACHE
+ fi
+ debug mv2 "$CLONE_TO" "$REPOCACHE"
+ safe_run mv "$CLONE_TO" "$REPOCACHE"
+ echo "$MYURL" > "$REPOURLCACHE"
+ echo "Cached $MYURL at $REPOCACHE"
+ fi
+}
+
+check_cache () {
+ if [ -d "$MYURL/.svn" ]; then
+ new_version=`LC_ALL=C svn info "$MYURL" | sed -n 's,^Last Changed Rev: \(.*\),\1,p'`
+ else
+ new_version="new_version"
+ fi
+ if echo "$MYURL" | grep '/$' &> /dev/null; then
+ new_version="new_version"
+ fi
+ if [ -d "$REPOCACHE/.svn" ]; then
+ old_version=`LC_ALL=C svn info "$REPOCACHE" | sed -n 's,^Last Changed Rev: \(.*\),\1,p'`
+ else
+ old_version="old_version"
+ fi
+ #echo "xdf: $new_version $old_version"
+ #if [ "$new_version" != "$old_version" ]; then
+ echo "The code has changed for $MYPROJECT/$MYPACKAGE"
+ rm -rf "$REPOCACHE"
+
+ calc_dir_to_clone_to
+ debug "new $MYSCM checkout to $CLONE_TO"
+ initial_clone
+
+ if [ -n "$CACHEDIRECTORY" ]; then
+ #cache_repo
+ REPOPATH="$REPOCACHE"
+ else
+ REPOPATH="$MYOUTDIR/$FILE"
+ fi
+
+ safe_run cd "$REPOPATH"
+ switch_to_revision
+ if [ "$TAR_VERSION" == "_auto_" -o -n "$MYFORMAT" ]; then
+ detect_version
+ fi
+}
+
+update_cache () {
+ safe_run cd "$REPOCACHE"
+
+ case "$MYSCM" in
+ git)
+ safe_run git fetch
+ ;;
+ svn)
+ args=
+ [ -n "$MYREVISION" ] && args="-r$MYREVISION"
+ safe_run svn update $args > svnupdate_info
+ isupdate=`cat svnupdate_info | wc -l`
+ if [ $isupdate -eq 1 ]; then
+ rm -f svnupdate_info
+ echo "There is no code update, so exit 22"
+ exit 22
+ fi
+ MYSUBDIR= # repo root is subdir
+ ;;
+ hg)
+ if ! out=`hg pull`; then
+ if [[ "$out" == *'no changes found'* ]]; then
+ # Contrary to the docs, hg pull returns exit code 1 when
+ # there are no changes to pull, but we don't want to treat
+ # this as an error.
+ :
+ else
+ error "hg pull failed; aborting!"
+ fi
+ fi
+ ;;
+ bzr)
+ args=
+ [ -n "$MYREVISION" ] && args="-r$MYREVISION"
+ safe_run bzr update $args
+ ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ esac
+}
+
+switch_to_revision () {
+ case "$MYSCM" in
+ git)
+ safe_run git checkout "$MYREVISION"
+ if git branch | grep -q '^\* (no branch)$'; then
+ echo "$MYREVISION does not refer to a branch, not attempting git pull"
+ else
+ safe_run git pull
+ fi
+ ;;
+ svn|bzr|local)
+ : # should have already happened via checkout or update
+ ;;
+ hg)
+ safe_run hg update "$MYREVISION"
+ ;;
+ # bzr)
+ # safe_run bzr update
+ # if [ -n "$MYREVISION" ]; then
+ # safe_run bzr revert -r "$MYREVISION"
+ # fi
+ # ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ esac
+}
+
+detect_version () {
+ if [ -z "$MYFORMAT" ]; then
+ case "$MYSCM" in
+ git)
+ MYFORMAT="%at"
+ ;;
+ hg)
+ MYFORMAT="{rev}"
+ ;;
+ svn|bzr)
+ MYFORMAT="%r"
+ ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ ;;
+ esac
+ fi
+
+ safe_run cd "$REPOPATH"
+ if [ -n "$MYFORMAT" ];then
+ MYPREFIX="$MYFORMAT"
+ else
+ get_version
+ fi
+ TAR_VERSION="$MYPREFIX$version"
+}
+
+get_version () {
+ case "$MYSCM" in
+ git)
+ #version=`safe_run git show --pretty=format:"$MYFORMAT" | head -n 1`
+ version=`safe_run git log -n1 --pretty=format:"$MYFORMAT"`
+ ;;
+ svn)
+ #rev=`LC_ALL=C safe_run svn info | awk '/^Revision:/ { print $2 }'`
+ rev=`LC_ALL=C safe_run svn info | sed -n 's,^Last Changed Rev: \(.*\),\1,p'`
+ version="${MYFORMAT//%r/$rev}"
+ ;;
+ hg)
+ rev=`safe_run hg id -n`
+ version=`safe_run hg log -l1 -r$rev --template "$MYFORMAT"`
+ ;;
+ bzr)
+ #safe_run bzr log -l1 ...
+ rev=`safe_run bzr revno`
+ version="${MYFORMAT//%r/$rev}"
+ ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ esac
+}
+
+prep_tree_for_tar () {
+ if [ ! -e "$REPOPATH/$MYSUBDIR" ]; then
+ error "directory does not exist: $REPOPATH/$MYSUBDIR"
+ fi
+
+ if [ -z "$TAR_VERSION" ]; then
+ TAR_BASENAME="$FILE"
+ else
+ TAR_BASENAME="${FILE}-${TAR_VERSION}"
+ fi
+
+ MYINCLUDES=""
+
+ for INC in $INCLUDES; do
+ MYINCLUDES="$MYINCLUDES $INC"
+ done
+ #if [ -z "$MYINCLUDES" ]; then
+ # MYINCLUDES="*"
+ #fi
+
+ safe_run cd "$MYOUTDIR"
+
+ if [ -n "$CACHEDIRECTORY" ]; then
+ debug cp -a "$REPOPATH/$MYSUBDIR" "$TAR_BASENAME"
+ safe_run cp -a "$REPOPATH/$MYSUBDIR" "$TAR_BASENAME"
+ if [ -e $REPOPATH/$MYSUBDIR/.git ]; then
+ # amazing copy failed, ignore fail temporary
+ cp -a "$REPOPATH/$MYSUBDIR/.git" "$TAR_BASENAME"
+ safe_run pushd "$TAR_BASENAME";git reset --hard HEAD;popd
+ fi
+ else
+ debug mv3 "$REPOPATH/$MYSUBDIR" "$TAR_BASENAME"
+ safe_run mv "$REPOPATH/$MYSUBDIR" "$TAR_BASENAME"
+ fi
+ if [ -z "$MYINCLUDES" ]; then
+ MYINCLUDES=`ls -A $TAR_BASENAME`
+ fi
+}
+
+create_tar () {
+ safe_run cd "$TAR_BASENAME"
+
+ compression_array=(`cat $MYOUTDIR/$TARFILE/_service | egrep '"compression"' | awk -F'>' '{print $2}' | awk -F'<' '{print $1}'`)
+ file_array=`cat $MYOUTDIR/$TARFILE/_service | egrep '"file"' | awk -F'>' '{print $2}' | awk -F'<' '{print $1}' | tr -d '.tar'`
+ index=0
+ for file in $file_array
+ do
+ if echo "$TAR_BASENAME" | egrep "$file"; then
+ break
+ else
+ ((index=index+1))
+ fi
+ done
+ compression_type=${compression_array[index]}
+ if [ -e .git ]; then
+ MYINCLUDES="$MYINCLUDES .git"
+ fi
+
+ TARFILE="${TAR_BASENAME}.tar"
+ TARPATH="$MYOUTDIR/$TARFILE"
+ debug tar Pcf "$TARPATH" $EXCLUDES $MYINCLUDES
+ safe_run tar Pcf "$TARPATH" $EXCLUDES $MYINCLUDES
+
+
+ echo "Created $TARFILE"
+ safe_run cd "$MYOUTDIR"
+}
+
+cleanup () {
+ debug rm -rf "$TAR_BASENAME" "$FILE"
+ rm -rf "$TAR_BASENAME" "$FILE"
+}
+
+main () {
+ set_default_params
+ #xdf
+ DEBUG_TAR_SCM=1
+
+ if [ -z "$DEBUG_TAR_SCM" ]; then
+ get_config_options
+ else
+ # We're in test-mode, so don't let any local site-wide
+ # or per-user config impact the test suite.
+ :
+ fi
+ parse_params "$@"
+ sanitise_params
+
+ SRCDIR=$(pwd)
+ cd "$MYOUTDIR"
+ #echo "$SRCDIR $MYOUTDIR"
+ detect_default_filename_param
+
+ #xdf
+ #LOGFILE=/srv/local_code/xdf/log/$MYPROJECT/$MYPACKAGE
+ #mkdir -p "/srv/local_code/xdf/log/$MYPROJECT"
+
+ lockfile=$LOGFILE".lock"
+ if [ -f $lockfile ]; then
+ mypid=`cat $lockfile`
+ while ps -p $mypid -o comm= &> /dev/null
+ do
+ sleep 10
+ mypid=`cat $lockfile`
+ done
+ rm -f $lockfile
+ fi
+ touch $lockfile
+ echo "$$" > $lockfile
+
+ #exec 6>&1
+ #exec > $LOGFILE
+ echo "$@"
+ echo "myurl === $MYURL"
+ fetch_upstream
+
+ prep_tree_for_tar
+ create_tar
+
+ cleanup
+ rm -f $lockfile
+}
+
+main "$@"
+
+exit 0
diff --git a/src/build/tar_local_kernel b/src/build/tar_local_kernel
new file mode 100644
index 0000000000000000000000000000000000000000..fa946a93bd8a305074d1999e4372e40e950d44e8
--- /dev/null
+++ b/src/build/tar_local_kernel
@@ -0,0 +1,578 @@
+#!/bin/bash
+
+# A simple script to checkout or update a svn or git repo as source service
+#
+# (C) 2010 by Adrian Schröter
+#
+# This program is free software; you can redistribute it and/or
+# modify it under the terms of the GNU General Public License
+# as published by the Free Software Foundation; either version 2
+# of the License, or (at your option) any later version.
+# See http://www.gnu.org/licenses/gpl-2.0.html for full license text.
+
+SERVICE='tar_scm'
+
+set_default_params () {
+ MYSCM=""
+ MYURL=""
+ #MYVERSION="_auto_"
+ MYVERSION="222"
+ MYFORMAT=""
+ MYPREFIX=""
+ MYFILENAME=""
+ MYREVISION=""
+ MYPACKAGEMETA=""
+# MYHISTORYDEPTH=""
+ INCLUDES=""
+}
+
+get_config_options () {
+ # config options for this host ?
+ if [ -f /etc/obs/services/$SERVICE ]; then
+ . /etc/obs/services/$SERVICE
+ fi
+ # config options for this user ?
+ if [ -f "$HOME"/.obs/$SERVICE ]; then
+ . "$HOME"/.obs/$SERVICE
+ fi
+}
+
+parse_params () {
+ while test $# -gt 0; do
+ case $1 in
+ *-scm)
+ MYSCM="$2"
+ shift
+ ;;
+ *-url)
+ MYURL="$2"
+ CI_PRO_NAME=${MYURL%%/*}
+ TEMP_URL="$MYURL"
+ MYURL=$TEMP_URL
+ shift
+ ;;
+ *-subdir)
+ MYSUBDIR="$2"
+ shift
+ ;;
+ *-revision)
+ MYREVISION="$2"
+ shift
+ ;;
+ *-version)
+ MYVERSION="$2"
+ shift
+ ;;
+ *-include)
+ INCLUDES="$INCLUDES $2"
+ shift
+ ;;
+ *-versionformat)
+ MYFORMAT="$2"
+ shift
+ ;;
+ *-versionprefix)
+ MYPREFIX="$2"
+ shift
+ ;;
+ *-exclude)
+ EXCLUDES="$EXCLUDES --exclude=${2#/}"
+ shift
+ ;;
+ *-filename)
+ MYFILENAME="${2#/}"
+ shift
+ ;;
+ *-package-meta)
+ MYPACKAGEMETA="${2#/}"
+ shift
+ ;;
+ *-outdir)
+ MYOUTDIR="$2"
+ shift
+ ;;
+ *-history-depth)
+ echo "history-depth parameter is obsolete and will be ignored"
+ shift
+ ;;
+ *-project)
+ MYPROJECT="$2"
+ shift
+ ;;
+ *-package)
+ MYPACKAGE="$2"
+ shift
+ ;;
+ *-extract)
+ shift
+ ;;
+ *)
+ echo "Unknown parameter: $1"
+ echo 'Usage: $SERVICE --scm $SCM --url $URL [--subdir $SUBDIR] [--revision $REVISION] [--version $VERSION] [--include $INCLUDE]* [--exclude $EXCLUDE]* [--versionformat $FORMAT] [--versionprefix $PREFIX] [--filename $FILENAME] [--package-meta $META] --outdir $OUT'
+ exit 1
+ ;;
+ esac
+ shift
+ done
+}
+
+error () {
+ echo "ERROR: $*"
+ exit 1
+}
+
+debug () {
+ [ -n "$DEBUG_TAR_SCM" ] && echo "$*"
+}
+
+safe_run () {
+ if ! "$@"; then
+ error "$* failed; aborting!"
+ fi
+}
+
+sanitise_params () {
+ TAR_VERSION="$MYVERSION"
+
+ if [ -z "$MYSCM" ]; then
+ error "no scm is given via --scm parameter (git/svn/hg/bzr)!"
+ fi
+ if [ -z "$MYURL" ]; then
+ error "no checkout URL is given via --url parameter!"
+ fi
+ if [ -z "$MYOUTDIR" ]; then
+ error "no output directory is given via --outdir parameter!"
+ fi
+ if [ -z "$MYPROJECT" ]; then
+ error "no project is given via --project parameter!"
+ fi
+ if [ -z "$MYPACKAGE" ]; then
+ error "no package is given via --package parameter!"
+ fi
+
+ FILE="$MYFILENAME"
+ WD_VERSION="$MYVERSION"
+ if [ -z "$MYPACKAGEMETA" ]; then
+ EXCLUDES="$EXCLUDES --exclude=.svn"
+ fi
+ # if [ "$MYHISTORYDEPTH" == "full" ]; then
+ # MYHISTORYDEPTH="999999999"
+ # fi
+}
+
+detect_default_filename_param () {
+ if [ -n "$FILE" ]; then
+ return
+ fi
+
+ case "$MYSCM" in
+ git)
+ FILE="${MYURL%/}"
+ FILE="${FILE##*/}"
+ FILE="${FILE%.git}"
+ FILE="${FILE#*@*:}"
+ ;;
+ svn|hg|bzr)
+ FILE="${MYURL%/}"
+ FILE="${FILE##*/}"
+ ;;
+ local)
+ FILE="temp_dir"
+ ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ esac
+}
+
+fetch_upstream () {
+ TOHASH="$MYURL"
+ [ "$MYSCM" = 'svn' ] && TOHASH="$TOHASH/$MYSUBDIR"
+ HASH=`echo "$TOHASH" | sha256sum | cut -d\ -f 1`
+ REPOCACHE=
+ CACHEDIRECTORY=/tmp/local_code/xdf
+ if [ -n "$CACHEDIRECTORY" ]; then
+ REPOCACHEINCOMING="$CACHEDIRECTORY/incoming"
+ REPOCACHEROOT="$CACHEDIRECTORY/repo"
+ REPOCACHE="$REPOCACHEROOT/$MYPROJECT/$MYPACKAGE"
+ REPOURLCACHE="$CACHEDIRECTORY/repourl/$HASH"
+ fi
+
+
+ debug "check local cache if configured"
+ if [ -n "$CACHEDIRECTORY" -a -d "$REPOCACHE/" ]; then
+ debug "cache hit: $REPOCACHE"
+ check_cache
+ else
+ if [ -n "$CACHEDIRECTORY" ]; then
+ debug "cache miss: $REPOCACHE/"
+ else
+ debug "cache not enabled"
+ fi
+
+ calc_dir_to_clone_to
+ debug "new $MYSCM checkout to $CLONE_TO"
+ initial_clone
+
+ if [ -n "$CACHEDIRECTORY" ]; then
+ #cache_repo
+ REPOPATH="$REPOCACHE"
+ else
+ REPOPATH="$MYOUTDIR/$FILE"
+ fi
+ if [ "$TAR_VERSION" == "_auto_" -o -n "$MYFORMAT" ]; then
+ detect_version
+ fi
+ #exit 22
+ fi
+
+}
+
+calc_dir_to_clone_to () {
+ if [ -n "$CACHEDIRECTORY" ]; then
+ if [ ! -d REPOCACHE ]; then
+ mkdir -p "$REPOCACHE"
+ fi
+ safe_run cd "$REPOCACHE"
+ # Use dry-run mode because git/hg refuse to clone into
+ # an empty directory on SLES11
+ #debug mktemp -u -d "tmp.XXXXXXXXXX"
+ #CLONE_TO=`mktemp -u -d "tmp.XXXXXXXXXX"`
+ CLONE_TO="$REPOCACHE"
+ else
+ CLONE_TO="$FILE"
+ fi
+}
+
+initial_clone () {
+ echo "Fetching from $MYURL ..."
+
+ case "$MYSCM" in
+ git)
+ # Clone with full depth; so that the revision can be found if specified
+ safe_run git clone "$MYURL" "$CLONE_TO"
+ ;;
+ svn)
+ args=
+ [ -n "$MYREVISION" ] && args="-r$MYREVISION"
+ if [[ $(svn --version --quiet) > "1.5.99" ]]; then
+ TRUST_SERVER_CERT="--trust-server-cert"
+ fi
+ safe_run svn checkout --non-interactive $TRUST_SERVER_CERT \
+ $args "$MYURL/$MYSUBDIR" "$CLONE_TO"
+ MYSUBDIR= # repo root is subdir
+ ;;
+ local)
+ echo "xdffff: $MYURL ---- $CLONE_TO --- `pwd`"
+ safe_run cp -av $MYURL/* ./
+ if [ -d "$MYURL/.svn" ]; then
+ safe_run cp -av $MYURL/.svn ./
+ fi
+ ;;
+ hg)
+ safe_run hg clone "$MYURL" "$CLONE_TO"
+ ;;
+ bzr)
+ args=
+ [ -n "$MYREVISION" ] && args="-r $MYREVISION"
+ safe_run bzr checkout $args "$MYURL" "$CLONE_TO"
+ ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ esac
+}
+
+cache_repo () {
+ if [ -e "$REPOCACHE" ]; then
+ error "Somebody else beat us to populating the cache for $MYURL ($REPOCACHE)"
+ else
+ # FIXME: small race window here; do source services need to be thread-safe?
+ if [ ! -d $REPOCACHE ]; then
+ mkdir -p $REPOCACHE
+ fi
+ debug mv2 "$CLONE_TO" "$REPOCACHE"
+ safe_run mv "$CLONE_TO" "$REPOCACHE"
+ echo "$MYURL" > "$REPOURLCACHE"
+ echo "Cached $MYURL at $REPOCACHE"
+ fi
+}
+
+check_cache () {
+ if [ -d "$MYURL/.svn" ]; then
+ new_version=`LC_ALL=C svn info "$MYURL" | sed -n 's,^Last Changed Rev: \(.*\),\1,p'`
+ else
+ new_version="new_version"
+ fi
+ if echo "$MYURL" | grep '/$' &> /dev/null; then
+ new_version="new_version"
+ fi
+ if [ -d "$REPOCACHE/.svn" ]; then
+ old_version=`LC_ALL=C svn info "$REPOCACHE" | sed -n 's,^Last Changed Rev: \(.*\),\1,p'`
+ else
+ old_version="old_version"
+ fi
+ #echo "xdf: $new_version $old_version"
+ #if [ "$new_version" != "$old_version" ]; then
+ echo "The code has changed for $MYPROJECT/$MYPACKAGE"
+ rm -rf "$REPOCACHE"
+
+ calc_dir_to_clone_to
+ debug "new $MYSCM checkout to $CLONE_TO"
+ initial_clone
+
+ if [ -n "$CACHEDIRECTORY" ]; then
+ #cache_repo
+ REPOPATH="$REPOCACHE"
+ else
+ REPOPATH="$MYOUTDIR/$FILE"
+ fi
+
+ safe_run cd "$REPOPATH"
+ switch_to_revision
+ if [ "$TAR_VERSION" == "_auto_" -o -n "$MYFORMAT" ]; then
+ detect_version
+ fi
+ #else
+ # echo "No code is changed, so exit 22"
+ # exit 22
+ #fi
+}
+
+update_cache () {
+ safe_run cd "$REPOCACHE"
+
+ case "$MYSCM" in
+ git)
+ safe_run git fetch
+ ;;
+ svn)
+ args=
+ [ -n "$MYREVISION" ] && args="-r$MYREVISION"
+ safe_run svn update $args > svnupdate_info
+ isupdate=`cat svnupdate_info | wc -l`
+ if [ $isupdate -eq 1 ]; then
+ rm -f svnupdate_info
+ echo "There is no code update, so exit 22"
+ exit 22
+ fi
+ MYSUBDIR= # repo root is subdir
+ ;;
+ hg)
+ if ! out=`hg pull`; then
+ if [[ "$out" == *'no changes found'* ]]; then
+ # Contrary to the docs, hg pull returns exit code 1 when
+ # there are no changes to pull, but we don't want to treat
+ # this as an error.
+ :
+ else
+ error "hg pull failed; aborting!"
+ fi
+ fi
+ ;;
+ bzr)
+ args=
+ [ -n "$MYREVISION" ] && args="-r$MYREVISION"
+ safe_run bzr update $args
+ ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ esac
+}
+
+switch_to_revision () {
+ case "$MYSCM" in
+ git)
+ safe_run git checkout "$MYREVISION"
+ if git branch | grep -q '^\* (no branch)$'; then
+ echo "$MYREVISION does not refer to a branch, not attempting git pull"
+ else
+ safe_run git pull
+ fi
+ ;;
+ svn|bzr|local)
+ : # should have already happened via checkout or update
+ ;;
+ hg)
+ safe_run hg update "$MYREVISION"
+ ;;
+ # bzr)
+ # safe_run bzr update
+ # if [ -n "$MYREVISION" ]; then
+ # safe_run bzr revert -r "$MYREVISION"
+ # fi
+ # ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ esac
+}
+
+detect_version () {
+ if [ -z "$MYFORMAT" ]; then
+ case "$MYSCM" in
+ git)
+ MYFORMAT="%at"
+ ;;
+ hg)
+ MYFORMAT="{rev}"
+ ;;
+ svn|bzr)
+ MYFORMAT="%r"
+ ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ ;;
+ esac
+ fi
+
+ safe_run cd "$REPOPATH"
+ if [ -n "$MYFORMAT" ];then
+ MYPREFIX="$MYFORMAT"
+ else
+ get_version
+ fi
+ TAR_VERSION="$MYPREFIX$version"
+}
+
+get_version () {
+ case "$MYSCM" in
+ git)
+ #version=`safe_run git show --pretty=format:"$MYFORMAT" | head -n 1`
+ version=`safe_run git log -n1 --pretty=format:"$MYFORMAT"`
+ ;;
+ svn)
+ #rev=`LC_ALL=C safe_run svn info | awk '/^Revision:/ { print $2 }'`
+ rev=`LC_ALL=C safe_run svn info | sed -n 's,^Last Changed Rev: \(.*\),\1,p'`
+ version="${MYFORMAT//%r/$rev}"
+ ;;
+ hg)
+ rev=`safe_run hg id -n`
+ version=`safe_run hg log -l1 -r$rev --template "$MYFORMAT"`
+ ;;
+ bzr)
+ #safe_run bzr log -l1 ...
+ rev=`safe_run bzr revno`
+ version="${MYFORMAT//%r/$rev}"
+ ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ esac
+}
+
+prep_tree_for_tar () {
+ if [ ! -e "$REPOPATH/$MYSUBDIR" ]; then
+ error "directory does not exist: $REPOPATH/$MYSUBDIR"
+ fi
+
+ if [ -z "$TAR_VERSION" ]; then
+ TAR_BASENAME="$FILE"
+ else
+ TAR_BASENAME="${FILE}-${TAR_VERSION}"
+ fi
+
+ MYINCLUDES=""
+
+ for INC in $INCLUDES; do
+ MYINCLUDES="$MYINCLUDES $INC"
+ done
+ if [ -z "$MYINCLUDES" ]; then
+ MYINCLUDES="*"
+ fi
+
+ safe_run cd "$MYOUTDIR"
+
+ if [ -n "$CACHEDIRECTORY" ]; then
+ debug cp -a "$REPOPATH/$MYSUBDIR" "$TAR_BASENAME"
+ safe_run cp -a "$REPOPATH/$MYSUBDIR" "$TAR_BASENAME"
+ else
+ debug mv3 "$REPOPATH/$MYSUBDIR" "$TAR_BASENAME"
+ safe_run mv "$REPOPATH/$MYSUBDIR" "$TAR_BASENAME"
+ fi
+}
+
+create_tar () {
+ safe_run cd "$TAR_BASENAME"
+
+ TARFILE="${TAR_BASENAME}.tar.bz2"
+ TARPATH="$MYOUTDIR/$TARFILE"
+
+ for INC in $MYINCLUDES; do
+ if [ "$INC" = ".$MYSCM" ]; then
+ continue
+ fi
+
+ if echo "$EXCLUDES" | grep -w $INC >/dev/null
+ then
+ continue
+ fi
+
+ if [ -d $INC ]; then
+ #safe_run tar jcf "$MYOUTDIR/$INC.tar.bz2" --exclude=.$MYSCM --exclude=.svn $INC
+ safe_run tar Pcf "$MYOUTDIR/$INC.tar" --exclude=.$MYSCM --exclude=.svn $INC
+ continue
+ fi
+
+ safe_run cp $INC "$MYOUTDIR/"
+ done
+
+ echo "Created $TARFILE"
+ safe_run cd "$MYOUTDIR"
+}
+
+cleanup () {
+ debug rm -rf "$TAR_BASENAME" "$FILE"
+ #rm -rf "$TAR_BASENAME" "$FILE"
+ rm -rf "$TAR_BASENAME"
+}
+
+main () {
+ set_default_params
+ #xdf
+ DEBUG_TAR_SCM=1
+
+ if [ -z "$DEBUG_TAR_SCM" ]; then
+ get_config_options
+ else
+ # We're in test-mode, so don't let any local site-wide
+ # or per-user config impact the test suite.
+ :
+ fi
+ parse_params "$@"
+ sanitise_params
+
+ SRCDIR=$(pwd)
+ cd "$MYOUTDIR"
+ #echo "$SRCDIR $MYOUTDIR"
+ detect_default_filename_param
+
+ #xdf
+ #LOGFILE=/srv/local_code/xdf/log/$MYPROJECT/$MYPACKAGE
+ #mkdir -p "/srv/local_code/xdf/log/$MYPROJECT"
+
+ lockfile=$LOGFILE".lock"
+ if [ -f $lockfile ]; then
+ mypid=`cat $lockfile`
+ # while ps -p $mypid -o comm= &> /dev/null
+ # do
+ # sleep 10
+ # mypid=`cat $lockfile`
+ # done
+ rm -f $lockfile
+ fi
+ touch $lockfile
+ echo "$$" > $lockfile
+
+ #exec 6>&1
+ #exec > $LOGFILE
+ echo "$@"
+ echo "myurl === $MYURL"
+ fetch_upstream
+
+ prep_tree_for_tar
+ create_tar
+
+ cleanup
+ rm -f $lockfile
+}
+
+main "$@"
+
+exit 0
diff --git a/src/build/tar_local_kernels b/src/build/tar_local_kernels
new file mode 100644
index 0000000000000000000000000000000000000000..2e26233df814a1dc04e55ddf58ada74666f803fe
--- /dev/null
+++ b/src/build/tar_local_kernels
@@ -0,0 +1,585 @@
+#!/bin/bash
+
+# A simple script to checkout or update a svn or git repo as source service
+#
+# (C) 2010 by Adrian Schröter
+#
+# This program is free software; you can redistribute it and/or
+# modify it under the terms of the GNU General Public License
+# as published by the Free Software Foundation; either version 2
+# of the License, or (at your option) any later version.
+# See http://www.gnu.org/licenses/gpl-2.0.html for full license text.
+
+set -x
+SERVICE='tar_scm'
+
+set_default_params () {
+ MYSCM=""
+ MYURL=""
+ #MYVERSION="_auto_"
+ MYVERSION="222"
+ MYFORMAT=""
+ MYPREFIX=""
+ MYFILENAME=""
+ MYREVISION=""
+ MYPACKAGEMETA=""
+# MYHISTORYDEPTH=""
+ INCLUDES=""
+}
+
+get_config_options () {
+ # config options for this host ?
+ if [ -f /etc/obs/services/$SERVICE ]; then
+ . /etc/obs/services/$SERVICE
+ fi
+ # config options for this user ?
+ if [ -f "$HOME"/.obs/$SERVICE ]; then
+ . "$HOME"/.obs/$SERVICE
+ fi
+}
+
+parse_params () {
+ while test $# -gt 0; do
+ case $1 in
+ *-scm)
+ MYSCM="$2"
+ shift
+ ;;
+ *-url)
+ MYURL="$2"
+ CI_PRO_NAME=${MYURL%%/*}
+ TEMP_URL="$MYURL"
+ MYURL=$TEMP_URL
+ shift
+ ;;
+ *-subdir)
+ MYSUBDIR="$2"
+ shift
+ ;;
+ *-revision)
+ MYREVISION="$2"
+ shift
+ ;;
+ *-version)
+ MYVERSION="$2"
+ shift
+ ;;
+ *-include)
+ INCLUDES="$INCLUDES $2"
+ shift
+ ;;
+ *-versionformat)
+ MYFORMAT="$2"
+ shift
+ ;;
+ *-versionprefix)
+ MYPREFIX="$2"
+ shift
+ ;;
+ *-exclude)
+ EXCLUDES="$EXCLUDES --exclude=${2#/}"
+ shift
+ ;;
+ *-filename)
+ MYFILENAME="${2#/}"
+ shift
+ ;;
+ *-package-meta)
+ MYPACKAGEMETA="${2#/}"
+ shift
+ ;;
+ *-outdir)
+ MYOUTDIR="$2"
+ shift
+ ;;
+ *-history-depth)
+ echo "history-depth parameter is obsolete and will be ignored"
+ shift
+ ;;
+ *-project)
+ MYPROJECT="$2"
+ shift
+ ;;
+ *-package)
+ MYPACKAGE="$2"
+ shift
+ ;;
+ *)
+ echo "Unknown parameter: $1"
+ echo 'Usage: $SERVICE --scm $SCM --url $URL [--subdir $SUBDIR] [--revision $REVISION] [--version $VERSION] [--include $INCLUDE]* [--exclude $EXCLUDE]* [--versionformat $FORMAT] [--versionprefix $PREFIX] [--filename $FILENAME] [--package-meta $META] --outdir $OUT'
+ exit 1
+ ;;
+ esac
+ shift
+ done
+}
+
+error () {
+ echo "ERROR: $*"
+ exit 1
+}
+
+debug () {
+ [ -n "$DEBUG_TAR_SCM" ] && echo "$*"
+}
+
+safe_run () {
+ if ! "$@"; then
+ error "$* failed; aborting!"
+ fi
+}
+
+sanitise_params () {
+ TAR_VERSION="$MYVERSION"
+
+ if [ -z "$MYSCM" ]; then
+ error "no scm is given via --scm parameter (git/svn/hg/bzr)!"
+ fi
+ if [ -z "$MYURL" ]; then
+ error "no checkout URL is given via --url parameter!"
+ fi
+ if [ -z "$MYOUTDIR" ]; then
+ error "no output directory is given via --outdir parameter!"
+ fi
+ if [ -z "$MYPROJECT" ]; then
+ error "no project is given via --project parameter!"
+ fi
+ if [ -z "$MYPACKAGE" ]; then
+ error "no package is given via --package parameter!"
+ fi
+
+ FILE="$MYFILENAME"
+ WD_VERSION="$MYVERSION"
+ if [ -z "$MYPACKAGEMETA" ]; then
+ EXCLUDES="$EXCLUDES --exclude=.svn"
+ fi
+ # if [ "$MYHISTORYDEPTH" == "full" ]; then
+ # MYHISTORYDEPTH="999999999"
+ # fi
+}
+
+detect_default_filename_param () {
+ if [ -n "$FILE" ]; then
+ return
+ fi
+
+ case "$MYSCM" in
+ git)
+ FILE="${MYURL%/}"
+ FILE="${FILE##*/}"
+ FILE="${FILE%.git}"
+ FILE="${FILE#*@*:}"
+ ;;
+ svn|hg|bzr)
+ FILE="${MYURL%/}"
+ FILE="${FILE##*/}"
+ ;;
+ local)
+ FILE="temp_dir"
+ ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ esac
+}
+
+fetch_upstream () {
+ TOHASH="$MYURL"
+ [ "$MYSCM" = 'svn' ] && TOHASH="$TOHASH/$MYSUBDIR"
+ HASH=`echo "$TOHASH" | sha256sum | cut -d\ -f 1`
+ REPOCACHE=
+ CACHEDIRECTORY=/tmp/local_code/xdf
+ if [ -n "$CACHEDIRECTORY" ]; then
+ REPOCACHEINCOMING="$CACHEDIRECTORY/incoming"
+ REPOCACHEROOT="$CACHEDIRECTORY/repo"
+ REPOCACHE="$REPOCACHEROOT/$MYPROJECT/$MYPACKAGE"
+ REPOURLCACHE="$CACHEDIRECTORY/repourl/$HASH"
+ fi
+
+
+ debug "check local cache if configured"
+ if [ -n "$CACHEDIRECTORY" -a -d "$REPOCACHE/" ]; then
+ debug "cache hit: $REPOCACHE"
+ check_cache
+ else
+ if [ -n "$CACHEDIRECTORY" ]; then
+ debug "cache miss: $REPOCACHE/"
+ else
+ debug "cache not enabled"
+ fi
+
+ calc_dir_to_clone_to
+ debug "new $MYSCM checkout to $CLONE_TO"
+ initial_clone
+
+ if [ -n "$CACHEDIRECTORY" ]; then
+ #cache_repo
+ REPOPATH="$REPOCACHE"
+ else
+ REPOPATH="$MYOUTDIR/$FILE"
+ fi
+ if [ "$TAR_VERSION" == "_auto_" -o -n "$MYFORMAT" ]; then
+ detect_version
+ fi
+
+ #exit 22
+ fi
+
+}
+
+calc_dir_to_clone_to () {
+ if [ -n "$CACHEDIRECTORY" ]; then
+ if [ ! -d REPOCACHE ]; then
+ mkdir -p "$REPOCACHE"
+ fi
+ safe_run cd "$REPOCACHE"
+ # Use dry-run mode because git/hg refuse to clone into
+ # an empty directory on SLES11
+ #debug mktemp -u -d "tmp.XXXXXXXXXX"
+ #CLONE_TO=`mktemp -u -d "tmp.XXXXXXXXXX"`
+ CLONE_TO="$REPOCACHE"
+ else
+ CLONE_TO="$FILE"
+ fi
+}
+
+initial_clone () {
+ echo "Fetching from $MYURL ..."
+
+ case "$MYSCM" in
+ git)
+ # Clone with full depth; so that the revision can be found if specified
+ safe_run git clone "$MYURL" "$CLONE_TO"
+ ;;
+ svn)
+ args=
+ [ -n "$MYREVISION" ] && args="-r$MYREVISION"
+ if [[ $(svn --version --quiet) > "1.5.99" ]]; then
+ TRUST_SERVER_CERT="--trust-server-cert"
+ fi
+ safe_run svn checkout --non-interactive $TRUST_SERVER_CERT \
+ $args "$MYURL/$MYSUBDIR" "$CLONE_TO"
+ MYSUBDIR= # repo root is subdir
+ ;;
+ local)
+ echo "xdffff: $MYURL ---- $CLONE_TO --- `pwd`"
+ MYURL=`echo $MYURL | sed 's#\./##g' | sed 's/[ /]*$//g'`
+ pkgname=`basename $MYURL`
+ safe_run mkdir $pkgname
+ safe_run cp -av $MYURL/* $pkgname
+ safe_run mv $pkgname/*.spec .
+ if [ -f /usr/bin/rpmspec ]
+ then
+ version=`rpmspec -q --srpm --qf %{Version} *.spec`
+ else
+ version=`grep "^Version:*" *.spec | awk -F: '{print $2}' | sed 's/[ ]*//g'`
+ fi
+ pkg="${pkgname}-${version}"
+ safe_run mv $pkgname $pkg
+ if [ -d "$MYURL/.svn" ]; then
+ safe_run cp -av $MYURL/.svn ./
+ fi
+ ;;
+ hg)
+ safe_run hg clone "$MYURL" "$CLONE_TO"
+ ;;
+ bzr)
+ args=
+ [ -n "$MYREVISION" ] && args="-r $MYREVISION"
+ safe_run bzr checkout $args "$MYURL" "$CLONE_TO"
+ ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ esac
+}
+
+cache_repo () {
+ if [ -e "$REPOCACHE" ]; then
+ error "Somebody else beat us to populating the cache for $MYURL ($REPOCACHE)"
+ else
+ # FIXME: small race window here; do source services need to be thread-safe?
+ if [ ! -d $REPOCACHE ]; then
+ mkdir -p $REPOCACHE
+ fi
+ debug mv2 "$CLONE_TO" "$REPOCACHE"
+ safe_run mv "$CLONE_TO" "$REPOCACHE"
+ echo "$MYURL" > "$REPOURLCACHE"
+ echo "Cached $MYURL at $REPOCACHE"
+ fi
+}
+
+check_cache () {
+ if [ -d "$MYURL/.svn" ]; then
+ new_version=`LC_ALL=C svn info "$MYURL" | sed -n 's,^Last Changed Rev: \(.*\),\1,p'`
+ else
+ new_version="new_version"
+ fi
+ if echo "$MYURL" | grep '/$' &> /dev/null; then
+ new_version="new_version"
+ fi
+ if [ -d "$REPOCACHE/.svn" ]; then
+ old_version=`LC_ALL=C svn info "$REPOCACHE" | sed -n 's,^Last Changed Rev: \(.*\),\1,p'`
+ else
+ old_version="old_version"
+ fi
+ #echo "xdf: $new_version $old_version"
+ #if [ "$new_version" != "$old_version" ]; then
+ echo "The code has changed for $MYPROJECT/$MYPACKAGE"
+ rm -rf "$REPOCACHE"
+
+ calc_dir_to_clone_to
+ debug "new $MYSCM checkout to $CLONE_TO"
+ initial_clone
+
+ if [ -n "$CACHEDIRECTORY" ]; then
+ #cache_repo
+ REPOPATH="$REPOCACHE"
+ else
+ REPOPATH="$MYOUTDIR/$FILE"
+ fi
+
+ safe_run cd "$REPOPATH"
+ switch_to_revision
+ if [ "$TAR_VERSION" == "_auto_" -o -n "$MYFORMAT" ]; then
+ detect_version
+ fi
+ #else
+ # echo "No code is changed, so exit 22"
+ # exit 22
+ #fi
+}
+
+update_cache () {
+ safe_run cd "$REPOCACHE"
+
+ case "$MYSCM" in
+ git)
+ safe_run git fetch
+ ;;
+ svn)
+ args=
+ [ -n "$MYREVISION" ] && args="-r$MYREVISION"
+ safe_run svn update $args > svnupdate_info
+ isupdate=`cat svnupdate_info | wc -l`
+ if [ $isupdate -eq 1 ]; then
+ rm -f svnupdate_info
+ echo "There is no code update, so exit 22"
+ exit 22
+ fi
+ MYSUBDIR= # repo root is subdir
+ ;;
+ hg)
+ if ! out=`hg pull`; then
+ if [[ "$out" == *'no changes found'* ]]; then
+ # Contrary to the docs, hg pull returns exit code 1 when
+ # there are no changes to pull, but we don't want to treat
+ # this as an error.
+ :
+ else
+ error "hg pull failed; aborting!"
+ fi
+ fi
+ ;;
+ bzr)
+ args=
+ [ -n "$MYREVISION" ] && args="-r$MYREVISION"
+ safe_run bzr update $args
+ ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ esac
+}
+
+switch_to_revision () {
+ case "$MYSCM" in
+ git)
+ safe_run git checkout "$MYREVISION"
+ if git branch | grep -q '^\* (no branch)$'; then
+ echo "$MYREVISION does not refer to a branch, not attempting git pull"
+ else
+ safe_run git pull
+ fi
+ ;;
+ svn|bzr|local)
+ : # should have already happened via checkout or update
+ ;;
+ hg)
+ safe_run hg update "$MYREVISION"
+ ;;
+ # bzr)
+ # safe_run bzr update
+ # if [ -n "$MYREVISION" ]; then
+ # safe_run bzr revert -r "$MYREVISION"
+ # fi
+ # ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ esac
+}
+
+detect_version () {
+ if [ -z "$MYFORMAT" ]; then
+ case "$MYSCM" in
+ git)
+ MYFORMAT="%at"
+ ;;
+ hg)
+ MYFORMAT="{rev}"
+ ;;
+ svn|bzr)
+ MYFORMAT="%r"
+ ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ ;;
+ esac
+ fi
+
+ safe_run cd "$REPOPATH"
+ [ -n "$MYPREFIX" ] && MYPREFIX="$MYPREFIX."
+ get_version
+ TAR_VERSION="$MYPREFIX$version"
+}
+
+get_version () {
+ case "$MYSCM" in
+ git)
+ #version=`safe_run git show --pretty=format:"$MYFORMAT" | head -n 1`
+ version=`safe_run git log -n1 --pretty=format:"$MYFORMAT"`
+ ;;
+ svn)
+ #rev=`LC_ALL=C safe_run svn info | awk '/^Revision:/ { print $2 }'`
+ rev=`LC_ALL=C safe_run svn info | sed -n 's,^Last Changed Rev: \(.*\),\1,p'`
+ version="${MYFORMAT//%r/$rev}"
+ ;;
+ hg)
+ rev=`safe_run hg id -n`
+ version=`safe_run hg log -l1 -r$rev --template "$MYFORMAT"`
+ ;;
+ bzr)
+ #safe_run bzr log -l1 ...
+ rev=`safe_run bzr revno`
+ version="${MYFORMAT//%r/$rev}"
+ ;;
+ *)
+ error "unknown SCM '$MYSCM'"
+ esac
+}
+
+prep_tree_for_tar () {
+ if [ ! -e "$REPOPATH/$MYSUBDIR" ]; then
+ error "directory does not exist: $REPOPATH/$MYSUBDIR"
+ fi
+
+ if [ -z "$TAR_VERSION" ]; then
+ TAR_BASENAME="$FILE"
+ else
+ TAR_BASENAME="${FILE}-${TAR_VERSION}"
+ fi
+
+ MYINCLUDES=""
+
+ for INC in $INCLUDES; do
+ MYINCLUDES="$MYINCLUDES $INC"
+ done
+ if [ -z "$MYINCLUDES" ]; then
+ MYINCLUDES="*"
+ fi
+
+ safe_run cd "$MYOUTDIR"
+
+ if [ -n "$CACHEDIRECTORY" ]; then
+ debug cp -a "$REPOPATH/$MYSUBDIR" "$TAR_BASENAME"
+ safe_run cp -a "$REPOPATH/$MYSUBDIR" "$TAR_BASENAME"
+ else
+ debug mv3 "$REPOPATH/$MYSUBDIR" "$TAR_BASENAME"
+ safe_run mv "$REPOPATH/$MYSUBDIR" "$TAR_BASENAME"
+ fi
+}
+
+create_tar () {
+ safe_run cd "$TAR_BASENAME"
+
+ TARFILE="${TAR_BASENAME}.tar.bz2"
+ TARPATH="$MYOUTDIR/$TARFILE"
+
+ for INC in $MYINCLUDES; do
+ if [ "$INC" = ".$MYSCM" ]; then
+ continue
+ fi
+
+ if echo "$EXCLUDES" | grep -w $INC >/dev/null
+ then
+ continue
+ fi
+
+ if [ -d $INC ]; then
+ #safe_run tar jcf "$MYOUTDIR/$INC.tar.bz2" --exclude=.$MYSCM --exclude=.svn $INC
+ safe_run tar Pcf "$MYOUTDIR/$INC.tar" --exclude=.$MYSCM --exclude=.svn $INC
+ continue
+ fi
+
+ safe_run cp $INC "$MYOUTDIR/"
+ done
+
+ echo "Created $TARFILE"
+ safe_run cd "$MYOUTDIR"
+}
+
+cleanup () {
+ debug rm -rf "$TAR_BASENAME" "$FILE"
+ rm -rf "$TAR_BASENAME" "$FILE"
+}
+
+main () {
+ set_default_params
+ #xdf
+ DEBUG_TAR_SCM=1
+
+ if [ -z "$DEBUG_TAR_SCM" ]; then
+ get_config_options
+ else
+ # We're in test-mode, so don't let any local site-wide
+ # or per-user config impact the test suite.
+ :
+ fi
+ parse_params "$@"
+ sanitise_params
+
+ SRCDIR=$(pwd)
+ cd "$MYOUTDIR"
+ #echo "$SRCDIR $MYOUTDIR"
+ detect_default_filename_param
+
+ #xdf
+ #LOGFILE=/srv/local_code/xdf/log/$MYPROJECT/$MYPACKAGE
+ #mkdir -p "/srv/local_code/xdf/log/$MYPROJECT"
+
+ lockfile=$LOGFILE".lock"
+ if [ -f $lockfile ]; then
+ mypid=`cat $lockfile`
+ # while ps -p $mypid -o comm= &> /dev/null
+ # do
+ # sleep 10
+ # mypid=`cat $lockfile`
+ # done
+ rm -f $lockfile
+ fi
+ touch $lockfile
+ echo "$$" > $lockfile
+
+ #exec 6>&1
+ #exec > $LOGFILE
+ echo "$@"
+ echo "myurl === $MYURL"
+ fetch_upstream
+
+ prep_tree_for_tar
+ create_tar
+
+ cleanup
+ rm -f $lockfile
+}
+
+main "$@"
+
+exit 0
diff --git a/src/constant.py b/src/constant.py
new file mode 100644
index 0000000000000000000000000000000000000000..d948d8683478489240321b90f2279609f0ebfe78
--- /dev/null
+++ b/src/constant.py
@@ -0,0 +1,117 @@
+# -*- encoding=utf-8 -*-
+"""
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2021-12-22
+# Description: class Constant
+# **********************************************************************************
+"""
+
+
+class Constant(object):
+ """
+ class Constant
+ """
+
+ SUPPORT_ARCH = ["x86_64", "aarch64"]
+
+ GITEE_BRANCH_PROJECT_MAPPING = {
+ "master": ["bringInRely", "openEuler:Extras", "openEuler:Factory", "openEuler:Mainline", "openEuler:Epol",
+ "openEuler:BaseTools", "openEuler:C", "openEuler:Common_Languages_Dependent_Tools",
+ "openEuler:Erlang", "openEuler:Golang", "openEuler:Java", "openEuler:KernelSpace", "openEuler:Lua",
+ "openEuler:Meson", "openEuler:MultiLanguage", "openEuler:Nodejs", "openEuler:Ocaml",
+ "openEuler:Perl", "openEuler:Python", "openEuler:Qt", "openEuler:Ruby"],
+ "openEuler-20.03-LTS": ["openEuler:20.03:LTS"],
+ "openEuler-20.03-LTS-Next": ["openEuler:20.03:LTS:Next", "openEuler:20.03:LTS:Next:Epol"],
+ "openEuler-EPOL-LTS": ["bringInRely"],
+ "openEuler-20.09": ["openEuler:20.09", "openEuler:20.09:Epol", "openEuler:20.09:Extras"],
+ "mkopeneuler-20.03": ["openEuler:Extras"],
+ "openEuler-20.03-LTS-SP1": ["openEuler:20.03:LTS:SP1", "openEuler:20.03:LTS:SP1:Epol",
+ "openEuler:20.03:LTS:SP1:Extras"],
+ "openEuler-20.03-LTS-SP2": ["openEuler:20.03:LTS:SP2", "openEuler:20.03:LTS:SP2:Epol",
+ "openEuler:20.03:LTS:SP2:Extras"],
+ "openEuler-21.03": ["openEuler:21.03", "openEuler:21.03:Epol", "openEuler:21.03:Extras"],
+ "openEuler-21.09": ["openEuler:21.09", "openEuler:21.09:Epol", "openEuler:21.09:Extras"],
+ "openEuler-20.03-LTS-SP3": ["openEuler:20.03:LTS:SP3", "openEuler:20.03:LTS:SP3:Epol"],
+ "openEuler-22.03-LTS-Next": ["openEuler:22.03:LTS:Next", "openEuler:22.03:LTS:Next:Epol"],
+ "openEuler-22.03-LTS": ["openEuler:22.03:LTS", "openEuler:22.03:LTS:Epol"],
+ "openEuler-22.03-LTS-SP1": ["openEuler:22.03:LTS:SP1", "openEuler:22.03:LTS:SP1:Epol"],
+ "openEuler-22.09": ["openEuler:22.09", "openEuler:22.09:Epol"],
+ "Multi-Version_obs-server-2.10.11_openEuler-22.09": [
+ "openEuler:22.09:Epol:Multi-Version:obs-server:2.10.11",
+ "openEuler:22.09", "openEuler:22.09:Epol"],
+ "oepkg_openstack-train_oe-20.03-LTS-SP1": ["openEuler:20.03:LTS:SP1:oepkg:openstack:train",
+ "openEuler:20.03:LTS:SP1",
+ "openEuler:20.03:LTS:SP1:Epol"],
+ "oepkg_openstack-common_oe-20.03-LTS-SP2": ["openEuler:20.03:LTS:SP2:oepkg:openstack:common",
+ "openEuler:20.03:LTS:SP2"],
+ "oepkg_openstack-queens_oe-20.03-LTS-SP2": ["openEuler:20.03:LTS:SP2:oepkg:openstack:queens",
+ "openEuler:20.03:LTS:SP2:oepkg:openstack:common",
+ "openEuler:20.03:LTS:SP2"],
+ "oepkg_openstack-rocky_oe-20.03-LTS-SP2": ["openEuler:20.03:LTS:SP2:oepkg:openstack:rocky",
+ "openEuler:20.03:LTS:SP2:oepkg:openstack:common",
+ "openEuler:20.03:LTS:SP2"],
+ "oepkg_openstack-common_oe-20.03-LTS-Next": ["openEuler:20.03:LTS:Next:oepkg:openstack:common",
+ "openEuler:20.03:LTS:Next"],
+ "oepkg_openstack-queens_oe-20.03-LTS-Next": ["openEuler:20.03:LTS:Next:oepkg:openstack:queens",
+ "openEuler:20.03:LTS:Next:oepkg:openstack:common",
+ "openEuler:20.03:LTS:Next"],
+ "oepkg_openstack-rocky_oe-20.03-LTS-Next": ["openEuler:20.03:LTS:Next:oepkg:openstack:rocky",
+ "openEuler:20.03:LTS:Next:oepkg:openstack:common",
+ "openEuler:20.03:LTS:Next"],
+ "oepkg_openstack-common_oe-20.03-LTS-SP3": ["openEuler:20.03:LTS:SP3:oepkg:openstack:common",
+ "openEuler:20.03:LTS:SP3"],
+ "oepkg_openstack-queens_oe-20.03-LTS-SP3": ["openEuler:20.03:LTS:SP3:oepkg:openstack:queens",
+ "openEuler:20.03:LTS:SP3:oepkg:openstack:common",
+ "openEuler:20.03:LTS:SP3"],
+ "oepkg_openstack-rocky_oe-20.03-LTS-SP3": ["openEuler:20.03:LTS:SP3:oepkg:openstack:rocky",
+ "openEuler:20.03:LTS:SP3:oepkg:openstack:common",
+ "openEuler:20.03:LTS:SP3"],
+ "Multi-Version_OpenStack-Train_openEuler-22.03-LTS-Next": [
+ "openEuler:22.03:LTS:Next:Epol:Multi-Version:OpenStack:Train",
+ "openEuler:22.03:LTS:Next", "openEuler:22.03:LTS:Next:Epol"],
+ "Multi-Version_OpenStack-Wallaby_openEuler-22.03-LTS-Next": [
+ "openEuler:22.03:LTS:Next:Epol:Multi-Version:OpenStack:Wallaby",
+ "openEuler:22.03:LTS:Next", "openEuler:22.03:LTS:Next:Epol"],
+ "Multi-Version_OpenStack-Train_openEuler-22.03-LTS": [
+ "openEuler:22.03:LTS:Epol:Multi-Version:OpenStack:Train",
+ "openEuler:22.03:LTS", "openEuler:22.03:LTS:Epol"],
+ "Multi-Version_OpenStack-Wallaby_openEuler-22.03-LTS": [
+ "openEuler:22.03:LTS:Epol:Multi-Version:OpenStack:Wallaby",
+ "openEuler:22.03:LTS", "openEuler:22.03:LTS:Epol"],
+ "Multi-Version_OpenStack-Train_openEuler-22.03-LTS-SP1": [
+ "openEuler:22.03:LTS:SP1:Epol:Multi-Version:OpenStack:Train",
+ "openEuler:22.03:LTS:SP1", "openEuler:22.03:LTS:SP1:Epol"],
+ "Multi-Version_OpenStack-Wallaby_openEuler-22.03-LTS-SP1": [
+ "openEuler:22.03:LTS:SP1:Epol:Multi-Version:OpenStack:Wallaby",
+ "openEuler:22.03:LTS:SP1", "openEuler:22.03:LTS:SP1:Epol"],
+ "Multi-Version_obs-server-2.10.11_openEuler-22.03-LTS": [
+ "openEuler:22.03:LTS:Epol:Multi-Version:obs-server:2.10.11",
+ "openEuler:22.03:LTS", "openEuler:22.03:LTS:Epol"],
+ "openEuler-22.03-LTS-LoongArch": [
+ "openEuler:22.03:LTS:LoongArch", "openEuler:22.03:LTS", "openEuler:22.03:LTS:Epol"],
+ "openEuler-22.03-LTS-performance": [
+ "gcc-performance", "openEuler:22.03:LTS", "openEuler:22.03:LTS:Epol"],
+ "Multi-Version_obs-server-2.10.11_openEuler-22.03-LTS-SP1": [
+ "openEuler:22.03:LTS:SP1:Epol:Multi-Version:obs-server:2.10.11",
+ "openEuler:22.03:LTS:SP1", "openEuler:22.03:LTS:SP1:Epol"]
+ }
+
+ COMPARE_PACKAGE_BLACKLIST = [
+ r'^/etc/ima/digest_lists/0-metadata_list-compact*',
+ r'^/etc/ima/digest_lists.tlv/0-metadata_list-compact_tlv*'
+ ]
+
+ STOPPED_MAINTENANCE_BRANCH = ["openeuler-20.03-lts-next", "openeuler-20.03-lts", "openeuler-20.03-lts-sp2",
+ "openeuler-21.03", "openeuler-21.09"]
+
+ ALARM_LTS_BRANCH = ["openeuler-20.03-lts-sp1", "openeuler-20.03-lts-sp3", "openeuler-22.03-lts"]
diff --git a/src/logger.py b/src/logger.py
new file mode 100644
index 0000000000000000000000000000000000000000..8e70244dd06d20ab06ca3fbaffb3022ab0f1167f
--- /dev/null
+++ b/src/logger.py
@@ -0,0 +1,54 @@
+# -*- encoding=utf-8 -*-
+"""
+# **********************************************************************************
+# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved.
+# [openeuler-jenkins] is licensed under the Mulan PSL v2.
+# You can use this software according to the terms and conditions of the Mulan PSL v2.
+# You may obtain a copy of Mulan PSL v2 at:
+# http://license.coscl.org.cn/MulanPSL2
+# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
+# EITHER EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT,
+# MERCHANTABILITY OR FIT FOR A PARTICULAR PURPOSE.
+# See the Mulan PSL v2 for more details.
+# Author:
+# Create: 2021-3-21
+# Description: ci mistake report to database by kafka
+# **********************************************************************************
+"""
+import os
+import logging
+import sys
+
+from src.utils.color_log import CusColoredFormatter
+
+
+def singleton(cls):
+ instances = {}
+
+ def _singleton(*args, **kwargs):
+ if cls not in instances:
+ instances[cls] = cls(*args, **kwargs)
+ return instances[cls]
+
+ return _singleton
+
+
+@singleton
+class Logger:
+ def __init__(self):
+ self.logger = logging.getLogger('MyLogger')
+ self.logger.setLevel(logging.DEBUG)
+ console_handler = logging.StreamHandler(sys.stdout)
+ console_handler.setLevel(logging.INFO)
+ console_formatter = CusColoredFormatter(fmt="%(log_color)s%(asctime)s [%(levelname)7s] : %(message)s")
+ console_handler.setFormatter(console_formatter)
+ self.logger.addHandler(console_handler)
+
+ file_handler = logging.FileHandler("{}/ci.log".format(os.path.dirname(__file__)))
+ file_handler.setLevel(logging.DEBUG)
+ file_formatter = logging.Formatter("%(asctime)s %(filename)20s[line:%(lineno)3d] %(levelname)7s : %(message)s")
+ file_handler.setFormatter(file_formatter)
+ self.logger.addHandler(file_handler)
+
+
+logger = Logger().logger
diff --git a/src/requirements b/src/requirements
new file mode 100644
index 0000000000000000000000000000000000000000..7d34070afd36a0242b173d47d037f55727678ba0
--- /dev/null
+++ b/src/requirements
@@ -0,0 +1,14 @@
+requests
+jenkinsapi
+colorlog
+threadpool
+PyYAML
+gevent
+jsonpath
+mock
+tldextract
+chardet
+kafka-python
+elasticsearch
+retrying
+scanoss
diff --git a/start.png b/start.png
deleted file mode 100644
index bbd69d6259a61d01bdb54276a06d2f78e27afbca..0000000000000000000000000000000000000000
Binary files a/start.png and /dev/null differ
|