diff --git a/packageship/.gitignore b/packageship/.gitignore deleted file mode 100644 index 2027f52cd9ebda8e294616082ebbf3cec258f56a..0000000000000000000000000000000000000000 --- a/packageship/.gitignore +++ /dev/null @@ -1,4 +0,0 @@ -.DS_Store -*/.DS_Store -*.pyc -*.vscode \ No newline at end of file diff --git a/packageship/README.md b/packageship/README.md deleted file mode 100644 index d349b57d295f1537b3319f0941c4515318690eba..0000000000000000000000000000000000000000 --- a/packageship/README.md +++ /dev/null @@ -1,382 +0,0 @@ -# pkgship - - - -- [pkgship](#pkgship) - - [介绍](#介绍) - - [架构](#架构) - - [软件下载](#软件下载) - - [运行环境](#运行环境) - - [安装工具](#安装工具) - - [配置参数](#配置参数) - - [服务启动和停止](#服务启动和停止) - - [工具使用](#工具使用) - - - -## 介绍 -pkgship是一款管理OS软件包依赖关系,提供依赖和被依赖关系完整图谱的查询工具,pkgship提供软件包依赖查询、生命周期管理、补丁查询等功能。 - -1. 软件包依赖查询:方便社区人员在软件包引入、更新和删除的时候了解软件的影响范围。 -2. 生命周期管理:跟踪上游软件包发布状态,方便维护人员了解当前软件状态,及时升级到合理的版本。 -3. 补丁查询:方便社区人员了解openEuler软件包的补丁情况以及提取补丁内容,详细内容请参见[patch-tracking](../patch-tracking/README.md)。 - -## 架构 - -系统采用flask-restful开发,使用SQLAlchemy ORM查询框架。 - - - -## 软件下载 - -* Repo源挂载正式发布地址: -* 源码获取地址: -* rpm包版本获取地址: - -## 运行环境 - -* 可用内存700M以上 -* python版本 3.8及以上 -* sqlite版本 3.32及以上 - -## 安装工具 -工具安装可通过以下两种方式中的任意一种实现。 - -* 方法一,通过dnf挂载repo源实现。 - 先使用dnf挂载pkgship软件在所在repo源(具体方法可参考[应用开发指南](https://openeuler.org/zh/docs/20.09/docs/ApplicationDev/%E5%BC%80%E5%8F%91%E7%8E%AF%E5%A2%83%E5%87%86%E5%A4%87.html)),然后执行如下指令下载以及安装pkgship及其依赖。 - - ```bash - dnf install pkgship - ``` -* 方法二,通过安装rpm包实现。 - 先下载pkgship的rpm包,然后执行如下命令进行安装(其中“x.x-x”表示版本号,请用实际情况代替)。 - - ```bash - rpm -ivh pkgship-x.x-x.oe1.noarch.rpm - ``` - - 或者 - - ```bash - dnf install pkgship-x.x-x.oe1.noarch.rpm - ``` - -## 配置参数 - -1. 在配置文件中对相应参数进行配置,系统的默认配置文件存放在 /etc/pkgship/packge.ini,请根据实际情况进行配置更改。 - - ```basn - vim /etc/pkgship/package.ini - ``` - - ```ini - [系统配置] - - ; 初始化数据库时导入的yaml文件存放位置,该yaml中记录导入的sqlite文件位置 - init_conf_path=/etc/pkgship/conf.yaml - - ; 存放成功导入的sqlite文件的地址 - data_base_path=/var/run/pkgship_dbs - - ; 写接口 - write_port=8080 - - ; 读接口 - query_port=8090 - - ; 写权限访问ip - write_ip_addr=127.0.0.1 - - ; 读权限访问ip - query_ip_addr=127.0.0.1 - - ; 远程服务的地址,命令行可以直接调用远程服务来完成数据请求, 只需在每个命令行后加 -remote参数 - remote_host=https://api.openeuler.org/pkgmanage - - [LOG] - - ; 日志存放路径 - log_path=/var/log/pkgship/ - - ; 打印日志级别,支持如下: - ; INFO DEBUG WARNING ERROR CRITICAL - log_level=INFO - - ; 日志名称 - log_name=log_info.log - - ; 日志文件大小达到上限后动态创建的日志的数量 - backup_count=10 - - ; 每个日志文件的大小 - max_bytes=314572800 - - [UWSGI服务配置] - - ; uwsgi 日志路径 - daemonize=/var/log/uwsgi.log - - ; 前后端传输数据大小 - buffer-size=65536 - - ; HTTP Connection time - http-timeout=600 - - ; Server response time - harakiri=600 - - [TIMEDTASK] - - ; 是否开启定时任务 - open=True - - ; 设定定时任务触发的时间 - hour=3 - minute=0 - - [LIFECYCLE] - ; 每个包的yaml地址的存储远程地址 - warehouse_remote=https://gitee.com/openeuler/openEuler-Advisor/raw/master/upstream-info/ - - ; 在执行定时任务时,可以打开多线程执行,并且可以根据服务器的配置设置线程池中的线程数 - pool_workers=10 - - ; 仓库的名字 - warehouse=src-openeuler - - ``` - -2. 创建初始化数据库的yaml配置文件: - conf.yaml 文件默认存放在 /etc/pkgship/ 路径下,pkgship会通过该配置读取要建立的数据库名称以及需要导入的sqlite文件。conf.yaml 示例如下所示。 - - ```yaml - - dbname: openEuler-20.09 - src_db_file: /etc/pkgship/src.sqlite - bin_db_file: /etc/pkgship/bin.sqlite - lifecycle: enable - priority: 1 - ``` - - > 如需更改存放路径,请更改package.ini下的 init_conf_path 选项。 - -## 服务启动和停止 - -pkgship使用uWSGI web服务器,启动和停止命令如下所示,可指定只启动读(写)服务,或同时启动。 - -```bash -pkgshipd start [manage/selfpkg] - -pkgshipd stop [manage/selfpkg] -``` - -## 工具使用 - -1. 数据库初始化。 - > 使用场景:服务启动后,为了能查询对应的数据库(比如mainline, openEuler-20.09)中的包信息及包依赖关系,需要将这些数据库通过createrepo生成的sqlite(分为源码库和二进制库)导入进服务内,生成对应的db文件。当conf.yaml里配置数据库的参数项lifecycle声明为enable的时候,在lifecycle.db中会生成一张对应的表,用于记录数据库信息,后续需要读取数据库表名称(tablename)的操作会从此文件读取,[-filepath]为可选参数。 - - ```bash - pkgship init [-filepath path] - ``` - - > 参数说明: - > -filepath:指定初始化配置文件的路径,可以使用相对路径和绝对路径,不带参数则使用默认配置初始化。 - -2. 单包查询。 - - 用户可查询具体源码包(packagename)在指定数据库表(tablename)中的信息。 - > 使用场景:用户可查询具体源码包在指定数据库中的信息,packagename,tablename为必选参数。 - - ```bash - pkgship single packagename tablename - ``` - - > 参数说明: - > packagename:指定要查询的源码包名。 - > tablename:指定具体的数据库名称。 - -3. 所有包查询。 - - 查询数据库下包含的所有包的信息。 - > 使用场景:用户可查询指定数据库下包含的所有软件包信息。其中tablename为必选参数,[-packagename],[-maintainer]为可选参数。 - - ```bash - pkgship list tablename [-packagename pkgName] [-maintainer maintainer] - ``` - - > 参数说明: - > tablename:指定具体的数据库名称。 - > -packagename:可以匹配到包名中包含参数字符串的包。 - > -maintainer:可以匹配到maintainer为参数的包。 - -4. 安装依赖查询。 - - 查询二进制包(binaryName)的安装依赖。 - > 使用场景:用户需要安装某个二进制包A时,需要将该二进制包A的安装依赖B,及B的安装依赖C等等,直至所有的安装依赖全部安装到系统才能成功安装二进制包A。因此,在用户安装二进制包A之前,可能会需要查询二进制包A的所有安装依赖。该命令提供了此功能,允许用户根据平台默认的优先级在多个数据库之间进行查询;同时也支持用户自定义数据库查询优先级。 - - ```bash - pkgship installdep binaryName [-dbs dbName1 dbName2...] - ``` - - > 参数说明: - > -dbs:具体指定查询数据库的顺序优先级,dbName为具体的数据库名称。 - -5. 编译依赖查询。 - - 查询源码包(sourceName)的所有编译依赖。 - > 使用场景:用户要编译某个源码包A的时候,需要安装源码包A的编译依赖B, 要成功安装编译依赖B需要获取B的所有安装依赖。因此,在用户编译源码包A之前,可能会需要查询源码包的编译依赖以及这些编译依赖的所有安装依赖。该命令提供了此功能,允许用户根据平台默认的优先级在多个数据库之间进行查询;同时也支持用户自定义数据库查询优先级。 - - ```bash - pkgship builddep sourceName [-dbs dbName1 dbName2...] - ``` - - > 参数说明: - > -dbs:具体指定查询数据库的顺序优先级,dbName为具体的数据库名称。 - -6. 自编译自安装依赖查询。 - - 查询指定二进制包(binaryName)或源码包(sourceName )的安装及编译依赖,其中[pkgName]为查询的二进制包或者源码包的名称。当查询二进制包时,可以查询到该二进制包的所有安装依赖以及该二进制包对应的源码包的编译依赖,及这些编译依赖的所有安装依赖;当查询源码包时,可以查询该源码包的编译依赖,及这些编译依赖的所有安装依赖,并且查询该源码包生成的所有二进制包的所有安装依赖。同时,配合对应参数使用,该命令也支持查询软件包的自编译依赖查询,和包含子包的依赖查询。 - - > 使用场景:如果开发者想在现有的版本库的基础上引入新的软件包,应同时引入该软件包的所有编译、安装依赖。该命令提供开发者一个同时查询这两种依赖关系的功能,能让开发者知晓该软件包会引入哪些其他的包,该命令支持查询二进制包和源码包。 - - ```bash - pkgship selfbuild [pkgName] [-dbs dbName1 dbName2 ] [-t source] [-s 1] [-w 1] - ``` - - > 参数说明: - > -dbs 指定数据库优先级,dbName为具体的数据库名称,使用示例如下。 - - > ``` bash - > pkgship selfbuild pkgName -dbs dbName1 dbName2 - > ``` - - > -t source/binary 指定查询包名pkgName为源码包还是二进制包,不加-t时,默认为二进制包。 - > -s 增加该参数表示查询软件包的所有安装依赖和所有编译依赖(即编译依赖的源码包的编译依赖),以及所有编译依赖的安装依赖。其中-s参数后面的0表示不查询自编译依赖,1表示查询自编译依赖,默认为0,可以指定1。如果不增加-s参数表示只查询软件包的所有安装依赖和一层编译依赖,以及一层编译依赖的所有安装依赖,查询自编译使用示例如下。 - - > ```bash - > pkgship selfbuild pkgName -t source -s 1 - > ``` - - > -w 增加该参数表示引入某个二进制包的时候,查询结果会显示出该二进制包对应的源码包以及该源码包生成的所有二进制包。其中-w参数后面的0表示不查询对应子包,1表示查询对应子包,默认为0,可以指定1。如果不增加-w参数表示引入某个二进制包的时候,查询结果只显示对应的源码包,查询子包使用示例如下。 - - > ```bash - > pkgship selfbuild pkgName -w 1 - > ``` - -7. 被依赖查询。 - 查询源码包(sourceName)在某数据库(dbName)中被哪些包所依赖。 - > 使用场景:针对软件源码包A,在升级或删除的情况下会影响哪些软件包,可通过该命令查询。该命令会显示源码包A生成的所有二进制包被哪些源码包(比如B)编译依赖,被哪些二进制包(比如C1)安装依赖;以及B生成的二进制包及C1被哪些源码包(比如D)编译依赖,被哪些二进制包(比如E1)安装依赖,以此类推,遍历这些二进制包的被依赖,[-w 0/1]为可选参数,使用示例如下。 - ```bash - pkgship bedepend sourceName dbName [-w 1] - ``` - - > 参数说明: - > -w (0/1):当命令后不带配置参数或者[-w 0] 时,查询结果默认不包含对应二进制包的子包;当命令后带配置参数[-w 1] 时,不仅会查询二进制包C1的被依赖关系,还会进一步去查询C1对应的源码包C生成的其他二进制包(比如:C2,C3)的被依赖关系。 - -8. 包信息记录修改。 - > 使用场景: 用户可以修改指定源码包的维护人和维护级别。[-packagename],[-maintainer],[-maintainlevel],[-filefolder],[--batch]为可选参数。 - - 当前有两种修改方式: - 第一种,通过指定源码包名(packagename),修改源码包的维护人(Newmaintainer)和维护级别(Newmaintainlevel),示例如下。 - ```bash - pkgship updatepkg [-packagename packagename] [-maintainer Newmaintainer] [-maintainlevel Newmaintainlevel] - ``` - > 参数说明: - > -packagename:指定需要维护的包名。 - > -maintainer:指定更新包的维护人。 - > -maintainlevel:指定更新包的维护级别,值在1~4之间,默认为1。 - - 第二种,通过指定文件路径(path),批量更新包的维护人和维护级别, 该命令必须指定添加--batch参数,示例如下。 - ```bash - pkgship updatepkg [--batch] [-filefolder path] - ``` - > 参数说明: - > -filefolder: 指定存放包信息的yaml文件,指定的目录仅能包含更新的yaml文件。 - > --batch:指定批量更新,需要和[-filefolder]参数项一起使用。 - - 用户可以通过创建文件名A.yaml指定包名为A,指定yaml内容来修改包信息。 - 包信息的yaml格式如下: - ``` - maintainer:Newmaintainlevel - maintainlevel: Newmaintainlevel - ``` - -9. 数据库删除。 - > 使用场景: 删除指定数据库(dbName)。 - - ```bash - pkgship rm dbName - ``` - -10. 表信息查询。 - > 使用场景: 查看当前生命周期数据库中存在的所有数据表。 - - ```bash - pkgship tables - ``` - -11. issue查询。 - > 使用场景: 查看所有的源码包下的所有issue的信息。可选参数[-packagename],[-issue_type],[-issue_status],[-maintainer],[-page N],[-pagesize pageSize]。 - - ```bash - pkgship issue [-packagename pkgName],[-issue_type issueType],[-issue_status issueStatus],[-maintainer maintainer],[-page N],[-pagesize pageSize] - ``` - - > 参数说明: - > -packagename: 指定包名进行模糊查询。 - > -issue_type: 指定issue类型进行查询。 - > -issue_status: 指定issue状态进行查询。 - > -maintainer: 指定维护人进行查询。 - > -page: 指定查询第N页的数据。 - > -pagesize: 指定每页显示的数据条目数pageSize。 - - ```bash - 指定包名进行模糊查询示例: - pkgship issue -packagename pkgName - ``` - - ```bash - 指定issue类型进行查询示例: - pkgship issue -issue_type issueType - ``` - - ```bash - 指定issue状态进行查询示例: - pkgship issue -issue_status issueStatus - ``` - - ```bash - 指定维护人进行查询示例: - pkgship issue -maintainer maintainer - ``` - - ```bash - 指定查询第N页的数据示例: - pkgship issue -page N - ``` - - ```bash - 指定每页显示的数据条目数pageSize示例: - pkgship issue -pagesize pageSize - ``` - -12. 更新软件包的生命周期。 - - > 使用场景: 用户可指定更新生命周期表中所有软件包的issue信息,维护人和维护级别。可选参数[--issue],[--package]。 - - ```bash - pkgship update [--issue] [--package] - ``` - - > 参数说明: - > --issue: 指定更新生命周期表中所有软件包的issue信息,根据生命周期中表的软件包名去gitee爬取软件包对应的issue信息。 - > --package: 指定更新生命周期表中所有软件包的生命周期,维护人和维护级别。 - - ```bash - 更新生命周期表中所有软件包的issue信息示例: - pkgship update --issue - ``` - - ```bash - 更新生命周期表中所有软件包的生命周期,维护人和维护级别示例: - pkgship update --package - ``` diff --git a/packageship/conf.yaml b/packageship/conf.yaml deleted file mode 100644 index b4d29ee3ccb517e9a394af6d9bb0f1869ca73a4c..0000000000000000000000000000000000000000 --- a/packageship/conf.yaml +++ /dev/null @@ -1,21 +0,0 @@ -# During initialization, you can initialize multiple databases at once - -# dbname: The name of the database to be initialized -# It is recommended to use the version number as the name of the database - -# src_db_file:SQLite files related to the source package provided in the repo source - -# bin_db_file:SQLite files related to the binary package provided in the repo source - -# lifecycle:Does the imported source package data need to be stored in the life cycle database -# When you need to store in the life cycle, you need to set the value to enable -# When you do not need to store in the life cycle, you need to set the value to disable - -# priority: The priority of the database,When querying dependencies, which library is -# the first to find data the value can only be 1、 2、 3、 4 - -- dbname: $database_name - src_db_file: $path.sqlite - bin_db_file: $path.sqlite - lifecycle: enable/disable - priority: 1/2/3/4 \ No newline at end of file diff --git a/packageship/doc/design/packageLifeCycle.md b/packageship/doc/design/packageLifeCycle.md deleted file mode 100644 index bbf7e10431a840c5085277c0a0dfddeff9125cf0..0000000000000000000000000000000000000000 --- a/packageship/doc/design/packageLifeCycle.md +++ /dev/null @@ -1,773 +0,0 @@ -# pkgship 2.0 —— 生命周期 设计文档 - -## 特性描述 - -- SR-PKG-MANAGE02-AR01: 包静态信息建档管理 -- SR-PKG-MANAGE02-AR02: 包动态信息跟踪(生命周期) -- SR-PKG-MANAGE01-AR11: 包补丁分类管理 -- SR-PKG-MANAGE03-AR01: 特性与包关联模块 -- SR-PKG-MANAGE01-AR12: 支持网页前端显示 -- SR-PKG-MANAGE02-AR02:支持软件责任人及软件维护级别的存储和更改 - -## 依赖组件 - -- git, svn, pypi -- openEuler-Advisor/ upstream-info - -## License - -Mulan V2 - -## 流程分析 - -### 外部接口清单 - -| 序号 | 接口名称 | 类型 | 说明 | 特性号 | 涉及内部函数 | -| - | - | - | - | - | - | -| 1 | /packages | GET | 支持查看所有软件包静态信息、对应特性、动态信息及issue数量统计信息 | all | - | -| 2 | /packages/tablecol | GET | 支持获取前端显示的列表信息名称 | MANAGE01-AR12 | - | -| 3 | /lifeCycle/tables | GET | 支持查看package-info数据库中所有表名 | MANAGE01-AR12 | - | -| 4 | /lifeCycle/maintainer | GET | 支持查看package-info数据库中指定表名内的所有maintainer | MANAGE02-AR02 | -| 5 | /lifeCycle/download/packages | GET | 支持下载package-info数据库中指定表名内的所有包信息 | MANAGE02-AR02 & SR-PKG-MANAGE02-AR01 | -| 6 | /lifeCycle/download/issues | GET | 支持下载package-info数据库中issue表中对应版本的所有信息 | MANAGE01-AR11 | -| 7 | /packages/packageInfo | GET | 支持查看指定软件包静态信息、对应特性、动态信息及issue数量统计信息以及一层依赖关系 | all | - | -| 8 | /lifeCycle/issueTrace | GET | 支持查看package-info数据库中issue表中对应版本的信息 | MANAGE01-AR11 | - | -| 9 | /lifeCycle/issueType | GET | 支持查看issue的类型 | MANAGE01-AR12 | - | -| 10 | /lifeCycle/issuStatus | GET | 支持查看issue的状态 | MANAGE01-AR12 | - | -| 11 | /lifeCycle/updatePkgInfo | PUT | 支持更新指定软件包信息字段 | MANAGE02-AR02 | - | -| 12 | /lifeCycle/importdata | POST | 批量更新或导入生命周期软件包 | MANAGE02-AR02 | - | -| 13 | /lifeCycle/issueCatch | PUT | 支持获取gietee上src_openEuler仓库中的新增issue信息 | MANAGE01-AR11 | - | - -#### 1. 包关键信息获取接口 - -- API: /packages - -- HTTP请求方式: GET - -- 数据提交方式: application/json - -- 请求参数: - - | 参数名 | 必选 | 类型 | 说明 | - | - | - | - | - | - | table_name | True | string | 数据库pkginfo下的表名,如:mainline, bringInRely| - | page_num | True | int | 当前所在页数| - | page_size | True | int | 每页显示的条数| - | query_pkg_name | False | string | 源码包名,模糊匹配 | - | maintainner | False | string | 维护人名称 | - | maintainlevel | False | int | 软件包维护级别 | - | maintain_status | False | string | 软件包维护状态 | - -- 请求参数示例: - - ```json - { - "table_name" : "Mainline", - "page_num": "1", - "page_size": "2", - "query_pkg_name" : "dnf", - "maintainner": "ruebb", - "maintainlevel": "4" - } - ``` - -- 返回体参数: - - | 参数名 | 类型 | 说明 | - | - | - | - | - | total_count | int | 总条数 | - | total_page | int | 总页数 | - | id | int | | - | name | string | 包名 | - | version | string | 版本号 | - | release | string | release号 | - | url | string | url地址 | - | rpm_license | string | license | - | feature | string | 特性 | - | maintainer | string | 维护人 | - | maintainlevel | int | 软件包维护级别 | - | release_time | string | 当前版本发布时间 | - | used_time | string | 当天减去所用版本的发布日期 | - | latest_version | string | 最新版本号 | - | latest_version_time | string | 最新版本发布时间 | - | issue | int | 该软件包仓库下的issue总数 | - -- 返回体参数示例: - - ```json - { - "code": "200", - "total_count": 10309, - "total_page": 10, - "data": [{ - "id":1, - "name": "dnf", - "version": "3.0.1", - "release": "oe1.3", - "url": "www.gitee.com/src_openEuler/dnf", - "rpm_license":"Mulan" , - "feature": "", - "maintainer": "ruebb", - "maintainlevel": "4", - "release_time":"2020-01-02", - "used_time":"180d", - "maintainer_status": "available", - "latest_version":"3.4.2", - "latest_version_time ":"2020--5-26", - "issue":"13" - }, - { - "id":2, - "name": "", - "version": "", - "release": "", - "url":"", - "rpm_license":"" , - "feature": "", - "maintainer": "", - "maintainlevel": "", - "release_time":"", - "end_of_lifecycle":"", - "maintainer_status":"", - "latest_version":"", - "latest_version_time ":"", - "issue":"14" - } - ], - "msg": "" - } - ``` - - | 状态码 | 场景 | 提示信息 | - | - | - | - | - | 200 | 成功 | | - | 400 | 失败 | | - | 500 | 服务器内部错误 | | - -#### 2. 列表信息获取接口 - -- API:/packages/tablecol - -- HTTP请求方式: GET - -- 数据提交方式: application/json - -- 请求参数: - - | 参数名 | 必选 | 类型 | 说明 | - | - | - | - | - | - | table_name | True | string | 数据库pkginfo下的表名,如:mainline, bringInRely| - -- 请求参数示例: - - ```json - { - "table_name" : "Mainline" - } - ``` - -- 返回体参数: - - | 参数名 | 类型 | 说明 | - | - | - | - | - | data | list | 包展示列信息列表 | - | column_name | string | 列名 | - | default_selected | boolean | 默认显示在列中 | - -- 返回体参数示例: - - ```json - { - "code": "200", - "data": [{ - "column_name": "name", - "default_selected": true, - "label": "Name" - }, - { - "column_name": "version", - "default_selected": true, - "label": "Version" - }, - { - "column_name": "release", - "default_selected": true, - "label": "Release" - }, - { - "column_name": "url", - "default_selected": true, - "label": "Url" - }, - { - "column_name": "rpm_license", - "default_selected": false, - "label": "License" - }, - { - "column_name": "feature", - "default_selected": false, - "label": "Feature" - }, - { - "column_name": "maintainer", - "default_selected": true, - "label": "Maintainer" - }, - { - "column_name": "maintainlevel", - "default_selected": true, - "label": "Maintenance Level" - }, - { - "column_name": "release_time", - "default_selected": false, - "label": "Release Time" - }, - { - "column_name": "used_time", - "default_selected": true, - "label": "Used Time" - }, - { - "column_name": "latest_version", - "default_selected": false, - "label": "Latest Version" - }, - { - "column_name": "latest_version_time", - "default_selected": false, - "label": "Latest Version Release Time" - }, - { - "column_name": "issue", - "default_selected": true, - "label": "Issue" - } - ], - "msg": "" - } - ``` - -#### 3. 版本库列表获取接口 - -- API: /lifeCycle/tables - -- HTTP请求方式: GET - -- 数据提交方式: application/json - -- 请求参数:null - -- 返回体参数: - - | 参数名 | 类型 | 说明 | - | - | - | - | - | data | list | 版本库名称列表 | - -- 返回体参数示例: - - ```json - { - "code": "200", - "data":["openEuler-20.03", "openEuler-20.09", "master"], - "msg": "" - } - ``` - -#### 4.maintainer列表获取接口 - -- API:/lifeCycle/maintainer - -- HTTP请求方式: GET - -- 数据提交方式: application/json - -- 请求参数: - - | 参数名 | 必选 | 类型 | 说明 | - | - | - | - | - | - | table_name | True | string | 数据库pkginfo下的表名,如:mainline, bringInRely| - -- 请求参数示例: - - ```json - { - "table_name" : "Mainline" - } - ``` - -- 返回体参数: - - | 参数名 | 类型 | 说明 | - | - | - | - | - | data | list | 维护人列表 | - -- 返回体参数示例: - - ```json - { - "code": "200", - "data":["ruebb", "jinjin", "solar-hu", "mayunbaba"] - } - ``` - -#### 5. 包所有信息Excel获取接口(待更新) - -- API: /lifeCycle/download/packages - -- HTTP请求方式: GET - -- 数据提交方式: application/json - -- 请求参数: - -| 参数名 | 类型 | 说明 | -| - | - | - | -| table_name | str | 下载的版本库名称 | - -- 返回体:二进制流(excel表格) - -#### 6. 包所有issue Excel获取接口(待更新) - -- API: /lifeCycle/download/issues - -- HTTP请求方式: GET - -- 数据提交方式: application/json - -- 请求参数:null - -- 返回体:二进制流(excel表格) - -#### 7. 包详细信息获取接口 - -- API:/packages/packageInfo - -- HTTP请求方式: GET - -- 数据提交方式: application/json - -- 请求参数: - - | 参数名 | 必选 | 类型 | 说明 | - | - | - | - | - | - | table_name | True | string | 数据库pkginfo下的表名,如:mainline, bringInRely| - | pkg_name | True | string | 源码包名 | - -- 请求参数示例: - - ```json - { - "table_name" : "Mainline", - "pkg_name" : "dnf" - } - ``` - -- 返回体参数: - - | 参数名 | 类型 | 说明 | - | - | - | - | - | pkg_name | string | 源码包包名 | - | version | string | 版本号 | - | release | string | release号 | - | url | string | 上游社区链接 | - | license | string | license | - | feature | string | 特性 | - | maintainer | string | 维护人 | - | maintainlevel | int | 软件包维护级别 | - | issue | int | issue数量 | - | gitee_url | string | 软件包在码云的链接 | - | summary | string | summary | - | decription | string | description | - | buildrequired | list | 源码包的编译依赖 | - | | | | - | subpack | list | 源码包对应的二进制包列表 | - | id | int | 二进制包id | - | name | string | 二进制包名 | - | | | | - | provides | list |二进制包提供的组件列表 | - | id | int | 提供的组件id | - | name | string | 提供的组件名称 | - | requiredby | list | 依赖该组件的二进制包列表 | - | | | | - | requires | list | 此二进制包依赖的组件列表 | - | id | int | 依赖的组件id | - | name | string | 依赖的组件名称 | - | providedby | string | 提供该组件的二进制包 | - -- 返回体参数示例: - - ```json - { - "code":200, - "msg":"", - "data":{ - "pkg_name": "dnf", - "version":"3.0.2", - "release":"oe.13", - "url":"github.com", - "license":"GLV", - "feature":"rpm-management", - "maintainer":"ruebb", - "maintainlevel": 4, - "gitee_url": "www.gitee.com/src_openEuler/dnf", - "issue":12, - "summary":"xxxxxxxxxxxxxxxxx", - "description":"xxxxxxxxxx", - "buildrequired":["gcc","make"], - "subpack":[ - { - "id":1, - "name":"dnf", - "provides":[ - { - "id":1, - "name":"dnf = 4.2.15-8.oe1", - "requiredby":["yum","supermin","kiwi-systemdeps","dnf-plugins-core","mock"] - } - ], - "requires":[ - { - "id":1, - "name":"libreport-filesystem", - "providedby":"libreport" - } - ] - } - ] - } - } - ``` - -#### 8. issue获取接口 - -- API: /lifeCycle/issueTrace - -- HTTP请求方式: GET - -- 数据提交方式: application/json - -- 请求参数: - - | 参数名 | 必选 | 类型 | 说明 | - | - | - | - | - | - | table_name | True | string | 数据库pkginfo下的表名,如:mainline, bringInRely| - | page_num | True | Int | 当前所在页数 | - | page_size | True | Int | 每页显示条数 | - | pkg_name | False | string | 源码包名 | - | issue_type | False | string | issue 类型 | - | issue_status | False | string | issue 状态 | - | maintainer | False | string | 软件包负责人 | - - - 请求参数示例: - - ```json - { - "table_name" : "Mainline", - "pkg_name" : "dnf", - "page_num": "1", - "page_size": "2" - } - ``` - -- 返回体参数: - - | 参数名 | 类型 | 说明 | - | - | - | - | - | total_count | int | 总条数 | - | total_page | int | 总页数 | - | issue_id | int | issue id | - | pkg_name | string | 包名 | - | table_name | string |版本名称 | - | issue_url | string | 码云issue链接 | - | issue_title | string | issue 主题 | - | issue_content | string | issue 内容 | - | issue_status | string | issue 状态 | - | issue_type | string | issue 类型 | - | maintainer | string | 软件包维护人 | - -- 返回体参数示例: - - ```json - { - "code":200, - "msg":'', - "data":[ - { - "total_count": 10309, - "total_page": 10, - "issue_id": "I1PGWQ", - "pkg_name": "PyYaml", - "table_name": "Mainline", - "issue_url": "https://gitee.com/openeuler/openEuler-Advisor/issues/I1PGWQ", - "issue_title":"get_yaml 接口的返回值类型有str和bool,对上层调用判断不太友好,建议修", - "issue_content":"def get_yaml(self, pkg): pass\n", - "issue_status": "open", - "issue_type": "demand", - "maintainer": "ruebb" - }, - { - "issue_id": "I1OQW0", - "pkg_name": "PyYaml", - "table_name": "Mainline", - "issue_url": "https://gitee.com/openeuler/openEuler-Advisor/issues/I1OQW0", - "issue_title":"【CI加固】对识别修改对周边组件和升级影响,提交构建支持接口变更检查", - "issue_content": "1.支持C/C++接口变更检查\n2.支持/etc下的配置文件变化检查\n\n检查结果以提示信息反馈给提交人", - "issue_status": "open", - "issue_type": "bug", - "maintainer": "ruebb" - - }, - { - "issue_id": "I1O6OE", - "pkg_name": "PyYaml", - "table_name": "Mainline", - "issue_url": "https://gitee.com/openeuler/openEuler-Advisor/issues/I1O6OE", - "issue_title": "【pkgship】数据库结构变化,包依赖查询的sql语句需要整改。同时需要考虑provides和requires在同一个数据库中不一一对应的情况。", - "issue_content": "涉及到的包依赖查询逻辑有:\n1.包的安装依赖查询语句\n2.包的编译依赖查询语句\n3.包的被依赖查询语句\n4.包的信息查询语句", - "issue_status": "closed", - "issue_download": "", - "issue_type": "CVE", - "maintainer": "ruebb" - - } - ] - } - ``` - -#### 9. issue type列表获取接口 - -- API: /lifeCycle/issueType - -- HTTP请求方式: GET - -- 数据提交方式: application/json - -- 请求参数:null -- 请求参数示例:null - -- 返回体参数: - - | 参数名 | 类型 | 说明 | - | - | - | - | - | data | list | issue type列表 | - -- 返回体参数示例: - - ```json - { - "code": "200", - "data":["CVE", "demand", "bug"] - } - ``` - -#### 10. issue status列表获取接口 - -- API: /lifeCycle/issueType - -- HTTP请求方式: GET - -- 数据提交方式: application/json - -- 请求参数:null -- 请求参数示例:null - -- 返回体参数: - - | 参数名 | 类型 | 说明 | - | - | - | - | - | data | list | issue status 列表 | - -- 返回体参数示例: - - ```json - { - "code": "200", - "data":["Closed", "Open", "In progress", "Rejected"] - } - ``` - -#### 11. 更新软件包信息接口 - -- API: /lifeCycle/updatePkgInfo - -- HTTP请求方式: PUT - -- 数据提交方式: application/json - -- 请求参数: - - | 参数名 | 必选 | 类型 | 说明 | - | - | - | - | - | - | table_name | True | string | 数据库pkginfo下的表名,如:mainline, bringInRely| - | pkg_name | False | string | 源码包名(当为单包更新时需要传递,批量更新时无需传递) | - | maintainer | False | string | 软件包责任人 | - | maintainlevel | False | string | 软件包维护级别 | - | batch | True | boolean | true: 批量更新 false: 单个包更新 | - | filepath | False | str | 批量更新时,存放yaml文件的文件夹路径 | - -- 请求参数示例: - - ```json - { - "table_name" : "Mainline", - "pkg_name" : "dnf", - "end_of_life": "2020-08-26", - "maintainer": "ruebb", - "maintainlevel": 4, - "batch": true, - "filepath": "/etc/upstream-info/" - } - ``` - -#### 12. 导入包信息或批量更新包(弃用,合并到pkgship init 接口中) - -- API: /lifeCycle/importdata - -- HTTP请求方式:POST - -- 数据提交方式:application/json - -- 请求参数: - - | 参数名 | 必选 | 类型 | 说明 | - | - | - | - | - | - | dbpath | True | str | 导入的数据库文件路径 | - | tablename | True | str | 版本库的名称(表名) | - - - conf.yaml 参数示例: - - ```yaml - - dbname: bringInRely - src_db_file: /etc/database/bringInRely_src.sqlite - bin_db_file: /etc/database/bringInRely_bin.sqlite - lifecycle: enable - priority: 2 -- dbname: fedora30 - src_db_file: /etc/database/fedora30_src.sqlite - bin_db_file: /etc/database/fedora30_bin.sqlite - lifecycle: disable - priority: 3 -- dbname: mainline - src_db_file: /etc/database/mainline_src.sqlite - bin_db_file: /etc/database/mainline_bin.sqlite - lifecycle: enable - priority: 1 -- dbname: openEuler_LTS_20.03 - src_db_file: /etc/database/openEuler-src.sqlite - bin_db_file: /etc/database/openEuler-bin.sqlite - lifecycle: enable - priority: 3 - ``` - - #### 13. 获取新增issue接口 - -- API: /lifeCycle/IssueCatch - -- HTTP请求方式:PUT - -- 数据提交方式:application/json - -- 请求参数: - - | 参数名 | 必选 | 类型 | 说明 | - | - | - | - | - | - | issue_id| True | str | Issue ID | - | pkg_name| True | str | 包名称 | - - - 请求参数示例: - - ```json - { - "issue_id" : "I1PHSL", - "pkg_name" : "units" - } - ``` - -- 返回体参数: - | 状态码 | 场景 | 提示信息 | - | - | - | - | - | 2001 | 成功 | | - | 4001 | 失败 | | - | 5001 | 服务器内部错误 | | - -### python内部函数接口清单 - -| 序号 | 接口名称 | 说明 | 入参 | 出参 | -| - | - | - | - | - | -| 1 | _blurry_pack_info| 获取满足条件的包关键信息 | table_name,page_num,page_size,querypkg_name,maintainner,maintainlevel | 列表['{包1}','{包1}'] | -| 2 | _get_table_column | 获取table的列名 | table_name | 列表['列1','列2'] | -| 3 | _get_tables | 获取lifecycle数据库下tables的名字 | 无 | 列表['表名1','表名2'] | -| 4 | _get_maintainers | 获取lifecycle数据库下对应table内的maintainer列表 | table_name | 列表['维护人1','维护人2'] | -| 5 | _all_pack_info | 获取lifecycle数据库下对应table(版本库)内的所有包信息 | table_name | 列表['{包1}','{包1}'] | -| 6 | _get_issue_contents | 获取lifecycle数据库下对应版本库的所有issue信息 | table_name | 列表[{issue内容1},{issue内容2}] | -| 7 | _download_excel_content | 下载excel表格 | type | 文本二进制流 | -| 8 | _sub_pack | 包的一层依赖关系信息获取 | srcname,table_name | 列表['{子包1}','{子包2}'] | -| 9 | _get_specified_issue_content | 获取满足搜索条件的issue信息 | table_name,pkg_name,page_num | 列表[{issue内容1},{issue内容2}] | -| 10 | _get_issue_type | 返回issue类型 | - | 列表[issue type] | -| 11 | _get_issue_status | 返回issue状态 | - | 列表[issue status] | - -#### 外部接口请求、回显格式 - -*需和前台对齐回显格式 - -- *packages-info*: - - 静态信息:name, version, release, url, rpm_license, feature, maintainer, maintainlevel; - - 动态信息&动态信息统计:name, version,release, published time, end time, maintainer status, latest version, latest publish time - - 动态信息统计:name, version,release, 需求, cve&安全问题, 缺陷 -- *packages-info-detailed*: name, version, release, url, rpm_license, maintainer, maintainlevel, summary, description, required, subpack, subpack-provides, subpack-requires-component, subpack-requires-binary(if exist) -- *packages-issue*: list: issudId, issue-url, issue-content, issue-status, issue-download - -## 功能设计 - -### 主体流程分析 - -计算生命周期结束日期: - - -### 数据表设计 - -针对不同的版本,设计多个字段相同,表名不同的table (注:表名应于对应依赖数据库名称相同): - -- Mainline - -| 序号 | 名称 | 说明 | 类型 | 键 | 允许空 | 默认值 | -| - | - | - | - | - | - | - | -| 1 | id | 条目序号 | Int | Primary | NO | - | -| 2 | name | 源码包名 | String | NO | YES | - | -| 3 | url | URL | String | NO | YES | - | -| 4 | rpm_license | license | String | NO | YES | - | -| 5 | version | 版本号 | String | NO | YES | - | -| 6 | release | release号 | String | NO | YES | - | -| 7 | version_time | 当前版本发布时间 | String | NO | YES | - | -| 8 | end_time | 结束当前版本生命周期的时间 | String | NO | YES | - | -| 9 | maintainer_status | 生命周期状态 | String | NO | YES | "Available" | -| 10 | latest_version | 最新版本号 | String | NO | YES | - | -| 11 | latest_version_time | 最新版本发布时间 | String | NO | YES | - | -| 12 | demand | 需求 | Int | NO | NO | 0 | -| 13 | cve | cve及安全漏洞 | Int | NO | NO | 0 | -| 14 | defect | 缺陷 | Int | NO | NO | 0 | -| 15 | maintainer | 维护人 | String | NO | YES | - | -| 16 | maintainlevel | 维护级别 | Int| NO | YES | - | -| 17 | feature | 对应特性 | String | NO | YES | - | -| 18 | version_control | 版本控制(git,svn) | String | NO | YES | - | -| 19 | src_repo | 上游社区repo源 | String | NO | YES | - | -| 20 | tag_prefix | 版本标签 | String | NO | YES | - | -| 21 | summary | rpm包的summary | String | NO | YES | - | -| 22 | description | rpm包的description | String | NO | YES | - | - -回显简单事例: - - -生命周期终止时间定义: - -1. 若最新版本和当前版本一致,生命周期终止时间为最新发布日期的6个月后; -2. 若最新版本高于当前版本,生命周期终止时间为最新发布日期的3个月后。 - - - -## 遗留问题 diff --git a/packageship/doc/design/packageManagerDesigen.md b/packageship/doc/design/packageManagerDesigen.md deleted file mode 100644 index be6278c637dda3266e02faa494b979d312c7b546..0000000000000000000000000000000000000000 --- a/packageship/doc/design/packageManagerDesigen.md +++ /dev/null @@ -1,223 +0,0 @@ -#特性描述 -管理OS软件包依赖关系,提供依赖和被依赖关系的完整图谱查询功能,方便开发者识别软件包范围,减少依赖梳理复杂度。 -##原始需求-软件包依赖管理 -- 输入软件包A,支持查询A的所有编译依赖(新增软件包) -- 输入软件包A,支持查询A的所有安装依赖(新增软件包) -- 输入软件包A,支持查询所有安装依赖A的软件(升级,删除软件包场景) -- 输入软件包A,支持查询所有编译依赖A的软件(升级,删除软件包场景) - -#依赖组件 -- createrepo - -#License -Mulan V2 - -#流程分析 -##软件包依赖管理 - -###功能清单 -- SR-PKG-MANAGE01-AR01:支持repo数据库导入 -- SR-PKG-MANAGE01-AR02:支持对多个数据库分级查询(内部接口) -- SR-PKG-MANAGE01-AR03:支持软件包安装/编译依赖查询 -- SR-PKG-MANAGE01-AR04:支持软件包自编译/自安装依赖查询 -- SR-PKG-MANAGE01-AR05:支持被依赖查询 -- SR-PKG-MANAGE01-AR06:支持编译被依赖查询 -- SR-PKG-MANAGE01-AR07:支持前台查询和显示软件依赖关系 -在线评审意见平台,支持查询评审意见及溯源 - -##外部接口清单 - -| 序号 | 接口名称 | 类型 | 说明 | 入参 | 出参 | 特性号 | -| - | - | - | - | - | - | - | -| 1 | /packages | GET | 支持查看所有软件包信息 | dbName | *packages* | AR01 & AR02 | -| 2 | /packages | PUT | 支持更新指定软件包的信息 | *packages* | null | AR01 | -| 3 | /packages/findByPackName | GET | 支持查询指定软件包的信息 | packageName,dbName,version(option) | *packages* | AR01 & AR02 | -| 4 | /packages/findInstallDepend | POST | 支持查询指定软件包安装依赖(在一个或多个数据库中分级查询) | packageName,version(opetion),dbPreority | *response* | AR02 & AR03 | -| 5 | /packages/findBuildDepend | POST | 支持查询指定软件包的编译依赖(在一个或多个数据库中分级查询) | packageName、version、repoPreority | *response* | AR02 & AR03 | -| 6 | /packages/findSelfDepend | POST | 支持查询指定软件包的自安装/自编译依赖(在一个或多个数据库中分级查询) | packageName、version、repoPreority、withSubPack、withSelfBuild | packageName、installDepend、buildDepend、parentNode | AR02 & AR04 | -| 7 | /packages/findBeDepend | POST | 支持在数据库中查询指定软件包的所有被依赖 | packageName、version、repoPreority、withSubPack | packageName、installBeDepend、buildBeDepend、parentNode | AR05 | -| 8 | /repodatas | GET | 支持获取所有引入的版本库 | null | *Repodatas | AR01 | -| 9 | /repodatas | POST | 支持repo数据库的导入 | dbName、dbPath、priority、dbStatus、repofiles | null | AR01 | -| 10 | /repodatas | PUT | 支持版本库的更新 | dbName、dbPath、priority、dbStatus | null | AR01 | - -###python函数接口清单 - -| 序号 | 接口名称 | 说明 | 入参 | 出参 | -| - | - | - | - | - | -| 1 | get_packages | 支持查看所有软件包信息 | dbName | *packages* | -| 2 | update_package | 支持更新指定软件包信息 | *package* | null | -| 3 | query_package | 支持查询指定软件包的信息 | source_name,dbname,version(option) | *package* | -| 4 | query_install_depend | 支持查询指定软件包安装依赖(在一个或多个数据库中分级查询) | binary_name,version(option), db_preority | *response* | -| 5 | query_build_depend | 支持查询指定软件包的编译依赖(在一个或多个数据库中分级查询) | source_name,version(option),db_preority,selfbuild=1/0 | *response* | -| 6 | query_subpack | 支持查询指定源码包的子包 | srouce_name,version(option),db_preority | *subpack_list* | -| 7 | query_srcpack | 支持查询指定二进制包的源码包 | binary_name,version(option),dbname | srouce_name | -| 6 | query_self_depend | 支持查询指定软件包的自安装/自编译依赖(在一个或多个数据库中分级查询) | package_name,version(option),db_preority,withsubpack(default:0),withselfbuild(default:0) | *response* | -| 7 | query_self_be_depend | 支持在数据库中查询指定源码包的所有被依赖(在一个或多个数据库中分级查询) | source_name,version(option),db_preority,withsubpack(default:0) | *response* | -| 8 | get_db | 支持获取所有引入的版本库 | null | *dbinfo* | -| 9 | import_db | 支持repo数据库的导入 | *dbinfo* | null | -| 10 | update_db | 支持版本库的更新 | *dbinfo* | null | - -###外部config文件输入格式清单 -1.初始化配置文件(init_db.config) -``` -#dbname - 数据库名称,unique,不可重复 -# src_db_file - 包含源码包信息的sqlite 文件 -# bin_db_file - 包含二进制包信息的sqlite 文件 -# status - 数据库状态,enable表示可用,disable表示不可用 -# priority - 1~100 default priority for user to query the information in databases - -- dbname: openEuler-20.03-LTS - src_db_file: /etc/pkgmng/dbname/primary_src.sqlite - bin_db_file: /etc/pkgmng/dbname/primary_binary.sqlite - status: enable - priority: 1 - -- dbname: openEuler-20.04-LTS - src_db_file: testdb/src - bin_db_file: testdb/bin - status: enable - priority: 2 - -- dbname: openEuler-20.05-LTS - src_db_file: testdb/src - bin_db_file: testdb/bin - status: enable - priority: 3 -``` -2.更新数据库信息(update_db.config) -``` -- dbname: openEuler-20.03-LTS - changeDBname: openEuler-LTS - addDBFile: /etc/pkgmng/dbname/primary1.sqlite - removeDBFile: /etc/pkgmng/dbname/primary2.sqlite - status: disable - priority: 4 -``` - -3.更新包的信息(package.config) -``` -#level: 维护的优先级,1-4 -- dbname: openEuler-20.03-LTS - packageName: openssh - version: 2.99 - maintainer: solar-hu - level: 3 -``` -###object -``` - - openssh - 1.3.2 - 2-66 - GLv2 - solar-hu - http://linuxcontainers.org - lxc-4.0.1.tar.gz - openEuler-20.03-LTS - - zip-devel - libmediaart-devel - - - openssh-devel - - maven-public - tomcat - - openssh-static - openssh-help - - -``` - -``` - - openEuler - 4 - enable - -``` - -#数据表设计 -- src-pack - -| 序号 | 名称 | 说明 | 类型 | 键 | 允许空 | 默认值 | -| - | - | - | - | - | - | - | -| 1 | id | 源码包条目序号 | Int | Primary | NO | - | -| 2 | name | 源码包包名 | String | | NO | - | -| 3 | version | 版本号 | String | | NO | - | -| 4 | license | 证书 | String | | NO | - | -| 5 | sourceURL | 源码包获取地址 | String | | YES | - | -| 6 | downloadURL | 下载地址获取 | String | | YES | - | -| 7 | Maintaniner | 维护责任人 | String | | YES | - | -| 8 | MaintainLevel | 维护优先级 | String | | YES | - | - -- bin-pack - -| 序号 | 名称 | 说明 | 类型 | 键 | 允许空 | 默认值 | -| - | - | - | - | - | - | - | -| 1 | id | 二进制包条目序号 | Int | Primary | NO | - | -| 2 | name | 二进制包包名 | String | | NO | - | -| 3 | version | 版本号 | String | | NO | - | -| 4 | srcIDkey | 源码包包名ID | Int | foreignkey | NO | - | - -- pack-requires - -| 序号 | 名称 | 说明 | 类型 | 键 | 允许空 | 默认值 | -| - | - | - | - | - | - | - | -| 1 | id | 依赖组件条目序号 | Int | Primary | NO | - | -| 2 | name | 依赖组件名 | String | | NO | - | -| 3 | depProIDkey | 依赖组件对应的ID | Int | foreignkey | NO | - | -| 4 | srcIDkey | 若为源码包该值不为空,列出来的是编译依赖 | Int | foreignkey | YES | - | -| 5 | binIDkey | 若为安装包该值不为空,列出来的是安装依赖 | Int | foreignkey | YES | - | - -- pack-provides - -| 序号 | 名称 | 说明 | 类型 | 键 | 允许空 | 默认值 | -| - | - | - | - | - | - | - | -| 1 | id | 组件条目序号 | Int | Primary | NO | - | -| 2 | name | 组件名 | Int | Primary | NO | - | -| 3 | binIDkey | 提供组件的二进制包ID | Int | foreignkey | NO | - | - - -- repoCheckSame - -| 序号 | 名称 | 说明 | 类型 | 键 | 允许空 | 默认值 | -| - | - | - | - | - | - | - | -| 1 | id | repoFile条目序号 | Int | Primary | NO | - | -| 2 | name | repoFile名称 | String | | NO | - | -| 3 | md5sum | md5sum指 | String | | NO | - | - - - -#功能设计 -##主体流程分析 - - -##依赖关系梳理 -findInstallDepend: - -findBuildDepend: - -findBeDepend(withSubPack = 0): -删除源码包A造成的影响: -1.影响他的子包(A1,A2) -2.安装依赖A1,A2的二进制包 -3.编译依赖A1,A2的源码包 - -findBeDepend(withSubPack = 1): -删除源码包A造成的影响: -1.影响他的子包(A1,A2) -2.安装依赖A1,A2的二进制包(B1) -3.编译依赖A1,A2的源码包 -4.删除B1的源码包B,影响B的其他子包B2,B3 - - -#遗留问题 -- repo数据库分析,如何做数据组织 汪奕如 -- 嵌套依赖查询流程整理 汪奕如 -- svn/git监控原型验证 陈燕潘 -- gitee机机接口对齐 陈燕潘 -- 版本升级如何更新到补丁获取系统中 陈燕潘 -- web前台拓扑图UCD设计 NA -- 数据表设计 汪奕如 diff --git a/packageship/doc/design/pkgimg/Package.JPG b/packageship/doc/design/pkgimg/Package.JPG deleted file mode 100644 index 775833d113cc0d3c975fe397abb8fa73034f24da..0000000000000000000000000000000000000000 Binary files a/packageship/doc/design/pkgimg/Package.JPG and /dev/null differ diff --git a/packageship/doc/design/pkgimg/Repodatas.JPG b/packageship/doc/design/pkgimg/Repodatas.JPG deleted file mode 100644 index b835f470ed32da06374a1f3330d5b1b817840fb3..0000000000000000000000000000000000000000 Binary files a/packageship/doc/design/pkgimg/Repodatas.JPG and /dev/null differ diff --git a/packageship/doc/design/pkgimg/beDepend_1.JPG b/packageship/doc/design/pkgimg/beDepend_1.JPG deleted file mode 100644 index 8dfd56a92936f14f2afe587926ca06f5c027f2e5..0000000000000000000000000000000000000000 Binary files a/packageship/doc/design/pkgimg/beDepend_1.JPG and /dev/null differ diff --git a/packageship/doc/design/pkgimg/beDepend_2.JPG b/packageship/doc/design/pkgimg/beDepend_2.JPG deleted file mode 100644 index 4db41a623a7e32f01d1353e4f966a89755b151ac..0000000000000000000000000000000000000000 Binary files a/packageship/doc/design/pkgimg/beDepend_2.JPG and /dev/null differ diff --git a/packageship/doc/design/pkgimg/buildDepend_1.JPG b/packageship/doc/design/pkgimg/buildDepend_1.JPG deleted file mode 100644 index cd01369fbe5e3a4075c603a62adf5252b24a2401..0000000000000000000000000000000000000000 Binary files a/packageship/doc/design/pkgimg/buildDepend_1.JPG and /dev/null differ diff --git a/packageship/doc/design/pkgimg/depend_flowchart.png b/packageship/doc/design/pkgimg/depend_flowchart.png deleted file mode 100644 index 7c6447afa562b3d981df8a355b043d4ee2dadef9..0000000000000000000000000000000000000000 Binary files a/packageship/doc/design/pkgimg/depend_flowchart.png and /dev/null differ diff --git a/packageship/doc/design/pkgimg/installDepend.JPG b/packageship/doc/design/pkgimg/installDepend.JPG deleted file mode 100644 index 0f35928860281ca6ca249200e6313dc63dab621f..0000000000000000000000000000000000000000 Binary files a/packageship/doc/design/pkgimg/installDepend.JPG and /dev/null differ diff --git a/packageship/doc/design/pkgimg/issue_display.png b/packageship/doc/design/pkgimg/issue_display.png deleted file mode 100644 index 7e645889998d0f9a0a474319e41527daf7ab86ae..0000000000000000000000000000000000000000 Binary files a/packageship/doc/design/pkgimg/issue_display.png and /dev/null differ diff --git a/packageship/doc/design/pkgimg/lifecycle.png b/packageship/doc/design/pkgimg/lifecycle.png deleted file mode 100644 index 6fd2fa672b849a08545702b3b6d966f1d3175010..0000000000000000000000000000000000000000 Binary files a/packageship/doc/design/pkgimg/lifecycle.png and /dev/null differ diff --git a/packageship/doc/design/pkgimg/lifecycle_2.png b/packageship/doc/design/pkgimg/lifecycle_2.png deleted file mode 100644 index ac2959a38a3128aa03f9ad16431edbc92c141dfd..0000000000000000000000000000000000000000 Binary files a/packageship/doc/design/pkgimg/lifecycle_2.png and /dev/null differ diff --git a/packageship/doc/design/pkgimg/lifecycle_display.png b/packageship/doc/design/pkgimg/lifecycle_display.png deleted file mode 100644 index 565e6381eb3cf93fbaace5b59bf4e062b38d6f68..0000000000000000000000000000000000000000 Binary files a/packageship/doc/design/pkgimg/lifecycle_display.png and /dev/null differ diff --git a/packageship/doc/design/pkgimg/packagemanagement.JPG b/packageship/doc/design/pkgimg/packagemanagement.JPG deleted file mode 100644 index df7151a74e051051eaeb3fb16a3abb5f0e0d2e37..0000000000000000000000000000000000000000 Binary files a/packageship/doc/design/pkgimg/packagemanagement.JPG and /dev/null differ diff --git a/packageship/doc/design/pkgimg/pkgship-logo.png b/packageship/doc/design/pkgimg/pkgship-logo.png deleted file mode 100644 index 53905d2ef94063ef52a8a0069448ba5a42fa2720..0000000000000000000000000000000000000000 Binary files a/packageship/doc/design/pkgimg/pkgship-logo.png and /dev/null differ diff --git a/packageship/example/annotation_specifications.py b/packageship/example/annotation_specifications.py deleted file mode 100644 index 169f1cc34d664c974212d2c995ec3f02be28786e..0000000000000000000000000000000000000000 --- a/packageship/example/annotation_specifications.py +++ /dev/null @@ -1,88 +0,0 @@ -""" -description: Function and class annotation specifications in the project -functions: test -""" -# In the PY file, if all are functions, the format of the top information is as above, -# the description information is filled in, and the function name is filled in functions -# Args: -# List the name of each parameter with a colon and a space after the name, -# Separate the description of this parameter. -# If the description is too long than 80 characters in a single line, -# use a hanging indent of 2 or 4 spaces (consistent with the rest of the file) -# The description should include the type and meaning required -# Returns: -# Describes the type and semantics of the return value. If the function returns none, -# this part can be omitted -# Raises: -# Possible anomalies - - -def test(name, age): - """ - Description: Function description information - Args: - name: name information - age: age information - Returns: - Returned information - Raises: - IOError: An error occurred accessing the bigtable.Table object. - """ - name = 'tom' - age = 11 - return name, age - - -# description: Function and class annotation specifications in the project -# class: SampleClass -# In the PY file, if all are classes, the top information format is as above, -# description fills in the description information, class fills in the class name, -# uses three quotation marks, does not need# -# Class should have a document string under its definition that describes -# the class -# If your class has attributes, -# Then there should be an attribute section in the document -# And it should follow the same format as function parameters -class SampleClass(): - """ - Summary of class here. - Longer class information.... - Attributes: - likes_spam: A boolean indicating if we like SPAM or not. - eggs: An integer count of the eggs we have laid. - """ - - def __init__(self, likes_spam=False): - """Inits SampleClass with blah.""" - self.likes_spam = likes_spam - self.eggs = "eggs" - - def public_method_one(self, egg, fun): - """ - Description: Function description information - Args: - egg: egg information - fun: fun information - Returns: - Returned information - Raises: - AttributeError - """ - self.eggs = "eggs" - egg = "egg" - fun = "fun" - return egg, fun - - def public_method_two(self, tom): - """ - Description: Function description information - Args: - tom: tom information - Returns: - Returned information - Raises: - Error - """ - self.likes_spam = True - tom = 'cat' - return tom diff --git a/packageship/images/pkgship_outline.png b/packageship/images/pkgship_outline.png deleted file mode 100644 index 6fe1247c22c6b12a83aa01a5812c444f1667b952..0000000000000000000000000000000000000000 Binary files a/packageship/images/pkgship_outline.png and /dev/null differ diff --git a/packageship/packageship/__init__.py b/packageship/packageship/__init__.py deleted file mode 100644 index 626a3f18b65858f5ad3e8e2551a606b60bd48355..0000000000000000000000000000000000000000 --- a/packageship/packageship/__init__.py +++ /dev/null @@ -1,27 +0,0 @@ -#!/usr/bin/python3 -# ****************************************************************************** -# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved. -# licensed under the Mulan PSL v2. -# You can use this software according to the terms and conditions of the Mulan PSL v2. -# You may obtain a copy of Mulan PSL v2 at: -# http://license.coscl.org.cn/MulanPSL2 -# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND, EITHER EXPRESS OR -# IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT, MERCHANTABILITY OR FIT FOR A PARTICULAR -# PURPOSE. -# See the Mulan PSL v2 for more details. -# ******************************************************************************/ -""" -The root path of the project -""" -import os -import sys - -if "SETTINGS_FILE_PATH" not in os.environ: - os.environ["SETTINGS_FILE_PATH"] = '/etc/pkgship/package.ini' - - -# The root directory where the system is running -if getattr(sys, 'frozen', False): - BASE_PATH = os.path.dirname(os.path.realpath(sys.argv[0])) -else: - BASE_PATH = os.path.abspath(os.path.dirname(__file__)) diff --git a/packageship/packageship/application/__init__.py b/packageship/packageship/application/__init__.py deleted file mode 100644 index 8a5b157718b4031b0290d74fbf47325b46ab2588..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/__init__.py +++ /dev/null @@ -1,76 +0,0 @@ -#!/usr/bin/python3 -""" - Initial operation and configuration of the flask project -""" -from flask import Flask -from flask_session import Session -from flask_apscheduler import APScheduler -from packageship.application.settings import Config -from packageship.libs.log import setup_log -from packageship.libs.conf import configuration - -OPERATION = None - - -def _timed_task(app): - """ - Timed task function - """ - # disable=import-outside-toplevel Avoid circular import problems,so import inside the function - # pylint: disable=import-outside-toplevel - from packageship.application.apps.lifecycle.function.download_yaml import update_pkg_info - _hour = configuration.HOUR - _minute = configuration.MINUTE - if _hour < 0 or _hour > 23: - _hour = 3 - if _minute < 0 or _minute > 59: - _minute = 0 - - # disable=no-member Dynamic variable pylint is not recognized - app.apscheduler.add_job( # pylint: disable=no-member - func=update_pkg_info, id="update_package_data", trigger="cron", hour=_hour, minute=_minute) - app.apscheduler.add_job( # pylint: disable=no-member - func=update_pkg_info, - id="issue_catch", - trigger="cron", - hour=_hour, - minute=_minute, - args=(False,)) - - -def init_app(operation): - """ - Project initialization function - """ - app = Flask(__name__) - - # log configuration - # disable=no-member Dynamic variable pylint is not recognized - app.logger.addHandler(setup_log(Config())) # pylint: disable=no-member - - # Load configuration items - - app.config.from_object(Config()) - - # Register a scheduled task - scheduler = APScheduler() - scheduler.init_app(app) - scheduler.start() - - # Open session function - Session(app) - - # Variables OPERATION need to be modified within the function and imported in other modules - global OPERATION # pylint: disable=global-statement - OPERATION = operation - - # Register Blueprint - # disable=import-outside-toplevel Avoid circular import problems,so import inside the function - from packageship.application import apps # pylint: disable=import-outside-toplevel - for blue, api in apps.blue_point: - api.init_app(app) - app.register_blueprint(blue) - - _timed_task(app) - - return app diff --git a/packageship/packageship/application/app_global.py b/packageship/packageship/application/app_global.py deleted file mode 100644 index 0e74a7fd662139d294f088c2c8f9a8a8b4fbd224..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/app_global.py +++ /dev/null @@ -1,33 +0,0 @@ -#!/usr/bin/python3 -""" -Description: Interception before request -""" -from flask import request -from packageship import application -from packageship.application.apps.package.url import urls as package_urls -from packageship.application.apps.lifecycle.url import urls as lifecycle_urls -from packageship.application.apps.dependinfo.url import urls as dependinfo_urls - - -__all__ = ['identity_verification'] - -URLS = package_urls + lifecycle_urls + dependinfo_urls - - -def identity_verification(): - """ - Description: Requested authentication - Args: - Returns: - Raises: - """ - if request.url_rule: - url_rule = request.url_rule.rule - for _view, url, authentication in URLS: - if url.lower() == url_rule.lower() and application.OPERATION in authentication.keys(): - if request.method not in authentication.get(application.OPERATION): - return False - break - return True - - return False diff --git a/packageship/packageship/application/apps/__init__.py b/packageship/packageship/application/apps/__init__.py deleted file mode 100644 index 01356ef619794f38922a3f62dfb17df987c3022f..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/__init__.py +++ /dev/null @@ -1,15 +0,0 @@ -#!/usr/bin/python3 -""" -Blueprint collection trying to page -""" -from packageship.application.apps import package -from packageship.application.apps import lifecycle -from packageship.application.apps import dependinfo - -blue_point = [ # pylint: disable=invalid-name - (package.package, package.api), - (lifecycle.lifecycle, lifecycle.api), - (dependinfo.dependinfo, dependinfo.api) -] - -__all__ = ['blue_point'] diff --git a/packageship/packageship/application/apps/dependinfo/__init__.py b/packageship/packageship/application/apps/dependinfo/__init__.py deleted file mode 100644 index c54604862abc864be2a4d570ce72e62b4e9fe124..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/dependinfo/__init__.py +++ /dev/null @@ -1,28 +0,0 @@ -#!/usr/bin/python3 -# ****************************************************************************** -# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved. -# licensed under the Mulan PSL v2. -# You can use this software according to the terms and conditions of the Mulan PSL v2. -# You may obtain a copy of Mulan PSL v2 at: -# http://license.coscl.org.cn/MulanPSL2 -# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND, EITHER EXPRESS OR -# IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT, MERCHANTABILITY OR FIT FOR A PARTICULAR -# PURPOSE. -# See the Mulan PSL v2 for more details. -# ******************************************************************************/ -from flask.blueprints import Blueprint -from flask_restful import Api -from packageship import application -from .url import urls - -dependinfo = Blueprint('dependinfo', __name__) - -# init restapi -api = Api() - -for view, url, operation in urls: - if application.OPERATION and application.OPERATION in operation.keys(): - api.add_resource(view, url) - - -__all__ = ['dependinfo', 'api'] diff --git a/packageship/packageship/application/apps/dependinfo/function/__init__.py b/packageship/packageship/application/apps/dependinfo/function/__init__.py deleted file mode 100644 index 6851032853d1af6ec5065266cf9d86c2bd1c0b3c..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/dependinfo/function/__init__.py +++ /dev/null @@ -1,12 +0,0 @@ -#!/usr/bin/python3 -# ****************************************************************************** -# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved. -# licensed under the Mulan PSL v2. -# You can use this software according to the terms and conditions of the Mulan PSL v2. -# You may obtain a copy of Mulan PSL v2 at: -# http://license.coscl.org.cn/MulanPSL2 -# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND, EITHER EXPRESS OR -# IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT, MERCHANTABILITY OR FIT FOR A PARTICULAR -# PURPOSE. -# See the Mulan PSL v2 for more details. -# ******************************************************************************/ diff --git a/packageship/packageship/application/apps/dependinfo/function/graphcache.py b/packageship/packageship/application/apps/dependinfo/function/graphcache.py deleted file mode 100644 index 3ea54e67197fa5689a0f50facb4585134448d28b..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/dependinfo/function/graphcache.py +++ /dev/null @@ -1,130 +0,0 @@ -#!/usr/bin/python3 -# ****************************************************************************** -# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved. -# licensed under the Mulan PSL v2. -# You can use this software according to the terms and conditions of the Mulan PSL v2. -# You may obtain a copy of Mulan PSL v2 at: -# http://license.coscl.org.cn/MulanPSL2 -# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND, EITHER EXPRESS OR -# IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT, MERCHANTABILITY OR FIT FOR A PARTICULAR -# PURPOSE. -# See the Mulan PSL v2 for more details. -# ******************************************************************************/ -""" -Use redis cache to install dependencies, compile dependencies -be dependent, and self-compile dependencies -""" -import threading -import hashlib -import json -from importlib import import_module -from functools import partial -from redis import Redis, ConnectionPool -from packageship.libs.conf import configuration - - -REDIS_CONN = Redis(connection_pool=ConnectionPool( - host=configuration.REDIS_HOST, - port=configuration.REDIS_PORT, - max_connections=configuration.REDIS_MAX_CONNECTIONS, - decode_responses=True)) -lock = threading.Lock() - - -def hash_key(encrypt_obj): - """ - After sorting the values of the dictionary type, calculate the md5 encrypted hash value - - Args: - encrypt_obj:Dictionaries that need to be computed by hash values - """ - if isinstance(encrypt_obj, dict): - encrypt_obj = {key: encrypt_obj[key] for key in sorted(encrypt_obj)} - md5 = hashlib.md5() - md5.update(str(encrypt_obj).encode('utf-8')) - return md5.hexdigest() - - -def get_module(module_path): - """ - Import a dotted module path and return the attribute/class designated by the - last name in the path. Raise ImportError if the import failed. - - Args: - module_path:Module path - """ - try: - module_path, class_name = module_path.rsplit('.', 1) - except ValueError as err: - raise ImportError("%s doesn't look like a module path" % - module_path) from err - - module = import_module(module_path) - - try: - return getattr(module, class_name) - except AttributeError as err: - raise ImportError('Module "%s" does not define a "%s" attribute/class' % ( - module_path, class_name) - ) from err - - -def _query_depend(query_parameter, depend_relation_str): - """ - Retrieves dependency data in the RedIS cache or queries - dependencies from the database and saves them in the RedIS cache - - Args: - query_parameter:The condition of the query is a dictionary-type parameter - depend_relation_str:A module string of dependencies - Returns: - A dictionary form of dependency representation - """ - depend_relation_key = hash_key(query_parameter) - - def _get_redis_value(): - depend_relation = REDIS_CONN.get(depend_relation_key) - if depend_relation: - depend_relation = json.loads(depend_relation, encoding='utf-8') - return depend_relation - - if REDIS_CONN.exists(depend_relation_key): - return _get_redis_value() - - lock.acquire() - if not REDIS_CONN.exists(depend_relation_key + '_flag'): - REDIS_CONN.set(depend_relation_key + '_flag', 'True') - else: - REDIS_CONN.set(depend_relation_key + '_flag', 'False') - REDIS_CONN.expire(depend_relation_key + '_flag', 600) - lock.release() - while not REDIS_CONN.exists(depend_relation_key) and \ - REDIS_CONN.get(depend_relation_key + '_flag') == 'False': - pass - if REDIS_CONN.exists(depend_relation_key): - return _get_redis_value() - # query depend relation - try: - depend_relation = get_module(depend_relation_str) - except ImportError as err: - raise ImportError(err) - else: - _depend_result = depend_relation.query_depend_relation(query_parameter) - REDIS_CONN.set(depend_relation_key, json.dumps(_depend_result)) - return _depend_result - finally: - REDIS_CONN.delete(depend_relation_key + '_flag') - - -self_build = partial( - _query_depend, - depend_relation_str="packageship.application.apps.dependinfo.function.singlegraph.SelfBuildDep") -install_depend = partial( - _query_depend, - depend_relation_str="packageship.application.apps.dependinfo.function.singlegraph.InstallDep") -build_depend = partial( - _query_depend, - depend_relation_str="packageship.application.apps.dependinfo.function.singlegraph.BuildDep") -bedepend = partial( - _query_depend, - depend_relation_str="packageship.application.apps.dependinfo.function.singlegraph.BeDependOn") diff --git a/packageship/packageship/application/apps/dependinfo/function/singlegraph.py b/packageship/packageship/application/apps/dependinfo/function/singlegraph.py deleted file mode 100644 index c1ab8226a473f123a0a2f375cdfd0304d2fc6722..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/dependinfo/function/singlegraph.py +++ /dev/null @@ -1,489 +0,0 @@ -#!/usr/bin/python3 -# ****************************************************************************** -# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved. -# licensed under the Mulan PSL v2. -# You can use this software according to the terms and conditions of the Mulan PSL v2. -# You may obtain a copy of Mulan PSL v2 at: -# http://license.coscl.org.cn/MulanPSL2 -# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND, EITHER EXPRESS OR -# IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT, MERCHANTABILITY OR FIT FOR A PARTICULAR -# PURPOSE. -# See the Mulan PSL v2 for more details. -# ******************************************************************************/ -""" -Data analysis of dependency graph -""" -import random -from packageship.application.apps.package.function.searchdb import db_priority -from packageship.application.apps.package.function.constants import ResponseCode -from packageship.application.apps.package.serialize import BeDependSchema -from packageship.application.apps.package.serialize import BuildDependSchema -from packageship.application.apps.package.serialize import InstallDependSchema -from packageship.application.apps.package.serialize import SelfDependSchema -from packageship.application.apps.package.function.self_depend import SelfDepend -from packageship.application.apps.package.function.install_depend import InstallDepend -from packageship.application.apps.package.function.build_depend import BuildDepend -from packageship.application.apps.package.function.be_depend import BeDepend -from packageship.libs.log import Log -from .graphcache import self_build, bedepend, build_depend, install_depend - - -LEVEL_RADIUS = 30 -NODE_SIZE = 25 -PACKAGE_NAME = 0 -LOGGER = Log(__name__) - - -class SelfBuildDep: - """ - Description: Self-compilation dependent data query analysis - - Attributes: - graph:Diagram of an underlying operation instance - query_parameter:Parameters for a dependency query - """ - - def __init__(self, graph): - self.graph = graph - self.query_parameter = { - 'packagename': self.graph.packagename, - 'db_list': self.graph.dbname, - 'packtype': self.graph.packagetype, - 'selfbuild': self.graph.selfbuild, - 'withsubpack': self.graph.withsubpack - } - - def validate(self): - """Verify the validity of the data""" - depend = SelfDependSchema().validate(self.query_parameter) - if depend: - return False - return True - - @staticmethod - def query_depend_relation(query_parameter): - """ - Self-compile dependent relational queries - - Args: - query_parameter:Parameters for a dependency query - """ - db_list = query_parameter['db_list'] - packagename = query_parameter['packagename'] - selfbuild = int(query_parameter['selfbuild']) - withsubpack = int(query_parameter['withsubpack']) - packtype = query_parameter['packtype'] - _response_code, binary_dicts, source_dicts, not_fd_components = \ - SelfDepend(db_list).query_depend(packagename, - selfbuild, - withsubpack, - packtype) - return { - "code": _response_code, - "binary_dicts": binary_dicts, - "source_dicts": source_dicts, - "not_found_components": list(not_fd_components) - } - - def __call__(self): - def _query_depend(): - query_result = self_build(self.query_parameter) - return query_result['binary_dicts'] - return self.graph.get_depend_relation_data(self, _query_depend) - - -class InstallDep: - """ - Installation dependent data query analysis - - Attributes: - graph:Diagram of an underlying operation instance - query_parameter:Parameters for a dependency query - """ - - def __init__(self, graph): - self.graph = graph - self.query_parameter = { - 'binaryName': self.graph.packagename, - 'db_list': self.graph.dbname - } - - def validate(self): - """Verify the validity of the data""" - depend = InstallDependSchema().validate(self.query_parameter) - if depend: - return False - return True - - @staticmethod - def query_depend_relation(query_parameter): - """ - Install dependent relational queries - - Args: - query_parameter:Parameters for a dependency query - """ - db_list = query_parameter['db_list'] - binary_name = query_parameter['binaryName'] - _response_code, install_dict, not_found_components = \ - InstallDepend(db_list).query_install_depend([binary_name]) - return { - "code": _response_code, - "install_dict": install_dict, - 'not_found_components': list(not_found_components) - } - - def __call__(self): - def _query_depend(): - query_result = install_depend(self.query_parameter) - return query_result['install_dict'] - return self.graph.get_depend_relation_data(self, _query_depend) - - -class BuildDep: - """ - Compile dependent data query analysis - - Attributes: - graph:Diagram of an underlying operation instance - query_parameter:Parameters for a dependency query - """ - - def __init__(self, graph): - self.graph = graph - self.query_parameter = { - 'sourceName': self.graph.packagename, - 'db_list': self.graph.dbname - } - - def validate(self): - """Verify the validity of the data""" - depend = BuildDependSchema().validate(self.query_parameter) - if depend: - return False - return True - - @staticmethod - def query_depend_relation(query_parameter): - """ - Compile dependent relational queries - - Args: - query_parameter:Parameters for a dependency query - """ - source_name = query_parameter['sourceName'] - db_list = query_parameter['db_list'] - build_ins = BuildDepend([source_name], db_list) - _res_code, builddep_dict, _, not_found_components = build_ins.build_depend_main() - return { - "code": _res_code, - 'build_dict': builddep_dict, - 'not_found_components': list(not_found_components) - } - - def __call__(self): - def _query_depend(): - query_result = build_depend(self.query_parameter) - return query_result['build_dict'] - return self.graph.get_depend_relation_data(self, _query_depend) - - -class BeDependOn: - """ - Dependent relational queries - - Attributes: - graph:Diagram of an underlying operation instance - query_parameter:Parameters for a dependency query - """ - - def __init__(self, graph): - self.graph = graph - dbname = None - if self.graph.dbname and isinstance(self.graph.dbname, (list, tuple)): - dbname = self.graph.dbname[0] - self.query_parameter = { - 'packagename': self.graph.packagename, - 'dbname': dbname, - 'withsubpack': self.graph.withsubpack - } - - def validate(self): - """Verify the validity of the data""" - bedependon = BeDependSchema().validate(self.query_parameter) - if bedependon: - return False - return True - - @staticmethod - def query_depend_relation(query_parameter): - """ - Dependent relational queries - - Args: - query_parameter:Parameters for a dependency query - """ - packagename = query_parameter['packagename'] - db_name = query_parameter['dbname'] - withsubpack = query_parameter['withsubpack'] - bedepnd_ins = BeDepend(packagename, db_name, withsubpack) - be_depend_dict = bedepnd_ins.main() - _code = ResponseCode.PACK_NAME_NOT_FOUND - if be_depend_dict: - _code = ResponseCode.SUCCESS - return { - "code": _code, - "bedepend": be_depend_dict - } - - def __call__(self): - - def _query_depend(): - query_result = bedepend(self.query_parameter) - return query_result['bedepend'] - return self.graph.get_depend_relation_data(self, _query_depend) - - -class BaseGraph: - """ - Basic operation of dependency graph - """ - depend = { - 'selfbuild': SelfBuildDep, - 'installdep': InstallDep, - 'builddep': BuildDep, - 'bedepend': BeDependOn - } - - def __init__(self, query_type, **kwargs): - self.query_type = query_type - self.dbname = None - self.__dict__.update(**kwargs) - depend_graph = self.depend.get(self.query_type) - if depend_graph is None: - raise RuntimeError( - 'The query parameter type is wrong, and normal', - ' dependent data analysis cannot be completed') - self.graph = depend_graph(self) - self._color = ['#E02020', '#FA6400', '#F78500', '#6DD400', '#44D7B6', - '#32C5FF', '#0091FF', '#6236FF', '#B620E0', '#6D7278'] - self.nodes = dict() - self.edges = list() - self.depend_package = dict() - self.package_datas = { - 'uplevel': dict(), - 'downlevel': dict() - } - self.up_depend_node = list() - self.down_depend_nodes = list() - self._quadrant = [1, -1] - - def __getattr__(self, value): - if value not in self.__dict__: - return None - return self.__dict__[value] - - @property - def color(self): - """rgb random color value acquisition""" - return self._color[random.randint(0, 9)] - - @property - def quadrant(self): - """Get the coordinate quadrant at random""" - return self._quadrant[random.randint(0, 1)] - - @property - def coordinate(self): - """ - Dynamically calculate the random coordinates of each package in the current level - - Returns: - The coordinate value of the dynamically calculated dependent package - example : (x,y) - """ - _x, _y = random.uniform(0, LEVEL_RADIUS) * self.quadrant, random.uniform( - 0, LEVEL_RADIUS) * self.quadrant - return _x, _y - - @property - def node_size(self): - """Dynamically calculate the size of each node """ - node_size = random.uniform(1, NODE_SIZE) - return node_size - - def _database_priority(self): - """Verify the validity of the query database""" - - databases = db_priority() - if not databases: - return ResponseCode.FILE_NOT_FIND_ERROR - self.dbname = self.dbname if self.dbname else databases - - if any(filter(lambda db_name: db_name not in databases, self.dbname)): - return ResponseCode.DB_NAME_ERROR - return None - - @staticmethod - def create_dict(**kwargs): - """ - Create dictionary data - - Args: - kwargs: Create each key-Val key-value pair for the dictionary - """ - if isinstance(kwargs, dict): - return kwargs - return dict() - - def _combination_nodes(self, package_name, root=True): - """ - Regroup node values - Args: - package_name:Dependent package name - root:he coordinate value of the root node - """ - _size = self.node_size - if root: - _x, _y = 0, 0 - _size = 30 - else: - _x, _y = self.coordinate - self.nodes[package_name] = BaseGraph.create_dict( - color=self.color, - label=package_name, - y=_y, - x=_x, - id=package_name, - size=_size) - - def _combination_edges(self, source_package_name, target_package_name): - """ - Depend on the data combination of the edges node in the graph - Args: - source_package_name:Source node - target_package_name:Target node - """ - self.edges.append(BaseGraph.create_dict( - sourceID=source_package_name, - targetID=target_package_name, - )) - - def _up_level_depend(self): - """ - Data analysis of the previous layer - """ - _up_depend_nodes = [] - for node_name in self.up_depend_node: - if node_name not in self.package_datas['uplevel'].keys(): - continue - depend_data = self.package_datas['uplevel'][node_name] - for depend_item in depend_data: - _up_depend_nodes.append(depend_item) - self._combination_nodes( - depend_item, root=False) - self._combination_edges( - node_name, depend_item) - - self.up_depend_node = list(set(_up_depend_nodes)) - - def _down_level_depend(self): - """ - Specify the next level of dependencies of dependent nodes - """ - _down_depend_nodes = [] - for node_name in self.down_depend_nodes: - if node_name not in self.package_datas['downlevel'].keys(): - continue - depend_data = self.package_datas['downlevel'][node_name] - for depend_item in depend_data: - _down_depend_nodes.append(depend_item) - self._combination_nodes( - depend_item, root=False) - self._combination_edges( - depend_item, node_name) - - self.down_depend_nodes = list(set(_down_depend_nodes)) - - def _graph_data(self): - """ - Resolve the data in the dependency graph - """ - def depend_package(): - if self.packagetype == "binary": - self.up_depend_node.append(self.node_name) - self.down_depend_nodes.append(self.node_name) - self._combination_nodes(self.node_name) - for _level in range(1, 3): - self._up_level_depend() - self._down_level_depend() - depend_package() - - self.depend_package = { - 'nodes': [node for key, node in self.nodes.items()], - 'edges': self.edges - } - - def _relation_recombine(self, package_datas): - """ - The data in the dependency query is recombined - into representations of the upper and lower dependencies - of the current node - - Args: - package_datas:Package dependency data - - """ - for package_name, package_depend in package_datas.items(): - if not package_depend or not isinstance(package_depend, list): - continue - if self.packagetype == 'source' and package_depend[PACKAGE_NAME] == self.node_name: - self.up_depend_node.append(package_name) - self.down_depend_nodes.append(package_name) - - for depend_item in package_depend[-1]: - if depend_item[PACKAGE_NAME] == 'root': - continue - if not self.package_datas['uplevel'].__contains__(package_name): - self.package_datas['uplevel'][package_name] = list() - if not self.package_datas['downlevel'].__contains__(depend_item[PACKAGE_NAME]): - self.package_datas['downlevel'][depend_item[PACKAGE_NAME]] = list( - ) - self.package_datas['uplevel'][package_name].append( - depend_item[PACKAGE_NAME]) - self.package_datas['downlevel'][depend_item[PACKAGE_NAME]].append( - package_name) - - def get_depend_relation_data(self, depend, func): - """ - Get data for different dependencies - - Args: - depend:Each of the dependent instance objects - SelfBuildDep()、BuildDep()、InstallDep()、BeDependOn() - func:Methods to query dependencies - """ - - if not depend.validate(): - return (ResponseCode.PARAM_ERROR, ResponseCode.CODE_MSG_MAP[ResponseCode.PARAM_ERROR]) - database_error = self._database_priority() - if database_error: - return database_error - _package_datas = func() - - if _package_datas: - self._relation_recombine(_package_datas) - try: - self._graph_data() - except KeyError as error: - LOGGER.logger.error(error) - return (ResponseCode.SERVICE_ERROR, ResponseCode.CODE_MSG_MAP[ResponseCode.SERVICE_ERROR]) - return (ResponseCode.SUCCESS, ResponseCode.CODE_MSG_MAP[ResponseCode.SUCCESS]) - - def parse_depend_graph(self): - """Analyze the data that the graph depends on""" - response_status, _msg = self.graph() - if response_status != ResponseCode.SUCCESS: - return (response_status, _msg, None) - - return (response_status, _msg, self.depend_package) diff --git a/packageship/packageship/application/apps/dependinfo/url.py b/packageship/packageship/application/apps/dependinfo/url.py deleted file mode 100644 index cffc61f0813a42241dc68d96308c3c43ebe21573..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/dependinfo/url.py +++ /dev/null @@ -1,23 +0,0 @@ -#!/usr/bin/python3 -# ****************************************************************************** -# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved. -# licensed under the Mulan PSL v2. -# You can use this software according to the terms and conditions of the Mulan PSL v2. -# You may obtain a copy of Mulan PSL v2 at: -# http://license.coscl.org.cn/MulanPSL2 -# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND, EITHER EXPRESS OR -# IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT, MERCHANTABILITY OR FIT FOR A PARTICULAR -# PURPOSE. -# See the Mulan PSL v2 for more details. -# ******************************************************************************/ - -from . import view - -urls = [ - # get SelfDepend - (view.InfoSelfDepend, '/dependInfo/selfDepend', {'query': ('POST')}), - # get BeDepend - (view.InfoBeDepend, '/dependInfo/beDepend', {'query': ('POST')}), - # get all database name - (view.DataBaseInfo, '/dependInfo/databases', {'query': ('GET')}) -] diff --git a/packageship/packageship/application/apps/dependinfo/view.py b/packageship/packageship/application/apps/dependinfo/view.py deleted file mode 100644 index 7b06362c6fe8ae19d58cd81071a2fba28aaf054b..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/dependinfo/view.py +++ /dev/null @@ -1,330 +0,0 @@ -#!/usr/bin/python3 -# ****************************************************************************** -# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved. -# licensed under the Mulan PSL v2. -# You can use this software according to the terms and conditions of the Mulan PSL v2. -# You may obtain a copy of Mulan PSL v2 at: -# http://license.coscl.org.cn/MulanPSL2 -# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND, EITHER EXPRESS OR -# IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT, MERCHANTABILITY OR FIT FOR A PARTICULAR -# PURPOSE. -# See the Mulan PSL v2 for more details. -# ******************************************************************************/ -""" -description: Interface processing -class: BeDepend, BuildDepend, InitSystem, InstallDepend, Packages, -Repodatas, SelfDepend, SinglePack -""" -from flask import request -from flask import jsonify -from flask_restful import Resource -from packageship.libs.log import Log - -from packageship.application.apps.package.serialize import SelfDependSchema -from packageship.application.apps.package.serialize import BeDependSchema -from packageship.application.apps.package.serialize import have_err_db_name -from packageship.application.apps.package.function.constants import ResponseCode -from packageship.application.apps.package.function.searchdb import db_priority -from packageship.application.apps.package.function.constants import ListNode -from .function.graphcache import bedepend -from .function.graphcache import self_build as selfbuild - -DB_NAME = 0 -LOGGER = Log(__name__) - - -# pylint: disable = no-self-use -# pylint: disable = too-many-locals - -# licensed under the Mulan PSL v2. -class ParseDependPackageMethod(Resource): - """ - Description: Common Method - """ - - def __init__(self): - """ - Description: Class instance initialization - """ - self.statistics = dict() - - def _parse_database(self, db_list): - data_name = [] - for _, v in enumerate(db_list): - data_name.append({"database": v, "binary_num": 0, "source_num": 0}) - return data_name - - # pylint: disable = too-many-nested-blocks - def parse_depend_package(self, package_all, db_list=None): - """ - Description: Parsing package data with dependencies - Args: - package_all: http request response content - Returns: - Summarized data table - Raises: - """ - depend_bin_data = [] - depend_src_data = [] - statistics = self._parse_database(db_list) - if isinstance(package_all, dict): - for bin_package, package_depend in package_all.items(): - if isinstance(package_depend, list) and \ - package_depend[ListNode.SOURCE_NAME] != 'source': - row_data = {"binary_name": bin_package, - "source_name": package_depend[ListNode.SOURCE_NAME], - "version": package_depend[ListNode.VERSION], - "database": package_depend[ListNode.DBNAME]} - row_src_data = {"source_name": package_depend[ListNode.SOURCE_NAME], - "version": package_depend[ListNode.VERSION], - "database": package_depend[ListNode.DBNAME]} - if package_depend[ListNode.DBNAME] not in self.statistics: - self.statistics[package_depend[ListNode.DBNAME]] = { - 'binary': [], - 'source': [] - } - if bin_package not in \ - self.statistics[package_depend[ListNode.DBNAME]]['binary']: - self.statistics[package_depend[ListNode.DBNAME]]['binary'].append( - bin_package) - for con in statistics: - if package_depend[ListNode.DBNAME] == con["database"]: - con["binary_num"] += 1 - if package_depend[ListNode.SOURCE_NAME] not in \ - self.statistics[package_depend[ListNode.DBNAME]]['source']: - self.statistics[package_depend[ListNode.DBNAME]]['source'].append( - package_depend[ListNode.SOURCE_NAME]) - for con in statistics: - if package_depend[ListNode.DBNAME] == con["database"]: - con["source_num"] += 1 - depend_bin_data.append(row_data) - depend_src_data.append(row_src_data) - src_data = [dict(t) for t in set([tuple(d.items()) for d in depend_src_data])] - statistics_table = { - "binary_dicts": depend_bin_data, - "source_dicts": src_data, - "statistics": statistics} - return statistics_table - - -class InfoSelfDepend(ParseDependPackageMethod): - """ - Description: querying install and build depend for a package - and others which has the same src name - Restful API: post - changeLog: - """ - - def _parse_bin_package(self, bin_packages, db_list): - """ - Description: Parsing binary result data - Args: - bin_packages: Binary package data - Returns: - Raises: - """ - bin_package_data = [] - statistics = self._parse_database(db_list) - if bin_packages: - for bin_package, package_depend in bin_packages.items(): - # distinguish whether the current data is the data of the root node - if isinstance(package_depend, list) and package_depend[ListNode.PARENT_LIST][DB_NAME][ - DB_NAME] != 'root': - row_data = {"binary_name": bin_package, - "source_name": package_depend[ListNode.SOURCE_NAME], - "version": package_depend[ListNode.VERSION], - "database": package_depend[ListNode.DBNAME]} - for con in statistics: - if package_depend[ListNode.DBNAME] == con["database"]: - con["binary_num"] += 1 - if package_depend[ListNode.DBNAME] not in self.statistics: - self.statistics[package_depend[ListNode.DBNAME]] = { - 'binary': [], - 'source': [] - } - if bin_package not in \ - self.statistics[package_depend[ListNode.DBNAME]]['binary']: - self.statistics[package_depend[ListNode.DBNAME]]['binary'].append( - bin_package) - bin_package_data.append(row_data) - return bin_package_data, statistics - - def _parse_src_package(self, src_packages, db_list): - """ - Description: Source package data analysis - Args: - src_packages: Source package - - Returns: - Source package data - Raises: - - """ - src_package_data = [] - statistics = db_list - if src_packages: - for src_package, package_depend in src_packages.items(): - if isinstance(package_depend, list): - row_data = {"source_name": src_package, - "version": package_depend[ListNode.VERSION], - "database": package_depend[DB_NAME]} - for con in statistics: - if package_depend[DB_NAME] == con["database"]: - con["source_num"] += 1 - if package_depend[DB_NAME] not in self.statistics: - self.statistics[package_depend[DB_NAME]] = { - 'binary': [], - 'source': [] - } - if src_package not in self.statistics[package_depend[DB_NAME]]['source']: - self.statistics[package_depend[DB_NAME]]['source'].append(src_package) - src_package_data.append(row_data) - return src_package_data, statistics - - def post(self): - """ - Query a package's all dependencies including install and build depend - (support quering a binary or source package in one or more databases) - Args: - packageName:package name - packageType: source/binary - selfBuild :0/1 - withSubpack: 0/1 - dbPreority:the array for database preority - Returns: - for - example:: - { - "code": "", - "data": "", - "msg": "" - } - """ - schema = SelfDependSchema() - data = request.get_json() - validate_err = schema.validate(data) - - if validate_err: - return jsonify( - ResponseCode.response_json(ResponseCode.PARAM_ERROR) - ) - pkg_name = data.get("packagename") - db_pri = db_priority() - if not db_pri: - return jsonify( - ResponseCode.response_json( - ResponseCode.FILE_NOT_FIND_ERROR - ) - ) - db_list = data.get("db_list") if data.get("db_list") \ - else db_pri - self_build = data.get("selfbuild", 0) - with_sub_pack = data.get("withsubpack", 0) - pack_type = data.get("packtype", "binary") - if have_err_db_name(db_list, db_pri): - return jsonify( - ResponseCode.response_json(ResponseCode.DB_NAME_ERROR) - ) - query_parameter = {"db_list": db_list, - "packagename": pkg_name, - "selfbuild": self_build, - "withsubpack": with_sub_pack, - "packtype": pack_type} - - result_data = selfbuild(query_parameter) - response_code = result_data["code"] - binary_dicts = result_data["binary_dicts"] - source_dicts = result_data["source_dicts"] - if not all([binary_dicts, source_dicts]): - return jsonify( - ResponseCode.response_json(response_code) - ) - bin_package, package_count = self._parse_bin_package(binary_dicts, db_list) - src_package, statistics = self._parse_src_package(source_dicts, package_count) - return jsonify( - ResponseCode.response_json(ResponseCode.SUCCESS, data={ - "binary_dicts": bin_package, - "source_dicts": src_package, - "statistics": statistics - }) - ) - - -class InfoBeDepend(ParseDependPackageMethod): - """ - Description: querying be installed and built depend for a package - and others which has the same src name - Restful API: post - changeLog: - """ - - def post(self): - """ - Query a package's all dependencies including - be installed and built depend - Args: - packageName:package name - withSubpack: 0/1 - dbname:database name - Returns: - for - example:: - resultList[ - restult[ - binaryName: - srcName: - dbName: - type: beinstall or bebuild, which depend on the function - childNode: the binary package name which is the be built/installed - depend for binaryName - ] - ] - """ - schema = BeDependSchema() - data = request.get_json() - validate_err = schema.validate(data) - if validate_err: - return jsonify( - ResponseCode.response_json(ResponseCode.PARAM_ERROR) - ) - package_name = data.get("packagename") - with_sub_pack = data.get("withsubpack") - db_name = data.get("dbname") - if db_name not in db_priority(): - return jsonify( - ResponseCode.response_json(ResponseCode.DB_NAME_ERROR) - ) - query_parameter = {"packagename": package_name, - "withsubpack": with_sub_pack, - "dbname": db_name} - - result_data = bedepend(query_parameter) - res_code = result_data["code"] - res_dict = result_data["bedepend"] - if not res_dict: - return jsonify( - ResponseCode.response_json(res_code) - ) - result_dict = self.parse_depend_package(res_dict, [db_name]) - return jsonify( - ResponseCode.response_json(ResponseCode.SUCCESS, data=result_dict) - ) - - -class DataBaseInfo(Resource): - """ - Get the database name of all packages - """ - - def get(self): - """ - Returns: name_list: database name list - """ - name_list = db_priority() - if not name_list: - return jsonify( - ResponseCode.response_json(ResponseCode.NOT_FOUND_DATABASE_INFO) - ) - return jsonify( - ResponseCode.response_json(ResponseCode.SUCCESS, data=name_list) - ) diff --git a/packageship/packageship/application/apps/lifecycle/__init__.py b/packageship/packageship/application/apps/lifecycle/__init__.py deleted file mode 100644 index d17a06a54ae6322f651c9e9f125b645e31696989..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/lifecycle/__init__.py +++ /dev/null @@ -1,20 +0,0 @@ -#!/usr/bin/python3 -""" - Blueprint registration for life cycle -""" -from flask.blueprints import Blueprint -from flask_restful import Api -from packageship.application.apps.lifecycle.url import urls -from packageship import application - -lifecycle = Blueprint('lifecycle', __name__) - -# init restapi -api = Api() - -for view, url, operation in urls: - if application.OPERATION and application.OPERATION in operation.keys(): - api.add_resource(view, url) - - -__all__ = ['lifecycle', 'api'] diff --git a/packageship/packageship/application/apps/lifecycle/function/__init__.py b/packageship/packageship/application/apps/lifecycle/function/__init__.py deleted file mode 100644 index 1d9852d5d5ea7bd61eb19baf148cb26b4ea326a0..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/lifecycle/function/__init__.py +++ /dev/null @@ -1,4 +0,0 @@ -#!/usr/bin/python3 -""" - Business approach to package life cycle -""" diff --git a/packageship/packageship/application/apps/lifecycle/function/concurrent.py b/packageship/packageship/application/apps/lifecycle/function/concurrent.py deleted file mode 100644 index 9abd26a6071acba5f1ab9fa800e873b072ca1eda..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/lifecycle/function/concurrent.py +++ /dev/null @@ -1,74 +0,0 @@ -#!/usr/bin/python3 -""" -Use queues to implement the producer and consumer model -to solve the database lock introduced by high concurrency issues -""" -import threading -import time -from queue import Queue -from sqlalchemy.exc import SQLAlchemyError -from sqlalchemy.exc import OperationalError -from packageship.libs.exception import Error, ContentNoneException -from packageship.libs.log import LOGGER -from packageship.libs.conf import configuration - - -class ProducerConsumer(): - """ - The data written in the database is added to the high - concurrency queue, and the high concurrency is solved - by the form of the queue - """ - _queue = Queue(maxsize=configuration.QUEUE_MAXSIZE) - _instance_lock = threading.Lock() - - def __init__(self): - self.thread_queue = threading.Thread(target=self.__queue_process) - self._instance_lock.acquire() - if not self.thread_queue.isAlive(): - self.thread_queue = threading.Thread(target=self.__queue_process) - self.thread_queue.start() - self._instance_lock.release() - - def start_thread(self): - """ - Judge a thread, if the thread is terminated, restart - """ - self._instance_lock.acquire() - if not self.thread_queue.isAlive(): - self.thread_queue = threading.Thread(target=self.__queue_process) - self.thread_queue.start() - self._instance_lock.release() - - def __new__(cls, *args, **kwargs): # pylint: disable=unused-argument - """ - Use the singleton pattern to create a thread-safe producer pattern - """ - if not hasattr(cls, "_instance"): - with cls._instance_lock: - if not hasattr(cls, "_instance"): - cls._instance = object.__new__(cls) - return cls._instance - - def __queue_process(self): - """ - Read the content in the queue and save and update - """ - while not self._queue.empty(): - _queue_value, method = self._queue.get() - try: - method(_queue_value) - except OperationalError as error: - LOGGER.warning(error) - time.sleep(0.2) - self._queue.put((_queue_value, method)) - except (Error, ContentNoneException, SQLAlchemyError) as error: - LOGGER.error(error) - - def put(self, pending_content): - """ - The content of the operation is added to the queue - """ - if pending_content: - self._queue.put(pending_content) - self.start_thread() diff --git a/packageship/packageship/application/apps/lifecycle/function/download_yaml.py b/packageship/packageship/application/apps/lifecycle/function/download_yaml.py deleted file mode 100644 index fa3bfbab4ae57a38971fd9afdc6f0b7a60a3da3a..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/lifecycle/function/download_yaml.py +++ /dev/null @@ -1,196 +0,0 @@ -#!/usr/bin/python3 -""" -Dynamically obtain the content of the yaml file \ -that saves the package information, periodically \ -obtain the content and save it in the database -""" -import copy -from concurrent.futures import ThreadPoolExecutor -import datetime as date -import requests -import yaml - -from requests.exceptions import HTTPError -from packageship.application.models.package import Packages -from packageship.application.models.package import PackagesMaintainer -from packageship.libs.dbutils import DBHelper -from packageship.libs.exception import Error -from packageship.libs.conf import configuration -from packageship.libs.log import LOGGER -from .gitee import Gitee -from .concurrent import ProducerConsumer - - -class ParseYaml(): - """ - Description: Analyze the downloaded remote yaml file, obtain the tags - and maintainer information in the yaml file, and save the obtained - relevant information into the database - - Attributes: - base: base class instance - pkg: Specific package data - _table_name: The name of the data table to be operated - openeuler_advisor_url: Get the warehouse address of the yaml file - _yaml_content: The content of the yaml file - """ - - def __init__(self, pkg_info, table_name): - self.headers = { - 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; WOW 64; rv:50.0) Gecko/20100101 \ - Firefox / 50.0 '} - self.pkg = pkg_info - self._table_name = table_name - self.openeuler_advisor_url = configuration.WAREHOUSE_REMOTE + \ - '{pkg_name}.yaml'.format(pkg_name=pkg_info.name) - self._yaml_content = None - self.timed_task_open = configuration.OPEN - self.producer_consumer = ProducerConsumer() - - def update_database(self): - """ - For the current package, determine whether the specific yaml file exists, parse - the data in it and save it in the database if it exists, and record the relevant - log if it does not exist - - """ - if self._openeuler_advisor_exists_yaml(): - self._save_to_database() - else: - msg = "The yaml information of the [%s] package has not been" \ - "obtained yet" % self.pkg.name - LOGGER.warning(msg) - - def _get_yaml_content(self, url): - """ - Gets the contents of the YAML file - Args: - url:The network address where the YAML file exists - """ - try: - response = requests.get( - url, headers=self.headers) - if response.status_code == 200: - self._yaml_content = yaml.safe_load(response.content) - except HTTPError as error: - LOGGER.error(error) - - def _openeuler_advisor_exists_yaml(self): - """ - Determine whether there is a yaml file with the current \ - package name under the openeuler-advisor project - - """ - self._get_yaml_content(self.openeuler_advisor_url) - if self._yaml_content: - return True - return False - - def _save_to_database(self): - """ - Save the acquired yaml file information to the database - - Raises: - Error: An error occurred during data addition - """ - - def _save_package(package_module): - with DBHelper(db_name="lifecycle") as database: - database.add(package_module) - - def _save_maintainer_info(maintainer_module): - with DBHelper(db_name="lifecycle") as database: - _packages_maintainer = database.session.query( - PackagesMaintainer).filter(PackagesMaintainer.name == - maintainer_module['name']).first() - if _packages_maintainer: - for key, val in maintainer_module.items(): - setattr(_packages_maintainer, key, val) - else: - _packages_maintainer = PackagesMaintainer( - **maintainer_module) - database.add(_packages_maintainer) - - self._parse_warehouse_info() - tags = self._yaml_content.get('git_tag', None) - if tags: - self._parse_tags_content(tags) - self.producer_consumer.put( - (copy.deepcopy(self.pkg), _save_package)) - if self.timed_task_open: - maintainer = {'name': self.pkg.name} - _maintainer = self._yaml_content.get('maintainers') - if _maintainer and isinstance(_maintainer, list): - maintainer['maintainer'] = _maintainer[0] - maintainer['maintainlevel'] = self._yaml_content.get( - 'maintainlevel') - - self.producer_consumer.put((maintainer, _save_maintainer_info)) - - def _parse_warehouse_info(self): - """ - Parse the warehouse information in the yaml file - - """ - if self._yaml_content: - self.pkg.version_control = self._yaml_content.get( - 'version_control') - self.pkg.src_repo = self._yaml_content.get('src_repo') - self.pkg.tag_prefix = self._yaml_content.get('tag_prefix') - - def _parse_tags_content(self, tags): - """ - Parse the obtained tags content - - """ - try: - # Integrate tags information into key-value pairs - _tags = [(tag.split()[0], tag.split()[1]) for tag in tags] - _tags = sorted(_tags, key=lambda x: x[0], reverse=True) - self.pkg.latest_version = _tags[0][1] - self.pkg.latest_version_time = _tags[0][0] - _end_time = date.datetime.strptime( - self.pkg.latest_version_time, '%Y-%m-%d') - if self.pkg.latest_version != self.pkg.version: - for _version in _tags: - if _version[1] == self.pkg.version: - _end_time = date.datetime.strptime( - _version[0], '%Y-%m-%d') - self.pkg.used_time = (date.datetime.now() - _end_time).days - - except (IndexError, Error) as index_error: - LOGGER.error(index_error) - - -def update_pkg_info(pkg_info_update=True): - """ - Update the information of the upstream warehouse in the source package - - """ - try: - # Open thread pool - pool = ThreadPoolExecutor(max_workers=configuration.POOL_WORKERS) - with DBHelper(db_name="lifecycle") as database: - for table_name in filter(lambda x: x not in ['packages_issue', - 'packages_maintainer', - 'databases_info'], - database.engine.table_names()): - cls_model = Packages.package_meta(table_name) - # Query a specific table - for package_item in database.session.query(cls_model).all(): - if pkg_info_update: - parse_yaml = ParseYaml( - pkg_info=copy.deepcopy(package_item), - table_name=table_name) - pool.submit(parse_yaml.update_database) - else: - # Get the issue of each warehouse and save it - gitee_issue = Gitee( - copy.deepcopy(package_item), - configuration.WAREHOUSE, - package_item.name, - table_name) - pool.submit(gitee_issue.query_issues_info) - pool.shutdown() - except(RuntimeError, Exception) as error: - LOGGER.error(error) diff --git a/packageship/packageship/application/apps/lifecycle/function/gitee.py b/packageship/packageship/application/apps/lifecycle/function/gitee.py deleted file mode 100644 index 7d9d51a7c057d022bb51c0803a129c912af6b8ac..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/lifecycle/function/gitee.py +++ /dev/null @@ -1,220 +0,0 @@ -#!/usr/bin/python3 -""" -Description:Get issue info from gitee -Class: Gitee -""" -import copy -from json import JSONDecodeError -from retrying import retry -import requests -from requests.exceptions import HTTPError -from requests.exceptions import RequestException -from sqlalchemy.exc import SQLAlchemyError -from packageship.libs.dbutils import DBHelper -from packageship.libs.exception import Error, ContentNoneException -from packageship.application.models.package import PackagesIssue -from packageship.libs.log import LOGGER -from .concurrent import ProducerConsumer - - -class Gitee(): - """ - gitee version management tool related information acquisition - - """ - - def __init__(self, pkg_info, owner, repo, table_name): - self.pkg_info = pkg_info - self.owner = owner - self.repo = repo - self.url = "https://gitee.com/" - self.api_url = "https://gitee.com/api/v5/repos" - self.pool = None - self.issue_id = None - self.defect = 0 - self.feature = 0 - self.cve = 0 - self.table_name = table_name - self.producer_consumer = ProducerConsumer() - self._issue_url = None - self.total_page = 0 - - def query_issues_info(self, issue_id=""): - """ - Description: View the issue details of the specified package - Args: - issue_id: Issue id - Returns: - issue_content_list: The issue details of the specified package list - Raises: - - """ - self._issue_url = self.api_url + \ - "/{}/{}/issues/{}".format(self.owner, self.repo, issue_id) - try: - response = self._request_issue(0) - except (HTTPError, RequestException) as error: - LOGGER.logger.error(error) - return None - - self.total_page = 1 if issue_id else int( - response.headers['total_page']) - total_count = int(response.headers['total_count']) - - if total_count > 0: - issue_list = self._query_per_page_issue_info() - if not issue_list: - LOGGER.logger.error( - "An error occurred while querying {}".format(self.repo)) - return None - self._save_issues(issue_list) - - @retry(stop_max_attempt_number=3, stop_max_delay=1000) - def _request_issue(self, page): - try: - response = requests.get(self._issue_url, - params={"state": "all", "per_page": 100, "page": page}) - except RequestException as error: - raise RequestException(error) - if response.status_code != 200: - _msg = "There is an exception with the remote service [%s]," \ - "Please try again later.The HTTP error code is:%s" % (self._issue_url, str( - response.status_code)) - raise HTTPError(_msg) - return response - - def _query_per_page_issue_info(self): - """ - Description: View the issue details - Args: - total_page: total page - - Returns: - - """ - issue_content_list = [] - for i in range(1, self.total_page + 1): - try: - response = self._request_issue(i) - issue_content_list.extend( - self.parse_issues_content(response.json())) - except (HTTPError, RequestException) as error: - LOGGER.logger.error(error) - continue - except (JSONDecodeError, Error) as error: - LOGGER.logger.error(error) - return issue_content_list - - def _save_issues(self, issue_list): - """ - Save the obtained issue information - - """ - try: - def _save(issue_module): - with DBHelper(db_name='lifecycle') as database: - exist_issues = database.session.query(PackagesIssue).filter( - PackagesIssue.issue_id == issue_module['issue_id']).first() - if exist_issues: - for key, val in issue_module.items(): - setattr(exist_issues, key, val) - else: - exist_issues = PackagesIssue(**issue_module) - database.add(exist_issues) - - def _save_package(package_module): - with DBHelper(db_name='lifecycle') as database: - database.add(package_module) - - # Save the issue - for issue_item in issue_list: - self.producer_consumer.put((copy.deepcopy(issue_item), _save)) - - # The number of various issues in the update package - self.pkg_info.defect = self.defect - self.pkg_info.feature = self.feature - self.pkg_info.cve = self.cve - self.producer_consumer.put( - (copy.deepcopy(self.pkg_info), _save_package)) - - except (Error, ContentNoneException, SQLAlchemyError) as error: - LOGGER.logger.error( - 'An abnormal error occurred while saving related issues:%s' % error if error else '') - - def parse_issues_content(self, sources): - """ - Description: Parse the response content and get issue content - Args:Issue list - - Returns:list:issue_id, issue_url, issue_content, issue_status, issue_download - Raises: - """ - result_list = [] - if isinstance(sources, list): - for source in sources: - issue_content = self.parse_issue_content(source) - if issue_content: - result_list.append(issue_content) - else: - issue_content = self.parse_issue_content(sources) - if issue_content: - result_list.append(issue_content) - return result_list - - def parse_issue_content(self, source): - """ - Description: Parse the response content and get issue content - Args: Source of issue content - - Returns:list:issue_id, issue_url, issue_content, issue_status, issue_download, issue_status - issue_type, related_release - Raises:KeyError - """ - try: - result_dict = {"issue_id": source['number'], "issue_url": source['html_url'], - "issue_title": source['title'].strip(), - "issue_content": source['body'].strip(), - "issue_status": source['state'], "issue_download": "", - "issue_type": source["issue_type"], - "pkg_name": self.repo, - "related_release": source["labels"][0]['name'] if source["labels"] else ''} - if source["issue_type"] == "缺陷": - self.defect += 1 - elif source["issue_type"] == "需求": - self.feature += 1 - elif source["issue_type"] == "CVE和安全问题": - self.cve += 1 - else: - pass - except KeyError as error: - LOGGER.logger.error(error) - return None - return result_dict - - def issue_hooks(self, issue_hook_info): - """ - Description: Hook data triggered by a new task operation - Args: - issue_hook_info: Issue info - Returns: - - Raises: - - """ - if issue_hook_info is None: - raise ContentNoneException( - 'The content cannot be empty') - issue_info_list = [] - issue_info = issue_hook_info["issue"] - issue_content = self.parse_issue_content(issue_info) - if issue_content: - issue_info_list.append(issue_content) - if self.feature != 0: - self.defect, self.feature, self.cve = self.pkg_info.defect, self.pkg_info.feature + \ - 1, self.pkg_info.cve - if self.defect != 0: - self.defect, self.feature, self.cve = self.pkg_info.defect + \ - 1, self.pkg_info.feature, self.pkg_info.cve - if self.cve != 0: - self.defect, self.feature, self.cve = self.pkg_info.defect, self.pkg_info.feature, self.pkg_info.cve + 1 - self._save_issues(issue_info_list) diff --git a/packageship/packageship/application/apps/lifecycle/serialize.py b/packageship/packageship/application/apps/lifecycle/serialize.py deleted file mode 100644 index 5d424448d68aca69dbea52674136fe7b35b2ce4b..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/lifecycle/serialize.py +++ /dev/null @@ -1,112 +0,0 @@ -#!/usr/bin/python3 -""" -Description: marshmallow serialize -""" -from marshmallow import Schema -from marshmallow import fields -from marshmallow import validate -from marshmallow import ValidationError -from packageship.application.models.package import PackagesIssue, Packages -from packageship.libs.log import Log - -LOGGER = Log(__name__) - - -def validate_pagenum(pagenum): - """ - Description: Method test - Args - pagenum: pagenum - Returns: - True or failure - Raises: - ValidationError: Test failed - """ - if pagenum <= 0 or pagenum > 65535: - LOGGER.logger.error("[pagenum:{}] is illegal data ".format(pagenum)) - raise ValidationError("pagenum is illegal data ") - - -def validate_pagesize(pagesize): - """ - Description: Method test - Args - pagesize: pagesize - Returns: - True or failure - Raises: - ValidationError: Test failed - """ - if pagesize <= 0 or pagesize > 65535: - LOGGER.logger.error("[pagesize:{}] is illegal data ".format(pagesize)) - raise ValidationError("pagesize is illegal data ") - - -class IssueSchema(Schema): - """ - Description: IssueSchema serialize - """ - page_num = fields.Integer(required=True, validate=validate_pagenum) - page_size = fields.Integer(required=True, validate=validate_pagesize) - pkg_name = fields.Str(validate=validate.Length( - max=200), required=False, allow_none=True) - maintainer = fields.Str(validate=validate.Length( - max=200), required=False, allow_none=True) - issue_type = fields.Str(validate=validate.Length( - max=200), required=False, allow_none=True) - issue_status = fields.Str(validate=validate.Length( - max=200), required=False, allow_none=True) - - -class IssueDownloadSchema(Schema): - """ - Field serialization for issue file download - """ - - class Meta: - """Model mapping serialized fields""" - model = PackagesIssue - fields = ('issue_id', 'issue_url', 'issue_content', - 'issue_title', 'issue_status', 'pkg_name', 'issue_type', 'related_release') - - -class PackagesDownloadSchema(Schema): - """ - Field serialization for package file download - """ - - class Meta: - """Model mapping serialized fields""" - model = Packages - fields = ('name', 'url', 'rpm_license', 'version', 'release', 'release_time', - 'used_time', 'latest_version', 'latest_version_time', - 'feature', 'cve', 'defect', 'maintainer', 'maintainlevel') - - -class IssuePageSchema(Schema): - """ - Description: IssuePageSchema serialize - """ - maintainer = fields.Str() - - class Meta: - """Model mapping serialized fields""" - model = PackagesIssue - fields = ('issue_id', 'issue_url', - 'issue_title', 'issue_status', 'pkg_name', 'issue_type', 'maintainer') - - -class UpdateMoreSchema(Schema): - """ - Description: InitSystemSchema serialize - """ - tablename = fields.Str( - required=True, - validate=validate.Length( - min=1, - max=200)) - dbpath = fields.Str( - required=True, - validate=validate.Length( - min=1, - max=200)) diff --git a/packageship/packageship/application/apps/lifecycle/url.py b/packageship/packageship/application/apps/lifecycle/url.py deleted file mode 100644 index 71750d89b2a40fcd4104da0c44b0be1fcd1ae79c..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/lifecycle/url.py +++ /dev/null @@ -1,22 +0,0 @@ -#!/usr/bin/python3 -""" -Life cycle of url giant whale collection -""" -from . import view - -urls = [ # pylint: disable=invalid-name - # Download package data or iSsue data - (view.DownloadFile, '/lifeCycle/download/', {'query': ('GET')}), - # Get a collection of maintainers list - (view.MaintainerView, '/lifeCycle/maintainer', {'query': ('GET')}), - # Get the columns that need to be displayed by default in the package - (view.TableColView, '/packages/tablecol', {'query': ('GET')}), - # View all table names in the package-info database - (view.LifeTables, '/lifeCycle/tables', {'query': ('GET')}), - (view.IssueView, '/lifeCycle/issuetrace', {'query': ('GET')}), - (view.IssueType, '/lifeCycle/issuetype', {'query': ('GET')}), - (view.IssueStatus, '/lifeCycle/issuestatus', {'query': ('GET')}), - (view.IssueCatch, '/lifeCycle/issuecatch', {'write': ('POST')}), -# update a package info - (view.UpdatePackages, '/lifeCycle/updatePkgInfo', {'write': ('PUT')}) -] diff --git a/packageship/packageship/application/apps/lifecycle/view.py b/packageship/packageship/application/apps/lifecycle/view.py deleted file mode 100644 index 61231f0f3491fa80c18fff1211de570a65de265a..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/lifecycle/view.py +++ /dev/null @@ -1,752 +0,0 @@ -#!/usr/bin/python3 -""" -Life cycle related api interface -""" -import io -import json -import math -import os -from concurrent.futures import ThreadPoolExecutor - -import pandas as pd -import yaml - -from flask import request -from flask import jsonify, make_response -from flask import current_app -from flask_restful import Resource -from marshmallow import ValidationError -from sqlalchemy.exc import DisconnectionError, SQLAlchemyError - -from packageship.libs.exception import Error -from packageship.application.apps.package.function.constants import ResponseCode -from packageship.libs.dbutils.sqlalchemy_helper import DBHelper -from packageship.application.models.package import PackagesIssue -from packageship.application.models.package import Packages -from packageship.application.models.package import PackagesMaintainer -from packageship.libs.log import LOGGER -from packageship.libs.conf import configuration -from .serialize import IssueDownloadSchema, PackagesDownloadSchema, IssuePageSchema, IssueSchema -from ..package.serialize import DataFormatVerfi, UpdatePackagesSchema -from .function.gitee import Gitee as gitee - - -class DownloadFile(Resource): - """ - Download the content of the issue or the excel file of the package content - """ - - def _download_excel(self, file_type, table_name=None): - """ - Download excel file - """ - file_name = 'packages.xlsx' - if file_type == 'packages': - download_content = self.__get_packages_content(table_name) - else: - file_name = 'issues.xlsx' - download_content = self.__get_issues_content() - if download_content is None: - return jsonify( - ResponseCode.response_json( - ResponseCode.SERVICE_ERROR)) - pd_dataframe = self.__to_dataframe(download_content) - - _response = self.__bytes_save(pd_dataframe) - return self.__set_response_header(_response, file_name) - - def __bytes_save(self, data_frame): - """ - Save the file content in the form of a binary file stream - """ - try: - bytes_io = io.BytesIO() - writer = pd.ExcelWriter( # pylint: disable=abstract-class-instantiated - bytes_io, engine='xlsxwriter') - data_frame.to_excel(writer, sheet_name='Summary', index=False) - writer.save() - writer.close() - bytes_io.seek(0) - _response = make_response(bytes_io.getvalue()) - bytes_io.close() - return _response - except (IOError, Error) as io_error: - current_app.logger.error(io_error) - return make_response() - - def __set_response_header(self, response, file_name): - """ - Set http response header information - """ - response.headers['Content-Type'] = \ - "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet" - response.headers["Cache-Control"] = "no-cache" - response.headers['Content-Disposition'] = 'attachment; filename={file_name}'.format( - file_name=file_name) - return response - - def __get_packages_content(self, table_name): - """ - Get package list information - """ - try: - with DBHelper(db_name='lifecycle') as database: - # Query all package data in the specified table - _model = Packages.package_meta(table_name) - _packageinfos = database.session.query(_model).all() - packages_dicts = PackagesDownloadSchema( - many=True).dump(_packageinfos) - return packages_dicts - - except (SQLAlchemyError, DisconnectionError) as error: - current_app.logger.error(error) - return None - - def __get_issues_content(self): - """ - Get the list of issues - """ - try: - with DBHelper(db_name='lifecycle') as database: - _issues = database.session.query(PackagesIssue).all() - issues_dicts = IssueDownloadSchema(many=True).dump(_issues) - return issues_dicts - except (SQLAlchemyError, DisconnectionError) as error: - current_app.logger.error(error) - return None - - def __to_dataframe(self, datas): - """ - Convert the obtained information into pandas content format - """ - - data_frame = pd.DataFrame(datas) - return data_frame - - def get(self, file_type): - """ - Download package collection information and isse list information - - """ - if file_type not in ['packages', 'issues']: - return jsonify( - ResponseCode.response_json( - ResponseCode.PARAM_ERROR)) - - table_name = request.args.get('table_name', None) - response = self._download_excel(file_type, table_name) - return response - - -class MaintainerView(Resource): - """ - Maintainer name collection - """ - - def __query_maintainers(self): - """ - Query the names of all maintainers in the specified table - """ - try: - with DBHelper(db_name='lifecycle') as database: - maintainers = database.session.query( - PackagesMaintainer.maintainer).group_by(PackagesMaintainer.maintainer).all() - return [maintainer_item[0] for maintainer_item in maintainers - if maintainer_item[0]] - except (SQLAlchemyError, DisconnectionError) as error: - current_app.logger.error(error) - return [] - - def get(self): - """ - Get the list of maintainers - """ - # Group query of the names of all maintainers in the current table - maintainers = self.__query_maintainers() - return jsonify(ResponseCode.response_json( - ResponseCode.SUCCESS, - maintainers)) - - -class TableColView(Resource): - """ - The default column of the package shows the interface - """ - - def __columns_names(self): - """ - Mapping of column name and title - """ - columns = [ - ('name', 'Name', True), - ('version', 'Version', True), - ('release', 'Release', True), - ('url', 'Url', True), - ('rpm_license', 'License', False), - ('feature', 'Feature', False), - ('maintainer', 'Maintainer', True), - ('maintainlevel', 'Maintenance Level', True), - ('release_time', 'Release Time', False), - ('used_time', 'Used Time', True), - ('maintainer_status', 'Maintain Status', True), - ('latest_version', 'Latest Version', False), - ('latest_version_time', 'Latest Version Release Time', False), - ('issue', 'Issue', True)] - return columns - - def __columns_mapping(self): - """ - - """ - columns = list() - for column in self.__columns_names(): - columns.append({ - 'column_name': column[0], - 'label': column[1], - 'default_selected': column[2] - }) - return columns - - def get(self): - """ - Get the default display column of the package - - """ - table_mapping_columns = self.__columns_mapping() - return jsonify( - ResponseCode.response_json( - ResponseCode.SUCCESS, - table_mapping_columns)) - - -class LifeTables(Resource): - """ - description: LifeTables - Restful API: get - ChangeLog: - """ - - def get(self): - """ - return all table names in the database - - Returns: - Return the table names in the database as a list - """ - try: - with DBHelper(db_name="lifecycle") as database_name: - # View all table names in the package-info database - all_table_names = database_name.engine.table_names() - all_table_names.remove("packages_issue") - all_table_names.remove("packages_maintainer") - all_table_names.remove("databases_info") - return jsonify( - ResponseCode.response_json( - ResponseCode.SUCCESS, data=all_table_names) - ) - except (SQLAlchemyError, DisconnectionError, Error, ValueError) as sql_error: - LOGGER.logger.error(sql_error) - return jsonify( - ResponseCode.response_json(ResponseCode.DATABASE_NOT_FOUND) - ) - - -class IssueView(Resource): - """ - Issue content collection - """ - - def _query_issues(self, request_data): - """ - Args: - request_data: - Returns: - """ - try: - with DBHelper(db_name='lifecycle') as database: - issues_query = database.session.query(PackagesIssue.issue_id, - PackagesIssue.issue_url, - PackagesIssue.issue_title, - PackagesIssue.issue_status, - PackagesIssue.pkg_name, - PackagesIssue.issue_type, - PackagesMaintainer.maintainer). \ - outerjoin(PackagesMaintainer, - PackagesMaintainer.name == PackagesIssue.pkg_name) - if request_data.get("pkg_name"): - issues_query = issues_query.filter( - PackagesIssue.pkg_name == request_data.get("pkg_name")) - if request_data.get("issue_type"): - issues_query = issues_query.filter( - PackagesIssue.issue_type == request_data.get("issue_type")) - if request_data.get("issue_status"): - issues_query = issues_query.filter( - PackagesIssue.issue_status == request_data.get("issue_status")) - if request_data.get("maintainer"): - issues_query = issues_query.filter( - PackagesMaintainer.maintainer == request_data.get("maintainer")) - total_count = issues_query.count() - total_page = math.ceil( - total_count / int(request_data.get("page_size"))) - issues_query = issues_query.limit(request_data.get("page_size")).offset( - (int(request_data.get("page_num")) - 1) * int(request_data.get("page_size"))) - issue_dicts = IssuePageSchema( - many=True).dump(issues_query.all()) - issue_data = ResponseCode.response_json( - ResponseCode.SUCCESS, issue_dicts) - issue_data['total_count'] = total_count - issue_data['total_page'] = total_page - return issue_data - except (SQLAlchemyError, DisconnectionError) as error: - current_app.logger.error(error) - return ResponseCode.response_json(ResponseCode.DATABASE_NOT_FOUND) - - def get(self): - """ - Description: Get all issues info or one specific issue - Args: - Returns: - [ - { - "issue_id": "", - "issue_url": "", - "issue_title": "", - "issue_content": "", - "issue_status": "", - "issue_type": "" - }, - ] - Raises: - DisconnectionError: Unable to connect to database exception - AttributeError: Object does not have this property - TypeError: Exception of type - Error: Abnormal error - """ - schema = IssueSchema() - if schema.validate(request.args): - return jsonify( - ResponseCode.response_json(ResponseCode.PARAM_ERROR) - ) - issue_dict = self._query_issues(request.args) - return issue_dict - - -class IssueType(Resource): - """ - Issue type collection - """ - - def _get_issue_type(self): - """ - Description: Query issue type - Returns: - """ - try: - with DBHelper(db_name='lifecycle') as database: - issues_query = database.session.query(PackagesIssue.issue_type).group_by( - PackagesIssue.issue_type).all() - return jsonify(ResponseCode.response_json( - ResponseCode.SUCCESS, [issue_query[0] for issue_query in issues_query])) - except (SQLAlchemyError, DisconnectionError) as error: - current_app.logger.error(error) - return jsonify(ResponseCode.response_json( - ResponseCode.PARAM_ERROR)) - - def get(self): - """ - Description: Get all issues info or one specific issue - Args: - Returns: - [ - "issue_type", - "issue_type" - ] - Raises: - DisconnectionError: Unable to connect to database exception - AttributeError: Object does not have this property - TypeError: Exception of type - Error: Abnormal error - """ - return self._get_issue_type() - - -class IssueStatus(Resource): - """ - Issue status collection - """ - - def _get_issue_status(self): - """ - Description: Query issue status - Returns: - """ - try: - with DBHelper(db_name='lifecycle') as database: - issues_query = database.session.query(PackagesIssue.issue_status).group_by( - PackagesIssue.issue_status).all() - return jsonify(ResponseCode.response_json( - ResponseCode.SUCCESS, [issue_query[0] for issue_query in issues_query])) - except (SQLAlchemyError, DisconnectionError) as error: - current_app.logger.error(error) - return jsonify(ResponseCode.response_json( - ResponseCode.PARAM_ERROR)) - - def get(self): - """ - Description: Get all issues info or one specific issue - Args: - Returns: - [ - "issue_status", - "issue_status" - ] - Raises: - DisconnectionError: Unable to connect to database exception - AttributeError: Object does not have this property - TypeError: Exception of type - Error: Abnormal error - """ - return self._get_issue_status() - - -class IssueCatch(Resource): - """ - description: Catch issue content - Restful API: put - ChangeLog: - """ - - def post(self): - """ - Searching issue content - Args: - Returns: - for examples: - [ - { - "issue_id": "", - "issue_url": "", - "issue_title": "", - "issue_content": "", - "issue_status": "", - "issue_type": "" - }, - ] - Raises: - DisconnectionError: Unable to connect to database exception - AttributeError: Object does not have this property - TypeError: Exception of type - Error: Abnormal error - """ - data = json.loads(request.get_data()) - if not isinstance(data, dict): - return jsonify( - ResponseCode.response_json(ResponseCode.PARAM_ERROR)) - pkg_name = data["repository"]["path"] - try: - pool_workers = configuration.POOL_WORKERS - _warehouse = configuration.WAREHOUSE - if not isinstance(pool_workers, int): - pool_workers = 10 - pool = ThreadPoolExecutor(max_workers=pool_workers) - with DBHelper(db_name="lifecycle") as database: - for table_name in filter(lambda x: x not in ['packages_issue', 'packages_maintainer', - 'databases_info'], - database.engine.table_names()): - cls_model = Packages.package_meta(table_name) - for package_item in database.session.query(cls_model).filter( - cls_model.name == pkg_name).all(): - gitee_issue = gitee( - package_item, _warehouse, package_item.name, table_name) - pool.submit(gitee_issue.issue_hooks, data) - pool.shutdown() - return jsonify(ResponseCode.response_json(ResponseCode.SUCCESS)) - except SQLAlchemyError as error_msg: - current_app.logger.error(error_msg) - - -class UpdatePackages(Resource): - """ - description:Life cycle update information of a single package - Restful API: post - ChangeLog: - """ - - def _get_all_yaml_name(self, filepath): - """ - List of all yaml file names in the folder - - Args: - filepath: file path - - Returns: - yaml_file_list:List of all yaml file names in the folder - - Attributes: - Error:Error - NotADirectoryError:Invalid directory name - FileNotFoundError:File not found error - - """ - try: - yaml_file_list = os.listdir(filepath) - return yaml_file_list - except (Error, NotADirectoryError, FileNotFoundError) as error: - current_app.logger.error(error) - return None - - def _get_yaml_content(self, yaml_file, filepath): - """ - Read the content of the yaml file - - Args: - yaml_file: yaml file - filepath: file path - - Returns: - Return a dictionary containing name, maintainer and maintainlevel - """ - yaml_data_dict = dict() - if not yaml_file.endswith(".yaml"): - return None - pkg_name = yaml_file.rsplit('.yaml')[0] - single_yaml_path = os.path.join(filepath, yaml_file) - with open(single_yaml_path, 'r', encoding='utf-8') as file_context: - yaml_flie_data = yaml.load( - file_context.read(), Loader=yaml.FullLoader) - if yaml_flie_data is None or not isinstance(yaml_flie_data, dict): - return None - maintainer = yaml_flie_data.get("maintainer") - maintainlevel = yaml_flie_data.get("maintainlevel") - yaml_data_dict['name'] = pkg_name - if maintainer: - yaml_data_dict['maintainer'] = maintainer - if maintainlevel: - yaml_data_dict['maintainlevel'] = maintainlevel - return yaml_data_dict - - def _read_yaml_file(self, filepath): - """ - Read the yaml file and combine the data of the nested dictionary of the list - - Args: - filepath: file path - - Returns: - yaml.YAMLError:yaml file error - SQLAlchemyError:SQLAlchemy Error - DisconnectionError:Connect to database error - Error:Error - """ - yaml_file_list = self._get_all_yaml_name(filepath) - if not yaml_file_list: - return None - try: - yaml_data_list = list() - pool_workers = configuration.POOL_WORKERS - if not isinstance(pool_workers, int): - pool_workers = 10 - with ThreadPoolExecutor(max_workers=pool_workers) as pool: - for yaml_file in yaml_file_list: - pool_result = pool.submit( - self._get_yaml_content, yaml_file, filepath) - yaml_data_dict = pool_result.result() - yaml_data_list.append(yaml_data_dict) - return yaml_data_list - except (yaml.YAMLError, SQLAlchemyError, DisconnectionError, Error) as error: - current_app.logger.error(error) - return None - - def _verification_yaml_data_list(self, yaml_data_list): - """ - Verify the data obtained in the yaml file - - Args: - yaml_data_list: yaml data list - - Returns: - yaml_data_list: After verification yaml data list - - Attributes: - ValidationError: Validation error - - """ - try: - DataFormatVerfi(many=True).load(yaml_data_list) - return yaml_data_list - except ValidationError as error: - current_app.logger.error(error.messages) - return None - - def _save_in_database(self, yaml_data_list): - """ - Save the data to the database - - Args: - tbname: Table Name - name_separate_list: Split name list - _update_pack_data: Split new list of combined data - - Returns: - SUCCESS or UPDATA_DATA_FAILED - - Attributes - DisconnectionError: Connect to database error - SQLAlchemyError: SQLAlchemy Error - Error: Error - - """ - try: - with DBHelper(db_name="lifecycle") as database_name: - if 'packages_maintainer' not in database_name.engine.table_names(): - return jsonify(ResponseCode.response_json( - ResponseCode.TABLE_NAME_NOT_EXIST)) - database_name.session.begin(subtransactions=True) - for yaml_data in yaml_data_list: - name = yaml_data.get("name") - maintainer = yaml_data.get("maintainer") - maintainlevel = yaml_data.get("maintainlevel") - packages_maintainer_obj = database_name.session.query( - PackagesMaintainer).filter_by(name=name).first() - if packages_maintainer_obj: - if maintainer: - packages_maintainer_obj.maintainer = maintainer - if maintainlevel: - packages_maintainer_obj.maintainlevel = maintainlevel - else: - database_name.add(PackagesMaintainer( - name=name, maintainer=maintainer, maintainlevel=maintainlevel - )) - database_name.session.commit() - return jsonify(ResponseCode.response_json( - ResponseCode.SUCCESS)) - except (DisconnectionError, SQLAlchemyError, Error, AttributeError) as error: - current_app.logger.error(error) - return jsonify(ResponseCode.response_json( - ResponseCode.UPDATA_DATA_FAILED)) - - def _overall_process( - self, - filepath): - """ - Call each method to complete the entire function - - Args: - filepath: file path - tbname: table name - - Returns: - SUCCESS or UPDATA_DATA_FAILED - - Attributes - DisconnectionError: Connect to database error - SQLAlchemyError: SQLAlchemy Error - Error: Error - """ - try: - if filepath is None or not os.path.exists(filepath): - return jsonify(ResponseCode.response_json( - ResponseCode.SPECIFIED_FILE_NOT_EXIST)) - yaml_file_list = self._get_all_yaml_name(filepath) - if not yaml_file_list: - return jsonify(ResponseCode.response_json( - ResponseCode.EMPTY_FOLDER)) - yaml_data_list_result = self._read_yaml_file(filepath) - yaml_data_list = self._verification_yaml_data_list( - yaml_data_list_result) - if yaml_data_list is None: - return jsonify(ResponseCode.response_json( - ResponseCode.YAML_FILE_ERROR)) - result = self._save_in_database( - yaml_data_list) - return result - except (DisconnectionError, SQLAlchemyError, Error) as error: - current_app.logger.error(error) - return jsonify(ResponseCode.response_json( - ResponseCode.UPDATA_DATA_FAILED)) - - def _update_single_package_info( - self, srcname, maintainer, maintainlevel): - """ - Update the maintainer field and maintainlevel - field of a single package - - Args: - srcname: The name of the source package - maintainer: Package maintainer - maintainlevel: Package maintenance level - - Returns: - success or failed - - Attributes - SQLAlchemyError: sqlalchemy error - DisconnectionError: Cannot connect to database error - Error: Error - """ - if not srcname: - return jsonify( - ResponseCode.response_json(ResponseCode.PACK_NAME_NOT_FOUND) - ) - if not maintainer and not maintainlevel: - return jsonify( - ResponseCode.response_json(ResponseCode.PARAM_ERROR) - ) - try: - with DBHelper(db_name='lifecycle') as database_name: - if 'packages_maintainer' not in database_name.engine.table_names(): - return jsonify(ResponseCode.response_json( - ResponseCode.TABLE_NAME_NOT_EXIST)) - update_obj = database_name.session.query( - PackagesMaintainer).filter_by(name=srcname).first() - if update_obj: - if maintainer: - update_obj.maintainer = maintainer - if maintainlevel: - update_obj.maintainlevel = maintainlevel - else: - database_name.add(PackagesMaintainer( - name=srcname, maintainer=maintainer, maintainlevel=maintainlevel - )) - database_name.session.commit() - return jsonify( - ResponseCode.response_json( - ResponseCode.SUCCESS)) - except (SQLAlchemyError, DisconnectionError, Error) as sql_error: - current_app.logger.error(sql_error) - database_name.session.rollback() - return jsonify(ResponseCode.response_json( - ResponseCode.UPDATA_DATA_FAILED - )) - - def put(self): - """ - Life cycle update information of a single package or - All packages - - Returns: - for example:: - { - "code": "", - "data": "", - "msg": "" - } - """ - schema = UpdatePackagesSchema() - data = request.get_json() - if schema.validate(data): - return jsonify( - ResponseCode.response_json(ResponseCode.PARAM_ERROR) - ) - srcname = data.get('pkg_name', None) - maintainer = data.get('maintainer', None) - maintainlevel = data.get('maintainlevel', None) - batch = data.get('batch') - filepath = data.get('filepath', None) - - if batch: - result = self._overall_process(filepath) - else: - result = self._update_single_package_info( - srcname, maintainer, maintainlevel) - return result diff --git a/packageship/packageship/application/apps/package/__init__.py b/packageship/packageship/application/apps/package/__init__.py deleted file mode 100644 index 987ad6145195cd94d044464c03e277aa73bf270a..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/package/__init__.py +++ /dev/null @@ -1,16 +0,0 @@ -from flask.blueprints import Blueprint -from flask_restful import Api -from packageship.application.apps.package.url import urls -from packageship import application - -package = Blueprint('package', __name__) - -# init restapi -api = Api() - -for view, url, operation in urls: - if application.OPERATION and application.OPERATION in operation.keys(): - api.add_resource(view, url) - - -__all__ = ['package', 'api'] diff --git a/packageship/packageship/application/apps/package/function/__init__.py b/packageship/packageship/application/apps/package/function/__init__.py deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/packageship/packageship/application/apps/package/function/be_depend.py b/packageship/packageship/application/apps/package/function/be_depend.py deleted file mode 100644 index 91cb5eebcca838791cf0c1511fe4c5bd63f3f9cd..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/package/function/be_depend.py +++ /dev/null @@ -1,339 +0,0 @@ -#!/usr/bin/python3 -""" -Description:The dependencies of the query package - Used for package deletion and upgrade scenarios - This includes both install and build dependencies -Class: BeDepend -""" -import copy -from collections import namedtuple, defaultdict -from flask import current_app -from sqlalchemy import text -from sqlalchemy.exc import SQLAlchemyError -from sqlalchemy.sql import literal_column -from packageship.application.models.package import SrcPack -from packageship.libs.dbutils import DBHelper - - -class BeDepend(): - """ - Description: Find the dependencies of the source package - Attributes: - source_name: source name - db_name: database name - with_sub_pack: with_sub_pack - source_name_set:Source package lookup set - bin_name_set:Bin package lookup set - result_dict:return json - """ - - def __init__(self, source_name, db_name, with_sub_pack): - """ - init class - """ - self.source_name = source_name - self.db_name = db_name - self.with_sub_pack = with_sub_pack - self.source_name_set = set() - self.bin_name_set = set() - self.result_dict = dict() - self.comm_install_builds = defaultdict(set) - self.provides_name = set() - - def main(self): - """ - Description: Map the database, if the source - package of the query is not in the database, - throw an exception. Directly to the end - Args: - Returns: - "source name": [ - "source", - "version", - "dbname", - [ - [ - "root", - null - ] - ] - ] - Raises: - """ - with DBHelper(db_name=self.db_name) as data_base: - src_obj = data_base.session.query( - SrcPack).filter_by(name=self.source_name).first() - if src_obj: - # spell dictionary - self.result_dict[self.source_name + "_src"] = [ - "source", - src_obj.version, - self.db_name, - [["root", None]] - ] - self.source_name_set.add(self.source_name) - self._provides_bedepend( - [self.source_name], data_base, package_type='src') - - for _, value in self.result_dict.items(): - value[-1] = list(value[-1]) - return self.result_dict - - def _get_provides(self, pkg_name_list, data_base, package_type): - """ - Description: Query the components provided by the required package - Args: - pkg_name_list:source or binary packages name - data_base: database - package_type: package type - Returns: - Raises: - SQLAlchemyError: Database connection exception - """ - res = namedtuple( - 'restuple', [ - 'search_bin_name', 'search_bin_version', 'source_name']) - sql_com = """ - SELECT DISTINCT b1.name AS search_bin_name, - b1.version AS search_bin_version, - b1.src_name AS source_name, - bin_provides.name As pro_name - FROM ( SELECT pkgKey,src_name,name,version FROM bin_pack WHERE {} ) b1 - LEFT JOIN bin_provides ON bin_provides.pkgKey = b1.pkgKey;""" - - # package_type - if package_type == 'src': - literal_name = 'src_name' - elif package_type == 'bin': - literal_name = 'name' - - # Query database - # The lower version of SQLite can look up up to 999 parameters - # simultaneously, so use 900 sharding queries - try: - result = [] - for input_name in (pkg_name_list[i:i + 900] - for i in range(0, len(pkg_name_list), 900)): - name_in = literal_column(literal_name).in_(input_name) - sql_str = text(sql_com.format(name_in)) - result.extend(data_base.session.execute( - sql_str, - { - literal_name + '_{}'.format(i): v - for i, v in enumerate(input_name, 1) - } - ).fetchall()) - except SQLAlchemyError as sql_err: - current_app.logger.error(sql_err) - return - - if not result: - return - - # Process the result of the component - pro_name_dict = dict() - - _components = set() - for obj in result: - if not obj.pro_name: - continue - # De-weight components - if obj.pro_name not in self.comm_install_builds: - pro_name_dict[obj.pro_name] = res( - obj.search_bin_name, obj.search_bin_version, obj.source_name) - - if obj.search_bin_name not in self.result_dict: - self.result_dict[obj.search_bin_name] = [ - obj.source_name, - obj.search_bin_version, - self.db_name, - self.comm_install_builds[obj.pro_name] - if self.comm_install_builds[obj.pro_name] else {(None, None)} - ] - tmp_ = copy.deepcopy(self.comm_install_builds[obj.pro_name]) - - tmp_.discard((obj.search_bin_name, 'install')) - tmp_.discard((obj.search_bin_name, 'build')) - - if (None, None) in self.result_dict[obj.search_bin_name][-1] \ - and self.comm_install_builds[obj.pro_name]: - self.result_dict[obj.search_bin_name][-1] = tmp_ - else: - self.result_dict[obj.search_bin_name][-1].update(tmp_) - return pro_name_dict - - def _provides_bedepend(self, pkg_name_list, data_base, package_type): - """ - Description: Query the dependent function - Args: - pkg_name_list:source or binary packages name - data_base: database - package_type: package type - Returns: - Raises: - SQLAlchemyError: Database connection exception - """ - # Query component - pro_names = self._get_provides(pkg_name_list, data_base, package_type) - - if not pro_names: - return - - sql_2_bin = """ - SELECT DISTINCT - b2.name AS bin_name, - b2.src_name AS install_depend_src_name, - br.name AS pro_name - FROM - ( SELECT name, pkgKey FROM bin_requires WHERE {}) br - LEFT JOIN bin_pack b2 ON b2.pkgKey = br.pkgKey; - """ - - sql_2_src = """ - SELECT DISTINCT - s1.name AS bebuild_src_name, - sr.name AS pro_name - FROM - ( SELECT name, pkgKey FROM src_requires WHERE {} ) sr - LEFT JOIN src_pack s1 ON s1.pkgKey = sr.pkgKey; - """ - - provides_name_list = [pro for pro, _ in pro_names.items()] - - result_2_bin = [] - result_2_src = [] - # Query database - try: - for input_name in ( - provides_name_list[i:i + 900] for i in range(0, len(provides_name_list), 900)): - name_in = literal_column('name').in_(input_name) - sql_str_2_bin = text(sql_2_bin.format(name_in)) - result_2_bin.extend(data_base.session.execute( - sql_str_2_bin, - { - 'name_{}'.format(i): v - for i, v in enumerate(input_name, 1) - } - ).fetchall()) - sql_str_2src = text(sql_2_src.format(name_in)) - result_2_src.extend(data_base.session.execute( - sql_str_2src, - { - 'name_{}'.format(i): v - for i, v in enumerate(input_name, 1) - } - ).fetchall()) - except SQLAlchemyError as sql_err: - current_app.logger.error(sql_err) - return - - source_name_list = [] - bin_name_list = [] - - # Process the data that the installation depends on - for bin_info in result_2_bin: - temp_bin_pkg = bin_info.bin_name - temp_sub_src_pkg = bin_info.install_depend_src_name - - #withsubpick ==1 - if self.with_sub_pack == '1' and temp_sub_src_pkg not in self.source_name_set: - self.source_name_set.add(temp_sub_src_pkg) - source_name_list.append(temp_sub_src_pkg) - - if temp_bin_pkg not in self.bin_name_set: - self.bin_name_set.add(temp_bin_pkg) - bin_name_list.append(temp_bin_pkg) - - if bin_info.pro_name not in self.comm_install_builds: - self.comm_install_builds[bin_info.pro_name] = { - (bin_info.bin_name, 'install') - } - - elif (bin_info.bin_name, 'install') not in \ - self.comm_install_builds[bin_info.pro_name]: - - self.comm_install_builds[bin_info.pro_name].add( - (bin_info.bin_name, 'install') - ) - - self.make_dicts( - pro_names.get(bin_info.pro_name).search_bin_name, - pro_names.get(bin_info.pro_name).source_name, - pro_names.get(bin_info.pro_name).search_bin_version, - bin_info.bin_name, - 'install' - ) - # Process data that is compile-dependent - for src_info in result_2_src: - if src_info.bebuild_src_name not in self.source_name_set: - self.source_name_set.add(src_info.bebuild_src_name) - source_name_list.append(src_info.bebuild_src_name) - - if src_info.pro_name not in self.comm_install_builds: - self.comm_install_builds[src_info.pro_name] = { - (src_info.bebuild_src_name, 'build') - } - elif (src_info.bebuild_src_name, 'build') not in \ - self.comm_install_builds[src_info.pro_name]: - - self.comm_install_builds[src_info.pro_name].add( - (src_info.bebuild_src_name, 'build') - ) - - self.make_dicts( - pro_names.get(src_info.pro_name).search_bin_name, - pro_names.get(src_info.pro_name).source_name, - pro_names.get(src_info.pro_name).search_bin_version, - src_info.bebuild_src_name, - 'build' - ) - # Recursively query all source packages that need to be looked up - if source_name_list: - self._provides_bedepend( - source_name_list, data_base, package_type="src") - # Recursively query all binary packages that need to be looked up - if bin_name_list: - self._provides_bedepend( - bin_name_list, data_base, package_type="bin") - - def make_dicts(self, key, source_name, version, parent_node, be_type): - """ - Description: Splicing dictionary function - Args: - key: dependent bin name - source_name: source name - version: version - parent_node: Rely on package name - be_type: dependent type - Returns: - Raises: - """ - if key not in self.result_dict: - self.result_dict[key] = [ - source_name, - version, - self.db_name, - { - (parent_node, - be_type - ) - } - - ] - else: - if be_type and parent_node: - if (None, None) in self.result_dict[key][-1]: - self.result_dict[key][-1] = { - ( - parent_node, - be_type - ) - } - - elif (parent_node, be_type) not in self.result_dict[key][-1]: - self.result_dict[key][-1].add( - ( - parent_node, - be_type - ) - ) diff --git a/packageship/packageship/application/apps/package/function/build_depend.py b/packageship/packageship/application/apps/package/function/build_depend.py deleted file mode 100644 index bb087108cc9947651ba84c2045f6954f979e8a74..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/package/function/build_depend.py +++ /dev/null @@ -1,253 +0,0 @@ -#!/usr/bin/python3 -""" -Description: Find compilation dependency of source package -class: BuildDepend -""" -from packageship.application.apps.package.function.searchdb import SearchDB -from packageship.application.apps.package.function.install_depend import InstallDepend -from packageship.application.apps.package.function.constants import ResponseCode, ListNode - - -class BuildDepend(): - """ - Description: Find compilation dependency of source package - Attributes: - pkg_name_list: List of package names - db_list: List of database names - self_build: Compile dependency conditions - history_dict: Query history dict - search_db:Query an instance of a database class - result_dict:A dictionary to store the data that needs to be echoed - source_dict:A dictionary to store the searched source code package name - not_found_components: Contain the package not found components - __already_pk_val:List of pkgKey found - """ - - # pylint: disable = R0902 - def __init__(self, pkg_name_list, db_list, self_build=0, history_dict=None): - """ - init class - """ - self.pkg_name_list = pkg_name_list - self._self_build = self_build - - self.db_list = db_list - self.search_db = SearchDB(db_list) - - self.result_dict = dict() - self.source_dict = dict() - - self.history_dicts = history_dict if history_dict else {} - self.not_found_components = set() - - self.__already_pk_val = [] - - def build_depend_main(self): - """ - Description: Entry function - Args: - Returns: - ResponseCode: response code - result_dict: Dictionary of query results - source_dict: Dictionary of source code package - not_found_components: Set of package not found components - Raises: - """ - if not self.search_db.db_object_dict: - return ResponseCode.DIS_CONNECTION_DB, None, None, set() - - if self._self_build == 0: - code = self.build_depend(self.pkg_name_list) - if None in self.result_dict: - del self.result_dict[None] - return code, self.result_dict, None, self.not_found_components - - if self._self_build == 1: - self.self_build(self.pkg_name_list) - if None in self.result_dict: - del self.result_dict[None] - # There are two reasons for the current status code to return SUCCESS - # 1, Other branches return three return values. - # Here, a place holder is needed to prevent unpacking errors during call - # 2, This function is an auxiliary function of other modules. - # The status code is not the final display status code - return (ResponseCode.SUCCESS, self.result_dict, - self.source_dict, self.not_found_components) - - return ResponseCode.PARAM_ERROR, None, None, set() - - def build_depend(self, pkg_list): - """ - Description: Compile dependency query - Args: - pkg_list:You need to find the dependent source package name - Returns: - ResponseCode: response code - Raises: - """ - (res_status, - build_list, - not_fd_com_build, - pk_v - ) = self.search_db.get_build_depend(pkg_list, pk_value=self.__already_pk_val) - - self.__already_pk_val = pk_v - self.not_found_components.update(not_fd_com_build) - if not build_list: - return res_status if res_status == ResponseCode.DIS_CONNECTION_DB else \ - ResponseCode.PACK_NAME_NOT_FOUND - - # create root node and get next search list - search_list = self._create_node_and_get_search_list(build_list, pkg_list) - - code, res_dict, not_fd_com_install = \ - InstallDepend(self.db_list).query_install_depend(search_list, - history_pk_val=self.__already_pk_val, - history_dicts=self.history_dicts) - self.not_found_components.update(not_fd_com_install) - if not res_dict: - return code - - for k, values in res_dict.items(): - if k in self.result_dict: - if ['root', None] in values[ListNode.PARENT_LIST]: - index = values[ListNode.PARENT_LIST].index(['root', None]) - del values[ListNode.PARENT_LIST][index] - - self.result_dict[k][ListNode.PARENT_LIST].extend(values[ListNode.PARENT_LIST]) - else: - self.result_dict[k] = values - - return ResponseCode.SUCCESS - - def _create_node_and_get_search_list(self, build_list, pkg_list): - """ - Description: To create root node in self.result_dict and - return the name of the source package to be found next time - Args: - build_list:List of binary package names - pkg_list: List of binary package names - Returns: - the name of the source package to be found next time - Raises: - """ - search_set = set() - search_list = [] - for obj in build_list: - if not obj.search_name: - continue - - if obj.search_name + "_src" not in self.result_dict: - self.result_dict[obj.search_name + "_src"] = [ - 'source', - obj.search_version, - obj.db_name, - [ - ['root', None] - ] - ] - search_set.add(obj.search_name) - - if not obj.bin_name: - continue - - if obj.bin_name in self.history_dicts: - self.result_dict[obj.bin_name] = [ - self.history_dicts[obj.bin_name][ListNode.SOURCE_NAME], - self.history_dicts[obj.bin_name][ListNode.VERSION], - self.history_dicts[obj.bin_name][ListNode.DBNAME], - [ - [obj.search_name, 'build'] - ] - ] - else: - if obj.bin_name in search_list: - self.result_dict[obj.bin_name][ListNode.PARENT_LIST].append([ - obj.search_name, 'build' - ]) - else: - self.result_dict[obj.bin_name] = [ - obj.source_name, - obj.version, - self.search_db.binary_search_database_for_first_time(obj.bin_name), - [ - [obj.search_name, 'build'] - ] - ] - search_list.append(obj.bin_name) - - if search_set and len(search_set) != len(pkg_list): - temp_set = set(pkg_list) - search_set - for name in temp_set: - self.result_dict[name + "_src"] = [ - None, - None, - 'NOT_FOUND', - [ - ['root', None] - ] - ] - return search_list - - def self_build(self, pkg_name_li): - """ - Description: Using recursion to find compilation dependencies - Args: - pkg_name_li: Source package name list - Returns: - Raises: - """ - if not pkg_name_li: - return - - next_src_set = set() - (_, - bin_info_lis, - not_fd_com, - pk_v - ) = self.search_db.get_build_depend(pkg_name_li, - pk_value=self.__already_pk_val) - self.__already_pk_val = pk_v - self.not_found_components.update(not_fd_com) - if not bin_info_lis: - return - - # generate data content - search_name_set = set() - for obj in bin_info_lis: - - search_name_set.add(obj.search_name) - if obj.search_name not in self.source_dict: - self.source_dict[obj.search_name] = [obj.db_name, obj.search_version] - - if not obj.bin_name: - continue - - if obj.bin_name not in self.result_dict: - self.result_dict[obj.bin_name] = [ - obj.source_name if obj.source_name else None, - obj.version if obj.version else None, - self.search_db.binary_search_database_for_first_time(obj.bin_name), - [ - [obj.search_name, "build"] - ] - ] - else: - node = [obj.search_name, "build"] - node_list = self.result_dict[obj.bin_name][-1] - if node not in node_list: - node_list.append(node) - - if obj.source_name and \ - obj.source_name not in self.source_dict and \ - obj.source_name not in self.history_dicts: - next_src_set.add(obj.source_name) - - not_found_pkg = set(pkg_name_li) - search_name_set - for pkg_name in not_found_pkg: - if pkg_name not in self.source_dict: - self.source_dict[pkg_name] = ['NOT FOUND', 'NOT FOUND'] - not_found_pkg.clear() - self.self_build(next_src_set) - - return diff --git a/packageship/packageship/application/apps/package/function/constants.py b/packageship/packageship/application/apps/package/function/constants.py deleted file mode 100644 index 872632b94e54a47ea98de9ff53548475e188569e..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/package/function/constants.py +++ /dev/null @@ -1,108 +0,0 @@ -#!/usr/bin/python3 -""" -Description: Response contain and code ID -class: ListNode, ResponseCode -""" - - -class ListNode(): - """ - Description: Dethe structure of dict: - {package_name: [source_name, - dbname, - [[parent_node_1, depend_type],[parent_node_2, depend_type],...]], - check_tag] - } - changeLog: - """ - - SOURCE_NAME = 0 - VERSION = 1 - DBNAME = 2 - PARENT_LIST = 3 - # FOR PARENT LIST: - PARENT_NODE = 0 - DEPEND_TYPE = 1 - -# response code - - -class ResponseCode(): - """ - Description: response code to web - changeLog: - """ - # Four digits are common status codes - SUCCESS = "2001" - PARAM_ERROR = "4001" - DB_NAME_ERROR = "4002" - PACK_NAME_NOT_FOUND = "4003" - CONNECT_DB_ERROR = "4004" - INPUT_NONE = "4005" - FILE_NOT_FOUND = "4006" - SPECIFIED_FILE_NOT_EXIST = "4007" - UPDATA_OR_ADD_DATA_FAILED = "4008" - TABLE_NAME_NOT_EXIST = "4009" - UPDATA_DATA_FAILED = "4010" - NOT_FOUND_DATABASE_INFO = "4011" - # Database operation module error status code - DELETE_DB_ERROR = "40051" - SERVICE_ERROR = "50000" - CONFIGFILE_PATH_EMPTY = "50001" - FAILED_CREATE_DATABASE_TABLE = "50002" - TYPE_ERROR = "50003" - DATA_MERGE_ERROR = "50004" - FILE_NOT_FIND_ERROR = "50005" - DIS_CONNECTION_DB = "50006" - NO_PACKAGES_TABLE = "60001" - DATABASE_NOT_FOUND = "60002" - TABLE_NAME_NOT_EXIST_IN_DATABASE = "60003" - YAML_FILE_ERROR = " 70001" - EMPTY_FOLDER = "70002" - - CODE_MSG_MAP = { - SUCCESS: "Successful Operation!", - PARAM_ERROR: "Parameter error, please check the parameter and query again.", - DB_NAME_ERROR: "Database does not exist! Please check the database name", - PACK_NAME_NOT_FOUND: "Sorry! The querying package does not exist in the databases", - CONNECT_DB_ERROR: "Failed to Connect the database! " - "Please check the database connection", - INPUT_NONE: "The input is None, please check the input value.", - FILE_NOT_FOUND: "Database import success file does not exist", - DELETE_DB_ERROR: "Failed to delete database", - CONFIGFILE_PATH_EMPTY: "Initialization profile does not exist or cannot be found", - FAILED_CREATE_DATABASE_TABLE: "Failed to create database or table", - TYPE_ERROR: "The source code and binary path types in the initialization file are abnormal", - DATA_MERGE_ERROR: "abnormal multi-file database integration", - FILE_NOT_FIND_ERROR: "system initialization configuration file does not exist", - DIS_CONNECTION_DB: "Unable to connect to the database, check the database configuration", - SERVICE_ERROR: "An exception occurred in the system, please try again later", - SPECIFIED_FILE_NOT_EXIST: "The specified file does not exist", - NO_PACKAGES_TABLE: "There is no packages table in the database", - UPDATA_OR_ADD_DATA_FAILED: "Failed to update or add data", - DATABASE_NOT_FOUND: "There is no such database in the system", - TABLE_NAME_NOT_EXIST: "There is no such table in the database", - UPDATA_DATA_FAILED: "Failed to update data", - TABLE_NAME_NOT_EXIST_IN_DATABASE: "the table name dose not match the existed database", - NOT_FOUND_DATABASE_INFO: "Unable to get the generated database information", - YAML_FILE_ERROR: "Data error in yaml file", - EMPTY_FOLDER: "This is an empty folder, no yaml file" - } - - @classmethod - def response_json(cls, code, data=None, msg=None): - """ - Description: classmethod - """ - try: - _msg = cls.CODE_MSG_MAP[code] - except KeyError: - _msg = msg - return { - "code": code, - "msg": _msg, - "data": data - } - - def __str__(self): - return 'ResponseCode' diff --git a/packageship/packageship/application/apps/package/function/install_depend.py b/packageship/packageship/application/apps/package/function/install_depend.py deleted file mode 100644 index 70967693c14b66ecdded3013e8be199f53376c89..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/package/function/install_depend.py +++ /dev/null @@ -1,177 +0,0 @@ -#!/usr/bin/python3 -""" -Description: Querying for install dependencies - Querying packages install depend for those package can be installed -class: InstallDepend, DictionaryOperations -""" -from packageship.libs.log import Log -from packageship.application.apps.package.function.searchdb import SearchDB -from packageship.application.apps.package.function.constants import ResponseCode, ListNode - -LOGGER = Log(__name__) - - -class InstallDepend(): - """ - Description: query install depend of package - Attributes: - db_list: A list of Database name to show the priority - __search_list: Contain the binary packages searched in the next loop - binary_dict: Contain all the binary packages info and operation - __search_db: A object of database which would be connected - not_found_components: Contain the package not found components - __already_pk_value: List of pkgKey found - changeLog: - """ - - # pylint: disable = too-few-public-methods - def __init__(self, db_list): - """ - Initialization class - """ - self.binary_dict = DictionaryOperations() - self.__search_list = [] - - self.db_list = db_list - self.__search_db = SearchDB(db_list) - self.not_found_components = set() - self.__already_pk_value = [] - - def query_install_depend(self, binary_list, history_pk_val=None, history_dicts=None): - """ - Description: init result dict and determint the loop end point - Args: - binary_list: A list of binary rpm package name - history_dicts: record the searching install depend history, - defualt is None - history_pk_val:List of pkgKey found - Returns: - binary_dict.dictionary: - {binary_name: [ - src, - dbname, - version, - [ - parent_node_package_name - 'install' - ] - ]} - not_found_components:Set of package not found components - Raises: - """ - if not self.__search_db.db_object_dict: - return ResponseCode.DIS_CONNECTION_DB, None, set() - if not binary_list: - return ResponseCode.INPUT_NONE, None, set() - for binary in binary_list: - if binary: - self.__search_list.append(binary) - else: - LOGGER.logger.warning("There is a NONE in input value: %s", str(binary_list)) - self.__already_pk_value = history_pk_val if history_pk_val else [] - while self.__search_list: - self.__query_single_install_dep(history_dicts) - return ResponseCode.SUCCESS, self.binary_dict.dictionary, self.not_found_components - - def __query_single_install_dep(self, history_dicts): - """ - Description: query a package install depend and append to result - Args: - history_dicts: A list of binary rpm package name - Returns: - response_code: response code - Raises: - """ - res_list, not_found_components, pk_val = self.__search_db.get_install_depend(self.__search_list, - pk_value=self.__already_pk_value) - result_list = set(res_list) - self.not_found_components.update(not_found_components) - self.__already_pk_value = pk_val - for search in self.__search_list: - if search not in self.binary_dict.dictionary: - self.binary_dict.init_key(key=search, parent_node=[]) - self.__search_list.clear() - if result_list: - for result, dbname in result_list: - if not self.binary_dict.dictionary[result.search_name][ListNode.PARENT_LIST]: - self.binary_dict.init_key(key=result.search_name, - src=result.search_src_name, - version=result.search_version, - dbname=dbname) - else: - self.binary_dict.update_value(key=result.search_name, - src=result.search_src_name, - version=result.search_version, - dbname=dbname) - - if result.depend_name: - if result.depend_name in self.binary_dict.dictionary: - self.binary_dict.update_value(key=result.depend_name, - parent_node=[result.search_name, 'install']) - elif history_dicts is not None and result.depend_name in history_dicts: - self.binary_dict.init_key( - key=result.depend_name, - src=history_dicts[result.depend_name][ListNode.SOURCE_NAME], - version=history_dicts[result.depend_name][ListNode.VERSION], - dbname=None, - parent_node=[[result.search_name, 'install']] - ) - else: - self.binary_dict.init_key(key=result.depend_name, - parent_node=[[result.search_name, 'install']]) - self.__search_list.append(result.depend_name) - - -class DictionaryOperations(): - """ - Description: Related to dictionary operations, creating dictionary, append dictionary - Attributes: - dictionary: Contain all the binary packages info after searching - changeLog: - """ - - def __init__(self): - """ - init class - """ - self.dictionary = dict() - - # pylint: disable=R0913 - def init_key(self, key, src=None, version=None, dbname=None, parent_node=None): - """ - Description: Creating dictionary - Args: - key: binary_name - src: source_name - version: version - dbname: databases name - parent_node: parent_node - Returns: - dictionary[key]: [src, version, dbname, parent_node] - """ - if dbname: - self.dictionary[key] = [src, version, dbname, [['root', None]]] - else: - self.dictionary[key] = [src, version, dbname, parent_node] - - # pylint: disable=R0913 - def update_value(self, key, src=None, version=None, dbname=None, parent_node=None): - """ - Description: append dictionary - Args: - key: binary_name - src: source_name - version: version - dbname: database name - parent_node: parent_node - Returns: - Raises: - """ - if src: - self.dictionary[key][ListNode.SOURCE_NAME] = src - if version: - self.dictionary[key][ListNode.VERSION] = version - if dbname: - self.dictionary[key][ListNode.DBNAME] = dbname - if parent_node: - self.dictionary[key][ListNode.PARENT_LIST].append(parent_node) diff --git a/packageship/packageship/application/apps/package/function/packages.py b/packageship/packageship/application/apps/package/function/packages.py deleted file mode 100644 index 3260d5f4dce945fa92d121e04cf940d0cb87bbc5..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/package/function/packages.py +++ /dev/null @@ -1,379 +0,0 @@ -#!/usr/bin/python3 -""" -Description: Get package information and modify package information -functions: get_packages, buildep_packages, sub_packages, get_single_package, - update_single_package, update_maintaniner_info -""" -import math - - -from flask import current_app -from sqlalchemy import text -from sqlalchemy.exc import SQLAlchemyError, DisconnectionError - - -from packageship.application.apps.package.function.constants import ResponseCode -from packageship.application.apps.package.serialize import AllPackInfoSchema -from packageship.application.apps.package.serialize import SinglePackInfoSchema -from packageship.libs.dbutils import DBHelper -from packageship.application.models.package import SrcPack -from packageship.application.models.package import PackagesMaintainer -from packageship.application.models.package import PackagesIssue -from packageship.application.models.package import SrcRequires -from packageship.application.models.package import BinPack -from packageship.application.models.package import BinProvides -from packageship.libs.exception import Error -from packageship.application.models.package import Packages - - -def get_all_package_info(tablename, pagenum, pagesize, - srcname, maintainner, maintainlevel): - """ - Args: - tablename: Table Name - pagenum: Page number - pagesize: Current page display quantity - - Returns: - package info - - Attributes: - SQLAlchemyError: sqlalchemy error - DisconnectionError: Cannot connect to database error - Error: Error - """ - try: - with DBHelper(db_name="lifecycle") as database_name: - if tablename not in database_name.engine.table_names(): - response = ResponseCode.response_json( - ResponseCode.TABLE_NAME_NOT_EXIST) - response['total_count'] = None - response['total_page'] = None - return response - cls_model = Packages.package_meta(tablename) - # If srcname is empty, it will query all the information in the - # library - package_info_set_one = database_name.session.query(cls_model).outerjoin( - PackagesMaintainer, cls_model.name == PackagesMaintainer.name) - if srcname: - package_info_set_one = package_info_set_one.filter( - cls_model.name.like('%{srcname}%'.format(srcname=srcname))) - if maintainner: - package_info_set_one = package_info_set_one.filter( - PackagesMaintainer.maintainer == maintainner) - if maintainlevel: - package_info_set_one = package_info_set_one.filter( - PackagesMaintainer.maintainlevel == maintainlevel) - package_info_set = package_info_set_one.limit( - int(pagesize)).offset((int(pagenum) - 1) * int(pagesize)).all() - packageinfo_dicts = AllPackInfoSchema( - many=True).dump(package_info_set) - total_count = package_info_set_one.count() - total_page = math.ceil(total_count / int(pagesize)) - packageinfo_dicts = parsing_dictionary_issuse(packageinfo_dicts) - packageinfo_dicts = parsing_dictionary_maintainner( - packageinfo_dicts) - response = ResponseCode.response_json( - ResponseCode.SUCCESS, packageinfo_dicts) - response["total_count"] = total_count - response["total_page"] = total_page - return response - except (SQLAlchemyError, DisconnectionError, Error) as error: - current_app.logger.error(error) - response = ResponseCode.response_json( - ResponseCode.TABLE_NAME_NOT_EXIST) - response["total_count"] = None - response["total_page"] = None - return response - - -def parsing_dictionary_issuse(packageinfo_dicts): - """ - - Args: - packageinfo_dicts: package info dict - - Returns: - packageinfo_dicts - """ - with DBHelper(db_name="lifecycle") as database_name: - for packageinfo_dict in packageinfo_dicts: - issue_count = database_name.session.query(PackagesIssue).filter_by( - pkg_name=packageinfo_dict.get("name")).count() - packageinfo_dict["issue"] = issue_count - return packageinfo_dicts - - -def parsing_dictionary_maintainner(packageinfo_dicts): - """ - parsing dictionary maintainner - - Args: - packageinfo_dicts: - - Returns: - packageinfo_dicts - """ - with DBHelper(db_name="lifecycle") as database_name: - for packageinfo_dict in packageinfo_dicts: - maintainer_obj = database_name.session.query(PackagesMaintainer).filter_by( - name=packageinfo_dict.get("name")).first() - if maintainer_obj is None: - packageinfo_dict["maintainer"] = None - packageinfo_dict["maintainlevel"] = None - else: - packageinfo_dict["maintainer"] = maintainer_obj.maintainer - packageinfo_dict["maintainlevel"] = maintainer_obj.maintainlevel - return packageinfo_dicts - - -def sing_pack(srcname, tablename): - """ - Query information about a single source package, including a layer - of installation dependencies and compilation dependencies - Args: - srcname: The name of the source package - tablename: The name of the table in the database - - Returns: - single pack package info - - Attributes: - SQLAlchemyError: sqlalchemy error - DisconnectionError: Cannot connect to database error - Error: Error - """ - try: - with DBHelper(db_name="lifecycle") as database_name: - if tablename not in database_name.engine.table_names(): - return ResponseCode.response_json(ResponseCode.TABLE_NAME_NOT_EXIST) - cls_model = Packages.package_meta(tablename) - package_info_obj = database_name.session.query( - cls_model).filter_by(name=srcname).first() - if package_info_obj is None: - return ResponseCode.response_json(ResponseCode.PACK_NAME_NOT_FOUND) - pack_info_dict = SinglePackInfoSchema( - many=False).dump(package_info_obj) - pack_info_dict = parsing_maintainner(srcname, pack_info_dict) - issue_count = database_name.session.query(PackagesIssue).filter_by( - pkg_name=package_info_obj.name).count() - pack_info_dict["issue"] = issue_count - buildrequired = buildrequired_search(srcname, tablename) - pack_info_dict["buildrequired"] = buildrequired - subpack = _sub_pack(srcname, tablename) - pack_info_dict["gitee_url"] = "www.gitee.com/src-openeuler/" + \ - str(srcname) - pack_info_dict["subpack"] = subpack - pack_info_dict.update( - {"license": pack_info_dict.pop("rpm_license")}) - pack_info_dict.update({"pkg_name": pack_info_dict.pop("name")}) - return ResponseCode.response_json(ResponseCode.SUCCESS, pack_info_dict) - except (SQLAlchemyError, DisconnectionError, Error, AttributeError) as error: - current_app.logger.error(error) - return ResponseCode.response_json(ResponseCode.DIS_CONNECTION_DB) - - -def parsing_maintainner(srcname, pack_info_dict): - """ - Single package query maintainer and maintainlevel - Args: - srcname: Source package name - pack_info_dict: - Returns: Dictionary of package information - - """ - with DBHelper(db_name="lifecycle") as database_name: - maintainer_obj = database_name.session.query( - PackagesMaintainer).filter_by(name=srcname).first() - if maintainer_obj is None: - pack_info_dict["maintainer"] = None - pack_info_dict["maintainlevel"] = None - else: - pack_info_dict["maintainer"] = maintainer_obj.maintainer - pack_info_dict["maintainlevel"] = maintainer_obj.maintainlevel - return pack_info_dict - - -def buildrequired_search(srcname, tablename): - """ - Source code package one-level compilation dependency - Args: - srcname: The name of the source package - tablename: The name of the table in the database - - Returns: - Source code package one-level compilation dependency - """ - with DBHelper(db_name=tablename) as data_name: - - src_pack_obj = data_name.session.query( - SrcPack).filter_by(name=srcname).first() - if src_pack_obj is None: - return None - - src_pack_pkgkey = src_pack_obj.pkgKey - s_pack_requires_set = data_name.session.query( - SrcRequires).filter_by(pkgKey=src_pack_pkgkey).all() - # src_requires pkykey to find the name of the dependent component - s_pack_requires_names = [ - s_pack_requires_obj.name for s_pack_requires_obj in s_pack_requires_set] - - # Find pkgkey in BinProvides by the name of the dependent component - b_pack_provides_set = data_name.session.query(BinProvides).filter( - BinProvides.name.in_(s_pack_requires_names)).all() - b_pack_provides_pkg_list = [ - b_pack_provides_obj.pkgKey for b_pack_provides_obj in b_pack_provides_set] - - # Go to bin_pack to find the name by pkgkey of BinProvides - b_bin_pack_set = data_name.session.query(BinPack).filter( - BinPack.pkgKey.in_(b_pack_provides_pkg_list)).all() - builddep = [b_bin_pack_obj.name for b_bin_pack_obj in b_bin_pack_set] - return builddep - - -def helper(cls): - """ - Auxiliary function - The returned data format is converted, - the main function is to convert a dictionary to a list - - Args: - cls: Data before conversion - Returns: - Converted data - """ - for obj in cls: - if "provides" in obj: - obj["provides"] = list(obj["provides"].values()) - for values_p in obj["provides"]: - if 'requiredby' in values_p: - values_p['requiredby'] = list( - values_p['requiredby'].values()) - if "requires" in obj: - obj["requires"] = list(obj["requires"].values()) - for values_r in obj["requires"]: - if "providedby" in values_r: - values_r['providedby'] = list( - values_r['providedby'].values()) - - -def _sub_pack(src_name, table_name): - """ - One-level installation dependency of the source package - to generate the binary package - Args: - srcname: The name of the source package - tablename: The name of the table in the database - Returns: - One-level installation dependency of the source package to - generate the binary package - """ - with DBHelper(db_name=table_name) as database: - sql_str = """ - SELECT DISTINCT - b2.pkgKey AS sub_id, - b2.name AS sub_name, - pro.id AS sub_pro_id, - pro.name AS sub_pro_name, - b1.name AS sub_reqby_name - FROM - ( select pkgKey,name,src_name from bin_pack where src_name=:src_name) b2 - left join bin_provides pro on b2.pkgKey=pro.pkgKey - LEFT JOIN bin_requires req ON req.name = pro.name - LEFT JOIN bin_pack b1 ON req.pkgKey = b1.pkgKey; - """ - res = {} - res_pro = database.session.execute( - text(sql_str), {"src_name": src_name}).fetchall() - - for pro_obj in res_pro: - if pro_obj.sub_name not in res: - res[pro_obj.sub_name] = { - "id": pro_obj.sub_id, - "name": pro_obj.sub_name, - "provides": { - pro_obj.sub_pro_name: { - "id": pro_obj.sub_pro_id, - "name": pro_obj.sub_pro_name, - "requiredby": { - pro_obj.sub_reqby_name: pro_obj.sub_reqby_name - } if pro_obj.sub_reqby_name else {} - } - } if pro_obj.sub_pro_name else {} - } - else: - pro_info = res[pro_obj.sub_name]["provides"] - if pro_obj.sub_pro_name in pro_info: - pro_info[pro_obj.sub_pro_name]["requiredby"].update( - {pro_obj.sub_reqby_name: pro_obj.sub_reqby_name} - if pro_obj.sub_reqby_name else {}) - else: - pro_info.update( - { - pro_obj.sub_pro_name: { - "id": pro_obj.sub_pro_id, - "name": pro_obj.sub_pro_name, - "requiredby": { - pro_obj.sub_reqby_name: pro_obj.sub_reqby_name - } if pro_obj.sub_reqby_name else {} - } if pro_obj.sub_pro_name else {} - } - ) - - sql_re = """ - SELECT DISTINCT - b2.pkgKey AS sub_id, - b2.name AS sub_name, - req.id AS sub_req_id, - req.name AS sub_req_name, - b1.name AS sub_proby_name - FROM - ( SELECT pkgKey, name, src_name FROM bin_pack WHERE src_name = :src_name ) b2 - LEFT JOIN bin_requires req ON b2.pkgKey = req.pkgKey - LEFT JOIN bin_provides pro ON pro.name = req.name - LEFT JOIN bin_pack b1 ON pro.pkgKey = b1.pkgKey; - """ - res_req = database.session.execute( - text(sql_re), {"src_name": src_name}).fetchall() - - for req_obj in res_req: - sub_pkg_info = res[req_obj.sub_name] - # if req_obj.sub_name not in sub_pkg_info: - - if "requires" not in sub_pkg_info: - if not req_obj.sub_req_name: - sub_pkg_info['requires'] = {} - else: - sub_pkg_info.update( - { - "requires": { - req_obj.sub_req_name: { - "id": req_obj.sub_req_id, - "name": req_obj.sub_req_name, - "providedby": { - req_obj.sub_proby_name: req_obj.sub_proby_name - } if req_obj.sub_proby_name else {} - } - } - } - ) - else: - req_info = sub_pkg_info["requires"] - if req_obj.sub_req_name in req_info: - req_info[req_obj.sub_req_name]["providedby"].update( - {req_obj.sub_proby_name: req_obj.sub_proby_name} - if req_obj.sub_proby_name else {}) - else: - req_info.update( - { - req_obj.sub_req_name: { - "id": req_obj.sub_req_id, - "name": req_obj.sub_req_name, - "providedby": { - req_obj.sub_proby_name: req_obj.sub_proby_name - } if req_obj.sub_proby_name else {} - } - } - ) - helper([values for k, values in res.items()]) - return list(res.values()) diff --git a/packageship/packageship/application/apps/package/function/searchdb.py b/packageship/packageship/application/apps/package/function/searchdb.py deleted file mode 100644 index 616fb754ad4654b57f7468a78fbe511eea87b68a..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/package/function/searchdb.py +++ /dev/null @@ -1,939 +0,0 @@ -#!/usr/bin/python3 -""" -Description: A set for all query databases function -class: SearchDB -functions: db_priority -""" -from collections import namedtuple, Counter - -import yaml -from flask import current_app -from sqlalchemy import text -from sqlalchemy.exc import SQLAlchemyError, DisconnectionError -from sqlalchemy.sql import literal_column -from sqlalchemy import exists - -from packageship.libs.dbutils import DBHelper -from packageship.libs.log import Log -from packageship.application.models.package import BinPack -from packageship.application.models.package import SrcPack -from packageship.application.models.package import DatabaseInfo -from packageship.application.apps.package.function.constants import ResponseCode - -LOGGER = Log(__name__) - - -class SearchDB(): - """ - Description: query in database - Attributes: - db_list: Database list - db_object_dict:A dictionary for storing database connection objects - changeLog: - """ - - def __new__(cls, *args, **kwargs): - # pylint: disable=w0613 - if not hasattr(cls, "_instance"): - cls._instance = super(SearchDB, cls).__new__(cls) - return cls._instance - - def __init__(self, db_list): - """ - init class - """ - self.db_object_dict = dict() - for db_name in db_list: - try: - with DBHelper(db_name=db_name) as data_base: - self.db_object_dict[db_name] = data_base - except DisconnectionError as connection_error: - current_app.logger.error(connection_error) - - # Related methods of install - # pylint: disable=R0914 - def get_install_depend(self, binary_list, pk_value=None): - """ - Description: get a package install depend from database: - binary_name -> binary_id -> requires_set -> requires_id_set -> provides_set - -> install_depend_binary_id_key_list -> install_depend_binary_name_list - Args: - binary_list: a list of binary package name - pk_value:List of pkgKey found - Returns: - list:install depend list - set:package not found components, - pk_val:The pkgkey corresponding to the required components - Raises: - """ - pk_val = pk_value if pk_value else [] - result_list = [] - provides_not_found = dict() - - if not self.db_object_dict: - LOGGER.logger.warning("Unable to connect to the database," - "check the database configuration") - return result_list, set(), pk_val - - if None in binary_list: - binary_list.remove(None) - search_set = set(binary_list) - - if not search_set: - LOGGER.logger.warning("The input is None, please check the input value.") - return result_list, set(), pk_val - - return_tuple = namedtuple('return_tuple', [ - 'depend_name', - 'depend_version', - 'depend_src_name', - 'search_name', - 'search_src_name', - 'search_version' - ]) - - for db_name, data_base in self.db_object_dict.items(): - try: - req_set = self._get_requires(search_set, data_base, search_type='install') - - if not req_set: - continue - - (depend_set, - req_pk_dict, - pk_v, - not_fd_com) = self._get_provides_req_info(req_set, - data_base, - pk_value=pk_val) - pk_val += pk_v - res_list, get_list = self._comb_install_list(depend_set, - req_pk_dict, - not_fd_com, - return_tuple, - db_name, - provides_not_found, - req_set) - - result_list += res_list - - search_set.symmetric_difference_update(set(get_list)) - - if not search_set: - result_list.extend( - self._get_install_pro_in_other_database(provides_not_found, - database_name=db_name) - ) - return result_list, set(provides_not_found.keys()), pk_val - - except AttributeError as error_msg: - LOGGER.logger.error(error_msg) - except SQLAlchemyError as error_msg: - LOGGER.logger.error(error_msg) - if search_set: - result_list.extend( - self._get_install_pro_in_other_database(provides_not_found) - ) - - for binary_name in search_set: - result_list.append((return_tuple(None, None, None, - binary_name, None, None), 'NOT FOUND')) - return result_list, set(provides_not_found.keys()), pk_val - - # pylint: disable=R0913 - @staticmethod - def _comb_install_list(depend_set, - req_pk_dict, - not_fd_com, - return_tuple, - db_name, - provides_not_found, - req_set): - """ - Description: Query the corresponding installation dependency list - through the components of the requirements - Args: - depend_set: List binary package information corresponding to the components - req_pk_dict:Mapping of components and binary pkgKey - not_fd_com: List of pkgKey found, - return_tuple: Named tuple format for saving information - db_name:current database name - provides_not_found:Component mapping not found in the current database - req_set:Package information and corresponding component information - Returns: - ret_list:install depend list - get_list:Packages that have found results - Raises: - """ - get_list = [] - ret_list = [] - depend_info_tuple = namedtuple('depend_info', [ - 'depend_name', - 'depend_version', - 'depend_src_name' - ]) - depend_info_dict = { - info.pk: depend_info_tuple(info.depend_name, - info.depend_version, - info.depend_src_name) - for info in depend_set - } - - for req_name, search_name, search_src_name, search_version in req_set: - get_list.append(search_name) - - if not req_name: - obj = return_tuple( - None, - None, - None, - search_name, - search_src_name, - search_version, - ) - ret_list.append((obj, db_name)) - - elif req_name in req_pk_dict: - depend_info_t = depend_info_dict.get(req_pk_dict[req_name]) - obj = return_tuple( - depend_info_t.depend_name, - depend_info_t.depend_version, - depend_info_t.depend_src_name, - search_name, - search_src_name, - search_version, - ) - ret_list.append((obj, db_name)) - - else: - if req_name in not_fd_com: - if req_name not in provides_not_found: - provides_not_found[req_name] = [[search_name, search_src_name, - search_version, db_name]] - else: - provides_not_found[req_name].append([search_name, search_src_name, - search_version, db_name]) - - return ret_list, get_list - - def _get_install_pro_in_other_database(self, not_found_binary, database_name=None): - """ - Description: Binary package name data not found in - the current database, go to other databases to try - Args: - not_found_binary: not_found_build These data cannot be found in the current database - database_name:current database name - Returns: - result_list :[return_tuple1,return_tuple2] package information - Raises: - """ - if not not_found_binary: - return [] - - return_tuple = namedtuple('return_tuple', [ - 'depend_name', - 'depend_version', - 'depend_src_name', - 'search_name', - 'search_src_name', - 'search_version' - ]) - - result_list = [] - search_set = {k for k, _ in not_found_binary.items()} - - for db_name, data_base in self.db_object_dict.items(): - if db_name == database_name: - continue - - parm_tuple = namedtuple("in_tuple", 'req_name') - in_tuple_list = [parm_tuple(k) for k, _ in not_found_binary.items()] - - depend_set, req_pk_dict, *_ = self._get_provides_req_info( - in_tuple_list, - data_base - ) - - depend_info_tuple = namedtuple('depend_info', [ - 'depend_name', - 'depend_version', - 'depend_src_name' - ]) - depend_info_dict = { - info.pk: depend_info_tuple(info.depend_name, - info.depend_version, - info.depend_src_name) - for info in depend_set - } - result_list += self._comb_install_info(search_set, - req_pk_dict, - depend_info_dict, - not_found_binary, - return_tuple, - db_name) - if not not_found_binary: - return result_list - - if not_found_binary: - for _, values in not_found_binary.items(): - for info in values: - obj = return_tuple( - None, - None, - None, - info[0], - info[1], - info[2] - ) - result_list.append((obj, info[3])) - return result_list - - @staticmethod - def _comb_install_info(search_set, - req_pk_dict, - depend_info_dict, - not_found_binary, - return_tuple, - db_name): - """ - Description: Binary package name data not found in - the current database, go to other databases to try - Args: - search_set: The name of the component to be queried - req_pk_dict:Mapping of components and binary pkgKey - depend_info_dict:The mapping of binary pkgKey and binary information - not_found_binary:not_found_build These data cannot be found in the current database - return_tuple:Named tuple format for saving information - db_name:current database name - Returns: - ret_list :[return_tuple1,return_tuple2] package information - Raises: - """ - ret_list = [] - for req_name in search_set: - if req_name in req_pk_dict: - pk_ = req_pk_dict[req_name] - if pk_ in depend_info_dict: - for binary_info in not_found_binary[req_name]: - obj = return_tuple( - depend_info_dict[pk_].depend_name, - depend_info_dict[pk_].depend_version, - depend_info_dict[pk_].depend_src_name, - binary_info[0], - binary_info[1], - binary_info[2] - ) - ret_list.append((obj, db_name)) - del not_found_binary[req_name] - return ret_list - - # Related methods of build - def get_build_depend(self, source_name_li, pk_value=None): - """ - Description: get a package build depend from database - Args: - source_name_li: search package's name list - pk_value:List of pkgKey found - Returns: - all source pkg build depend list - structure :[(search_name,source_name,bin_name,bin_version,db_name,search_version), - (search_name,source_name,bin_name,bin_version,db_name,search_version),] - set: package not found components name set - Raises: - AttributeError: The object does not have this property - SQLAlchemyError: sqlalchemy error - """ - # pylint: disable=R0914 - return_tuple = namedtuple("return_tuple", [ - "search_name", - "source_name", - "bin_name", - "version", - "db_name", - "search_version" - ]) - pk_val = pk_value if pk_value else [] - s_name_set = set(source_name_li) - if not s_name_set: - return ResponseCode.PARAM_ERROR, list(), set(), pk_val - - provides_not_found = dict() - build_list = [] - - for db_name, data_base in self.db_object_dict.items(): - - try: - req_set = self._get_requires(s_name_set, data_base, search_type='build') - - if not req_set: - continue - - (depend_set, - req_pk_dict, - pk_v, - not_fd_req) = self._get_provides_req_info(req_set, data_base) - - pk_val += pk_v - ret_list, get_list = self._comb_build_list(depend_set, - req_pk_dict, - not_fd_req, - return_tuple, - db_name, - provides_not_found, - req_set) - build_list += ret_list - s_name_set.symmetric_difference_update(set(get_list)) - if not s_name_set: - build_list.extend( - self._get_binary_in_other_database(provides_not_found, database_name=db_name) - ) - return ResponseCode.SUCCESS, build_list, set(provides_not_found.keys()), pk_val - - except AttributeError as attr_err: - current_app.logger.error(attr_err) - except SQLAlchemyError as sql_err: - current_app.logger.error(sql_err) - - if s_name_set: - build_list.extend( - self._get_binary_in_other_database(provides_not_found) - ) - for source in s_name_set: - LOGGER.logger.warning( - "CANNOT FOUND THE SOURCE %s in all database", source) - - return ResponseCode.SUCCESS, build_list, set(provides_not_found.keys()), pk_val - - @staticmethod - def _comb_build_list(depend_set, - req_pk_dict, - not_fd_com, - return_tuple, - db_name, - provides_not_found, - req_set): - """ - Description: Query the corresponding build dependency list - through the components of the requirements - Args: - depend_set: List binary package information corresponding to the components - req_pk_dict:Mapping of components and binary pkgKey - not_fd_com: List of pkgKey found, - return_tuple: Named tuple format for saving information - db_name:current database name - provides_not_found:Component mapping not found in the current database - req_set:Package information and corresponding component information - Returns: - ret_list:install depend list - get_list:Packages that have found results - Raises: - """ - get_list = [] - ret_list = [] - depend_info_tuple = namedtuple('depend_info', [ - 'depend_name', - 'depend_version', - 'depend_src_name' - ]) - depend_info_dict = { - info.pk: depend_info_tuple(info.depend_name, - info.depend_version, - info.depend_src_name) - for info in depend_set - } - - for req_name, search_name, search_version in req_set: - - get_list.append(search_name) - - if not req_name: - obj = return_tuple( - search_name, - None, - None, - None, - db_name, - search_version, - ) - ret_list.append(obj) - - elif req_name in req_pk_dict: - depend_info_t = depend_info_dict.get(req_pk_dict[req_name]) - obj = return_tuple( - search_name, - depend_info_t.depend_src_name, - depend_info_t.depend_name, - depend_info_t.depend_version, - db_name, - search_version - ) - ret_list.append(obj) - - else: - if req_name in not_fd_com: - if req_name not in provides_not_found: - provides_not_found[req_name] = [ - [search_name, - search_version, - db_name] - ] - else: - provides_not_found[req_name].append([search_name, - search_version, - db_name]) - - return ret_list, get_list - - def _get_binary_in_other_database(self, not_found_binary, database_name=None): - """ - Description: Binary package name data not found in - the current database, go to other databases to try - Args: - not_found_binary: not_found_build These data cannot be found in the current database - database_name:current database name - Returns: - result_list :[return_tuple1,return_tuple2] package information - Raises: - AttributeError: The object does not have this property - SQLAlchemyError: sqlalchemy error - """ - if not not_found_binary: - return [] - - return_tuple = namedtuple("return_tuple", [ - "search_name", - "source_name", - "bin_name", - "version", - "db_name", - "search_version", - ]) - - result_list = [] - search_set = {k for k, _ in not_found_binary.items()} - - for db_name, data_base in self.db_object_dict.items(): - - if db_name == database_name: - continue - - in_tuple = namedtuple("in_tuple", 'req_name') - in_tuple_list = [in_tuple(k) for k, _ in not_found_binary.items()] - - depend_set, req_pk_dict, *_ = self._get_provides_req_info( - in_tuple_list, - data_base - ) - - depend_info_tuple = namedtuple('depend_info', [ - 'depend_name', - 'depend_version', - 'depend_src_name' - ]) - depend_info_dict = { - info.pk: depend_info_tuple(info.depend_name, - info.depend_version, - info.depend_src_name) - for info in depend_set - } - - result_list += self._comb_build_info(search_set, - req_pk_dict, - depend_info_dict, - not_found_binary, - return_tuple, - db_name) - if not not_found_binary: - return result_list - - if not_found_binary: - for _, values in not_found_binary.items(): - for info in values: - obj = return_tuple( - info[0], - None, - None, - None, - 'NOT FOUND', - info[2] - ) - result_list.append(obj) - return result_list - - @staticmethod - def _comb_build_info(search_set, - req_pk_dict, - depend_info_dict, - not_found_binary, - return_tuple, - db_name): - """ - Description: Binary package name data not found in - the current database, go to other databases to try - Args: - search_set: The name of the component to be queried - req_pk_dict:Mapping of components and binary pkgKey - depend_info_dict:The mapping of binary pkgKey and binary information - not_found_binary:not_found_build These data cannot be found in the current database - return_tuple:Named tuple format for saving information, - db_name:current data base name - Returns: - ret_list :[return_tuple1,return_tuple2] package information - Raises: - """ - ret_list = [] - for req_name in search_set: - if req_name in req_pk_dict: - pk_ = req_pk_dict[req_name] - if pk_ in depend_info_dict: - for binary_info in not_found_binary[req_name]: - obj = return_tuple( - binary_info[0], - depend_info_dict[pk_].depend_src_name, - depend_info_dict[pk_].depend_name, - depend_info_dict[pk_].depend_version, - db_name, - binary_info[1] - ) - ret_list.append(obj) - del not_found_binary[req_name] - return ret_list - - # Common methods for install and build - @staticmethod - def _get_requires(search_set, data_base, search_type=None): - """ - Description: Query the dependent components of the current package - Args: - search_set: The package name to be queried - data_base:current database object - search_type: type options build or install - Returns: - req_set:List Package information and corresponding component information - Raises: - AttributeError: The object does not have this property - SQLAlchemyError: sqlalchemy error - """ - if search_type == 'build': - sql_com = text(""" - SELECT DISTINCT - src_requires.NAME AS req_name, - src.NAME AS search_name, - src.version AS search_version - FROM - ( SELECT pkgKey, NAME, version, src_name FROM src_pack WHERE {} ) src - LEFT JOIN src_requires ON src.pkgKey = src_requires.pkgKey; - """.format(literal_column('name').in_(search_set))) - elif search_type == 'install': - sql_com = text(""" - SELECT DISTINCT - bin_requires.NAME AS req_name, - bin.NAME AS search_name, - s1.name as search_src_name, - bin.version AS search_version - FROM - ( SELECT pkgKey, NAME, version, rpm_sourcerpm FROM bin_pack WHERE {} ) bin - LEFT JOIN src_pack s1 ON bin.rpm_sourcerpm = s1.src_name - LEFT JOIN bin_requires ON bin.pkgKey = bin_requires.pkgKey; - """.format(literal_column('name').in_(search_set))) - else: - return [] - - req_set = [] - try: - req_set = data_base.session. \ - execute(sql_com, {'name_{}'.format(i): v - for i, v in enumerate(search_set, 1)}).fetchall() - except AttributeError as error_msg: - LOGGER.logger.error(error_msg) - except SQLAlchemyError as error_msg: - LOGGER.logger.error(error_msg) - return req_set - - def _get_provides_req_info(self, req_info, data_base, pk_value=None): - """ - Description: Get the name of the binary package - that provides the dependent component, - Filter redundant queries - when the same binary package is provided to multiple components - Args: - req_info: List of sqlalchemy objects with component names. - data_base: The database currently being queried - pk_value:Binary pkgKey that has been found - Returns: - depend_set: List of related dependent sqlalchemy objects - req_pk_dict: Mapping dictionary of component name and pkgKey - pk_val:update Binary pkgKey that has been found - not_fd_req: Components not found - Raises: - AttributeError: The object does not have this property - SQLAlchemyError: sqlalchemy error - """ - pk_val = pk_value if pk_value else [] - depend_set = [] - req_pk_dict = {} - not_fd_req = set() - try: - req_names = {req_.req_name - for req_ in req_info - if req_.req_name is not None} - req_name_in = literal_column('name').in_(req_names) - - sql_com_pro = text(""" - SELECT DISTINCT - NAME as req_name, - pkgKey - FROM - ( SELECT name, pkgKey FROM bin_provides - UNION ALL - SELECT name, pkgKey FROM bin_files ) - WHERE - {}; - """.format(req_name_in)) - - pkg_key_set = data_base.session.execute( - sql_com_pro, { - 'name_{}'.format(i): v - for i, v in enumerate(req_names, 1) - } - ).fetchall() - - req_pk_dict = dict() - pk_v = list() - - for req_name, pk_ in pkg_key_set: - if not req_name: - continue - pk_v.append(pk_) - if req_name not in req_pk_dict: - req_pk_dict[req_name] = [pk_] - else: - req_pk_dict[req_name].append(pk_) - - pk_val += pk_v - - pk_count_dic = Counter(pk_val) - - for key, values in req_pk_dict.items(): - count_values = list(map( - lambda x: pk_count_dic[x] if x in pk_count_dic else 0, values - )) - max_index = count_values.index(max(count_values)) - req_pk_dict[key] = values[max_index] - - not_fd_req = req_names - set(req_pk_dict.keys()) - depend_set = self._get_depend_info(req_pk_dict, data_base) - - except SQLAlchemyError as sql_err: - LOGGER.logger.error(sql_err) - except AttributeError as error_msg: - LOGGER.logger.error(error_msg) - - return depend_set, req_pk_dict, pk_val, not_fd_req - - @staticmethod - def _get_depend_info(req_pk_dict, data_base): - """ - Description: Obtain binary related information through binary pkgKey - Args: - req_pk_dict: Mapping dictionary of component name and pkgKey - data_base: The database currently being queried - Returns: - depend_set: List of related dependent sqlalchemy objects - Raises: - AttributeError: The object does not have this property - SQLAlchemyError: sqlalchemy error - """ - depend_set = [] - try: - bin_src_pkg_key = req_pk_dict.values() - pk_in = literal_column('pkgKey').in_(bin_src_pkg_key) - sql_bin_src = text(""" - SELECT DISTINCT - bin.pkgKey as pk, - bin.name AS depend_name, - bin.version AS depend_version, - src_pack.name AS depend_src_name - FROM - ( SELECT name, pkgKey,version, rpm_sourcerpm FROM bin_pack WHERE {} ) bin - LEFT JOIN src_pack ON src_pack.src_name = bin.rpm_sourcerpm; - """.format(pk_in)) - - depend_set = data_base.session.execute( - sql_bin_src, { - 'pkgKey_{}'.format(i): v - for i, v in enumerate(bin_src_pkg_key, 1) - } - ).fetchall() - - except SQLAlchemyError as sql_err: - LOGGER.logger.error(sql_err) - except AttributeError as error_msg: - LOGGER.logger.error(error_msg) - - return depend_set - - # Other methods - def binary_search_database_for_first_time(self, binary_name): - """ - Args: - binary_name: a binary package name - - Returns: - The name of the first database - in which the binary package appears according to priority - If it does not exist or exception occurred , return 'NOT FOUND' - - """ - try: - for db_name, data_base in self.db_object_dict.items(): - if data_base.session.query( - exists().where(BinPack.name == binary_name) - ).scalar(): - return db_name - except AttributeError as attr_err: - current_app.logger.error(attr_err) - except SQLAlchemyError as sql_err: - current_app.logger.error(sql_err) - - return 'NOT FOUND' - - def get_version_and_db(self, src_name): - """ - - Args: - src_name:the source package name - Returns: - this source package version and db_name - """ - try: - for db_name, data_base in self.db_object_dict.items(): - res = data_base.session.query(SrcPack.version).filter_by(name=src_name).first() - if res: - return db_name, res.version - except AttributeError as attr_err: - current_app.logger.error(attr_err) - except SQLAlchemyError as sql_err: - current_app.logger.error(sql_err) - - return None, None - - def get_src_name(self, binary_name): - """ - Description: get a package source name from database: - bianry_name ->binary_source_name -> source_name - Args: - binary_name: search package's name, database preority list - Returns: - db_name: database name - source_name: source name - source_version: source version - Raises: - AttributeError: The object does not have this property - SQLAlchemyError: sqlalchemy error - """ - for db_name, data_base in self.db_object_dict.items(): - sql_str = """ - SELECT DISTINCT - src_pack.name AS source_name, - src_pack.version AS source_version - FROM - bin_pack, - src_pack - WHERE - src_pack.src_name = bin_pack.rpm_sourcerpm - AND bin_pack.name = :binary_name; - """ - try: - bin_obj = data_base.session.execute(text(sql_str), - {"binary_name": binary_name} - ).fetchone() - source_name = bin_obj.source_name - source_version = bin_obj.source_version - if source_name is not None: - return ResponseCode.SUCCESS, db_name, \ - source_name, source_version - except AttributeError as error_msg: - LOGGER.logger.error(error_msg) - except SQLAlchemyError as error_msg: - LOGGER.logger.error(error_msg) - return ResponseCode.DIS_CONNECTION_DB, None, None, None - return ResponseCode.PACK_NAME_NOT_FOUND, None, None, None - - def get_sub_pack(self, source_name_list): - """ - Description: get a subpack list based on source name list: - source_name ->source_name_id -> binary_name - Args: - source_name_list: search package's name, database preority list - Returns: - response code - result_list: subpack tuple - Raises: - AttributeError: The object does not have this property - SQLAlchemyError: sqlalchemy error - """ - if not self.db_object_dict: - return ResponseCode.DIS_CONNECTION_DB, None - search_set = {source_name for source_name in source_name_list if source_name} - result_list = [] - get_list = [] - if not search_set: - return ResponseCode.INPUT_NONE, None - for db_name, data_base in self.db_object_dict.items(): - try: - name_in = literal_column('name').in_(search_set) - sql_com = text(''' - SELECT - bin_pack.name AS subpack_name, - bin_pack.version AS sub_pack_version, - src.name AS search_name, - src.version AS search_version - FROM - (SELECT name,version,src_name FROM src_pack WHERE {}) src - LEFT JOIN bin_pack on src.src_name = bin_pack.rpm_sourcerpm - '''.format(name_in)) - subpack_tuple = data_base.session. \ - execute(sql_com, {'name_{}'.format(i): v - for i, v in enumerate(search_set, 1)}).fetchall() - if subpack_tuple: - for result in subpack_tuple: - result_list.append((result, db_name)) - get_list.append(result.search_name) - search_set.symmetric_difference_update(set(get_list)) - get_list.clear() - if not search_set: - return ResponseCode.SUCCESS, result_list - else: - continue - except AttributeError as attr_error: - current_app.logger.error(attr_error) - except SQLAlchemyError as sql_error: - current_app.logger.error(sql_error) - if not result_list: - return ResponseCode.PACK_NAME_NOT_FOUND, result_list - return_tuple = namedtuple( - 'return_tuple', 'subpack_name sub_pack_version search_version search_name') - for search_name in search_set: - result_list.append( - (return_tuple(None, None, None, search_name), 'NOT FOUND')) - return ResponseCode.SUCCESS, result_list - - -def db_priority(): - """ - Description: Read yaml file, return database name, according to priority - Args: - Returns: - db_list: database name list - Raises: - FileNotFoundError: file cannot be found - Error: abnormal error - """ - try: - with DBHelper(db_name='lifecycle') as data_name: - name_list = data_name.session.query( - DatabaseInfo.name).order_by(DatabaseInfo.priority).all() - return [name[0] for name in name_list] - except SQLAlchemyError as error: - current_app.logger.error(error) - return None diff --git a/packageship/packageship/application/apps/package/function/self_depend.py b/packageship/packageship/application/apps/package/function/self_depend.py deleted file mode 100644 index f6f42e08cdfbf7e7de8bd67f9b0e67309c07ea79..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/package/function/self_depend.py +++ /dev/null @@ -1,335 +0,0 @@ -#!/usr/bin/python3 -""" -Description: Querying for self dependencies - Querying packages install and build depend for those package can be - build and install -class: SelfDepend, DictionaryOperations -""" - -import copy -from packageship.libs.log import Log -from packageship.application.apps.package.function.searchdb import SearchDB -from packageship.application.apps.package.function.constants import ResponseCode, ListNode -from packageship.application.apps.package.function.install_depend import InstallDepend \ - as install_depend -from packageship.application.apps.package.function.build_depend import BuildDepend as build_depend - -LOGGER = Log(__name__) - - -class SelfDepend(): - """ - Description: - Querying for self dependencies - Querying packages install and build depend for those package can be - build and install - Attributes: - db_list: list of database names - binary_dict: Contain all the binary packages info and operation - source_dicts: Contain all the source packages info and operation - result_tmp: restore the return result dict - search_install_list: Contain the binary packages searched install dep in the next loop - search_build_list: Contain the source packages searched build dep in the next loop - search_subpack_list: Contain the source packages searched subpack in the next loop - withsubpack: withsubpack - search_db: A object of database which would be connected - not_found_components: Contain the package not found components - """ - - # pylint: disable = R0902 - def __init__(self, db_list): - """ - init class - """ - self.binary_dict = DictionaryOperations() - self.source_dicts = DictionaryOperations() - self.result_tmp = dict() - self.search_install_list = [] - self.search_build_list = [] - self.search_subpack_list = [] - self.withsubpack = 0 - self.db_list = db_list - self.search_db = SearchDB(db_list) - self.not_found_components = set() - - def query_depend(self, packname, selfbuild, withsubpack, packtype='binary'): - """ - Description: init result dict and determint the loop end point - Args: - packname: Package name - selfbuild: selfbuild - withsubpack: withsubpack - packtype: package type - Returns: - binary_dict.dictionary: Contain all the binary packages info after searching - source_dicts.dictionary: Contain all the source packages info after searching - not_found_components :Set of package not found components - Raises: - """ - if not self.search_db.db_object_dict: - return ResponseCode.DIS_CONNECTION_DB, None, None, set() - if not packname: - return ResponseCode.INPUT_NONE - - self.withsubpack = withsubpack - response_code = self.init_dict(packname, packtype) - if response_code != ResponseCode.SUCCESS: - return (response_code, self.binary_dict.dictionary, - self.source_dicts.dictionary, self.not_found_components) - - for key, _ in self.binary_dict.dictionary.items(): - self.search_install_list.append(key) - for key, _ in self.source_dicts.dictionary.items(): - self.search_build_list.append(key) - if self.withsubpack == 1: - self.search_subpack_list.append(key) - - while self.search_build_list or self.search_install_list or self.search_subpack_list: - if self.search_install_list: - self.query_install() - if self.withsubpack == 1 and self.search_subpack_list: - self.with_subpack() - if self.search_build_list: - self.query_build(selfbuild) - return (response_code, self.binary_dict.dictionary, - self.source_dicts.dictionary, self.not_found_components) - - def init_dict(self, packname, packtype): - """ - Description: init result dict - Args: - packname: package name - packtype: package type - Returns: - response_code - Raises: - """ - if packtype == 'source': - response_code, subpack_list = self.search_db.get_sub_pack([packname]) - if not subpack_list: - return ResponseCode.PACK_NAME_NOT_FOUND - - for subpack_tuple, dbname in subpack_list: - self.source_dicts.append_src(packname, dbname, subpack_tuple.search_version) - if dbname == 'NOT FOUND': - continue - - if subpack_tuple.subpack_name and subpack_tuple.subpack_name \ - not in self.binary_dict.dictionary: - self.binary_dict.append_bin(key=subpack_tuple.subpack_name, - src=packname, - version=subpack_tuple.search_version, - dbname=dbname) - - else: - response_code, dbname, source_name, version = \ - self.search_db.get_src_name(packname) - if response_code != ResponseCode.SUCCESS: - return response_code - self.source_dicts.append_src(source_name, dbname, version) - self.binary_dict.append_bin(key=packname, - src=source_name, - version=version, - dbname=dbname) - return response_code - - def query_install(self): - """ - Description: query install depend - Args: - Returns: - Raises: - """ - self.result_tmp.clear() - _, self.result_tmp, not_fd_com = \ - install_depend(self.db_list).query_install_depend(self.search_install_list, - history_dicts=self.binary_dict.dictionary) - self.not_found_components.update(not_fd_com) - self.search_install_list.clear() - for key, values in self.result_tmp.items(): - if key in self.binary_dict.dictionary: - if ['root', None] in values[ListNode.PARENT_LIST]: - index = values[ListNode.PARENT_LIST].index(['root', None]) - del values[ListNode.PARENT_LIST][index] - self.binary_dict.update_value(key=key, parent_list=values[ListNode.PARENT_LIST]) - else: - if not key: - continue - self.binary_dict.dictionary[key] = copy.deepcopy(values) - source_name = values[ListNode.SOURCE_NAME] - if not source_name: - LOGGER.logger.warning("source name is None") - if source_name and source_name not in self.source_dicts.dictionary: - db_, src_version_ = self.search_db.get_version_and_db(source_name) - self.source_dicts.append_src(key=source_name, - dbname=db_ if db_ else values[ListNode.DBNAME], - version=src_version_ - if src_version_ else values[ListNode.VERSION]) - self.search_build_list.append(source_name) - if self.withsubpack == 1: - self.search_subpack_list.append(source_name) - - def with_subpack(self): - """ - Description: query subpackage - Args: - Returns: - Raises: - """ - if None in self.search_subpack_list: - LOGGER.logger.warning("There is a NONE in input value: %s", - str(self.search_subpack_list)) - self.search_subpack_list.remove(None) - _, result_list = self.search_db.get_sub_pack(self.search_subpack_list) - for subpack_tuple, dbname in result_list: - if dbname == 'NOT FOUND': - continue - - if subpack_tuple.subpack_name and subpack_tuple.subpack_name \ - not in self.binary_dict.dictionary: - self.binary_dict.append_bin(key=subpack_tuple.subpack_name, - src=subpack_tuple.search_name, - version=subpack_tuple.sub_pack_version, - dbname=dbname, - parent_node=[subpack_tuple.search_name, 'Subpack']) - self.search_install_list.append(subpack_tuple.subpack_name) - self.search_subpack_list.clear() - - def query_build(self, selfbuild): - """ - Description: query build depend - Args: - selfbuild: selfbuild - Returns: - Raises: - """ - self.result_tmp.clear() - if selfbuild == 0: - self.query_builddep() - else: - self.query_selfbuild() - - def query_builddep(self): - """ - Description: for selfbuild == 0, query selfbuild depend - Args: - Returns: - Raises: - """ - _, self.result_tmp, _, not_fd_com = build_depend( - self.search_build_list, - self.db_list, - self_build=0, - history_dict=self.binary_dict.dictionary - ).build_depend_main() - self.not_found_components.update(not_fd_com) - self.search_build_list.clear() - for key, values in self.result_tmp.items(): - if not key: - LOGGER.logger.warning("key is NONE for value = %s", str(values)) - continue - if key not in self.binary_dict.dictionary and values[0] != 'source': - self.binary_dict.dictionary[key] = copy.deepcopy(values) - source_name = values[ListNode.SOURCE_NAME] - if not source_name: - LOGGER.logger.warning("source name is None") - if source_name and source_name not in self.source_dicts.dictionary: - db_, src_version_ = self.search_db.get_version_and_db(source_name) - self.source_dicts.append_src(key=source_name, - dbname=db_ if db_ else values[ListNode.DBNAME], - version=src_version_ - if src_version_ else values[ListNode.VERSION]) - if self.withsubpack == 1: - self.search_subpack_list.append(source_name) - elif key in self.binary_dict.dictionary: - self.binary_dict.update_value(key=key, - parent_list=values[ListNode.PARENT_LIST]) - - def query_selfbuild(self): - """ - Description: for selfbuild == 1, query selfbuild depend - Args: - Returns: - """ - _, self.result_tmp, source_dicts_tmp, not_fd_com = build_depend( - self.search_build_list, - self.db_list, - self_build=1, - history_dict=self.source_dicts.dictionary - ).build_depend_main() - self.not_found_components.update(not_fd_com) - for key, values in self.result_tmp.items(): - if not key: - LOGGER.logger.warning("key is NONE for value = %s", str(values)) - continue - if key in self.binary_dict.dictionary: - self.binary_dict.update_value(key=key, parent_list=values[ListNode.PARENT_LIST]) - else: - self.binary_dict.dictionary[key] = copy.deepcopy(values) - self.search_install_list.append(key) - for key, values in source_dicts_tmp.items(): - if not key: - LOGGER.logger.warning("key is NONE for value = %s", str(values)) - continue - if key not in self.source_dicts.dictionary: - self.source_dicts.dictionary[key] = copy.deepcopy(values) - if self.withsubpack == 1: - self.search_subpack_list.append(key) - self.search_build_list.clear() - - -class DictionaryOperations(): - """ - Description: Related to dictionary operations, creating dictionary, append dictionary - Attributes: - dictionary: dict - """ - - def __init__(self): - """ - init class - """ - self.dictionary = dict() - - def append_src(self, key, dbname, version): - """ - Description: Appending source dictionary - Args: - key: bianry name - dbname: database name - version: version - Returns: - Raises: - """ - self.dictionary[key] = [dbname, version] - - # pylint: disable=R0913 - def append_bin(self, key, src=None, version=None, dbname=None, parent_node=None): - """ - Description: Appending binary dictionary - Args: - key: binary name - src: source name - version: version - dbname: database name - parent_node: parent node - Returns: - Raises: - """ - if not parent_node: - self.dictionary[key] = [src, version, dbname, [['root', None]]] - else: - self.dictionary[key] = [src, version, dbname, [parent_node]] - - def update_value(self, key, parent_list=None): - """ - Args: - key: binary name - parent_list: parent list - Returns: - Raises: - """ - if parent_list: - for parent in parent_list: - if parent not in self.dictionary[key][ListNode.PARENT_LIST]: - self.dictionary[key][ListNode.PARENT_LIST].append(parent) diff --git a/packageship/packageship/application/apps/package/serialize.py b/packageship/packageship/application/apps/package/serialize.py deleted file mode 100644 index 4949040908e9fb42bdf2037c15638e97924a5547..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/package/serialize.py +++ /dev/null @@ -1,307 +0,0 @@ -#!/usr/bin/python3 -""" -Description: marshmallow serialize -""" -from marshmallow import Schema -from marshmallow import fields -from marshmallow import ValidationError -from marshmallow import validate - -from packageship.application.models.package import Packages - - -def validate_pagenum(pagenum): - """ - Description: Method test - Args: - pagenum: pagenum - Returns: - True or failure - Raises: - ValidationError: Test failed - """ - if pagenum <= 0 or pagenum > 65535: - raise ValidationError("pagenum is illegal data ") - - -def validate_pagesize(pagesize): - """ - Description: Method test - Args: - pagesize: pagesize - Returns: - True or failure - Raises: - ValidationError: Test failed - """ - if pagesize <= 0 or pagesize > 65535: - raise ValidationError("pagesize is illegal data ") - - -def validate_maintainlevel(maintainlevel): - """ - Description: Method test - Args: - maintainlevel: maintainlevel - Returns: - True or failure - Raises: - ValidationError: Test failed - """ - if maintainlevel not in [1, 2, 3, 4, '']: - raise ValidationError("maintainLevel is illegal data ") - - -def validate_maintainlevels(maintainlevel): - """ - Description: Method test - Args: - maintainlevel: maintainlevel - Returns: - True or failure - Raises: - ValidationError: Test failed - """ - if maintainlevel not in ['1', '2', '3', '4', '']: - raise ValidationError("maintainLevel is illegal data ") - - -class AllPackagesSchema(Schema): - """ - Description: AllPackagesSchema serialize - """ - table_name = fields.Str( - required=True, - validate=validate.Length(min=1, - max=200)) - page_num = fields.Integer( - required=True, - validate=validate_pagenum - ) - page_size = fields.Integer( - required=True, - validate=validate_pagesize - ) - query_pkg_name = fields.Str(validate=validate.Length( - max=200), required=False, allow_none=True) - maintainner = fields.Str(validate=validate.Length( - max=200), required=False, allow_none=True) - maintainlevel = fields.Str( - validate=validate_maintainlevels, - required=False, - allow_none=True) - - -class SinglepackSchema(Schema): - """ - Description: GetpackSchema serialize - """ - pkg_name = fields.Str( - required=True, - validate=validate.Length(min=1, - max=200)) - - table_name = fields.Str(required=True, - validate=validate.Length(min=1, - max=200)) - - -class UpdatePackagesSchema(Schema): - """ - Description: UpdatePackagesSchema serialize - """ - pkg_name = fields.Str( - required=False, - validate=validate.Length( - max=200)) - maintainer = fields.Str(validate=validate.Length( - max=50), required=False, allow_none=True) - maintainlevel = fields.Integer( - validate=validate_maintainlevel, - required=False, - allow_none=True) - batch = fields.Boolean( - required=True) - filepath = fields.Str(validate=validate.Length( - max=200), required=False, allow_none=True) - - -class InstallDependSchema(Schema): - """ - Description: InstallDependSchema - """ - binaryName = fields.Str( - required=True, - validate=validate.Length( - min=1, max=500)) - db_list = fields.List(fields.String(), required=False, allow_none=True) - - -class BuildDependSchema(Schema): - """ - Description: BuildDependSchema serialize - """ - sourceName = fields.Str( - required=True, - validate=validate.Length( - min=1, max=200)) - db_list = fields.List(fields.String(), required=False, allow_none=True) - - -def validate_withsubpack(withsubpack): - """ - Description: Method test - Args: - withsubpack: withsubpack - Returns: - True or failure - Raises: - ValidationError: Test failed - """ - if withsubpack not in ['0', '1']: - raise ValidationError("withSubpack is illegal data ") - - -class BeDependSchema(Schema): - """ - Description: BeDependSchema serialize - """ - packagename = fields.Str( - required=True, - validate=validate.Length( - min=1, - max=200)) - withsubpack = fields.Str( - validate=validate_withsubpack, - required=False, allow_none=True) - dbname = fields.Str( - required=True, - validate=validate.Length( - min=1, - max=50)) - - -def validate_selfbuild(selfbuild): - """ - Description: Method test - """ - if selfbuild not in ['0', '1']: - raise ValidationError("selfbuild is illegal data ") - - -def validate_packtype(packtype): - """ - Description: Method test - """ - if packtype not in ['source', 'binary']: - raise ValidationError("packtype is illegal data ") - - -class SelfDependSchema(Schema): - """ - Description: SelfDependSchema serialize - """ - packagename = fields.Str( - required=True, - validate=validate.Length( - min=1, - max=200)) - db_list = fields.List(fields.String(), required=False, allow_none=True) - selfbuild = fields.Str(validate=validate_selfbuild, - required=False, allow_none=True) - withsubpack = fields.Str( - validate=validate_withsubpack, required=False, allow_none=True) - packtype = fields.Str(validate=validate_packtype, - required=False, allow_none=True) - - -class DeletedbSchema(Schema): - """ - Description: DeletedbSchema serialize - """ - dbName = fields.Str( - required=True, - validate=validate.Length( - min=1, - max=200)) - - -def have_err_db_name(db_list, db_priority): - """ - Description: have error database name method - Args: - db_list: db_list db list of inputs - db_priority: db_priority default list - Returns: - If any element in db_list is no longer in db_priority, return false - Raises: - """ - return any(filter(lambda db_name: db_name not in db_priority, db_list)) - - -class InitSystemSchema(Schema): - """ - Description: InitSystemSchema serialize - """ - configfile = fields.Str( - validate=validate.Length( - max=500), required=False, allow_none=True) - - -class AllPackInfoSchema(Schema): # pylint: disable= too-few-public-methods - """ - Field serialization for package file download - """ - class Meta: - """Model mapping serialized fields""" - model = Packages - fields = ( - 'id', - 'name', - 'url', - 'version', - 'release', - 'release_time', - 'rpm_license', - 'maintainer', - 'maintainlevel', - 'feature', - 'release_time', - 'used_time', - 'maintainer_status', - 'latest_version', - 'latest_version_time') - - -class SinglePackInfoSchema(Schema): - """ - Field serialization for package file download - """ - - class Meta: - """Model mapping serialized fields""" - model = Packages - fields = ( - 'name', - 'version', - 'release', - 'url', - 'feature', - 'rpm_license', - 'summary', - 'description') - - -class DataFormatVerfi(Schema): - """ - Verify the data in yaml - """ - - maintainer = fields.Str(validate=validate.Length( - max=50), required=False, allow_none=True) - maintainlevel = fields.Int( - validate=validate_maintainlevel, - required=False, - allow_none=True) - name = fields.Str(validate=validate.Length(min=1, - max=50), required=True) diff --git a/packageship/packageship/application/apps/package/url.py b/packageship/packageship/application/apps/package/url.py deleted file mode 100644 index 873cd7931ac4d53c64267c9f24b1db7763ae3cea..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/package/url.py +++ /dev/null @@ -1,36 +0,0 @@ -#!/usr/bin/python3 -""" -Description: url set -""" -from . import view - -urls = [ - # Get all packages' info - (view.Packages, '/packages', {'query': ('GET')}), - - - # Query and update a package info - (view.SinglePack, '/packages/packageInfo', - {'query': ('GET'), 'write': ('PUT')}), - - # Query a package's install depend(support querying in one or more databases) - (view.InstallDepend, '/packages/findInstallDepend', {'query': ('POST')}), - - # Query a package's build depend(support querying in one or more databases) - - (view.BuildDepend, '/packages/findBuildDepend', {'query': ('POST')}), - - # Query a package's all dependencies including install and build depend - # (support quering a binary or source package in one or more databases) - (view.SelfDepend, '/packages/findSelfDepend', {'query': ('POST')}), - - # Query a package's all be dependencies including be installed and built depend - (view.BeDepend, '/packages/findBeDepend', {'query': ('POST')}), - - # Get all imported databases, import new databases and update existed databases - - (view.Repodatas, '/repodatas', {'query': ('GET'), 'write': ('DELETE')}), - - # Reload database - (view.InitSystem, '/initsystem', {'write': ('POST')}) -] diff --git a/packageship/packageship/application/apps/package/view.py b/packageship/packageship/application/apps/package/view.py deleted file mode 100644 index 7df9f978be1effe28de848f88021662f87e50669..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/apps/package/view.py +++ /dev/null @@ -1,621 +0,0 @@ -#!/usr/bin/python3 -""" -description: Interface processing -class: BeDepend, BuildDepend, InitSystem, InstallDepend, Packages, -Repodatas, SelfDepend, SinglePack -""" -from flask import request -from flask import jsonify -from flask import current_app -from flask_restful import Resource -from sqlalchemy.exc import DisconnectionError -from sqlalchemy.exc import SQLAlchemyError - -from packageship.application.initsystem.data_import import InitDataBase -from packageship.libs.dbutils import DBHelper -from packageship.libs.exception import Error -from packageship.libs.exception import ContentNoneException -from packageship.libs.exception import DataMergeException -from packageship.libs.exception import ConfigurationException -from packageship.libs.log import Log -from packageship.libs.conf import configuration -from .function.constants import ResponseCode -from .function.packages import get_all_package_info -from .function.packages import sing_pack -from .function.searchdb import db_priority -from .serialize import AllPackagesSchema -from .serialize import SinglepackSchema - -from .serialize import DeletedbSchema -from .serialize import InitSystemSchema -from .serialize import BeDependSchema -from .function.be_depend import BeDepend as be_depend -from .function.install_depend import InstallDepend as installdepend -from .function.build_depend import BuildDepend as builddepend -from .function.self_depend import SelfDepend as self_depend -from .serialize import InstallDependSchema -from .serialize import BuildDependSchema -from .serialize import SelfDependSchema -from .serialize import have_err_db_name -from ...models.package import DatabaseInfo - -LOGGER = Log(__name__) - - -# pylint: disable = no-self-use - - -class Packages(Resource): - """ - Description: interface for package info management - Restful API: get - changeLog: - """ - - def get(self): - """ - Get all package info from a database - - Args: - dbName: Data path name, not required parameter - Returns: - for - example:: - { - "code": "", - "data": [{ - "dbname": "", - "license": "", - "maintainlevel":, - "maintaniner": , - "rpm_packager": "", - "sourceName": "", - "sourceURL": "", - "version": "" - }], - "msg": "" - } - Raises: - DisconnectionError: Unable to connect to database exception - AttributeError: Object does not have this property - Error: Abnormal error - """ - # Get verification parameters - schema = AllPackagesSchema() - data = request.args - if schema.validate(data): - response = ResponseCode.response_json(ResponseCode.PARAM_ERROR) - response['total_count'] = None - response['total_page'] = None - return jsonify(response) - table_name = data.get("table_name") - page_num = data.get("page_num") - page_size = data.get("page_size") - src_name = data.get("query_pkg_name", None) - maintainner = data.get("maintainner", None) - maintainlevel = data.get("maintainlevel", None) - result = get_all_package_info( - table_name, - page_num, - page_size, - src_name, - maintainner, - maintainlevel) - return jsonify(result) - - -class SinglePack(Resource): - """ - description: single package management - Restful API: get - ChangeLog: - """ - - def get(self): - """ - Searching a package info - - Args: - dbName: Database name, not required parameter - sourceName: Source code package name, must pass - Returns: - for - examples:: - { - "code": "", - "data": [{ - "buildDep": [], - "dbname": "", - "license": "", - "maintainlevel": "", - "maintaniner": "", - "rpm_packager": "", - "sourceName": "", - "sourceURL": "", - "subpack": { }, - "version": ""}], - "msg": "" - } - Raises: - DisconnectionError: Unable to connect to database exception - AttributeError: Object does not have this property - TypeError: Exception of type - Error: Abnormal error - """ - # Get verification parameters - schema = SinglepackSchema() - data = request.args - if schema.validate(data): - return jsonify( - ResponseCode.response_json(ResponseCode.PARAM_ERROR) - ) - srcname = data.get("pkg_name") - tablename = data.get("table_name") - result = sing_pack(srcname, tablename) - return jsonify(result) - - -class InstallDepend(Resource): - """ - Description: install depend of binary package - Restful API: post - changeLog: - """ - - def post(self): - """ - Query a package's install depend(support - querying in one or more databases) - - Args: - binaryName - dbPreority: the array for database preority - Returns: - resultDict{ - binary_name: //binary package name - [ - src, //the source package name for - that binary packge - dbname, - version, - [ - parent_node, //the binary package name which is - the install depend for binaryName - type //install install or build, which - depend on the function - ] - ] - } - Raises: - """ - schema = InstallDependSchema() - - data = request.get_json() - validate_err = schema.validate(data) - if validate_err: - return jsonify( - ResponseCode.response_json(ResponseCode.PARAM_ERROR) - ) - pkg_name = data.get("binaryName") - - db_pri = db_priority() - if not db_pri: - return jsonify( - ResponseCode.response_json( - ResponseCode.NOT_FOUND_DATABASE_INFO - ) - ) - - db_list = data.get("db_list") if data.get("db_list") \ - else db_pri - - if not all([pkg_name, db_list]): - return jsonify( - ResponseCode.response_json(ResponseCode.PARAM_ERROR) - ) - - if have_err_db_name(db_list, db_pri): - return jsonify( - ResponseCode.response_json(ResponseCode.DB_NAME_ERROR) - ) - - response_code, install_dict, not_found_components = \ - installdepend(db_list).query_install_depend([pkg_name]) - - if not install_dict: - return jsonify( - ResponseCode.response_json(response_code) - ) - elif len(install_dict) == 1 and install_dict.get(pkg_name)[2] == 'NOT FOUND': - return jsonify( - ResponseCode.response_json(ResponseCode.PACK_NAME_NOT_FOUND) - ) - return jsonify( - ResponseCode.response_json(ResponseCode.SUCCESS, data={ - "install_dict": install_dict, - 'not_found_components': list(not_found_components) - }) - ) - - -class BuildDepend(Resource): - """ - Description: build depend of binary package - Restful API: post - changeLog: - """ - - def post(self): - """ - Query a package's build depend and - build depend package's install depend - (support querying in one or more databases) - - Args: - sourceName :name of the source package - dbPreority:the array for database preority - Returns: - for - example:: - { - "code": "", - "data": "", - "msg": "" - } - Raises: - """ - schema = BuildDependSchema() - - data = request.get_json() - if schema.validate(data): - return jsonify( - ResponseCode.response_json(ResponseCode.PARAM_ERROR) - ) - pkg_name = data.get("sourceName") - - db_pri = db_priority() - - if not db_pri: - return jsonify( - ResponseCode.response_json( - ResponseCode.NOT_FOUND_DATABASE_INFO - ) - ) - - db_list = data.get("db_list") if data.get("db_list") \ - else db_pri - - if have_err_db_name(db_list, db_pri): - return jsonify( - ResponseCode.response_json(ResponseCode.DB_NAME_ERROR) - ) - - build_ins = builddepend([pkg_name], db_list) - - res_code, res_dict, _, not_found_com = build_ins.build_depend_main() - if res_dict: - res_code = ResponseCode.SUCCESS - else: - return jsonify( - ResponseCode.response_json( - res_code - ) - ) - - return jsonify( - ResponseCode.response_json( - res_code, - data={ - 'build_dict': res_dict, - 'not_found_components': list(not_found_com) - } - ) - ) - - -class SelfDepend(Resource): - """ - Description: querying install and build depend for a package - and others which has the same src name - Restful API: post - changeLog: - """ - - def post(self): - """ - Query a package's all dependencies including install and build depend - (support quering a binary or source package in one or more databases) - - Args: - packageName:package name - packageType: source/binary - selfBuild :0/1 - withSubpack: 0/1 - dbPreority:the array for database preority - Returns: - for - example:: - { - "code": "", - "data": "", - "msg": "" - } - """ - schema = SelfDependSchema() - - data = request.get_json() - validate_err = schema.validate(data) - - if validate_err: - return jsonify( - ResponseCode.response_json(ResponseCode.PARAM_ERROR) - ) - - pkg_name = data.get("packagename") - db_pri = db_priority() - - if not db_pri: - return jsonify( - ResponseCode.response_json( - ResponseCode.NOT_FOUND_DATABASE_INFO - ) - ) - db_list = data.get("db_list") if data.get("db_list") \ - else db_pri - - self_build = data.get("selfbuild", 0) - with_sub_pack = data.get("withsubpack", 0) - pack_type = data.get("packtype", "binary") - - if have_err_db_name(db_list, db_pri): - return jsonify( - ResponseCode.response_json(ResponseCode.DB_NAME_ERROR) - ) - response_code, binary_dicts, source_dicts, not_fd_components = \ - self_depend(db_list).query_depend(pkg_name, int(self_build), - int(with_sub_pack), pack_type) - - if not all([binary_dicts, source_dicts]): - return jsonify( - ResponseCode.response_json(response_code) - ) - - return jsonify( - ResponseCode.response_json(ResponseCode.SUCCESS, data={ - "binary_dicts": binary_dicts, - "source_dicts": source_dicts, - "not_found_components": list(not_fd_components) - }) - ) - - -class BeDepend(Resource): - """ - Description: querying be installed and built depend for a package - and others which has the same src name - Restful API: post - changeLog: - """ - - def post(self): - """ - Query a package's all dependencies including - be installed and built depend - - Args: - packageName:package name - withSubpack: 0/1 - dbname:database name - Returns: - for - example:: - resultList[ - restult[ - binaryName: - srcName: - dbName: - type: beinstall or bebuild, which depend on the function - childNode: the binary package name which is the be built/installed - depend for binaryName - ] - ] - """ - schema = BeDependSchema() - data = request.get_json() - validate_err = schema.validate(data) - - if validate_err: - return jsonify( - ResponseCode.response_json(ResponseCode.PARAM_ERROR) - ) - - package_name = data.get("packagename") - with_sub_pack = data.get("withsubpack") - db_name = data.get("dbname") - - if db_name not in db_priority(): - return jsonify( - ResponseCode.response_json(ResponseCode.DB_NAME_ERROR) - ) - - bedepnd_ins = be_depend(package_name, db_name, with_sub_pack) - - res_dict = bedepnd_ins.main() - - if not res_dict: - return jsonify( - ResponseCode.response_json(ResponseCode.PACK_NAME_NOT_FOUND) - ) - return jsonify( - ResponseCode.response_json(ResponseCode.SUCCESS, data=res_dict) - ) - - -class Repodatas(Resource): - """ - description: Get database information and delete database - Restful API: get, delete - ChangeLog: - """ - - def get(self): - """ - get all database - - Returns: - for - example:: - { - "code": "", - "data": [ - { - "database_name": "", - "priority": "", - "status": "" - } - ], - "msg": "" - } - Raises: - FileNotFoundError: File not found exception - TypeError: Exception of wrong type - Error: abnormal Error - """ - try: - with DBHelper(db_name='lifecycle') as data_name: - name_list = data_name.session.query( - DatabaseInfo.name, DatabaseInfo.priority).order_by(DatabaseInfo.priority).all() - data_list = [dict(zip(ven.keys(), ven)) for ven in name_list] - return jsonify( - ResponseCode.response_json( - ResponseCode.SUCCESS, - data=data_list)) - except (SQLAlchemyError, Error) as data_info_error: - current_app.logger.error(data_info_error) - return jsonify( - ResponseCode.response_json( - ResponseCode.NOT_FOUND_DATABASE_INFO) - ) - - def delete(self): - """ - get all database - - Returns: - for - example:: - { - "code": "", - "data": "", - "msg": "" - } - Raises: - FileNotFoundError: File not found exception, - TypeError: Exception of wrong type - Error: Abnormal error - """ - schema = DeletedbSchema() - data = request.args - if schema.validate(data): - return jsonify( - ResponseCode.response_json(ResponseCode.PARAM_ERROR) - ) - db_name = data.get("dbName") - db_list = db_priority() - if not db_list: - return jsonify( - ResponseCode.response_json(ResponseCode.FILE_NOT_FOUND) - ) - if db_name not in db_priority(): - return jsonify( - ResponseCode.response_json(ResponseCode.DB_NAME_ERROR) - ) - try: - drop_db = InitDataBase() - del_result = drop_db.delete_db(db_name) - if not del_result: - return jsonify( - ResponseCode.response_json(ResponseCode.DELETE_DB_ERROR)) - return jsonify( - ResponseCode.response_json(ResponseCode.SUCCESS) - ) - except (SQLAlchemyError, TypeError, Error) as error: - current_app.logger.error(error) - return jsonify( - ResponseCode.response_json(ResponseCode.DELETE_DB_ERROR) - ) - - -class InitSystem(Resource): - """ - description: Initialize database - Restful API: post - ChangeLog: - """ - - def post(self): - """ - InitSystem - - Returns: - for - example:: - { - "code": "", - "data": "", - "msg": "" - } - Raises: - ContentNoneException: Unable to connect to the exception of the database - DisconnectionError:Exception connecting to database - TypeError:Exception of wrong type - DataMergeException:Exception of merging data - FileNotFoundError:File not found exception - Error: abnormal Error - """ - schema = InitSystemSchema() - - data = request.get_json() - validate_err = schema.validate(data) - if validate_err: - return jsonify( - ResponseCode.response_json( - ResponseCode.PARAM_ERROR)) - configfile = data.get("configfile", None) - try: - abnormal = None - if not configfile: - InitDataBase( - config_file_path=configuration.INIT_CONF_PATH).init_data() - else: - InitDataBase(config_file_path=configfile).init_data() - except ContentNoneException as content_none_error: - LOGGER.logger.error(content_none_error) - abnormal = ResponseCode.CONFIGFILE_PATH_EMPTY - except DisconnectionError as dis_connection_error: - LOGGER.logger.error(dis_connection_error) - abnormal = ResponseCode.DIS_CONNECTION_DB - except TypeError as type_error: - LOGGER.logger.error(type_error) - abnormal = ResponseCode.TYPE_ERROR - except ConfigurationException as error: - LOGGER.logger.error(error) - abnormal = error - return jsonify(ResponseCode.response_json('5000', msg=abnormal.message)) - except DataMergeException as data_merge_error: - LOGGER.logger.error(data_merge_error) - abnormal = ResponseCode.DATA_MERGE_ERROR - except FileNotFoundError as file_not_found_error: - LOGGER.logger.error(file_not_found_error) - abnormal = ResponseCode.FILE_NOT_FIND_ERROR - except (Error, SQLAlchemyError) as error: - LOGGER.logger.error(error) - abnormal = ResponseCode.FAILED_CREATE_DATABASE_TABLE - if abnormal is not None: - return jsonify(ResponseCode.response_json(abnormal)) - db_list = db_priority() - if not db_list: - return jsonify( - ResponseCode.response_json( - ResponseCode.FAILED_CREATE_DATABASE_TABLE)) - return jsonify(ResponseCode.response_json(ResponseCode.SUCCESS)) diff --git a/packageship/packageship/application/initsystem/__init__.py b/packageship/packageship/application/initsystem/__init__.py deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/packageship/packageship/application/initsystem/data_import.py b/packageship/packageship/application/initsystem/data_import.py deleted file mode 100644 index 99930cbdc665bfba08f50b0a50a3251a69ae1011..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/initsystem/data_import.py +++ /dev/null @@ -1,800 +0,0 @@ -#!/usr/bin/python3 -""" -Description: Initialization of data import - Import the data in the sqlite database into the mysql database -Class: InitDataBase,MysqlDatabaseOperations,SqliteDatabaseOperations -""" -import os -import yaml -from sqlalchemy import text -from sqlalchemy.exc import SQLAlchemyError, InternalError -from packageship.libs.dbutils.sqlalchemy_helper import DBHelper -from packageship.libs.exception import ContentNoneException -from packageship.libs.exception import DatabaseRepeatException -from packageship.libs.exception import Error -from packageship.libs.exception import ConfigurationException -from packageship.libs.log import LOGGER -from packageship.libs.conf import configuration -from packageship.application.models.package import SrcPack -from packageship.application.models.package import DatabaseInfo -from packageship.application.models.package import BinPack -from packageship.application.models.package import BinRequires -from packageship.application.models.package import SrcRequires -from packageship.application.models.package import BinProvides -from packageship.application.models.package import BinFiles -from packageship.application.models.package import Packages - - -class InitDataBase(): - """ - Description: Database initialization, generate multiple databases and data - based on configuration files - - Attributes: - config_file_path: configuration file path - config_file_datas: initialize the configuration content of the database - db_type: type of database - """ - - def __init__(self, config_file_path=None): - """ - Description: Class instance initialization - - Args: - config_file_path: Configuration file path - """ - self.config_file_path = config_file_path - if self.config_file_path: - # yaml configuration file content - self.config_file_datas = self.__read_config_file() - self.db_type = configuration.DATABASE_ENGINE_TYPE - self.sql = None - self._database = None - self._sqlite_db = None - self._database_engine = { - 'sqlite': SqliteDatabaseOperations, - 'mysql': MysqlDatabaseOperations - } - self.database_name = None - self._tables = ['src_pack', 'bin_pack', - 'bin_requires', 'src_requires', 'bin_provides', 'bin_files'] - # Create life cycle related databases and tables - if not self.create_database(db_name='lifecycle', - tables=['packages_issue', - 'packages_maintainer', - 'databases_info'], - storage=True): - raise SQLAlchemyError( - 'Failed to create the specified database and table:lifecycle') - - def __read_config_file(self): - """ - Read the contents of the configuration file load each - node data in the yaml configuration file as a list to return - - Returns: - Initialize the contents of the database configuration file - Raises: - FileNotFoundError: The specified file does not exist - TypeError: Wrong type of data - """ - - if not os.path.exists(self.config_file_path): - raise FileNotFoundError( - "system initialization configuration file" - "does not exist: %s" % self.config_file_path) - # load yaml configuration file - with open(self.config_file_path, 'r', encoding='utf-8') as file_context: - try: - init_database_config = yaml.load( - file_context.read(), Loader=yaml.FullLoader) - except yaml.YAMLError as yaml_error: - - raise ConfigurationException( - "The format of the yaml configuration" - "file is wrong please check and try again:{0}".format(yaml_error)) - - if init_database_config is None: - raise ConfigurationException( - 'The content of the database initialization configuration file cannot be empty') - if not isinstance(init_database_config, list): - raise ConfigurationException( - "The format of the initial database configuration file" - "is incorrect.When multiple databases need to be initialized," - "it needs to be configured in the form of multiple" - "nodes:{}".format(self.config_file_path)) - for config_item in init_database_config: - if not isinstance(config_item, dict): - raise ConfigurationException( - "The format of the initial database" - "configuration file is incorrect, and the value in a single node should" - "be presented in the form of key - val pairs:{}".format(self.config_file_path)) - return init_database_config - - def init_data(self): - """ - Initialization of the database - - Raises: - IOError: An error occurred while deleting the database information file - """ - if getattr(self, 'config_file_datas', None) is None or \ - self.config_file_datas is None: - raise ContentNoneException("The content of the database initialization" - "configuration file is empty") - - if self.__exists_repeat_database(): - raise DatabaseRepeatException( - 'There is a duplicate database configuration') - - if not self.__clear_database(): - raise SQLAlchemyError( - 'Failed to delete the database, throw an exception') - - if not InitDataBase.__clear_database_info(): - raise SQLAlchemyError( - 'Failed to clear data in database_info or lifecycle database') - - for database_config in self.config_file_datas: - if not database_config.get('dbname'): - LOGGER.logger.error( - 'The database name in the database initialization configuration file is empty') - continue - priority = database_config.get('priority') - if not isinstance(priority, int) or priority < 0 or priority > 100: - LOGGER.logger.error("The priority value type in the database initialization" - "configuration file is incorrect") - continue - lifecycle_status_val = database_config.get('lifecycle') - if lifecycle_status_val not in ('enable', 'disable'): - LOGGER.logger.error("The value of the life cycle in the initialization" - "configuration file can only be enable or disable") - continue - # Initialization data - self._init_data(database_config) - - def _create_database(self, db_name, tables, storage=False): - """ - create related databases - - Args: - database_config: Initialize the configuration content of the database_config - Returns: - The generated mysql database or sqlite database - Raises: - SQLAlchemyError: Abnormal database operation - """ - _database_engine = self._database_engine.get(self.db_type) - if not _database_engine: - raise Error("The database engine is set incorrectly," - "currently only the following engines are supported: %s " - % '、'.join(self._database_engine.keys())) - _create_table_result = _database_engine( - db_name=db_name, tables=tables, storage=storage).create_database(self) - return _create_table_result - - def _init_data(self, database_config): - """ - data initialization operation - - Args: - database: Initialize the configuration content of the database - Returns: - - Raises: - ContentNoneException: Exception with empty content - TypeError: Data type error - SQLAlchemyError: Abnormal database operation - IOError: An error occurred while deleting the database information file - """ - - try: - # 1. create a database and related tables in the database - _db_name = database_config.get('dbname') - _create_database_result = self._create_database( - _db_name, self._tables) - if not _create_database_result: - raise SQLAlchemyError( - 'Failed to create the specified database and table:%s' - % database_config['dbname']) - # 2. get the data of binary packages and source packages - src_db_file = database_config.get('src_db_file') - bin_db_file = database_config.get('bin_db_file') - - if src_db_file is None or bin_db_file is None: - raise ContentNoneException( - "The path to the sqlite file in the database initialization" - "configuration is incorrect ") - if not os.path.exists(src_db_file) or not os.path.exists(bin_db_file): - raise FileNotFoundError( - "sqlite file {src} or {bin} does not exist, please" - "check and try again".format(src=src_db_file, bin=bin_db_file)) - # 3. Obtain temporary source package files and binary package files - if self.__save_data(database_config, - self.database_name): - # Update the configuration file of the database - database_content = { - 'database_name': _db_name, - 'priority': database_config.get('priority'), - } - InitDataBase.__update_database_info(database_content) - - except (SQLAlchemyError, ContentNoneException, TypeError, - Error, FileNotFoundError) as error_msg: - LOGGER.logger.error(error_msg) - # Delete the specified database - self.__del_database(_db_name) - - def __del_database(self, db_name): - try: - _database_engine = self._database_engine.get(self.db_type) - del_result = _database_engine(db_name=db_name).drop_database() - return del_result - except (IOError, Error) as exception_msg: - LOGGER.logger.error(exception_msg) - return False - - @staticmethod - def __columns(cursor): - """ - functional description:Returns all the column names - queried by the current cursor - - Args: - cursor: Cursor - Returns: - The first columns - Raises: - - """ - return [col[0] for col in cursor.description] - - def __get_data(self): - """ - According to different sql statements, query related table data - - Args: - - Returns: - - Raises: - - """ - if self.sql is None: - return None - try: - src_packages_data = self._database.session.execute(self.sql) - columns = InitDataBase.__columns( - src_packages_data.cursor) - return [dict(zip(columns, row)) for row in src_packages_data.fetchall()] - except SQLAlchemyError as sql_error: - LOGGER.logger.error(sql_error) - return None - - def __save_data(self, database_config, db_name): - """ - integration of multiple data files - - Args: - src_package_paths: Source package database file - bin_package_paths: Binary package database file - Returns: - Path of the generated temporary database file - Raises: - - """ - src_db_file = database_config.get('src_db_file') - bin_db_file = database_config.get('bin_db_file') - table_name = database_config.get('dbname') - lifecycle_status_val = database_config.get('lifecycle') - try: - with DBHelper(db_name=src_db_file, db_type='sqlite:///', complete_route_db=True) \ - as database: - self._database = database - # Save data related to source package - self._save_src_packages( - db_name, table_name, lifecycle_status_val) - self._save_src_requires(db_name) - - with DBHelper(db_name=bin_db_file, db_type='sqlite:///', complete_route_db=True)\ - as database: - self._database = database - # Save binary package related data - self._save_bin_packages(db_name) - self._save_bin_requires(db_name) - self._save_bin_provides(db_name) - self._save_bin_files(db_name) - except (SQLAlchemyError, ContentNoneException) as sql_error: - LOGGER.logger.error(sql_error) - self.__del_database(db_name) - return False - else: - return True - - def _save_src_packages(self, db_name, table_name, lifecycle_status_val): - """ - Save the source package data - - Args: - db_name: Saved database name - """ - # Query all source packages - self.sql = " select * from packages " - packages_datas = self.__get_data() - if packages_datas is None: - raise ContentNoneException( - "{db_name}:There is no relevant data in the source " - "package provided ".format(db_name=db_name)) - for index, src_package_item in enumerate(packages_datas): - try: - src_package_name = '-'.join([src_package_item.get('name'), - src_package_item.get('version'), - src_package_item.get( - 'release') + '.src.rpm' - ]) - except AttributeError as exception_msg: - src_package_name = None - LOGGER.logger.warning(exception_msg) - finally: - packages_datas[index]['src_name'] = src_package_name - with DBHelper(db_name=db_name) as database: - database.batch_add(packages_datas, SrcPack) - if lifecycle_status_val == 'enable': - InitDataBase._storage_packages(table_name, packages_datas) - - @staticmethod - def _storage_packages(table_name, package_data): - """ - Bulk storage of source code package data - """ - add_packages = [] - cls_model = Packages.package_meta(table_name) - pkg_keys = ('name', 'url', 'rpm_license', 'version', - 'release', 'summary', 'description') - with DBHelper(db_name="lifecycle") as database: - if table_name not in database.engine.table_names(): - database.create_table([table_name]) - # Query data that already exists in the database - exist_packages_dict = dict() - for pkg in database.session.query(cls_model).all(): - exist_packages_dict[pkg.name] = pkg - _packages = [] - for pkg in package_data: - _package_dict = {key: pkg[key] for key in pkg_keys} - _packages.append(_package_dict) - - # Combine all package data, save or update - for package_item in _packages: - package_model = exist_packages_dict.get(package_item['name']) - if package_model: - for key, val in package_item.items(): - setattr(package_model, key, val) - else: - add_packages.append(package_item) - - if add_packages: - database.batch_add(add_packages, cls_model) - database.session.commit() - - def _save_src_requires(self, db_name): - """ - Save the dependent package data of the source package - - Args: - db_name:Name database - Returns: - - Raises: - - """ - # Query all source packages - self.sql = " select * from requires " - requires_datas = self.__get_data() - if requires_datas is None: - raise ContentNoneException( - "{db_name}: The package data that the source package " - "depends on is empty".format(db_name=db_name)) - with DBHelper(db_name=db_name) as database: - database.batch_add(requires_datas, SrcRequires) - - def _save_bin_packages(self, db_name): - """ - Save binary package data - - Args: - db_name:Name database - Returns: - - Raises: - - """ - self.sql = " select * from packages " - bin_packaegs = self.__get_data() - if bin_packaegs is None: - raise ContentNoneException( - "{db_name}:There is no relevant data in the provided " - "binary package ".format(db_name=db_name)) - for index, bin_package_item in enumerate(bin_packaegs): - try: - src_package_name = bin_package_item.get('rpm_sourcerpm').split( - '-' + bin_package_item.get('version'))[0] - except AttributeError as exception_msg: - src_package_name = None - LOGGER.logger.warning(exception_msg) - finally: - bin_packaegs[index]['src_name'] = src_package_name - - with DBHelper(db_name=db_name) as database: - database.batch_add(bin_packaegs, BinPack) - - def _save_bin_requires(self, db_name): - """ - Save the dependent package data of the binary package - - Args: - db_name:Name database - Returns: - - Raises: - - """ - self.sql = " select * from requires " - requires_datas = self.__get_data() - if requires_datas is None: - raise ContentNoneException( - "{db_name}:There is no relevant data in the provided binary " - "dependency package".format(db_name=db_name)) - - with DBHelper(db_name=db_name) as database: - database.batch_add(requires_datas, BinRequires) - - def _save_bin_provides(self, db_name): - """ - Save the component data provided by the binary package - - Args: - db_name:Name database - Returns: - - Raises: - - """ - self.sql = " select * from provides " - provides_datas = self.__get_data() - if provides_datas is None: - raise ContentNoneException( - "{db_name}:There is no relevant data in the provided " - "binary component ".format(db_name=db_name)) - - with DBHelper(db_name=db_name) as database: - database.batch_add(provides_datas, BinProvides) - - def _save_bin_files(self, db_name): - - self.sql = " select * from files " - files_datas = self.__get_data() - if files_datas is None: - raise ContentNoneException( - "{db_name}:There is no relevant binary file installation " - "path data in the provided database ".format(db_name=db_name)) - - with DBHelper(db_name=db_name) as database: - database.batch_add(files_datas, BinFiles) - - def __exists_repeat_database(self): - """ - Determine if the same database name exists - - Returns: - True if there are duplicate databases, false otherwise - Raises: - - """ - db_names = [name.get('dbname') - for name in self.config_file_datas] - - if len(set(db_names)) != len(self.config_file_datas): - return True - - return False - - @staticmethod - def __update_database_info(database_content): - """ - Update the database_name table - - Args: - database_content: - Dictionary of database names and priorities - Returns: - - """ - try: - with DBHelper(db_name="lifecycle") as database_name: - name = database_content.get("database_name") - priority = database_content.get("priority") - database_name.add(DatabaseInfo( - name=name, priority=priority - )) - database_name.session.commit() - except (SQLAlchemyError, Error, AttributeError) as error: - LOGGER.logger.error(error) - - @staticmethod - def __clear_database_info(): - """ - Delete the tables in the lifecycle except for the specific three tables - Returns: - - """ - try: - with DBHelper(db_name="lifecycle") as database_name: - clear_sql = """delete from databases_info;""" - database_name.session.execute(text(clear_sql)) - table_list = database_name.engine.table_names() - for item in table_list: - if item not in ['packages_maintainer', 'databases_info', 'packages_issue']: - drop_sql = '''DROP TABLE if exists `{table_name}`'''.format( - table_name=item) - database_name.session.execute(text(drop_sql)) - database_name.session.commit() - except SQLAlchemyError as sql_error: - LOGGER.logger.error(sql_error) - return False - else: - return True - - def __clear_database(self): - """ - Delete database - Returns: - - """ - try: - with DBHelper(db_name='lifecycle') as data_name: - name_data_list = data_name.session.query( - DatabaseInfo.name).order_by(DatabaseInfo.priority).all() - name_list = [name[0] for name in name_data_list if name[0]] - for item in name_list: - self.__del_database(item) - except (SQLAlchemyError, Error, IOError) as error: - LOGGER.logger.error(error) - return False - else: - return True - - def delete_db(self, db_name): - """ - delete the database - - Args: - db_name: The name of the database - Returns: - - Raises: - IOError: File or network operation io abnormal - """ - try: - del_result = True - with DBHelper(db_name='lifecycle') as database_name: - database_name.session.query(DatabaseInfo).filter( - DatabaseInfo.name == db_name).delete() - drop_sql = '''DROP TABLE if exists `{table_name}`''' \ - .format(table_name=db_name) - database_name.session.execute(text(drop_sql)) - database_name.session.commit() - except SQLAlchemyError as sql_error: - LOGGER.logger.error(sql_error) - del_result = False - if del_result: - del_result = self.__del_database(db_name) - return del_result - - def create_database(self, db_name, tables=None, storage=True): - """ - Create databases and tables related to the package life cycle - - Args: - db_name: The name of the database - tables: Table to be created - """ - _create_database_result = self._create_database( - db_name, tables, storage) - return _create_database_result - - -class MysqlDatabaseOperations(): - """ - Related to database operations, creating databases, creating tables - - Attributes: - db_name: The name of the database - create_database_sql: SQL statement to create a database - drop_database_sql: Delete the SQL statement of the database - """ - - def __init__(self, db_name, tables=None, storage=False): - """ - Class instance initialization - - Args: - db_name: Database name - """ - self.db_name = db_name - self.create_database_sql = ''' CREATE DATABASE if not exists `{db_name}` \ - DEFAULT CHARACTER SET utf8mb4; '''.format(db_name=self.db_name) - self.drop_database_sql = '''drop DATABASE if exists `{db_name}` '''.format( - db_name=self.db_name) - self.tables = tables - self.storage = storage - - def create_database(self, init_db): - """ - create a mysql database - - Returns: - True if successful, otherwise false - Raises: - SQLAlchemyError: An exception occurred while creating the database - """ - _create_success = True - if isinstance(init_db, InitDataBase): - init_db.database_name = self.db_name - with DBHelper(db_name='mysql') as data_base: - - try: - # create database - if not self.storage: - data_base.session.execute(self.drop_database_sql) - data_base.session.execute(self.create_database_sql) - except InternalError as internal_error: - LOGGER.logger.info(internal_error) - except SQLAlchemyError as exception_msg: - LOGGER.logger.error(exception_msg) - return False - if self.tables: - _create_success = self.__create_tables() - return _create_success - - def drop_database(self): - """ - Delete the database according to the specified name - - Args: - db_name: The name of the database to be deleted - Returns: - True if successful, otherwise false - Raises: - SQLAlchemyError: An exception occurred while creating the database - """ - if self.db_name is None: - raise IOError( - "The name of the database to be deleted cannot be empty") - with DBHelper(db_name='mysql') as data_base: - drop_database = ''' drop DATABASE if exists `{db_name}` '''.format( - db_name=self.db_name) - try: - data_base.session.execute(drop_database) - except SQLAlchemyError as exception_msg: - LOGGER.logger.error(exception_msg) - return False - else: - return True - - def __create_tables(self): - """ - Create the specified data table - - Returns: - True if successful, otherwise false - Raises: - SQLAlchemyError: An exception occurred while creating the database - """ - try: - with DBHelper(db_name=self.db_name) as database: - if self.tables: - _tables = list(set(self.tables).difference( - set(database.engine.table_names()))) - database.create_table(_tables) - - except SQLAlchemyError as exception_msg: - LOGGER.logger.error(exception_msg) - return False - else: - return True - - -class SqliteDatabaseOperations(): - """ - sqlite database related operations - - Attributes: - db_name: Name database - database_file_folder: Database folder path - """ - - def __init__(self, db_name, tables=None, storage=False, ** kwargs): - """ - Class instance initialization - - Args: - db_name: Database name - kwargs: data related to configuration file nodes - """ - self.db_name = db_name - self.database_file_folder = configuration.DATABASE_FOLDER_PATH - if hasattr(kwargs, 'database_path'): - self.database_file_folder = kwargs.get('database_path') - self._make_folder() - self.tables = tables - self.storage = storage - - def _make_folder(self): - """ - Create a folder to hold the database - - Raises: - IOError: File or network operation io abnormal - """ - if not os.path.exists(self.database_file_folder): - try: - os.makedirs(self.database_file_folder) - except IOError as makedirs_error: - LOGGER.logger.error(makedirs_error) - self.database_file_folder = None - - def create_database(self, init_db): - """ - create sqlite database and table - - Returns: - After successful generation, return the database file address, - otherwise return none - Raises: - FileNotFoundError: The specified folder path does not exist - SQLAlchemyError: An error occurred while generating the database - """ - _create_success = False - if self.database_file_folder is None: - raise FileNotFoundError('Database folder does not exist') - - _db_file = os.path.join( - self.database_file_folder, self.db_name) - - if not self.storage and os.path.exists(_db_file + '.db'): - os.remove(_db_file + '.db') - - # create a sqlite database - with DBHelper(db_name=_db_file) as database: - try: - if self.tables: - _tables = list(set(self.tables).difference( - set(database.engine.table_names()))) - database.create_table(_tables) - except (SQLAlchemyError, InternalError) as create_table_err: - LOGGER.logger.error(create_table_err) - return _create_success - if isinstance(init_db, InitDataBase): - init_db.database_name = _db_file - _create_success = True - return _create_success - - def drop_database(self): - """ - Delete the specified sqlite database - - Returns: - Return true after successful deletion, otherwise return false - Raises: - IOError: An io exception occurred while deleting the specified database file - """ - try: - db_path = os.path.join( - self.database_file_folder, self.db_name + '.db') - if os.path.exists(db_path): - os.remove(db_path) - except IOError as exception_msg: - LOGGER.logger.error(exception_msg) - return False - else: - return True diff --git a/packageship/packageship/application/models/__init__.py b/packageship/packageship/application/models/__init__.py deleted file mode 100644 index 79752094b1769eedae0b2069dc8203a17e61c7cb..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/models/__init__.py +++ /dev/null @@ -1,4 +0,0 @@ -#!/usr/bin/python3 -""" -Entity mapping model of database -""" diff --git a/packageship/packageship/application/models/package.py b/packageship/packageship/application/models/package.py deleted file mode 100644 index 2d9d20470db112e7d063dd1660b76406bdb3b6bc..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/models/package.py +++ /dev/null @@ -1,214 +0,0 @@ -#!/usr/bin/python3 -""" -Description: Database entity model mapping -""" -import uuid -from sqlalchemy import Column, Integer, String, Text -from packageship.libs.dbutils.sqlalchemy_helper import DBHelper - - -class SrcPack(DBHelper.BASE): - """ - Source package model - """ - - __tablename__ = 'src_pack' - - pkgKey = Column(Integer, primary_key=True) - pkgId = Column(String(500), nullable=True) - name = Column(String(200), nullable=True) - arch = Column(String(200), nullable=True) - version = Column(String(500), nullable=True) - epoch = Column(String(200), nullable=True) - release = Column(String(500), nullable=True) - summary = Column(String(500), nullable=True) - description = Column(String(500), nullable=True) - url = Column(String(500), nullable=True) - time_file = Column(Integer, nullable=True) - time_build = Column(Integer, nullable=True) - rpm_license = Column(String(500), nullable=True) - rpm_vendor = Column(String(500), nullable=True) - rpm_group = Column(String(500), nullable=True) - rpm_buildhost = Column(String(500), nullable=True) - rpm_sourcerpm = Column(String(500), nullable=True) - rpm_header_start = Column(Integer, nullable=True) - rpm_header_end = Column(Integer, nullable=True) - rpm_packager = Column(String(500), nullable=True) - size_package = Column(Integer, nullable=True) - size_installed = Column(Integer, nullable=True) - size_archive = Column(Integer, nullable=True) - location_href = Column(String(500), nullable=True) - location_base = Column(String(500), nullable=True) - checksum_type = Column(String(500), nullable=True) - maintaniner = Column(String(100), nullable=True) - maintainlevel = Column(String(100), nullable=True) - src_name = Column(String(100), nullable=True) - - -class BinPack(DBHelper.BASE): - """ - Description: functional description:Binary package data - """ - __tablename__ = 'bin_pack' - - pkgKey = Column(Integer, primary_key=True) - pkgId = Column(String(500), nullable=True) - name = Column(String(500), nullable=True) - arch = Column(String(500), nullable=True) - version = Column(String(500), nullable=True) - epoch = Column(String(500), nullable=True) - release = Column(String(500), nullable=True) - summary = Column(String(500), nullable=True) - description = Column(String(500), nullable=True) - url = Column(String(500), nullable=True) - time_file = Column(Integer, nullable=True) - time_build = Column(Integer, nullable=True) - rpm_license = Column(String(500), nullable=True) - rpm_vendor = Column(String(500), nullable=True) - rpm_group = Column(String(500), nullable=True) - rpm_buildhost = Column(String(500), nullable=True) - rpm_sourcerpm = Column(String(500), nullable=True) - rpm_header_start = Column(Integer, nullable=True) - rpm_header_end = Column(Integer, nullable=True) - rpm_packager = Column(String(500), nullable=True) - size_package = Column(Integer, nullable=True) - size_installed = Column(Integer, nullable=True) - size_archive = Column(Integer, nullable=True) - location_href = Column(String(500), nullable=True) - location_base = Column(String(500), nullable=True) - checksum_type = Column(String(500), nullable=True) - src_name = Column(String(500), nullable=True) - - -class BinRequires(DBHelper.BASE): - """ - Binary package dependent package entity model - """ - - __tablename__ = 'bin_requires' - - id = Column(Integer, primary_key=True) - name = Column(String(200), nullable=True) - flags = Column(String(200), nullable=True) - epoch = Column(String(200), nullable=True) - version = Column(String(500), nullable=True) - release = Column(String(200), nullable=True) - pkgKey = Column(Integer, nullable=True) - pre = Column(String(20), nullable=True) - - -class SrcRequires(DBHelper.BASE): - """ - Source entity package dependent package entity model - """ - __tablename__ = 'src_requires' - - id = Column(Integer, primary_key=True) - name = Column(String(200), nullable=True) - flags = Column(String(200), nullable=True) - epoch = Column(String(200), nullable=True) - version = Column(String(500), nullable=True) - release = Column(String(200), nullable=True) - pkgKey = Column(Integer, nullable=True) - pre = Column(String(20), nullable=True) - - -class BinProvides(DBHelper.BASE): - """ - Component entity model provided by binary package - """ - __tablename__ = 'bin_provides' - - id = Column(Integer, primary_key=True) - name = Column(String(200), nullable=True) - flags = Column(String(200), nullable=True) - epoch = Column(String(200), nullable=True) - version = Column(String(500), nullable=True) - release = Column(String(200), nullable=True) - pkgKey = Column(Integer, nullable=True) - - -class BinFiles(DBHelper.BASE): - """ - Installation path of the binary package - """ - __tablename__ = 'bin_files' - id = Column(Integer, primary_key=True) - name = Column(String(500), nullable=True) - type = Column(String(50), nullable=True) - pkgKey = Column(Integer) - - -class Packages(): - """ - Source code package version, issuer and other information - """ - __table_args__ = {'extend_existing': True} - id = Column(Integer, primary_key=True) - name = Column(String(500), nullable=True) - url = Column(String(500), nullable=True) - rpm_license = Column(String(500), nullable=True) - version = Column(String(200), nullable=True) - release = Column(String(200), nullable=True) - release_time = Column(String(50), nullable=True) - used_time = Column(Integer, default=0) - latest_version = Column(String(200), nullable=True) - latest_version_time = Column(String(50), nullable=True) - feature = Column(Integer, default=0) - cve = Column(Integer, default=0) - defect = Column(Integer, default=0) - maintainer = Column(String(200), nullable=True) - maintainlevel = Column(Integer, nullable=True) - version_control = Column(String(50), nullable=True) - src_repo = Column(String(500), nullable=True) - tag_prefix = Column(String(20), nullable=True) - summary = Column(String(500), nullable=True) - description = Column(String(500), nullable=True) - - @classmethod - def package_meta(cls, table_name): - """ - Dynamically generate different classes through metaclasses - """ - _uuid = str(uuid.uuid1()) - model = type(_uuid, (cls, DBHelper.BASE), { - '__tablename__': table_name}) - return model - - -class PackagesIssue(DBHelper.BASE): - """ - Source package issue - """ - __tablename__ = "packages_issue" - id = Column(Integer, primary_key=True) - issue_id = Column(String(50), nullable=True) - issue_url = Column(String(500), nullable=True) - issue_content = Column(Text, nullable=True) - issue_title = Column(String(1000), nullable=True) - issue_status = Column(String(20), nullable=True) - pkg_name = Column(String(500), nullable=False) - issue_download = Column(String(500), nullable=False) - issue_type = Column(String(50), nullable=True) - related_release = Column(String(500), nullable=True) - - -class PackagesMaintainer(DBHelper.BASE): - """ - Correspondence between source code package and maintainer - """ - __tablename__ = 'packages_maintainer' - id = Column(Integer, primary_key=True) - name = Column(String(200), nullable=True) - maintainer = Column(String(200), nullable=True) - maintainlevel = Column(Integer, nullable=True) - - -class DatabaseInfo(DBHelper.BASE): - """ - Save the name and priority of the database - """ - __tablename__ = 'databases_info' - id = Column(Integer, primary_key=True) - name = Column(String(200), nullable=True) - priority = Column(Integer, nullable=True) diff --git a/packageship/packageship/application/settings.py b/packageship/packageship/application/settings.py deleted file mode 100644 index c10ed09805bcff1967e56a3bb741bb1caf76172e..0000000000000000000000000000000000000000 --- a/packageship/packageship/application/settings.py +++ /dev/null @@ -1,51 +0,0 @@ -#!/usr/bin/python3 -""" -Description: Basic configuration of flask framework -""" -import random -from packageship.libs.conf import configuration - - -class Config(): - """ - Description: Configuration items in a formal environment - Attributes: - _read_config: read config - _set_config_val: Set the value of the configuration item - """ - SECRET_KEY = None - - DEBUG = False - - LOG_LEVEL = 'INFO' - - SCHEDULER_API_ENABLED = True - - def __init__(self): - - self.set_config_val() - - @classmethod - def _random_secret_key(cls, random_len=32): - """ - Description: Generate random strings - """ - cls.SECRET_KEY = ''.join( - [random.choice('abcdefghijklmnopqrstuvwxyz!@#$%^&*()') for index in range(random_len)]) - - @classmethod - def _set_log_level(cls, log_level): - """ - Description: Set the log level - """ - cls.LOG_LEVEL = log_level - - def set_config_val(self): - """ - Description: Set the value of the configuration item - Args: - Returns: - Raises: - """ - Config._random_secret_key() - Config._set_log_level(configuration.LOG_LEVEL) diff --git a/packageship/packageship/libs/__init__.py b/packageship/packageship/libs/__init__.py deleted file mode 100644 index f4f5866b119aefc85e00b88ef42f9bb9b5d5103c..0000000000000000000000000000000000000000 --- a/packageship/packageship/libs/__init__.py +++ /dev/null @@ -1,4 +0,0 @@ -#!/usr/bin/python3 -""" -Encapsulation of public class methods -""" diff --git a/packageship/packageship/libs/conf/__init__.py b/packageship/packageship/libs/conf/__init__.py deleted file mode 100755 index fbf34b133712ef90694cd7e807ffecff9529b678..0000000000000000000000000000000000000000 --- a/packageship/packageship/libs/conf/__init__.py +++ /dev/null @@ -1,124 +0,0 @@ -#!/usr/bin/python3 -# ****************************************************************************** -# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved. -# licensed under the Mulan PSL v2. -# You can use this software according to the terms and conditions of the Mulan PSL v2. -# You may obtain a copy of Mulan PSL v2 at: -# http://license.coscl.org.cn/MulanPSL2 -# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND, EITHER EXPRESS OR -# IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT, MERCHANTABILITY OR FIT FOR A PARTICULAR -# PURPOSE. -# See the Mulan PSL v2 for more details. -# ******************************************************************************/ -""" -System configuration file and default configuration file integration -""" -import os -import configparser - -from packageship.libs.exception import Error -from . import global_config - - -USER_SETTINGS_FILE_PATH = 'SETTINGS_FILE_PATH' - - -class PreloadingSettings(): - """ - The system default configuration file and the configuration - file changed by the user are lazily loaded. - """ - _setting_container = None - - def _preloading(self): - """ - Load the default configuration in the system and the related configuration - of the user, and overwrite the default configuration items of the system - with the user's configuration data - """ - settings_file = os.environ.get(USER_SETTINGS_FILE_PATH) - if not settings_file: - raise Error( - "The system does not specify the user configuration" - "that needs to be loaded:" % USER_SETTINGS_FILE_PATH) - - self._setting_container = Configs(settings_file) - - def __getattr__(self, name): - """ - Return the value of a setting and cache it in self.__dict__ - """ - if self._setting_container is None: - self._preloading() - value = getattr(self._setting_container, name, None) - self.__dict__[name] = value - return value - - def __setattr__(self, name, value): - """ - Set the configured value and re-copy the value cached in __dict__ - """ - if name is None: - raise KeyError("The set configuration key value cannot be empty") - if name == '_setting_container': - self.__dict__.clear() - self.__dict__["_setting_container"] = value - else: - self.__dict__.pop(name, None) - if self._setting_container is None: - self._preloading() - setattr(self._setting_container, name, value) - - def __delattr__(self, name): - """ - Delete a setting and clear it from cache if needed - """ - if name is None: - raise KeyError("The set configuration key value cannot be empty") - - if self._setting_container is None: - self._preloading() - delattr(self._setting_container, name) - self.__dict__.pop(name, None) - - @property - def config_ready(self): - """ - Return True if the settings have already been configured - """ - return self._setting_container is not None - - -class Configs(): - """ - The system's default configuration items and the user's - configuration items are integrated - """ - - def __init__(self, settings_file): - for config in dir(global_config): - if not config.startswith('_'): - setattr(self, config, getattr(global_config, config)) - - # Load user's configuration - self._conf_parser = configparser.ConfigParser() - self._conf_parser.read(settings_file) - - for section in self._conf_parser.sections(): - for option in self._conf_parser.items(section): - try: - _config_value = option[1] - _key = option[0] - except IndexError: - pass - else: - if not _config_value: - continue - if _config_value.isdigit(): - _config_value = int(_config_value) - elif _config_value.lower() in ('true', 'false'): - _config_value = bool(_config_value) - setattr(self, _key.upper(), _config_value) - - -configuration = PreloadingSettings() diff --git a/packageship/packageship/libs/conf/global_config.py b/packageship/packageship/libs/conf/global_config.py deleted file mode 100755 index 0b433a4f73b3b413c3f69f32157708b58fe87a09..0000000000000000000000000000000000000000 --- a/packageship/packageship/libs/conf/global_config.py +++ /dev/null @@ -1,110 +0,0 @@ -#!/usr/bin/python3 -# ****************************************************************************** -# Copyright (c) Huawei Technologies Co., Ltd. 2020-2020. All rights reserved. -# licensed under the Mulan PSL v2. -# You can use this software according to the terms and conditions of the Mulan PSL v2. -# You may obtain a copy of Mulan PSL v2 at: -# http://license.coscl.org.cn/MulanPSL2 -# THIS SOFTWARE IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND, EITHER EXPRESS OR -# IMPLIED, INCLUDING BUT NOT LIMITED TO NON-INFRINGEMENT, MERCHANTABILITY OR FIT FOR A PARTICULAR -# PURPOSE. -# See the Mulan PSL v2 for more details. -# ******************************************************************************/ -""" -Global environment variable value when the system is running -""" - -import os - -# Configuration file path for data initialization -INIT_CONF_PATH = os.path.join('/', 'etc', 'pkgship', 'conf.yaml') - -# If the path of the imported database is not specified in the configuration file, the -# configuration in the system is used by default -DATABASE_FOLDER_PATH = os.path.join('/', 'var', 'run', 'pkgship_dbs') - -# The database engine supported in the system is sqlite database by default -DATABASE_ENGINE_TYPE = 'sqlite' - -# Port managed by the administrator, with write permission -WRITE_PORT = 8080 - -# Ordinary user query port, only the right to query data, no permission to write data -QUERY_PORT = 8090 - -# IP address path with write permission -WRITE_IP_ADDR = '127.0.0.1' - -# IP address path with permission to query data -QUERY_IP_ADDR = '127.0.0.1' - -# The address of the remote service, the command line can directly -# call the remote service to complete the data request -REMOTE_HOST = 'https://api.openeuler.org/pkgmanage' - -# If the directory of log storage is not configured, -# it will be stored in the following directory specified by the system by default -LOG_PATH = os.path.join('/', 'var', 'log', 'pkgship') - -# Logging level -# The log level option value can only be as follows -# INFO DEBUG WARNING ERROR CRITICAL -LOG_LEVEL = 'INFO' - -# logging name -LOG_NAME = 'log_info.log' - -# The number of dynamically created logs -# after the log file size reaches the upper limit. The default is 10 dynamically created logs -BACKUP_COUNT = 10 - -# The size of each log file, in bytes, the default size of a single log file is 300M -MAX_BYTES = 314572800 - -# Execution frequency and switch of timing tasks -# Whether to execute the switch for batch update of information such -# as the maintainer during initialization. When set to True, the maintainer -# and other information will be updated when the scheduled task starts -# to execute. When it is set to False, it will be updated when the scheduled -# task is executed. , Does not update information such as maintainers and maintenance levels - -OPEN = True - -# The time point at which the timing task is executed in a cycle. -# Every day is a cycle, and the time value can only be any integer period between 0 and 23 -HOUR = 3 - -# Recurring timing task starts to execute the task at the current time point. -# The value range of this configuration item is an integer between 0 and 59 -MINUTE = 0 - -# Configuration during the life cycle for tag information, issue and other information acquisition -# The yaml address of each package is stored in the remote address, which can be -# a remote warehouse address or the address of a static resource service -WAREHOUSE_REMOTE = 'https://gitee.com/openeuler/openEuler-Advisor/raw/master/upstream-info/' - -# When performing timing tasks, you can open multi-threaded execution, and you can set -# the number of threads in the thread pool according to the configuration of the server - -POOL_WORKERS = 10 - -# The main body of the warehouse, the owner of the warehouse -# When this value is not set, the system will default to src-openeuler -WAREHOUSE = 'src-openeuler' - -# The address of the Redis cache server can be either a published -# domain or an IP address that can be accessed normally -# The link address defaults to 127.0.0.1 -# redis_host = 127.0.0.1 - -REDIS_HOST = '127.0.0.1' - -# Redis cache server link port number, default is 6379 -REDIS_PORT = 6379 - -# Maximum number of connections allowed by RedIS server at one time - -REDIS_MAX_CONNECTIONS = 10 - -# Maximum queue length -QUEUE_MAXSIZE = 1000 diff --git a/packageship/packageship/libs/configutils/__init__.py b/packageship/packageship/libs/configutils/__init__.py deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/packageship/packageship/libs/configutils/readconfig.py b/packageship/packageship/libs/configutils/readconfig.py deleted file mode 100644 index 6a3a2aa91a41367049e417dc7431aede3e3a1251..0000000000000000000000000000000000000000 --- a/packageship/packageship/libs/configutils/readconfig.py +++ /dev/null @@ -1,78 +0,0 @@ -#!/usr/bin/python3 -""" - Description:Read the base class of the configuration file in the system - which mainly includes obtaining specific node values - and obtaining arbitrary node values - Class:ReadConfig -""" -import configparser -from configparser import NoSectionError -from configparser import NoOptionError, Error - - -class ReadConfig(): - """ - Description: Read the configuration file base class in the system - Attributes: - conf:Configuration file for the system - conf.read:Read the system configuration file - """ - - def __init__(self, conf_path): - self.conf = configparser.ConfigParser() - if conf_path is None: - raise Error('The path of the configuration file does not exist:%s' % - conf_path if conf_path else '') - self.conf.read(conf_path) - - def get_system(self, param): - """ - Description: Get any data value under the system configuration node - Args: - param:The node parameters that need to be obtained - Returns: - Raises: - """ - if param: - try: - return self.conf.get("SYSTEM", param) - except NoSectionError: - return None - except NoOptionError: - return None - return None - - def get_database(self, param): - """ - Description: Get any data value under the database configuration node - Args: - param:The node parameters that need to be obtained - Returns: - Raises: - """ - if param: - try: - return self.conf.get("DATABASE", param) - except NoSectionError: - return None - except NoOptionError: - return None - return None - - def get_config(self, node, param): - """ - Description: Get configuration data under any node - Args: - node:node - param:The node parameters that need to be obtained - Returns: - Raises: - """ - if all([node, param]): - try: - return self.conf.get(node, param) - except NoSectionError: - return None - except NoOptionError: - return None - return None diff --git a/packageship/packageship/libs/dbutils/__init__.py b/packageship/packageship/libs/dbutils/__init__.py deleted file mode 100644 index 78ac1617112465680f720d650e779adc48937b22..0000000000000000000000000000000000000000 --- a/packageship/packageship/libs/dbutils/__init__.py +++ /dev/null @@ -1,7 +0,0 @@ -#!/usr/bin/python3 -""" -Database access public class method -""" -from .sqlalchemy_helper import DBHelper - -__all__ = ['DBHelper'] diff --git a/packageship/packageship/libs/dbutils/sqlalchemy_helper.py b/packageship/packageship/libs/dbutils/sqlalchemy_helper.py deleted file mode 100644 index 26c582263f34103b17ac38256a4521cc0d4699f5..0000000000000000000000000000000000000000 --- a/packageship/packageship/libs/dbutils/sqlalchemy_helper.py +++ /dev/null @@ -1,308 +0,0 @@ -#!/usr/bin/python3 -""" -Description: Simple encapsulation of sqlalchemy orm framework operation database -Class: DBHelper -""" -import os -from sqlalchemy import create_engine -from sqlalchemy import MetaData -from sqlalchemy.orm import sessionmaker -from sqlalchemy.exc import SQLAlchemyError -from sqlalchemy.exc import DisconnectionError -from sqlalchemy.exc import OperationalError -from sqlalchemy.ext.declarative import declarative_base -from sqlalchemy.engine.url import URL -from packageship.libs.exception.ext import Error -from packageship.libs.exception.ext import DbnameNoneException -from packageship.libs.exception.ext import ContentNoneException -from packageship.libs.conf import configuration - - -class BaseHelper(): - """ - Description: Base class for data manipulation - """ - - def __init__(self): - self.engine = None - - -class MysqlHelper(BaseHelper): - """ - Description: mysql database connection related operations - Attributes: - user_name: Database connection username - password: Database connection password - host: Remote server address - port: Port - database: Operational database name - connection_type: Database connection type - """ - - def __init__(self, user_name=None, password=None, host=None, # pylint: disable=unused-argument - port=None, database=None, **kwargs): - super(MysqlHelper, self).__init__() - self.user_name = user_name or configuration.USER_NAME - self.password = password or configuration.PASSWORD - self.host = host or configuration.HOST - self.port = port or configuration.PORT - self.database = database or configuration.DATABASE - self.connection_type = 'mysql+pymysql' - - def create_database_engine(self): - """ - Description: Create a database connection object - Args: - - Returns: - Raises: - DisconnectionError: A disconnect is detected on a raw DB-API connection. - - """ - if not all([self.user_name, self.password, self.host, self.port, self.database]): - raise DisconnectionError( - 'A disconnect is detected on a raw DB-API connection') - # create connection object - self.engine = create_engine(URL(**{'database': self.database, - 'username': self.user_name, - 'password': self.password, - 'host': self.host, - 'port': self.port, - 'drivername': self.connection_type}), - encoding='utf-8', - convert_unicode=True) - - -class SqliteHlper(BaseHelper): - """ - Description: sqlite database connection related operations - Attributes: - connection_type: Database connection type - database: Operational database name - """ - - def __init__(self, database, **kwargs): - super(SqliteHlper, self).__init__() - self.connection_type = 'sqlite:///' - if 'complete_route_db' in kwargs.keys(): - self.database = database - else: - self.database = self._database_file_path(database) - - def _database_file_path(self, database): - """ - Description: load the path stored in the sqlite database - Args: - - Returns: - Raises: - - """ - _database_folder_path = configuration.DATABASE_FOLDER_PATH - try: - if not os.path.exists(_database_folder_path): - os.makedirs(_database_folder_path) - except IOError: - pass - return os.path.join(_database_folder_path, database + '.db') - - def create_database_engine(self): - """ - Description: Create a database connection object - Args: - - Returns: - Raises: - DisconnectionError: A disconnect is detected on a raw DB-API connection - - """ - if not self.database: - raise DbnameNoneException( - 'The connected database name is empty') - self.engine = create_engine( - self.connection_type + self.database, encoding='utf-8', convert_unicode=True, - connect_args={'check_same_thread': False}) - - -class DBHelper(BaseHelper): - """ - Description: Database connection, operation public class - Attributes: - user_name: Username - password: Password - host: Remote server address - port: Port - db_name: Database name - connection_type: Database type - session: Session - """ - # The base class inherited by the data model - BASE = declarative_base() - ENGINE_CONTAINER = dict() - - def __init__(self, user_name=None, password=None, host=None, # pylint: disable=R0913 - port=None, db_name=None, connection_type=None, **kwargs): - """ - Description: Class instance initialization - - """ - super(DBHelper, self).__init__() - self._database_engine = { - 'mysql': MysqlHelper, - 'sqlite': SqliteHlper - } - if connection_type is None: - connection_type = configuration.DATABASE_ENGINE_TYPE - self._engine_pool = connection_type + '_' + db_name - _database_engine = self._database_engine.get(connection_type) - if 'complete_route_db' in kwargs: - _database_engine = SqliteHlper - if _database_engine is None: - raise DisconnectionError( - 'Database engine connection failed' - 'Not currently supported %s database' % connection_type) - _engine = self.ENGINE_CONTAINER.get(self._engine_pool) - if _engine: - self.engine = _engine - else: - _engine = _database_engine(user_name=user_name, password=password, - host=host, port=port, database=db_name, **kwargs) - _engine.create_database_engine() - self.engine = _engine.engine - self.ENGINE_CONTAINER[self._engine_pool] = self.engine - self.session = None - - def create_engine(self): - """ - Create related database engine connections - """ - session = sessionmaker() - try: - session.configure(bind=self.engine) - except DisconnectionError: - self.ENGINE_CONTAINER.pop(self._engine_pool) - else: - self.session = session() - return self - - def __enter__(self): - """ - Description: functional description:Create a context manager for the database connection - Args: - - Returns: - Class instance - Raises: - - """ - - database_engine = self.create_engine() - return database_engine - - def __exit__(self, exc_type, exc_val, exc_tb): - """ - Description: functional description:Release the database connection pool - and close the connection - Args: - - Returns: - exc_type: Abnormal type - exc_val: Abnormal value - exc_tb: Abnormal table - Raises: - - """ - if isinstance(exc_type, (AttributeError)): - raise SQLAlchemyError(exc_val) - self.session.close() - - @classmethod - def create_all(cls, db_name=None): - """ - Description: functional description:Create all database tables - Args: - db_name: Database name - Returns: - - Raises: - - """ - - cls.BASE.metadata.create_all(bind=cls(db_name=db_name).engine) - - def create_table(self, tables): - """ - Description: Create a single table - Args: - tables: Table - Returns: - - Raises: - """ - meta = MetaData(self.engine) - for table_name in DBHelper.BASE.metadata.tables.keys(): - if table_name in tables: - table = DBHelper.BASE.metadata.tables[table_name] - table.metadata = meta - table.create() - - def add(self, entity): - """ - Description: Insert a single data entity - Args: - entity: Data entity - Return: - If the addition is successful, return the corresponding entity, otherwise return None - Raises: - ContentNoneException: An exception occurred while content is none - SQLAlchemyError: An exception occurred while creating the database - """ - - if entity is None: - raise ContentNoneException( - 'The added entity content cannot be empty') - try: - self.session.add(entity) - except SQLAlchemyError as sql_error: - self.session.rollback() - if isinstance(sql_error, OperationalError): - raise OperationalError - raise Error(sql_error) - else: - self.session.commit() - return entity - - def batch_add(self, dicts, model): - """ - Description:tables for adding databases in bulk - Args: - dicts:Entity dictionary data to be added - model:Solid model class - Returns: - - Raises: - TypeError: An exception occurred while incoming type does not meet expectations - SQLAlchemyError: An exception occurred while creating the database - """ - - if model is None: - raise ContentNoneException('solid model must be specified') - - if not dicts: - raise ContentNoneException( - 'The inserted data content cannot be empty') - - if not isinstance(dicts, list): - raise TypeError( - "The input for bulk insertion must be a dictionary" - "list with the same fields as the current entity") - try: - self.session.execute( - model.__table__.insert(), - dicts - ) - except SQLAlchemyError as sql_error: - self.session.rollback() - raise Error(sql_error) - else: - self.session.commit() diff --git a/packageship/packageship/libs/exception/__init__.py b/packageship/packageship/libs/exception/__init__.py deleted file mode 100644 index 1ad97626124834f8183edaca0a000a3009bf40a8..0000000000000000000000000000000000000000 --- a/packageship/packageship/libs/exception/__init__.py +++ /dev/null @@ -1,17 +0,0 @@ -#!/usr/bin/python3 -""" -Customized exception information class -""" -from packageship.libs.exception.ext import ContentNoneException -from packageship.libs.exception.ext import DatabaseRepeatException -from packageship.libs.exception.ext import DataMergeException -from packageship.libs.exception.ext import Error -from packageship.libs.exception.ext import DbnameNoneException -from packageship.libs.exception.ext import ConfigurationException - -__all__ = ['ContentNoneException', - 'DatabaseRepeatException', - 'DataMergeException', - 'Error', - 'DbnameNoneException', - 'ConfigurationException'] diff --git a/packageship/packageship/libs/exception/ext.py b/packageship/packageship/libs/exception/ext.py deleted file mode 100644 index dcea9ff7a71cc2e45242deee97b78517539f74b1..0000000000000000000000000000000000000000 --- a/packageship/packageship/libs/exception/ext.py +++ /dev/null @@ -1,73 +0,0 @@ -#!/usr/bin/python3 -""" -Description:System exception information -Class:Error,ContentNoneException,DbnameNoneException, - DatabaseRepeatException,DataMergeException -""" - - -class Error(Exception): - - """ - Description: Read the configuration file base class in the system - Attributes: - message:Exception information - """ - - def __init__(self, msg=''): - self.message = msg - Exception.__init__(self, msg) - - def __repr__(self): - return self.message - - __str__ = __repr__ - - -class ContentNoneException(Error): - """ - Description: Content is empty exception - Attributes: - """ - - def __init__(self, message): - Error.__init__(self, 'No content: %r' % (message,)) - - -class DbnameNoneException(ContentNoneException): - """ - Description: Exception with empty database name - Attributes: - """ - - def __init__(self, message): - ContentNoneException.__init__(self, '%r' % (message,)) - - -class DatabaseRepeatException(Error): - """ - Description: There are duplicate exceptions in the database - Attributes: - """ - - def __init__(self, message): - Error.__init__(self, 'Database repeat: %r' % (message,)) - - -class DataMergeException(Error): - """ - Description: abnormal integration data - Attributes: - """ - - def __init__(self, message): - Error.__init__(self, 'DataMerge exception: %r' % (message,)) - - -class ConfigurationException(Error): - """ - Description: Configuration file exception information - """ - - def __init__(self, message): - Error.__init__(self, 'Configuration exception : %r' % (message,)) diff --git a/packageship/packageship/libs/log/__init__.py b/packageship/packageship/libs/log/__init__.py deleted file mode 100644 index 33cd95f042ffda33c5415a8a05c5e29689c2c629..0000000000000000000000000000000000000000 --- a/packageship/packageship/libs/log/__init__.py +++ /dev/null @@ -1,8 +0,0 @@ -#!/usr/bin/python3 -""" -Common methods for logging -""" -from packageship.libs.log.loghelper import setup_log -from packageship.libs.log.loghelper import Log, LOGGER - -__all__ = ['setup_log', 'Log', 'LOGGER'] diff --git a/packageship/packageship/libs/log/loghelper.py b/packageship/packageship/libs/log/loghelper.py deleted file mode 100644 index 1d38b02f1b6f31d6efd8201e498b7fb414d21c08..0000000000000000000000000000000000000000 --- a/packageship/packageship/libs/log/loghelper.py +++ /dev/null @@ -1,132 +0,0 @@ -#!/usr/bin/python3 -""" -Logging related -""" -import os -import threading -import pathlib -import logging -from concurrent_log_handler import ConcurrentRotatingFileHandler -from packageship.libs.conf import configuration - - -def setup_log(config=None): - """ - Log logging in the context of flask - """ - _level = configuration.LOG_LEVEL - if config: - _level = config.LOG_LEVEL - logging.basicConfig(level=_level) - backup_count = configuration.BACKUP_COUNT - if not backup_count or not isinstance(backup_count, int): - backup_count = 10 - max_bytes = configuration.MAX_BYTES - if not max_bytes or not isinstance(max_bytes, int): - max_bytes = 314572800 - - path = os.path.join(configuration.LOG_PATH, configuration.LOG_NAME) - if not os.path.exists(path): - try: - os.makedirs(os.path.split(path)[0]) - except FileExistsError: - pathlib.Path(path).touch() - - file_log_handler = ConcurrentRotatingFileHandler( - path, maxBytes=max_bytes, backupCount=backup_count) - - formatter = logging.Formatter( - '%(asctime)s-%(name)s-%(filename)s-[line:%(lineno)d]' - '-%(levelname)s-[ log details ]: %(message)s', - datefmt='%a, %d %b %Y %H:%M:%S') - - file_log_handler.setFormatter(formatter) - - return file_log_handler - - -class Log(): - """ - General log operations - """ - _instance_lock = threading.Lock() - - def __init__(self, name=__name__, path=None): - self.__name = name - - self.__file_handler = None - - self.__path = os.path.join( - configuration.LOG_PATH, configuration.LOG_NAME) - if path: - self.__path = path - - if not os.path.exists(self.__path): - try: - os.makedirs(os.path.split(self.__path)[0]) - except FileExistsError: - pathlib.Path(self.__path).touch() - - self.__level = configuration.LOG_LEVEL - self.__logger = logging.getLogger(self.__name) - self.__logger.setLevel(self.__level) - self.backup_count = configuration.BACKUP_COUNT - if not self.backup_count or not isinstance(self.backup_count, int): - self.backup_count = 10 - self.max_bytes = configuration.MAX_BYTES - if not self.max_bytes or not isinstance(self.max_bytes, int): - self.max_bytes = 314572800 - - self.__init_handler() - self.__set_handler() - self.__set_formatter() - - def __new__(cls, *args, **kwargs): # pylint: disable=unused-argument - """ - Use the singleton pattern to create a thread-safe producer pattern - """ - if not hasattr(cls, "_instance"): - with cls._instance_lock: - if not hasattr(cls, "_instance"): - cls._instance = object.__new__(cls) - return cls._instance - - def __init_handler(self): - self.__file_handler = ConcurrentRotatingFileHandler( - self.__path, maxBytes=self.max_bytes, backupCount=self.backup_count, encoding="utf-8") - - def __set_handler(self): - self.__file_handler.setLevel(self.__level) - self.__logger.addHandler(self.__file_handler) - - def __set_formatter(self): - formatter = logging.Formatter('%(asctime)s-%(name)s-%(filename)s-[line:%(lineno)d]' - '-%(levelname)s-[ log details ]: %(message)s', - datefmt='%a, %d %b %Y %H:%M:%S') - self.__file_handler.setFormatter(formatter) - - def info(self, message): - """General information printing of the log""" - self.__logger.info(message) - - def debug(self, message): - """Log debugging information printing""" - self.__logger.debug(message) - - def warning(self, message): - """Log warning message printing""" - self.__logger.warning(message) - - def error(self, message): - """Log error message printing""" - self.__logger.error(message) - - @property - def logger(self): - """ - Get logs - """ - return self.__logger - - -LOGGER = Log(__name__) diff --git a/packageship/packageship/manage.py b/packageship/packageship/manage.py deleted file mode 100644 index ccc94531046eeabc6619f65f14b0e763e0213a56..0000000000000000000000000000000000000000 --- a/packageship/packageship/manage.py +++ /dev/null @@ -1,36 +0,0 @@ -#!/usr/bin/python3 -""" -Description: Entry for project initialization and service startupc -""" -import os - -try: - from packageship.application import init_app - if not os.path.exists(os.environ.get('SETTINGS_FILE_PATH')): - raise RuntimeError( - 'System configuration file:%s' % os.environ.get( - 'SETTINGS_FILE_PATH'), - 'does not exist, software service cannot be started') - app = init_app("write") -except ImportError as error: - raise RuntimeError( - "The package management software service failed to start : %s" % error) -else: - from packageship.application.app_global import identity_verification - from packageship.libs.conf import configuration - - -@app.before_request -def before_request(): - """ - Description: Global request interception - """ - if not identity_verification(): - return 'No right to perform operation' - - -if __name__ == "__main__": - - port = configuration.WRITE_PORT - addr = configuration.WRITE_IP_ADDR - app.run(port=port, host=addr) diff --git a/packageship/packageship/package.ini b/packageship/packageship/package.ini deleted file mode 100644 index cc83e4e4bb29025f7f3cb19d4c3e6ce3aee8a2c9..0000000000000000000000000000000000000000 --- a/packageship/packageship/package.ini +++ /dev/null @@ -1,113 +0,0 @@ -[SYSTEM] - -; Configuration file path for data initialization -init_conf_path=/etc/pkgship/conf.yaml - -; Where to store data files when using sqlite database -; database_folder_path=/var/run/pkgship_dbs - -; Port managed by the administrator, with write permission - -write_port=8080 - -; Ordinary user query port, only the right to query data, no permission to write data - -query_port=8090 - -; IP address path with write permission - -write_ip_addr=127.0.0.1 - -; IP address path with permission to query data - -query_ip_addr=127.0.0.1 - -; The address of the remote service, the command line can directly -; call the remote service to complete the data request -remote_host=https://api.openeuler.org/pkgmanage - -[LOG] - -; Custom log storage path -; log_path=/var/log/pkgship/ - -; Logging level -; The log level option value can only be as follows -; INFO DEBUG WARNING ERROR CRITICAL -log_level=INFO - -; logging name -log_name=log_info.log - -; The number of dynamically created logs -; after the log file size reaches the upper limit. The default is 10 dynamically created logs -backup_count=10 - -; The size of each log file, in bytes, the default size of a single log file is 300M -max_bytes=314572800 - -[UWSGI] -; uwsgi log file path -daemonize=/var/log/uwsgi.log -; The data size transferred from back to forth -buffer-size=65536 -; HTTP Connection time -http-timeout=600 -; Server response time -harakiri=600 - - - -[TIMEDTASK] -; Execution frequency and switch of timing tasks -; Whether to execute the switch for batch update of information such -; as the maintainer during initialization. When set to True, the maintainer -; and other information will be updated when the scheduled task starts -; to execute. When it is set to False, it will be updated when the scheduled -; task is executed. , Does not update information such as maintainers and maintenance levels - -open=True - -; The time point at which the timing task is executed in a cycle. -; Every day is a cycle, and the time value can only be any integer period between 0 and 23 -hour=3 - -; Recurring timing task starts to execute the task at the current time point. -; The value range of this configuration item is an integer between 0 and 59 -minute=0 - -[LIFECYCLE] -; Configuration during the life cycle for tag information, issue and other information acquisition - -; The yaml address of each package is stored in the remote address, which can be -; a remote warehouse address or the address of a static resource service -warehouse_remote=https://gitee.com/openeuler/openEuler-Advisor/raw/master/upstream-info/ - -; When performing timing tasks, you can open multi-threaded execution, and you can set -; the number of threads in the thread pool according to the configuration of the server - -pool_workers=10 - - -; The main body of the warehouse, the owner of the warehouse -; When this value is not set, the system will default to src-openeuler -warehouse=src-openeuler - -; Maximum queue length -queue_maxsize = 1000 - -[REDIS] - -# The address of the Redis cache server can be either a published -# domain or an IP address that can be accessed normally -# The link address defaults to 127.0.0.1 -# redis_host = 127.0.0.1 - -redis_host= 127.0.0.1 - -# Redis cache server link port number, default is 6379 -redis_port= 6379 - -# Maximum number of connections allowed by RedIS server at one time - -redis_max_connections= 10 \ No newline at end of file diff --git a/packageship/packageship/pkgship b/packageship/packageship/pkgship deleted file mode 100644 index 9210bd2d9c511f9973ab496aadb17aa104d598f3..0000000000000000000000000000000000000000 --- a/packageship/packageship/pkgship +++ /dev/null @@ -1,23 +0,0 @@ -#!/usr/bin/python3 -import sys -import signal -from signal import SIG_DFL -try: - def sig_handler(signum, frame): - print('Exit command mode') - sys.exit(0) - - signal.signal(signal.SIGINT, sig_handler) - signal.signal(signal.SIGPIPE, SIG_DFL) -except: - pass - -from packageship.pkgship import main - - -if __name__ == '__main__': - try: - main() - except Exception as error: - print('Command execution error please try again ') - print(error.message) diff --git a/packageship/packageship/pkgship.py b/packageship/packageship/pkgship.py deleted file mode 100644 index 010ebcb54a887dae9c7159fcdb733b1228fd521d..0000000000000000000000000000000000000000 --- a/packageship/packageship/pkgship.py +++ /dev/null @@ -1,1586 +0,0 @@ -#!/usr/bin/python3 # pylint: disable= too-many-lines - -""" -Description: Entry method for custom commands -Class: BaseCommand,PkgshipCommand,RemoveCommand,InitDatabaseCommand, - AllPackageCommand,UpdatePackageCommand,BuildDepCommand,InstallDepCommand, - SelfBuildCommand,BeDependCommand,SingleCommand -""" -import os -import json -import threading -from json.decoder import JSONDecodeError - -try: - import argparse - import requests - from requests.exceptions import ConnectionError as ConnErr - from requests.exceptions import HTTPError - import prettytable - from prettytable import PrettyTable - from packageship.libs.conf import configuration - from packageship.libs.log import LOGGER - from packageship.libs.exception import Error -except ImportError as import_error: - print("Error importing related dependencies," - "please check if related dependencies are installed") -else: - from packageship.application.apps.package.function.constants import ResponseCode - from packageship.application.apps.package.function.constants import ListNode - from packageship.application.apps.lifecycle.function.download_yaml import update_pkg_info - -DB_NAME = 0 - - -def main(): - """ - Description: Command line tool entry, register related commands - - Raises: - Error: An error occurred while executing the command - """ - try: - packship_cmd = PkgshipCommand() - packship_cmd.parser_args() - except Error as error: - LOGGER.logger.error(error) - print('Command execution error please try again') - - -class BaseCommand(): - """ - Description: Basic attributes used for command invocation - Attributes: - write_host: Can write operation single host address - read_host: Can read the host address of the operation - headers: Send HTTP request header information - """ - - def __init__(self): - """ - Description: Class instance initialization - - """ - self.write_host = None - self.read_host = None - self.__http = 'http://' - self.headers = {"Content-Type": "application/json", - "Accept-Language": "zh-CN,zh;q=0.9"} - - self.load_read_host() - self.load_write_host() - - def load_write_host(self): - """ - Description: Address to load write permission - Args: - - Returns: - Raises: - - """ - wirte_port = configuration.WRITE_PORT - - write_ip = configuration.WRITE_IP_ADDR - if not all([write_ip, wirte_port]): - raise Error( - "The system does not configure the relevant port and ip correctly") - _write_host = self.__http + write_ip + ":" + str(wirte_port) - setattr(self, 'write_host', _write_host) - - def load_read_host(self): - """ - Returns:Address to load read permission - Args: - - Returns: - Raises: - - """ - read_port = configuration.QUERY_PORT - - read_ip = configuration.QUERY_IP_ADDR - if all([read_ip, read_port]): - _read_host = self.__http + read_ip + ":" + str(read_port) - - setattr(self, 'read_host', _read_host) - - def _set_read_host(self, remote=False): - """ - Set read domain name - """ - if remote: - self.read_host = configuration.REMOTE_HOST - if self.read_host is None: - raise Error( - "The system does not configure the relevant port and ip correctly") - - -class PkgshipCommand(BaseCommand): - """ - Description: PKG package command line - Attributes: - statistics: Summarized data table - table: Output table - columns: Calculate the width of the terminal dynamically - params: Command parameters - """ - parser = argparse.ArgumentParser( - description='package related dependency management') - subparsers = parser.add_subparsers( - help='package related dependency management') - - def __init__(self): - """ - Description: Class instance initialization - """ - super(PkgshipCommand, self).__init__() - self.statistics = dict() - self.table = PkgshipCommand.create_table() - # Calculate the total width of the current terminal - self.columns = 100 - self.params = [] - - @staticmethod - def register_command(command): - """ - Description: Registration of related commands - - Args: - command: Related commands - - Returns: - Raises: - - """ - command.register() - - def register(self): - """ - Description: Command line parameter injection - Args: - - Returns: - - Raises: - - """ - for command_params in self.params: - self.parse.add_argument( # pylint: disable=E1101 - command_params[0], - # type=eval(command_params[1]), # pylint: disable=W0123 - help=command_params[2], - default=command_params[3], - action=command_params[4]) - - @classmethod - def parser_args(cls): - """ - Description: Register the command line and parse related commands - Args: - - Returns: - - Raises: - Error: An error occurred during command parsing - """ - cls.register_command(RemoveCommand()) - cls.register_command(InitDatabaseCommand()) - cls.register_command(AllPackageCommand()) - cls.register_command(UpdatePackageCommand()) - cls.register_command(BuildDepCommand()) - cls.register_command(InstallDepCommand()) - cls.register_command(SelfBuildCommand()) - cls.register_command(BeDependCommand()) - cls.register_command(SingleCommand()) - cls.register_command(IssueCommand()) - cls.register_command(AllTablesCommand()) - cls.register_command(BatchTaskCommand()) - try: - args = cls.parser.parse_args() - args.func(args) - except Error: - print('command error') - - def parse_depend_package(self, response_data, params=None): - """ - Description: Parsing package data with dependencies - Args: - response_data: http request response content - params: Parameters passed in on the command line - Returns: - Summarized data table - Raises: - - """ - bin_package_count = 0 - src_package_count = 0 - if response_data.get('code') == ResponseCode.SUCCESS: - package_all = response_data.get('data') - if isinstance(package_all, dict): - if params: - if package_all.get("not_found_components"): - print("Problem: Not Found Components") - for not_found_com in package_all.get("not_found_components"): - print( - " - nothing provides {} needed by {} ". - format(not_found_com, params.packagename)) - package_all = package_all.get("build_dict") - - for bin_package, package_depend in package_all.items(): - # distinguish whether the current data is the data of the root node - if isinstance(package_depend, list) and \ - package_depend[ListNode.SOURCE_NAME] != 'source': - - row_data = [bin_package, - package_depend[ListNode.SOURCE_NAME], - package_depend[ListNode.VERSION], - package_depend[ListNode.DBNAME]] - # Whether the database exists - if package_depend[ListNode.DBNAME] not in self.statistics: - self.statistics[package_depend[ListNode.DBNAME]] = { - 'binary': [], - 'source': [] - } - # Determine whether the current binary package exists - if bin_package not in \ - self.statistics[package_depend[ListNode.DBNAME]]['binary']: - self.statistics[package_depend[ListNode.DBNAME] - ]['binary'].append(bin_package) - bin_package_count += 1 - # Determine whether the source package exists - if package_depend[ListNode.SOURCE_NAME] not in \ - self.statistics[package_depend[ListNode.DBNAME]]['source']: - self.statistics[package_depend[ListNode.DBNAME]]['source'].append( - package_depend[ListNode.SOURCE_NAME]) - src_package_count += 1 - - if hasattr(self, 'table') and self.table: - self.table.add_row(row_data) - else: - LOGGER.logger.error(response_data.get('msg')) - print(response_data.get('msg')) - statistics_table = self.statistics_table( - bin_package_count, src_package_count) - return statistics_table - - def print_(self, content=None, character='=', dividing_line=False): - """ - Description: Output formatted characters - Args: - content: Output content - character: Output separator content - dividing_line: Whether to show the separator - Returns: - - Raises: - - """ - # Get the current width of the console - - if dividing_line: - print(character * self.columns) - if content: - print(content) - if dividing_line: - print(character * self.columns) - - @staticmethod - def create_table(title=None): - """ - Description: Create printed forms - Args: - title: Table title - Returns: - ASCII format table - Raises: - - """ - table = PrettyTable(title) - # table.set_style(prettytable.PLAIN_COLUMNS) - table.align = 'l' - table.horizontal_char = '=' - table.junction_char = '=' - table.vrules = prettytable.NONE - table.hrules = prettytable.FRAME - return table - - def statistics_table(self, bin_package_count, src_package_count): - """ - Description: Generate data for total statistical tables - Args: - bin_package_count: Number of binary packages - src_package_count: Number of source packages - Returns: - Summarized data table - Raises: - - """ - statistics_table = self.create_table(['', 'binary', 'source']) - statistics_table.add_row( - ['self depend sum', bin_package_count, src_package_count]) - - # cyclically count the number of source packages and binary packages in each database - for database, statistics_item in self.statistics.items(): - statistics_table.add_row([database, len(statistics_item.get( - 'binary')), len(statistics_item.get('source'))]) - return statistics_table - - @staticmethod - def http_error(response): - """ - Description: Log error messages for http - Args: - response: Response content of http request - Returns: - - Raises: - HTTPError: http request error - """ - try: - print(response.raise_for_status()) - except HTTPError as http_error: - LOGGER.logger.error(http_error) - print('Request failed') - print(http_error) - - -class RemoveCommand(PkgshipCommand): - """ - Description: Delete database command - Attributes: - parse: Command line parsing example - params: Command line parameters - """ - - def __init__(self): - """ - Description: Class instance initialization - """ - super(RemoveCommand, self).__init__() - self.parse = PkgshipCommand.subparsers.add_parser( - 'rm', help='delete database operation') - self.params = [ - ('db', 'str', 'name of the database operated', '', 'store')] - - def register(self): - """ - Description: Command line parameter injection - Args: - - Returns: - - Raises: - - """ - super(RemoveCommand, self).register() - self.parse.set_defaults(func=self.do_command) - - def do_command(self, params): - """ - Description: Action to execute command - Args: - params: Command line parameters - Returns: - - Raises: - ConnErr: Request connection error - - """ - if params.db is None: - print('No database specified for deletion') - else: - _url = self.write_host + '/repodatas?dbName={}'.format(params.db) - try: - response = requests.delete(_url) - except ConnErr as conn_err: - LOGGER.logger.error(conn_err) - print(str(conn_err)) - else: - # Determine whether to delete the mysql database or sqlite database - if response.status_code == 200: - try: - data = json.loads(response.text) - except JSONDecodeError as json_error: - LOGGER.logger.error(json_error) - print(response.text) - else: - if data.get('code') == ResponseCode.SUCCESS: - print('delete success') - else: - LOGGER.logger.error(data.get('msg')) - print(data.get('msg')) - else: - self.http_error(response) - - -class InitDatabaseCommand(PkgshipCommand): - """ - Description: Initialize database command - Attributes: - parse: Command line parsing example - params: Command line parameters - """ - - def __init__(self): - """ - Description: Class instance initialization - """ - super(InitDatabaseCommand, self).__init__() - self.parse = PkgshipCommand.subparsers.add_parser( - 'init', help='initialization of the database') - self.params = [ - ('-filepath', 'str', 'name of the database operated', '', 'store')] - - def register(self): - """ - Description: Command line parameter injection - Args: - - Returns: - - Raises: - - """ - super(InitDatabaseCommand, self).register() - self.parse.set_defaults(func=self.do_command) - - def do_command(self, params): - """ - Description: Action to execute command - Args: - params: Command line parameters - Returns: - - Raises: - - """ - file_path = params.filepath - try: - if file_path: - file_path = os.path.abspath(file_path) - response = requests.post(self.write_host + - '/initsystem', data=json.dumps({'configfile': file_path}), - headers=self.headers) - except ConnErr as conn_error: - LOGGER.logger.error(conn_error) - print(str(conn_error)) - else: - if response.status_code == 200: - try: - response_data = json.loads(response.text) - except JSONDecodeError as json_error: - LOGGER.logger.error(json_error) - print(response.text) - else: - if response_data.get('code') == ResponseCode.SUCCESS: - print('Database initialization success ') - else: - LOGGER.logger.error(response_data.get('msg')) - print(response_data.get('msg')) - else: - self.http_error(response) - - -class AllPackageCommand(PkgshipCommand): - """ - Description: get all package commands - Attributes: - parse: Command line parsing example - params: Command line parameters - table: Output table - """ - - def __init__(self): - """ - Description: Class instance initialization - """ - super(AllPackageCommand, self).__init__() - - self.parse = PkgshipCommand.subparsers.add_parser( - 'list', help='get all package data') - self.table = self.create_table( - ['packagenames', 'database', 'version', 'license', 'maintainer', - 'release date', 'used time']) - self.params = [('tablename', 'str', 'name of the database operated', '', 'store'), - ('-remote', 'str', 'The address of the remote service', - False, 'store_true'), - ('-packagename', 'str', - 'Package name that needs fuzzy matching', '', 'store'), - ('-maintainer', 'str', 'Maintainer\'s name', '', 'store') - ] - - def register(self): - """ - Description: Command line parameter injection - Args: - - Returns: - - Raises: - - """ - super(AllPackageCommand, self).register() - self.parse.set_defaults(func=self.do_command) - - def __parse_package(self, response_data, table_name): - """ - Description: Parse the corresponding data of the package - Args: - response_data: http request response content - Returns: - - Raises: - - """ - if response_data.get('code') == ResponseCode.SUCCESS: - package_all = response_data.get('data') - if isinstance(package_all, list): - for package_item in package_all: - row_data = [package_item.get('name'), - table_name, - package_item.get('version') if package_item.get( - 'version') else '', - package_item.get('rpm_license') if package_item.get( - 'rpm_license') else '', - package_item.get('maintainer') if package_item.get( - 'maintainer') else '', - package_item.get('release_time') if package_item.get( - 'release_time') else '', - package_item.get('used_time')] - self.table.add_row(row_data) - else: - print(response_data.get('msg')) - - def do_command(self, params): - """ - Description: Action to execute command - Args: - params: Command line parameters - Returns: - - Raises: - ConnectionError: Request connection error - """ - self._set_read_host(params.remote) - _url = self.read_host + \ - '/packages?table_name={table_name}&query_pkg_name={pkg_name}&\ - maintainner={maintainer}&maintainlevel={maintainlevel}&\ - page_num={page}&page_size={pagesize}'.format( - table_name=params.tablename, - pkg_name=params.packagename, - maintainer=params.maintainer, - maintainlevel='', - page=1, - pagesize=65535).replace(' ', '') - try: - response = requests.get(_url) - except ConnErr as conn_error: - LOGGER.logger.error(conn_error) - print(str(conn_error)) - else: - if response.status_code == 200: - try: - response_data = json.loads(response.text) - self.__parse_package(response_data, params.tablename) - except JSONDecodeError as json_error: - LOGGER.logger.error(json_error) - print(response.text) - - if getattr(self.table, 'rowcount'): - print(self.table) - else: - print('Sorry, no relevant information has been found yet') - else: - self.http_error(response) - - -class UpdatePackageCommand(PkgshipCommand): - """ - Description: update package data - Attributes: - parse: Command line parsing example - params: Command line parameters - """ - - def __init__(self): - """ - Description: Class instance initialization - """ - super(UpdatePackageCommand, self).__init__() - - self.parse = PkgshipCommand.subparsers.add_parser( - 'updatepkg', help='update package data') - self.params = [ - ('-packagename', 'str', 'Source package name', '', 'store'), - ('-maintainer', 'str', 'Maintainers name', '', 'store'), - ('-maintainlevel', 'int', 'database priority', 1, 'store'), - ('-filefolder', 'str', 'Path of yaml file for batch update', '', 'store'), - ('--batch', 'str', 'The address of the remote service', - False, 'store_true'), - ] - - def register(self): - """ - Description: Command line parameter injection - Args: - - Returns: - - Raises: - - """ - super(UpdatePackageCommand, self).register() - self.parse.set_defaults(func=self.do_command) - - def do_command(self, params): - """ - Description: Action to execute command - Args: - params: Command line parameters - Returns: - - Raises: - ConnectionError: Request connection error - """ - _url = self.write_host + '/lifeCycle/updatePkgInfo' - try: - _folder = params.filefolder - if _folder: - _folder = os.path.abspath(_folder) - response = requests.put( - _url, data=json.dumps({'pkg_name': params.packagename, - 'maintainer': params.maintainer, - 'maintainlevel': params.maintainlevel, - 'batch': params.batch, - 'filepath': _folder}), - headers=self.headers) - except ConnErr as conn_error: - LOGGER.logger.error(conn_error) - print(str(conn_error)) - else: - if response.status_code == 200: - try: - data = json.loads(response.text) - except JSONDecodeError as json_error: - LOGGER.logger.error(json_error) - print(response.text) - else: - if data.get('code') == ResponseCode.SUCCESS: - print('update completed') - else: - LOGGER.logger.error(data.get('msg')) - print(data.get('msg')) - else: - self.http_error(response) - - -class BuildDepCommand(PkgshipCommand): - """ - Description: query the compilation dependencies of the specified package - Attributes: - parse: Command line parsing example - params: Command line parameters - collection: Is there a collection parameter - collection_params: Command line collection parameters - """ - - def __init__(self): - """ - Description: Class instance initialization - """ - super(BuildDepCommand, self).__init__() - self.table = PkgshipCommand.create_table( - ['Binary name', 'Source name', 'Version', 'Database name']) - self.parse = PkgshipCommand.subparsers.add_parser( - 'builddep', help='query the compilation dependencies of the specified package') - self.collection = True - self.params = [ - ('packagename', 'str', 'source package name', '', 'store'), - ('-remote', 'str', 'The address of the remote service', False, 'store_true') - ] - self.collection_params = [ - ('-dbs', 'Operational database collection') - ] - - def register(self): - """ - Description: Command line parameter injection - Args: - - Returns: - - Raises: - - """ - super(BuildDepCommand, self).register() - # collection parameters - - for cmd_params in self.collection_params: - self.parse.add_argument( - cmd_params[0], nargs='*', default=None, help=cmd_params[1]) - self.parse.set_defaults(func=self.do_command) - - def do_command(self, params): - """ - Description: Action to execute command - Args: - params: Command line parameters - Returns: - - Raises: - ConnectionError: Request connection error - """ - self._set_read_host(params.remote) - - _url = self.read_host + '/packages/findBuildDepend' - try: - response = requests.post( - _url, data=json.dumps({'sourceName': params.packagename, - 'db_list': params.dbs}), - headers=self.headers) - except ConnErr as conn_error: - LOGGER.logger.error(conn_error) - print(str(conn_error)) - else: - if response.status_code == 200: - try: - statistics_table = self.parse_depend_package( - json.loads(response.text), params) - except JSONDecodeError as json_error: - LOGGER.logger.error(json_error) - print(response.text) - else: - if getattr(self.table, 'rowcount'): - self.print_('query {} buildDepend result display:'.format( - params.packagename)) - print(self.table) - self.print_('statistics') - print(statistics_table) - else: - self.http_error(response) - - -class InstallDepCommand(PkgshipCommand): - """ - Description: query the installation dependencies of the specified package - Attributes: - parse: Command line parsing example - params: Command line parameters - collection: Is there a collection parameter - collection_params: Command line collection parameters - """ - - def __init__(self): - """ - Description: Class instance initialization - """ - super(InstallDepCommand, self).__init__() - - self.parse = PkgshipCommand.subparsers.add_parser( - 'installdep', help='query the installation dependencies of the specified package') - self.collection = True - self.params = [ - ('packagename', 'str', 'source package name', '', 'store'), - ('-remote', 'str', 'The address of the remote service', False, 'store_true') - ] - self.collection_params = [ - ('-dbs', 'Operational database collection') - ] - - def register(self): - """ - Description: Command line parameter injection - Args: - - Returns: - - Raises: - - """ - super(InstallDepCommand, self).register() - # collection parameters - - for cmd_params in self.collection_params: - self.parse.add_argument( - cmd_params[0], nargs='*', default=None, help=cmd_params[1]) - self.parse.set_defaults(func=self.do_command) - - def __parse_package(self, response_data, params): - """ - Description: Parse the corresponding data of the package - Args: - response_data: http response data - params: Parameters passed in on the command line - Returns: - - Raises: - - """ - self.table = PkgshipCommand.create_table( - ['Binary name', 'Source name', 'Version', 'Database name']) - if getattr(self, 'statistics'): - setattr(self, 'statistics', dict()) - bin_package_count = 0 - src_package_count = 0 - if response_data.get('code') == ResponseCode.SUCCESS: - package_all = response_data.get('data') - if isinstance(package_all, dict): - if package_all.get("not_found_components"): - print("Problem: Not Found Components") - for not_found_com in package_all.get("not_found_components"): - print( - " - nothing provides {} needed by {} ". - format(not_found_com, params.packagename)) - for bin_package, package_depend in package_all.get("install_dict").items(): - # distinguish whether the current data is the data of the root node - if isinstance(package_depend, list) and package_depend[-1][0][0] != 'root': - - row_data = [bin_package, - package_depend[ListNode.SOURCE_NAME], - package_depend[ListNode.VERSION], - package_depend[ListNode.DBNAME]] - # Whether the database exists - if package_depend[ListNode.DBNAME] not in self.statistics: - self.statistics[package_depend[ListNode.DBNAME]] = { - 'binary': [], - 'source': [] - } - # Determine whether the current binary package exists - if bin_package not in \ - self.statistics[package_depend[ListNode.DBNAME]]['binary']: - self.statistics[package_depend[ListNode.DBNAME] - ]['binary'].append(bin_package) - bin_package_count += 1 - # Determine whether the source package exists - if package_depend[ListNode.SOURCE_NAME] not in \ - self.statistics[package_depend[ListNode.DBNAME]]['source']: - self.statistics[package_depend[ListNode.DBNAME]]['source'].append( - package_depend[ListNode.SOURCE_NAME]) - src_package_count += 1 - - self.table.add_row(row_data) - else: - LOGGER.logger.error(response_data.get('msg')) - print(response_data.get('msg')) - # Display of aggregated data - statistics_table = self.statistics_table( - bin_package_count, src_package_count) - - return statistics_table - - def do_command(self, params): - """ - Description: Action to execute command - Args: - params: Command line parameters - Returns: - - Raises: - ConnectionError: requests connection error - """ - self._set_read_host(params.remote) - - _url = self.read_host + '/packages/findInstallDepend' - try: - response = requests.post(_url, data=json.dumps( - { - 'binaryName': params.packagename, - 'db_list': params.dbs - }, ensure_ascii=True), headers=self.headers) - except ConnErr as conn_error: - LOGGER.logger.error(conn_error) - print(str(conn_error)) - else: - if response.status_code == 200: - try: - statistics_table = self.__parse_package( - json.loads(response.text), params) - except JSONDecodeError as json_error: - LOGGER.logger.error(json_error) - print(response.text) - else: - if getattr(self.table, 'rowcount'): - self.print_('query {} InstallDepend result display:'.format( - params.packagename)) - print(self.table) - self.print_('statistics') - print(statistics_table) - else: - self.http_error(response) - - -class SelfBuildCommand(PkgshipCommand): - """ - Description: self-compiled dependency query - Attributes: - parse: Command line parsing example - params: Command line parameters - collection: Is there a collection parameter - collection_params: Command line collection parameters - """ - - def __init__(self): - """ - Description: Class instance initialization - """ - super(SelfBuildCommand, self).__init__() - - self.parse = PkgshipCommand.subparsers.add_parser( - 'selfbuild', help='query the self-compiled dependencies of the specified package') - self.collection = True - self.bin_package_table = self.create_table( - ['package name', 'src name', 'version', 'database']) - self.src_package_table = self.create_table([ - 'src name', 'version', 'database']) - self.params = [ - ('packagename', 'str', 'source package name', '', 'store'), - ('-t', 'str', 'Source of data query', 'binary', 'store'), - ('-w', 'str', 'whether to include other subpackages of binary', 0, 'store'), - ('-s', 'str', 'whether it is self-compiled', 0, 'store'), - ('-remote', 'str', 'The address of the remote service', False, 'store_true') - ] - - self.collection_params = [ - ('-dbs', 'Operational database collection') - ] - - def register(self): - """ - Description: Command line parameter injection - Args: - - Returns: - - Raises: - - """ - super(SelfBuildCommand, self).register() - # collection parameters - - for cmd_params in self.collection_params: - self.parse.add_argument( - cmd_params[0], nargs='*', default=None, help=cmd_params[1]) - self.parse.set_defaults(func=self.do_command) - - def _parse_bin_package(self, bin_packages): - """ - Description: Parsing binary result data - Args: - bin_packages: Binary package data - - Returns: - - Raises: - - """ - bin_package_count = 0 - if bin_packages: - for bin_package, package_depend in bin_packages.items(): - # distinguish whether the current data is the data of the root node - if isinstance(package_depend, list) and package_depend[-1][0][0] != 'root': - - row_data = [bin_package, package_depend[ListNode.SOURCE_NAME], - package_depend[ListNode.VERSION], package_depend[ListNode.DBNAME]] - - # Whether the database exists - if package_depend[ListNode.DBNAME] not in self.statistics: - self.statistics[package_depend[ListNode.DBNAME]] = { - 'binary': [], - 'source': [] - } - # Determine whether the current binary package exists - if bin_package not in \ - self.statistics[package_depend[ListNode.DBNAME]]['binary']: - self.statistics[package_depend[ListNode.DBNAME] - ]['binary'].append(bin_package) - bin_package_count += 1 - self.bin_package_table.add_row(row_data) - - return bin_package_count - - def _parse_src_package(self, src_packages): - """ - Description: Source package data analysis - Args: - src_packages: Source package - - Returns: - Source package data - Raises: - - """ - src_package_count = 0 - if src_packages: - for src_package, package_depend in src_packages.items(): - # distinguish whether the current data is the data of the root node - if isinstance(package_depend, list): - - row_data = [src_package, package_depend[ListNode.VERSION], - package_depend[DB_NAME]] - # Whether the database exists - if package_depend[DB_NAME] not in self.statistics: - self.statistics[package_depend[DB_NAME]] = { - 'binary': [], - 'source': [] - } - # Determine whether the current binary package exists - if src_package not in self.statistics[package_depend[DB_NAME]]['source']: - self.statistics[package_depend[DB_NAME] - ]['source'].append(src_package) - src_package_count += 1 - - self.src_package_table.add_row(row_data) - - return src_package_count - - def __parse_package(self, response_data, params): - """ - Description: Parse the corresponding data of the package - Args: - response_data: http response data - params: Parameters passed in on the command line - Returns: - Summarized data table - Raises: - - """ - if getattr(self, 'statistics'): - setattr(self, 'statistics', dict()) - bin_package_count = 0 - src_package_count = 0 - - if response_data.get('code') == ResponseCode.SUCCESS: - package_all = response_data.get('data') - if isinstance(package_all, dict): - # Parsing binary result data - if package_all.get("not_found_components"): - print("Problem: Not Found Components") - for not_found_com in package_all.get("not_found_components"): - print( - " - nothing provides {} needed by {} ". - format(not_found_com, params.packagename)) - bin_package_count = self._parse_bin_package( - package_all.get('binary_dicts')) - - # Source package data analysis - src_package_count = self._parse_src_package( - package_all.get('source_dicts')) - else: - LOGGER.logger.error(response_data.get('msg')) - print(response_data.get('msg')) - # Display of aggregated data - statistics_table = self.statistics_table( - bin_package_count, src_package_count) - # return (bin_package_table, src_package_table, statistics_table) - return statistics_table - - def do_command(self, params): - """ - Description: Action to execute command - Args: - params: commands lines params - Returns: - - Raises: - ConnectionError: requests connection error - """ - self._set_read_host(params.remote) - _url = self.read_host + '/packages/findSelfDepend' - try: - response = requests.post(_url, - data=json.dumps({ - 'packagename': params.packagename, - 'db_list': params.dbs, - 'packtype': params.t, - 'selfbuild': str(params.s), - 'withsubpack': str(params.w)}), - headers=self.headers) - except ConnErr as conn_error: - LOGGER.logger.error(conn_error) - print(str(conn_error)) - else: - if response.status_code == 200: - try: - statistics_table = self.__parse_package( - json.loads(response.text), params) - except JSONDecodeError as json_error: - LOGGER.logger.error(json_error) - print(response.text) - else: - if getattr(self.bin_package_table, 'rowcount') \ - and getattr(self.src_package_table, 'rowcount'): - self.print_('query {} selfDepend result display :'.format( - params.packagename)) - print(self.bin_package_table) - self.print_(character='=') - print(self.src_package_table) - self.print_('statistics') - print(statistics_table) - else: - self.http_error(response) - - -class BeDependCommand(PkgshipCommand): - """ - Description: dependent query - Attributes: - parse: Command line parsing example - params: Command line parameters - """ - - def __init__(self): - """ - Description: Class instance initialization - """ - super(BeDependCommand, self).__init__() - self.table = PkgshipCommand.create_table( - ['Binary name', 'Source name', 'Version', 'Database name']) - self.parse = PkgshipCommand.subparsers.add_parser( - 'bedepend', help='dependency query for the specified package') - self.params = [ - ('packagename', 'str', 'source package name', '', 'store'), - ('db', 'str', 'name of the database operated', '', 'store'), - ('-w', 'str', 'whether to include other subpackages of binary', 0, 'store'), - ('-remote', 'str', 'The address of the remote service', False, 'store_true') - ] - - def register(self): - """ - Description: Command line parameter injection - Args: - - Returns: - - Raises: - - """ - super(BeDependCommand, self).register() - self.parse.set_defaults(func=self.do_command) - - def do_command(self, params): - """ - Description: Action to execute command - Args: - params: command lines params - Returns: - - Raises: - ConnectionError: requests connection error - """ - self._set_read_host(params.remote) - _url = self.read_host + '/packages/findBeDepend' - try: - response = requests.post(_url, data=json.dumps( - { - 'packagename': params.packagename, - 'dbname': params.db, - 'withsubpack': str(params.w) - } - ), headers=self.headers) - except ConnErr as conn_error: - LOGGER.logger.error(conn_error) - print(str(conn_error)) - else: - if response.status_code == 200: - try: - statistics_table = self.parse_depend_package( - json.loads(response.text)) - except JSONDecodeError as json_error: - LOGGER.logger.error(json_error) - print(response.text) - else: - if getattr(self.table, 'rowcount'): - self.print_('query {} beDepend result display :'.format( - params.packagename)) - print(self.table) - self.print_('statistics') - print(statistics_table) - else: - self.http_error(response) - - -class SingleCommand(PkgshipCommand): - """ - Description: query single package information - Attributes: - parse: Command line parsing example - params: Command line parameters - """ - - def __init__(self): - """ - Description: Class instance initialization - """ - super(SingleCommand, self).__init__() - - self.parse = PkgshipCommand.subparsers.add_parser( - 'single', help='query the information of a single package') - self.params = [ - ('packagename', 'str', 'source package name', '', 'store'), - ('tablename', 'str', 'name of the database operated', '', 'store'), - ('-remote', 'str', 'The address of the remote service', False, 'store_true') - ] - self.provides_table = self.create_table(['Symbol', 'Required by']) - self.requires_table = self.create_table(['Symbol', 'Provides by']) - - def register(self): - """ - Description: Command line parameter injection - Args: - - Returns: - - Raises: - - """ - super(SingleCommand, self).register() - self.parse.set_defaults(func=self.do_command) - - def __parse_package_detail(self, response_data): - """ - - """ - _show_field_name = ('pkg_name', 'version', 'release', 'url', 'license', 'feature', - 'maintainer', 'maintainlevel', 'gitee_url', 'issue', 'summary', - 'description', 'buildrequired') - _package_detail_info = response_data.get('data') - _line_content = [] - if _package_detail_info: - for key in _show_field_name: - value = _package_detail_info.get(key) - if value is None: - value = '' - if isinstance(value, list): - value = '、'.join(value) if value else '' - _line_content.append('%-15s:%s' % (key, value)) - for content in _line_content: - self.print_(content=content) - - def __parse_provides(self, provides): - """ - - """ - if provides and isinstance(provides, list): - for _provide in provides: - _required_by = '\n'.join( - _provide['requiredby']) if _provide['requiredby'] else '' - self.provides_table.add_row( - [_provide['name'], _required_by]) - self.print_('Provides') - if getattr(self.provides_table, 'rowcount'): - print(self.provides_table) - else: - print('No relevant dependent data') - self.provides_table.clear_rows() - - def __parse_requires(self, requires): - """ - - """ - if requires and isinstance(requires, list): - for _require in requires: - _provide_by = '\n'.join( - _require['providedby']) if _require['providedby'] else '' - self.requires_table.add_row( - [_require['name'], _provide_by]) - self.print_('Requires') - if getattr(self.requires_table, 'rowcount'): - print(self.requires_table) - else: - print('No related components') - self.requires_table.clear_rows() - - def __parse_subpack(self, subpacks): - """ - Data analysis of binary package - """ - for subpack_item in subpacks: - print('-' * 50) - self.print_(subpack_item['name']) - - self.__parse_provides(subpack_item['provides']) - self.__parse_requires(subpack_item['requires']) - - def __parse_package(self, response_data): - """ - Description: Parse the corresponding data of the package - Args: - response_data: http response data - Returns: - - Raises: - - """ - if response_data.get('code') == ResponseCode.SUCCESS: - - self.__parse_package_detail(response_data) - try: - _subpacks = response_data['data']['subpack'] - self.__parse_subpack(_subpacks) - except KeyError as key_error: - LOGGER.logger.error(key_error) - else: - print(response_data.get('msg')) - - def do_command(self, params): - """ - Description: Action to execute command - Args: - params: command lines params - Returns: - - Raises: - ConnectionError: requests connection error - """ - self._set_read_host(params.remote) - _url = self.read_host + \ - '/packages/packageInfo?table_name={db_name}&pkg_name={packagename}' \ - .format(db_name=params.tablename, packagename=params.packagename) - try: - response = requests.get(_url) - except ConnErr as conn_error: - LOGGER.logger.error(conn_error) - print(str(conn_error)) - else: - if response.status_code == 200: - try: - self.__parse_package(json.loads(response.text)) - except JSONDecodeError as json_error: - LOGGER.logger.error(json_error) - print(response.text) - - else: - self.http_error(response) - - -class IssueCommand(PkgshipCommand): - """ - Description: Get the issue list - Attributes: - parse: Command line parsing example - params: Command line parameters - """ - - def __init__(self): - """ - Description: Class instance initialization - """ - super(IssueCommand, self).__init__() - - self.parse = PkgshipCommand.subparsers.add_parser( - 'issue', help='Query the issue list of the specified package') - self.params = [ - ('-packagename', 'str', 'Query source package name', '', 'store'), - - ('-issue_type', 'str', 'Type of issue', '', 'store'), - ('-issue_status', 'str', 'the status of the issue', '', 'store'), - ('-maintainer', 'str', 'Maintainer\'s name', '', 'store'), - ('-page', 'int', - 'Need to query the data on the first few pages', 1, 'store'), - ('-pagesize', 'int', - 'The size of the data displayed on each page', 65535, 'store'), - ('-remote', 'str', 'The address of the remote service', False, 'store_true') - ] - self.table = self.create_table( - ['issue_id', 'pkg_name', 'issue_title', - 'issue_status', 'issue_type', 'maintainer']) - - def register(self): - """ - Description: Command line parameter injection - - """ - super(IssueCommand, self).register() - self.parse.set_defaults(func=self.do_command) - - def __parse_package(self, response_data): - """ - Description: Parse the corresponding data of the package - - Args: - response_data: http response data - """ - if response_data.get('code') == ResponseCode.SUCCESS: - issue_all = response_data.get('data') - if isinstance(issue_all, list): - for issue_item in issue_all: - _row_data = [ - issue_item.get('issue_id'), - issue_item.get('pkg_name') if issue_item.get( - 'pkg_name') else '', - issue_item.get('issue_title')[:50]+'...' if issue_item.get( - 'issue_title') else '', - issue_item.get('issue_status') if issue_item.get( - 'issue_status') else '', - issue_item.get('issue_type') if issue_item.get( - 'issue_type') else '', - issue_item.get('maintainer') if issue_item.get('maintainer') else ''] - self.table.add_row(_row_data) - else: - print(response_data.get('msg')) - - def do_command(self, params): - """ - Description: Action to execute command - Args: - params: command lines params - Returns: - - Raises: - ConnectionError: requests connection error - """ - self._set_read_host(params.remote) - _url = self.read_host + \ - '/lifeCycle/issuetrace?page_num={page_num}&\ - page_size={page_size}&pkg_name={pkg_name}&issue_type={issue_type}\ - &issue_status={issue_status}&maintainer={maintainer}'\ - .format(page_num=params.page, - page_size=params.pagesize, - pkg_name=params.packagename, - issue_type=params.issue_type, - issue_status=params.issue_status, - maintainer=params.maintainer).replace(' ', '') - try: - response = requests.get(_url) - except ConnErr as conn_error: - LOGGER.logger.error(conn_error) - print(str(conn_error)) - else: - if response.status_code == 200: - try: - response_data = json.loads(response.text) - self.__parse_package(response_data) - except JSONDecodeError as json_error: - LOGGER.logger.error(json_error) - print(response.text) - if getattr(self.table, "rowcount"): - print('total count : %d' % response_data['total_count']) - print('total page : %d' % response_data['total_page']) - print('current page : %s ' % params.page) - print(self.table) - else: - print("Sorry, no relevant information has been found yet") - else: - self.http_error(response) - - -class AllTablesCommand(PkgshipCommand): - """ - Description: Get all data tables in the current life cycle - Attributes: - parse: Command line parsing example - params: Command line parameters - """ - - def __init__(self): - """ - Description: Class instance initialization - """ - super(AllTablesCommand, self).__init__() - - self.parse = PkgshipCommand.subparsers.add_parser( - 'tables', help='Get all data tables in the current life cycle') - self.params = [ - ('-remote', 'str', 'The address of the remote service', False, 'store_true') - ] - - def register(self): - """ - Description: Command line parameter injection - - """ - super(AllTablesCommand, self).register() - self.parse.set_defaults(func=self.do_command) - - def do_command(self, params): - """ - Description: Action to execute command - Args: - params: command lines params - Returns: - - Raises: - ConnectionError: requests connection error - """ - self._set_read_host(params.remote) - _url = self.read_host + '/lifeCycle/tables' - try: - response = requests.get(_url, headers=self.headers) - except ConnErr as conn_error: - LOGGER.logger.error(conn_error) - print(str(conn_error)) - else: - if response.status_code == 200: - try: - _response_content = json.loads(response.text) - if _response_content.get('code') == ResponseCode.SUCCESS: - print( - 'The version libraries that exist in the ', - 'current life cycle are as follows:') - for table in _response_content.get('data', []): - print(table) - else: - print('Failed to get the lifecycle repository') - except JSONDecodeError as json_error: - LOGGER.logger.error(json_error) - print(response.text) - else: - self.http_error(response) - - -class BatchTaskCommand(PkgshipCommand): - """ - Description: Issue and life cycle information involved in batch processing packages - Attributes: - parse: Command line parsing example - params: Command line parameters - """ - - def __init__(self): - """ - Description: Class instance initialization - """ - super(BatchTaskCommand, self).__init__() - - self.parse = PkgshipCommand.subparsers.add_parser( - 'update', - help='Issue and life cycle information involved in batch processing packages') - self.params = [ - ('--issue', 'str', 'Batch operation on issue', False, 'store_true'), - ('--package', 'str', 'Package life cycle information processing', - False, 'store_true'), - ] - - def register(self): - """ - Description: Command line parameter injection - - """ - super(BatchTaskCommand, self).register() - self.parse.set_defaults(func=self.do_command) - - def do_command(self, params): - """ - Description: Action to execute command - Args: - params: command lines params - Returns: - - Raises: - ConnectionError: requests connection error - """ - if not params.issue and not params.package: - print('Please select the way to operate') - if params.issue: - issue_thread = threading.Thread( - target=update_pkg_info, args=(False,)) - issue_thread.start() - if params.package: - update_pkg_thread = threading.Thread( - target=update_pkg_info) - update_pkg_thread.start() - - -if __name__ == '__main__': - main() diff --git a/packageship/packageship/pkgshipd b/packageship/packageship/pkgshipd deleted file mode 100755 index a0db9840fb719264596c53a894d861a63ffafaee..0000000000000000000000000000000000000000 --- a/packageship/packageship/pkgshipd +++ /dev/null @@ -1,285 +0,0 @@ -#!/bin/bash -SYS_PATH=/etc/pkgship -OUT_PATH=/var/run/pkgship_uwsgi - -MEM_THRESHOLD='700' -MEM_FREE=`free -m | grep "Mem" | awk '{print $7}'` - -if [ $1 = "start" ] -then - if [ $MEM_FREE -lt $MEM_THRESHOLD ]; then - echo "[ERROR] pkgship tool does not support memory less than ${MEM_THRESHOLD} MB." - exit 0 - fi -fi - -if [ ! -d "$OUT_PATH" ]; then - mkdir $OUT_PATH -fi - -if [ ! -f "$SYS_PATH/package.ini" ]; then - echo "[ERROR] $SYS_PATH/package.ini dose not exist!!!" - exit 0 -fi - -user=$(id | awk '{print $2}' | cut -d = -f 2) -if [ "$user" == "0(root)" ]; then - echo "[INFO] Current user is root." -else - echo "[ERROR] Current user is not root." - exit 1 -fi - -function check_config_file(){ - echo "[INFO] Check validation of config file." - check_null - - echo "[INFO] Check validation of ip addresses." - write_port=$(get_config "$service" "write_port") - query_port=$(get_config "$service" "query_port") - write_ip_addr=$(get_config "$service" "write_ip_addr") - query_ip_addr=$(get_config "$service" "query_ip_addr") - if [[ -z $write_ip_addr ]]; then - echo "[ERROR] The value of below config names is None in: $SYS_PATH/package.ini, Please check these parameters: write_ip_addr" - exit 1 - else - check_addr $write_ip_addr $write_port - fi - - if [[ -z $query_ip_addr ]]; then - echo "[ERROR] The value of below config names is None in: $SYS_PATH/package.ini, Please check these parameters: query_ip_addr" - exit 1 - else - check_addr $query_ip_addr $query_port - fi - - echo "[INFO] IP addresses are all valid." - - echo "[INFO] Check validation of numbers." - num_vars=(backup_count max_bytes buffer-size http-timeout harakiri hour minute pool_workers) - for var in ${num_vars[@]} - do - value=$(get_config "$service" $var) - if [[ -z "$value" ]]; then - echo "[ERROR] CAN NOT find config name $var in: $SYS_PATH/package.ini, Please check the file." - exit 1 - fi - check_num ${value} ${var} - done - echo "[INFO] All numbers are valid." - - echo "[INFO] Check validation of words." - log_level=$(get_config "$service" "log_level") - open=$(get_config "$service" "open") - check_word "log_level" "INFO|DEBUG|WARNING|ERROR|CRITICAL" $log_level - check_word "open" "True|False" $open - echo "[INFO] All words are valid." - - echo "[INFO] Config file checked valid." - -} - -function check_addr(){ - ip=$1 - ret=1 - if [[ $ip =~ ^[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}$ ]]; then - ip=(${ip//\./ }) - [[ ${ip[0]} -le 255 && ${ip[1]} -le 255 && ${ip[2]} -le 255 && ${ip[3]} -le 255 ]] - ret=$? - fi - if [ $ret -ne 0 ]; then - echo "[ERROR] Invalid ip of $1" - exit 1 - fi - check_num ${2-"port"} "port" - if [[ $2 -gt 65534 || $2 -lt 1025 ]]; then - echo "[ERROR] Invalid port of $2" - exit 1 - fi -} - -function check_null(){ - list=`cat $SYS_PATH/package.ini | grep -E ^[a-z_-]+= | awk -F '=' '{if($2 == "")print $1}'` - num=0 - for val in $list - do - num=$[$num+1] - done - if [ $num -gt 0 ]; then - echo "[ERROR] The value of below config names is None in: $SYS_PATH/package.ini, Please check these parameters:" - for val in $list - do - echo $val - done - exit 1 - fi -} - -function check_num(){ - result=`echo $1 | grep '^[[:digit:]]*$'` - if [ $? -ne 0 ]; then - echo "[ERROR] $2 should be a number." - exit 1 - fi -} - -function check_word(){ - if [ -z $3 ]; then - echo "[ERROR] The value of below config names is None in: $SYS_PATH/package.ini, Please check these parameters: $1" - exit 1 - fi - - result=`echo $3 | grep -wE "$2"` - if [ $? -ne 0 ]; then - echo "[ERROR] $1 should be $2." - exit 1 - fi -} - - -function get_config(){ - cat $SYS_PATH/package.ini | grep -E ^$2 | sed 's/[[:space:]]//g' | awk 'BEGIN{FS="="}{print $2}' -} - -function create_config_file(){ - echo "[INFO] config type is: $service" - daemonize=$(get_config "$service" "daemonize") - buffer_size=$(get_config "$service" "buffer-size") - http_timeout=$(get_config "$service" "http-timeout") - harakiri=$(get_config "$service" "harakiri") - uwsgi_file_path=$(find /usr/lib/ -name "packageship" | head -n 1) - echo "[INFO] run packageship under path: $uwsgi_file_path" - if [ $service = "manage" -o $service = "all" ]; then - write_port=$(get_config "$service" "write_port") - write_ip_addr=$(get_config "$service" "write_ip_addr") - if [[ -z "$daemonize" ]] || [[ -z "$buffer_size" ]] || [[ -z "$write_ip_addr" ]] || [[ -z "$http_timeout" ]] || [[ -z "$harakiri" ]] || [[ -z "$write_port" ]]; - then - echo "[ERROR] CAN NOT find all config name in: $SYS_PATH/package.ini, Please check the file" - echo "[ERROR] The following config name is needed: daemonize, buffer-size, write_port, write_ip_addr, harakiri and http-timeout" - exit 1 - fi - if [ -z "$uwsgi_file_path" ];then - echo "[ERROR] CAN NOT find the uwsgi file path under: /usr/lib/" - exit 1 - fi - echo "[INFO] manage.ini is saved to $OUT_PATH/manage.ini" - echo "[uwsgi] -http=$write_ip_addr:$write_port -module=packageship.manage -uwsgi-file=$uwsgi_file_path/manage.py -callable=app -buffer-size=$buffer_size -pidfile=$OUT_PATH/manage.pid -http-timeout=$http_timeout -harakiri=$harakiri -enable-threads=true -daemonize=$daemonize" > $OUT_PATH/manage.ini -chmod 666 $OUT_PATH/manage.ini - fi - - if [ $service = "selfpkg" -o $service = "all" ];then - query_port=$(get_config "$service" "query_port") - query_ip_addr=$(get_config "$service" "query_ip_addr") - - if [[ -z "$daemonize" ]] || [[ -z "$buffer_size" ]] || [[ -z "$query_ip_addr" ]] || [[ -z "$http_timeout" ]] || [[ -z "$harakiri" ]] || [[ -z "$query_port" ]];then - echo "[ERROR] CAN NOT find all config name in: $SYS_PATH/package.ini, Please check the file." - echo "[ERROR] The following config name is needed: daemonize, buffer_size, query_port, query_ip_addr, harakiri and http-timeout." - exit 1 - fi - if [ -z "$uwsgi_file_path" ];then - echo "[ERROR] CAN NOT find the uwsgi file path under: /usr/lib/" - exit 1 - fi - - echo "[INFO] selfpkg.ini is saved to: $OUT_PATH/selfpkg.ini" - echo "[uwsgi] -http=$query_ip_addr:$query_port -module=packageship.selfpkg -uwsgi-file=$uwsgi_file_path/selfpkg.py -callable=app -buffer-size=$buffer_size -pidfile=$OUT_PATH/selfpkg.pid -http-timeout=$http_timeout -harakiri=$harakiri -enable-threads=true -daemonize=$daemonize" > $OUT_PATH/selfpkg.ini -chmod 666 $OUT_PATH/selfpkg.ini - fi - - rm -f config_file -} - -function start_service(){ - if [ "`ps aux | grep "uwsgi" | grep "$1.ini"`" != "" ];then - echo "[WARNING] $1 service is running, please STOP it first." - else - cd $uwsgi_file_path - uwsgi -d --ini $OUT_PATH/$1.ini - echo "[INFO] START uwsgi service: $1.ini" - fi -} - -function stop_service(){ - if [ ! -f "$OUT_PATH/$1.pid" ]; then - echo "[ERROR] STOP service FAILED, $OUT_PATH/$1.pid dose not exist." - echo "[ERROR] Please stop it manually by using [ps -aux] and [uwsgi --stop #PID]" - exit 0 - fi - - pid=$(cat $OUT_PATH/$1.pid) - if [ "`ps aux | awk 'BEGIN{FS=" "}{if ($2=='$pid') print $0}' | grep "$1.ini"`" != "" ];then - uwsgi --$2 $OUT_PATH/$1.pid - echo "[INFO] STOP uwsgi service: $1.ini" - else - echo "[WARNING] STOP service [FAILED], Please START the service first." - fi -} - -if [ ! -n "$1" ] -then - echo "Usages: sh pkgshipd.sh start|stop [manage|selfpkg]" - exit 0 -fi - -if [ X$2 = X ];then - service="all" -elif [ $2 = "manage" -o $2 = "selfpkg" ];then - service=$2 -else - echo "[ERROR] Can not phase the input of $2!!!" - exit 0 -fi - -if [ $1 = "start" ] -then - check_config_file -fi - -create_config_file $service -if [ $? -ne 0 ];then - exit 0 -fi - - -if [ $1 = start ] -then - if [ $service = "all" ];then - start_service "manage" - start_service "selfpkg" - else - start_service $service - fi - echo "===The run log is saved into: $daemonize===" - -elif [ $1 = stop ];then - if [ $service = "all" ];then - stop_service "manage" "stop" - stop_service "selfpkg" "stop" - else - stop_service $service "stop" - fi - echo "===The run log is saved into: $daemonize===" - -else - echo "Usages: sh pkgshipd.sh start|stop [manage|selfpkg]" -fi diff --git a/packageship/packageship/pylint.conf b/packageship/packageship/pylint.conf deleted file mode 100644 index c2e270f53c4617ee15326e6be669bd97155141d6..0000000000000000000000000000000000000000 --- a/packageship/packageship/pylint.conf +++ /dev/null @@ -1,583 +0,0 @@ -[MASTER] - -# A comma-separated list of package or module names from where C extensions may -# be loaded. Extensions are loading into the active Python interpreter and may -# run arbitrary code. -extension-pkg-whitelist= - -# Add files or directories to the blacklist. They should be base names, not -# paths. -ignore=CVS - -# Add files or directories matching the regex patterns to the blacklist. The -# regex matches against base names, not paths. -ignore-patterns= - -# Python code to execute, usually for sys.path manipulation such as -# pygtk.require(). -#init-hook= - -# Use multiple processes to speed up Pylint. Specifying 0 will auto-detect the -# number of processors available to use. -jobs=1 - -# Control the amount of potential inferred values when inferring a single -# object. This can help the performance when dealing with large functions or -# complex, nested conditions. -limit-inference-results=100 - -# List of plugins (as comma separated values of python module names) to load, -# usually to register additional checkers. -load-plugins= - -# Pickle collected data for later comparisons. -persistent=yes - -# Specify a configuration file. -#rcfile= - -# When enabled, pylint would attempt to guess common misconfiguration and emit -# user-friendly hints instead of false-positive error messages. -suggestion-mode=yes - -# Allow loading of arbitrary C extensions. Extensions are imported into the -# active Python interpreter and may run arbitrary code. -unsafe-load-any-extension=no - - -[MESSAGES CONTROL] - -# Only show warnings with the listed confidence levels. Leave empty to show -# all. Valid levels: HIGH, INFERENCE, INFERENCE_FAILURE, UNDEFINED. -confidence= - -# Disable the message, report, category or checker with the given id(s). You -# can either give multiple identifiers separated by comma (,) or put this -# option multiple times (only on the command line, not in the configuration -# file where it should appear only once). You can also use "--disable=all" to -# disable everything first and then reenable specific checks. For example, if -# you want to run only the similarities checker, you can use "--disable=all -# --enable=similarities". If you want to run only the classes checker, but have -# no Warning level messages displayed, use "--disable=all --enable=classes -# --disable=W". -disable=print-statement, - parameter-unpacking, - unpacking-in-except, - old-raise-syntax, - backtick, - long-suffix, - old-ne-operator, - old-octal-literal, - import-star-module-level, - non-ascii-bytes-literal, - raw-checker-failed, - bad-inline-option, - locally-disabled, - file-ignored, - suppressed-message, - useless-suppression, - deprecated-pragma, - use-symbolic-message-instead, - apply-builtin, - basestring-builtin, - buffer-builtin, - cmp-builtin, - coerce-builtin, - execfile-builtin, - file-builtin, - long-builtin, - raw_input-builtin, - reduce-builtin, - standarderror-builtin, - unicode-builtin, - xrange-builtin, - coerce-method, - delslice-method, - getslice-method, - setslice-method, - no-absolute-import, - old-division, - dict-iter-method, - dict-view-method, - next-method-called, - metaclass-assignment, - indexing-exception, - raising-string, - reload-builtin, - oct-method, - hex-method, - nonzero-method, - cmp-method, - input-builtin, - round-builtin, - intern-builtin, - unichr-builtin, - map-builtin-not-iterating, - zip-builtin-not-iterating, - range-builtin-not-iterating, - filter-builtin-not-iterating, - using-cmp-argument, - eq-without-hash, - div-method, - idiv-method, - rdiv-method, - exception-message-attribute, - invalid-str-codec, - sys-max-int, - bad-python3-import, - deprecated-string-function, - deprecated-str-translate-call, - deprecated-itertools-function, - deprecated-types-field, - next-method-defined, - dict-items-not-iterating, - dict-keys-not-iterating, - dict-values-not-iterating, - deprecated-operator-function, - deprecated-urllib-function, - xreadlines-attribute, - deprecated-sys-function, - exception-escape, - comprehension-escape, - attribute-defined-outside-init - -# Enable the message, report, category or checker with the given id(s). You can -# either give multiple identifier separated by comma (,) or put this option -# multiple time (only on the command line, not in the configuration file where -# it should appear only once). See also the "--disable" option for examples. -enable=c-extension-no-member - - -[REPORTS] - -# Python expression which should return a score less than or equal to 10. You -# have access to the variables 'error', 'warning', 'refactor', and 'convention' -# which contain the number of messages in each category, as well as 'statement' -# which is the total number of statements analyzed. This score is used by the -# global evaluation report (RP0004). -evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10) - -# Template used to display messages. This is a python new-style format string -# used to format the message information. See doc for all details. -#msg-template= - -# Set the output format. Available formats are text, parseable, colorized, json -# and msvs (visual studio). You can also give a reporter class, e.g. -# mypackage.mymodule.MyReporterClass. -output-format=text - -# Tells whether to display a full report or only the messages. -reports=no - -# Activate the evaluation score. -score=yes - - -[REFACTORING] - -# Maximum number of nested blocks for function / method body -max-nested-blocks=5 - -# Complete name of functions that never returns. When checking for -# inconsistent-return-statements if a never returning function is called then -# it will be considered as an explicit return statement and no message will be -# printed. -never-returning-functions=sys.exit - - -[BASIC] - -# Naming style matching correct argument names. -argument-naming-style=snake_case - -# Regular expression matching correct argument names. Overrides argument- -# naming-style. -#argument-rgx= - -# Naming style matching correct attribute names. -attr-naming-style=snake_case - -# Regular expression matching correct attribute names. Overrides attr-naming- -# style. -#attr-rgx= - -# Bad variable names which should always be refused, separated by a comma. -bad-names=foo, - bar, - baz, - toto, - tutu, - tata - -# Naming style matching correct class attribute names. -class-attribute-naming-style=any - -# Regular expression matching correct class attribute names. Overrides class- -# attribute-naming-style. -#class-attribute-rgx= - -# Naming style matching correct class names. -class-naming-style=PascalCase - -# Regular expression matching correct class names. Overrides class-naming- -# style. -#class-rgx= - -# Naming style matching correct constant names. -const-naming-style=UPPER_CASE - -# Regular expression matching correct constant names. Overrides const-naming- -# style. -#const-rgx= - -# Minimum line length for functions/classes that require docstrings, shorter -# ones are exempt. -docstring-min-length=-1 - -# Naming style matching correct function names. -function-naming-style=snake_case - -# Regular expression matching correct function names. Overrides function- -# naming-style. -#function-rgx= - -# Good variable names which should always be accepted, separated by a comma. -good-names=i, - j, - k, - ex, - Run, - _ - -# Include a hint for the correct naming format with invalid-name. -include-naming-hint=no - -# Naming style matching correct inline iteration names. -inlinevar-naming-style=any - -# Regular expression matching correct inline iteration names. Overrides -# inlinevar-naming-style. -#inlinevar-rgx= - -# Naming style matching correct method names. -method-naming-style=snake_case - -# Regular expression matching correct method names. Overrides method-naming- -# style. -#method-rgx= - -# Naming style matching correct module names. -module-naming-style=snake_case - -# Regular expression matching correct module names. Overrides module-naming- -# style. -#module-rgx= - -# Colon-delimited sets of names that determine each other's naming style when -# the name regexes allow several styles. -name-group= - -# Regular expression which should only match function or class names that do -# not require a docstring. -no-docstring-rgx=^_ - -# List of decorators that produce properties, such as abc.abstractproperty. Add -# to this list to register other decorators that produce valid properties. -# These decorators are taken in consideration only for invalid-name. -property-classes=abc.abstractproperty - -# Naming style matching correct variable names. -variable-naming-style=snake_case - -# Regular expression matching correct variable names. Overrides variable- -# naming-style. -#variable-rgx= - - -[FORMAT] - -# Expected format of line ending, e.g. empty (any line ending), LF or CRLF. -expected-line-ending-format= - -# Regexp for a line that is allowed to be longer than the limit. -ignore-long-lines=^\s*(# )??$ - -# Number of spaces of indent required inside a hanging or continued line. -indent-after-paren=4 - -# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1 -# tab). -indent-string=' ' - -# Maximum number of characters on a single line. -max-line-length=100 - -# Maximum number of lines in a module. -max-module-lines=1000 - -# List of optional constructs for which whitespace checking is disabled. `dict- -# separator` is used to allow tabulation in dicts, etc.: {1 : 1,\n222: 2}. -# `trailing-comma` allows a space between comma and closing bracket: (a, ). -# `empty-line` allows space-only lines. -no-space-check=trailing-comma, - dict-separator - -# Allow the body of a class to be on the same line as the declaration if body -# contains single statement. -single-line-class-stmt=no - -# Allow the body of an if to be on the same line as the test if there is no -# else. -single-line-if-stmt=no - - -[LOGGING] - -# Format style used to check logging format string. `old` means using % -# formatting, `new` is for `{}` formatting,and `fstr` is for f-strings. -logging-format-style=old - -# Logging modules to check that the string format arguments are in logging -# function parameter format. -logging-modules=logging - - -[MISCELLANEOUS] - -# List of note tags to take in consideration, separated by a comma. -notes=FIXME, - XXX, - TODO - - -[SIMILARITIES] - -# Ignore comments when computing similarities. -ignore-comments=yes - -# Ignore docstrings when computing similarities. -ignore-docstrings=yes - -# Ignore imports when computing similarities. -ignore-imports=no - -# Minimum lines number of a similarity. -min-similarity-lines=4 - - -[SPELLING] - -# Limits count of emitted suggestions for spelling mistakes. -max-spelling-suggestions=4 - -# Spelling dictionary name. Available dictionaries: none. To make it work, -# install the python-enchant package. -spelling-dict= - -# List of comma separated words that should not be checked. -spelling-ignore-words= - -# A path to a file that contains the private dictionary; one word per line. -spelling-private-dict-file= - -# Tells whether to store unknown words to the private dictionary (see the -# --spelling-private-dict-file option) instead of raising a message. -spelling-store-unknown-words=no - - -[STRING] - -# This flag controls whether the implicit-str-concat-in-sequence should -# generate a warning on implicit string concatenation in sequences defined over -# several lines. -check-str-concat-over-line-jumps=no - - -[TYPECHECK] - -# List of decorators that produce context managers, such as -# contextlib.contextmanager. Add to this list to register other decorators that -# produce valid context managers. -contextmanager-decorators=contextlib.contextmanager - -# List of members which are set dynamically and missed by pylint inference -# system, and so shouldn't trigger E1101 when accessed. Python regular -# expressions are accepted. -generated-members= - -# Tells whether missing members accessed in mixin class should be ignored. A -# mixin class is detected if its name ends with "mixin" (case insensitive). -ignore-mixin-members=yes - -# Tells whether to warn about missing members when the owner of the attribute -# is inferred to be None. -ignore-none=yes - -# This flag controls whether pylint should warn about no-member and similar -# checks whenever an opaque object is returned when inferring. The inference -# can return multiple potential results while evaluating a Python object, but -# some branches might not be evaluated, which results in partial inference. In -# that case, it might be useful to still emit no-member and other checks for -# the rest of the inferred objects. -ignore-on-opaque-inference=yes - -# List of class names for which member attributes should not be checked (useful -# for classes with dynamically set attributes). This supports the use of -# qualified names. -ignored-classes=optparse.Values,thread._local,_thread._local - -# List of module names for which member attributes should not be checked -# (useful for modules/projects where namespaces are manipulated during runtime -# and thus existing member attributes cannot be deduced by static analysis). It -# supports qualified module names, as well as Unix pattern matching. -ignored-modules= - -# Show a hint with possible names when a member name was not found. The aspect -# of finding the hint is based on edit distance. -missing-member-hint=yes - -# The minimum edit distance a name should have in order to be considered a -# similar match for a missing member name. -missing-member-hint-distance=1 - -# The total number of similar names that should be taken in consideration when -# showing a hint for a missing member. -missing-member-max-choices=1 - -# List of decorators that change the signature of a decorated function. -signature-mutators= - - -[VARIABLES] - -# List of additional names supposed to be defined in builtins. Remember that -# you should avoid defining new builtins when possible. -additional-builtins= - -# Tells whether unused global variables should be treated as a violation. -allow-global-unused-variables=yes - -# List of strings which can identify a callback function by name. A callback -# name must start or end with one of those strings. -callbacks=cb_, - _cb - -# A regular expression matching the name of dummy variables (i.e. expected to -# not be used). -dummy-variables-rgx=_+$|(_[a-zA-Z0-9_]*[a-zA-Z0-9]+?$)|dummy|^ignored_|^unused_ - -# Argument names that match this expression will be ignored. Default to name -# with leading underscore. -ignored-argument-names=_.*|^ignored_|^unused_ - -# Tells whether we should check for unused import in __init__ files. -init-import=no - -# List of qualified module names which can have objects that can redefine -# builtins. -redefining-builtins-modules=six.moves,past.builtins,future.builtins,builtins,io - - -[CLASSES] - -# List of method names used to declare (i.e. assign) instance attributes. -defining-attr-methods=__init__, - __new__, - setUp, - __post_init__ - -# List of member names, which should be excluded from the protected access -# warning. -exclude-protected=_asdict, - _fields, - _replace, - _source, - _make - _rows - -# List of valid names for the first argument in a class method. -valid-classmethod-first-arg=cls - -# List of valid names for the first argument in a metaclass class method. -valid-metaclass-classmethod-first-arg=cls - - -[DESIGN] - -# Maximum number of arguments for function / method. -max-args=6 - -# Maximum number of attributes for a class (see R0902). -max-attributes=15 - -# Maximum number of boolean expressions in an if statement (see R0916). -max-bool-expr=5 - -# Maximum number of branch for function / method body. -max-branches=12 - -# Maximum number of locals for function / method body. -max-locals=15 - -# Maximum number of parents for a class (see R0901). -max-parents=7 - -# Maximum number of public methods for a class (see R0904). -max-public-methods=20 - -# Maximum number of return / yield for function / method body. -max-returns=6 - -# Maximum number of statements in function / method body. -max-statements=50 - -# Minimum number of public methods for a class (see R0903). -min-public-methods=2 - - -[IMPORTS] - -# List of modules that can be imported at any level, not just the top level -# one. -allow-any-import-level= - -# Allow wildcard imports from modules that define __all__. -allow-wildcard-with-all=no - -# Analyse import fallback blocks. This can be used to support both Python 2 and -# 3 compatible code, which means that the block might have code that exists -# only in one or another interpreter, leading to false positives when analysed. -analyse-fallback-blocks=no - -# Deprecated modules which should not be used, separated by a comma. -deprecated-modules=optparse,tkinter.tix - -# Create a graph of external dependencies in the given file (report RP0402 must -# not be disabled). -ext-import-graph= - -# Create a graph of every (i.e. internal and external) dependencies in the -# given file (report RP0402 must not be disabled). -import-graph= - -# Create a graph of internal dependencies in the given file (report RP0402 must -# not be disabled). -int-import-graph= - -# Force import order to recognize a module as part of the standard -# compatibility libraries. -known-standard-library= - -# Force import order to recognize a module as part of a third party library. -known-third-party=enchant - -# Couples of modules and preferred modules, separated by a comma. -preferred-modules= - - -[EXCEPTIONS] - -# Exceptions that will emit a warning when being caught. Defaults to -# "BaseException, Exception". -overgeneral-exceptions=BaseException, - Exception diff --git a/packageship/packageship/selfpkg.py b/packageship/packageship/selfpkg.py deleted file mode 100644 index 247627c2063584fa6d6a7987ec3c3470586ac96e..0000000000000000000000000000000000000000 --- a/packageship/packageship/selfpkg.py +++ /dev/null @@ -1,34 +0,0 @@ -#!/usr/bin/python3 -""" -Description: Entry for project initialization and service startupc -""" -import os -try: - from packageship.application import init_app - if not os.path.exists(os.environ.get('SETTINGS_FILE_PATH')): - raise RuntimeError( - 'System configuration file:%s' % os.environ.get( - 'SETTINGS_FILE_PATH'), - 'does not exist, software service cannot be started') - app = init_app("query") -except ImportError as error: - raise RuntimeError( - "The package management software service failed to start : %s" % error) -else: - from packageship.application.app_global import identity_verification - from packageship.libs.conf import configuration - - -@app.before_request -def before_request(): - """ - Description: Global request interception - """ - if not identity_verification(): - return 'No right to perform operation' - - -if __name__ == "__main__": - port = configuration.QUERY_PORT - addr = configuration.QUERY_IP_ADDR - app.run(port=port, host=addr) diff --git a/packageship/packageship/system_config.py b/packageship/packageship/system_config.py deleted file mode 100644 index 14de44a96e3c3dad4c030e2d4a942a63dfb170c0..0000000000000000000000000000000000000000 --- a/packageship/packageship/system_config.py +++ /dev/null @@ -1,33 +0,0 @@ -#!/usr/bin/python3 -""" -Description:System-level file configuration, mainly configure -the address of the operating environment, commonly used variables, etc. -""" - -import os -import sys - - -# The root directory where the system is running -if getattr(sys, 'frozen', False): - BASE_PATH = os.path.dirname(os.path.realpath(sys.argv[0])) -else: - BASE_PATH = os.path.abspath(os.path.dirname(__file__)) - -# system configuration file path - -SYS_CONFIG_PATH = os.path.join('/', 'etc', 'pkgship', 'package.ini') - -# data file after successful data import - -DATABASE_FILE_INFO = os.path.join( - '/', 'var', 'run', 'database_file_info.yaml') - -# If the path of the imported database is not specified in the configuration file, the -# configuration in the system is used by default -DATABASE_FOLDER_PATH = os.path.join('/', 'var', 'run', 'pkgship_dbs') - - -# If the directory of log storage is not configured, -# it will be stored in the following directory specified by the system by default -LOG_FOLDER_PATH = os.path.join('/', 'var', 'log', 'pkgship') diff --git a/packageship/pkgship.spec b/packageship/pkgship.spec deleted file mode 100644 index 606ce5c1a0cad5934bf6f074a26e9937af484a35..0000000000000000000000000000000000000000 --- a/packageship/pkgship.spec +++ /dev/null @@ -1,97 +0,0 @@ -Name: pkgship -Version: 1.1.0 -Release: 3 -Summary: Pkgship implements rpm package dependence ,maintainer, patch query and so no. -License: Mulan 2.0 -URL: https://gitee.com/openeuler/openEuler-Advisor -Source0: https://gitee.com/openeuler/openEuler-Advisor/pkgship-%{version}.tar.gz - -BuildArch: noarch - -BuildRequires: python3-flask-restful python3-flask python3 python3-pyyaml python3-sqlalchemy -BuildRequires: python3-prettytable python3-requests python3-flask-session python3-flask-script python3-marshmallow -BuildRequires: python3-Flask-APScheduler python3-pandas python3-retrying python3-xlrd python3-XlsxWriter -BuildRequires: python3-concurrent-log-handler -Requires: python3-pip python3-flask-restful python3-flask python3 python3-pyyaml -Requires: python3-sqlalchemy python3-prettytable python3-requests python3-concurrent-log-handler -Requires: python3-flask-session python3-flask-script python3-marshmallow python3-uWSGI -Requires: python3-pandas python3-dateutil python3-XlsxWriter python3-xlrd python3-Flask-APScheduler python3-retrying - -%description -Pkgship implements rpm package dependence ,maintainer, patch query and so no. - -%prep -%autosetup -n pkgship-%{version} - -%build -%py3_build - -%install -%py3_install - - -%check -# The apscheduler cannot catch the local time, so a time zone must be assigned before running the test case. -export TZ=Asia/Shanghai -# change log_path to solve default log_path permission denied problem -log_path=`pwd`/tmp/ -sed -i "/\[LOG\]/a\log_path=$log_path" test/common_files/package.ini -%{__python3} -m unittest test/init_test.py -%{__python3} -m unittest test/read_test.py -%{__python3} -m unittest test/write_test.py -rm -rf $log_path - -%post - -%postun - - -%files -%doc README.md -%{python3_sitelib}/* -%attr(0755,root,root) %config %{_sysconfdir}/pkgship/* -%attr(0755,root,root) %{_bindir}/pkgshipd -%attr(0755,root,root) %{_bindir}/pkgship - -%changelog -* Fri Sep 11 2020 Yiru Wang - 1.1.0-3 -- #I1UCM8, #I1UC8G: Modify some config files' permission issue; -- #I1TIYQ: Add concurrent-log-handler module to fix log resource conflict issue -- #I1TML0: Fix the matching relationship between source_rpm and src_name - -* Tue Sep 1 2020 Zhengtang Gong - 1.1.0-2 -- Delete the packaged form of pyinstaller and change the execution - of the command in the form of a single file as the input - -* Sat Aug 29 2020 Yiru Wang - 1.1.0-1 -- Add package management features: - RPM packages statically displayed in the version repository - RPM packages used time displayed for current version in the version repository - Issue management of packages in a version-management repository - -* Fri Aug 21 2020 Chengqiang Bao < baochengqiang1@huawei.com > - 1.0.0-7 -- Fixed a problem with command line initialization of the Filepath parameter where relative paths are not supported and paths are too long - -* Wed Aug 12 2020 Zhang Tao - 1.0.0-6 -- Fix the test content to adapt to the new data structure, add BuildRequires for running %check - -* Mon Aug 10 2020 Zhengtang Gong - 1.0-5 -- Command line supports calling remote services - -* Wed Aug 5 2020 Yiru Wang - 1.0-4 -- change Requires rpm pakcages' name to latest one - -* Mon Jul 13 2020 Yiru Wang - 1.0-3 -- run test cases while building - -* Sat Jul 4 2020 Yiru Wang - 1.0-2 -- cheange requires python3.7 to python3,add check pyinstaller file. - -* Tue Jun 30 2020 Yiru Wang - 1.0-1 -- add pkgshipd file - -* Thu Jun 11 2020 Feng Hu - 1.0-0 -- add macro to build cli bin when rpm install - -* Sat Jun 6 2020 Feng Hu - 1.0-0 -- init package diff --git a/packageship/setup.py b/packageship/setup.py deleted file mode 100644 index 4e74105ed51f83d49a3c74d20438e699d2f75ce0..0000000000000000000000000000000000000000 --- a/packageship/setup.py +++ /dev/null @@ -1,68 +0,0 @@ -#!/usr/bin/python3 -""" -Package management program installation configuration -file for software packaging -""" -from distutils.core import setup - -_CONFIG_PATH = "/etc/pkgship/" - -setup( - name='packageship', - version='1.0', - py_modules=[ - 'packageship.application.__init__', - 'packageship.application.app_global', - 'packageship.application.apps.__init__', - 'packageship.application.apps.package.serialize', - 'packageship.application.apps.package.url', - 'packageship.application.apps.package.view', - 'packageship.application.apps.package.function.be_depend', - 'packageship.application.apps.package.function.build_depend', - 'packageship.application.apps.package.function.constants', - 'packageship.application.apps.package.function.install_depend', - 'packageship.application.apps.package.function.packages', - 'packageship.application.apps.package.function.searchdb', - 'packageship.application.apps.package.function.self_depend', - 'packageship.application.apps.lifecycle.function.download_yaml', - 'packageship.application.apps.lifecycle.function.gitee', - 'packageship.application.apps.lifecycle.function.concurrent', - 'packageship.application.apps.lifecycle.serialize', - 'packageship.application.apps.lifecycle.url', - 'packageship.application.apps.lifecycle.view', - 'packageship.application.apps.dependinfo.function.singlegraph', - 'packageship.application.apps.dependinfo.function.graphcache', - 'packageship.application.apps.dependinfo.serialize', - 'packageship.application.apps.dependinfo.url', - 'packageship.application.apps.dependinfo.view', - 'packageship.application.initsystem.data_import', - 'packageship.application.models.package', - 'packageship.application.settings', - 'packageship.libs.__init__', - 'packageship.libs.configutils.readconfig', - 'packageship.libs.dbutils.sqlalchemy_helper', - 'packageship.libs.exception.ext', - 'packageship.libs.log.loghelper', - 'packageship.libs.conf.global_config', - 'packageship.manage', - 'packageship.pkgship', - 'packageship.selfpkg', - 'packageship.system_config'], - requires=['prettytable (==0.7.2)', - 'Flask_RESTful (==0.3.8)', - 'Flask_Session (==0.3.1)', - 'Flask_Script (==2.0.6)', - 'Flask (==1.1.2)', - 'marshmallow (==3.5.1)', - 'SQLAlchemy (==1.3.16)', - 'PyYAML (==5.3.1)', - 'requests (==2.21.0)', - 'pyinstall (==0.1.4)', - 'uwsgi (==2.0.18)'], - license='Dependency package management', - long_description=open('README.md', encoding='utf-8').read(), - author='gongzt', - data_files=[ - (_CONFIG_PATH, ['packageship/package.ini', 'conf.yaml']), - ('/usr/bin', ['packageship/pkgshipd', 'packageship/pkgship'])] -) diff --git a/packageship/test/__init__.py b/packageship/test/__init__.py deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/packageship/test/base_code/__init__.py b/packageship/test/base_code/__init__.py deleted file mode 100644 index 8b137891791fe96927ad78e64b0aad7bded08bdc..0000000000000000000000000000000000000000 --- a/packageship/test/base_code/__init__.py +++ /dev/null @@ -1 +0,0 @@ - diff --git a/packageship/test/base_code/basetest.py b/packageship/test/base_code/basetest.py deleted file mode 100644 index efc030c4b29efd3669d091ccaff0596a3c3fe289..0000000000000000000000000000000000000000 --- a/packageship/test/base_code/basetest.py +++ /dev/null @@ -1,16 +0,0 @@ -#!/usr/bin/python3 -import unittest - - -class TestBase(unittest.TestCase): - """ - unittest unit test basic class - """ - - def response_json_format(self, response): - """ - Json data judgment of corresponding content - """ - self.assertIn("code", response, msg="Error in data format return") - self.assertIn("msg", response, msg="Error in data format return") - self.assertIn("data", response, msg="Error in data format return") diff --git a/packageship/test/base_code/common_test_code.py b/packageship/test/base_code/common_test_code.py deleted file mode 100644 index ba8950ccdbc93b4e3844145d17236b54dc1d14f2..0000000000000000000000000000000000000000 --- a/packageship/test/base_code/common_test_code.py +++ /dev/null @@ -1,49 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -Compare the values in two Python data types for equality, ignoring the order of values -""" - -import os -import json -from packageship.system_config import BASE_PATH - - -def compare_two_values(obj1, obj2): - """ - - Args: - obj1:object1 It can be a data type in Python, - and can be converted by using the str() method - obj2:object2 same as obj1 - - Returns: True or False - - """ - # With the help of the str() method provided by python,It's so powerful - - return obj1 == obj2 or (isinstance(obj1, type(obj2)) and - "".join(sorted(str(obj1))) == "".join(sorted(str(obj2)))) - - -def get_correct_json_by_filename(filename): - """ - - Args: - filename: Correct JSON file name without suffix - - Returns: list this json file's content - - """ - json_path = os.path.join(os.path.dirname(BASE_PATH), - "test", - "common_files", - "correct_test_result_json", - "{}.json".format(filename)) - try: - with open(json_path, "r", encoding='utf-8') as json_fp: - correct_list = json.loads(json_fp.read()) - except FileNotFoundError: - return [] - - return correct_list diff --git a/packageship/test/base_code/init_config_path.py b/packageship/test/base_code/init_config_path.py deleted file mode 100644 index 7f04303df66f912f93620f7c370052b6e5ecae7a..0000000000000000000000000000000000000000 --- a/packageship/test/base_code/init_config_path.py +++ /dev/null @@ -1,46 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -InitConf -""" -import os -from configparser import ConfigParser -from packageship import system_config -import yaml - - -class InitConf: - """ - InitConf - """ - - def __init__(self): - base_path = os.path.join(os.path.dirname(system_config.BASE_PATH), - "test", - "common_files") - config = ConfigParser() - config.read(system_config.SYS_CONFIG_PATH) - - conf_path = os.path.join(base_path, "conf.yaml") - - config.set("SYSTEM", "init_conf_path", conf_path) - config.write(open(system_config.SYS_CONFIG_PATH, "w")) - - with open(conf_path, 'r', encoding='utf-8') as f: - origin_yaml = yaml.load(f.read(), Loader=yaml.FullLoader) - - for index, obj in enumerate(origin_yaml, 1): - src_path = os.path.join(base_path, "db_origin", - "data_{}_src.sqlite".format(str(index))) - bin_path = os.path.join(base_path, "db_origin", - "data_{}_bin.sqlite".format(str(index))) - obj["src_db_file"] = src_path - obj["bin_db_file"] = bin_path - with open(conf_path, 'w', encoding='utf-8') as w_f: - yaml.dump(origin_yaml, w_f) - - -# A simple method of single case model -# Prevent multiple file modifications - -init_config = InitConf() diff --git a/packageship/test/base_code/my_test_runner.py b/packageship/test/base_code/my_test_runner.py deleted file mode 100644 index 4f506ba686c46f7c6d942ef0427e36e9b737fc0f..0000000000000000000000000000000000000000 --- a/packageship/test/base_code/my_test_runner.py +++ /dev/null @@ -1,61 +0,0 @@ -#!/usr/bin/python3 -""" -Inherited from unittest.TestResult, -The simple statistical function is realized. -""" -import sys -import unittest - - -class MyTestResult(unittest.TestResult): - """ - Inherited from unittest.TestResult, - The simple statistical function is realized. - """ - - def __init__(self, verbosity=0): - super(MyTestResult, self).__init__() - self.success_case_count = 0 - self.err_case_count = 0 - self.failure_case_count = 0 - self.verbosity = verbosity - - def addSuccess(self, test): - """When the use case is executed successfully""" - self.success_case_count += 1 - super(MyTestResult, self).addSuccess(test) - sys.stderr.write('Success ') - sys.stderr.write(str(test)) - sys.stderr.write('\n') - - def addError(self, test, err): - """When a code error causes a use case to fail""" - self.err_case_count += 1 - super(MyTestResult, self).addError(test, err) - sys.stderr.write('Error ') - sys.stderr.write(str(test)+'\n') - _,err_info = self.errors[-1] - sys.stderr.write(err_info) - sys.stderr.write('\n') - - def addFailure(self, test, err): - """When the assertion is false""" - self.failure_case_count += 1 - super(MyTestResult, self).addFailure(test, err) - sys.stderr.write('Failure ') - sys.stderr.write(str(test) + '\n') - _, err_info = self.failures[-1] - sys.stderr.write(err_info) - sys.stderr.write('\n') - - -class MyTestRunner(): - """ - Run All TestCases - """ - - def run(self, test): - """run MyTestResult and return result""" - result = MyTestResult() - test(result) - return result diff --git a/packageship/test/base_code/operate_data_base.py b/packageship/test/base_code/operate_data_base.py deleted file mode 100644 index bbdee42123e3ad2d9b8a781ac2820d531cbd12c7..0000000000000000000000000000000000000000 --- a/packageship/test/base_code/operate_data_base.py +++ /dev/null @@ -1,40 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -OperateTestBase -""" -import os -import unittest - -from packageship.libs.exception import Error - -try: - from packageship import system_config - - system_config.SYS_CONFIG_PATH = os.path.join(os.path.dirname(system_config.BASE_PATH), - 'test', - 'common_files', - 'package.ini') - - system_config.DATABASE_FILE_INFO = os.path.join(os.path.dirname(system_config.BASE_PATH), - 'test', - 'common_files', - 'database_file_info.yaml') - system_config.DATABASE_FOLDER_PATH = os.path.join(os.path.dirname(system_config.BASE_PATH), - 'test', - 'common_files', - 'operate_dbs') - - from test.base_code.init_config_path import init_config - from packageship.manage import app -except Error: - raise - - -class OperateTestBase(unittest.TestCase): - """ - OperateTestBase - """ - - def setUp(self): - self.client = app.test_client() diff --git a/packageship/test/base_code/read_data_base.py b/packageship/test/base_code/read_data_base.py deleted file mode 100644 index 8e6a4c0fa996a961c49ffdfa941621f95b49597f..0000000000000000000000000000000000000000 --- a/packageship/test/base_code/read_data_base.py +++ /dev/null @@ -1,44 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -import os -import unittest -import json -from .basetest import TestBase - -from packageship.libs.exception import Error -try: - from packageship import system_config - - system_config.SYS_CONFIG_PATH = os.path.join(os.path.dirname(system_config.BASE_PATH), - 'test', - 'common_files', - 'package.ini') - - system_config.DATABASE_FILE_INFO = os.path.join(os.path.dirname(system_config.BASE_PATH), - 'test', - 'common_files', - 'database_file_info.yaml') - system_config.DATABASE_FOLDER_PATH = os.path.join(os.path.dirname(system_config.BASE_PATH), - 'test', - 'common_files', - 'dbs') - - from test.base_code.init_config_path import init_config - from packageship.selfpkg import app - -except Error: - raise - - -class ReadTestBase(TestBase): - - def client_request(self, url): - """ - Simulate client sending request - """ - response = self.client.get(url) - response_content = json.loads(response.data) - return response_content - - def setUp(self): - self.client = app.test_client() diff --git a/packageship/test/common_files/conf.yaml b/packageship/test/common_files/conf.yaml deleted file mode 100644 index 04dffd345879fcee9778f6097c607b10278326c3..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/conf.yaml +++ /dev/null @@ -1,10 +0,0 @@ -- bin_db_file: - dbname: mainline - priority: 1 - src_db_file: - lifecycle: enable -- bin_db_file: - dbname: fedora30 - priority: 2 - src_db_file: - lifecycle: enable diff --git a/packageship/test/common_files/correct_test_result_json/be_depend.json b/packageship/test/common_files/correct_test_result_json/be_depend.json deleted file mode 100644 index 33423547638d23f53feaf5394298e45f3c08c2f4..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/correct_test_result_json/be_depend.json +++ /dev/null @@ -1,153 +0,0 @@ -[ - { - "input": { - "packagename": "A", - "dbname": "mainline" - }, - "output": { - "code": "2001", - "data": { - "A1": [ - "A", - "0.0.23b", - "mainline", - [ - [ - "B", - "build" - ], - [ - "B1", - "install" - ], - [ - "D", - "build" - ], - [ - "D1", - "install" - ] - ] - ], - "A2": [ - "A", - "0.0.23b", - "mainline", - [ - [ - "A1", - "install" - ], - [ - "C", - "build" - ], - [ - "C1", - "install" - ] - ] - ], - "A_src": [ - "source", - "0.0.23b", - "mainline", - [ - [ - "root", - null - ] - ] - ], - "B1": [ - "B", - "0.0.2", - "mainline", - [ - [ - "A", - "build" - ], - [ - "D", - "build" - ] - ] - ], - "B2": [ - "B", - "0.0.2", - "mainline", - [ - [ - "C", - "build" - ] - ] - ], - "C1": [ - "C", - "0.1", - "mainline", - [ - [ - "A", - "build" - ], - [ - "A2", - "install" - ], - [ - "B", - "build" - ], - [ - "B2", - "install" - ] - ] - ], - "C2": [ - "C", - "0.1", - "mainline", - [ - [ - null, - null - ] - ] - ], - "D1": [ - "D", - "0.11", - "mainline", - [ - [ - "A2", - "install" - ], - [ - "D2", - "install" - ] - ] - ], - "D2": [ - "D", - "0.11", - "mainline", - [ - [ - null, - null - ] - ] - ] - }, - "msg": "Successful Operation!" - } - } -] \ No newline at end of file diff --git a/packageship/test/common_files/correct_test_result_json/build_depend.json b/packageship/test/common_files/correct_test_result_json/build_depend.json deleted file mode 100644 index 945d58fa1f8b3f0e117506c5a6de4976a6ac939c..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/correct_test_result_json/build_depend.json +++ /dev/null @@ -1,94 +0,0 @@ -[ - { - "input": { - "sourceName": "A" - }, - "output": { - "code": "2001", - "data": { - "build_dict": { - "A_src": [ - "source", - "0.0.23b", - "mainline", - [ - [ - "root", - null - ] - ] - ], - "B1": [ - "B", - "0.0.2", - "mainline", - [ - [ - "A", - "build" - ] - ] - ], - "C1": [ - "C", - "0.1", - "mainline", - [ - [ - "A", - "build" - ], - [ - "A2", - "install" - ] - ] - ], - "A1": [ - "A", - "0.0.23b", - "mainline", - [ - [ - "B1", - "install" - ], - [ - "D1", - "install" - ] - ] - ], - "A2": [ - "A", - "0.0.23b", - "mainline", - [ - [ - "A1", - "install" - ], - [ - "C1", - "install" - ] - ] - ], - "D1": [ - "D", - "0.11", - "mainline", - [ - [ - "A2", - "install" - ] - ] - ] - }, - "not_found_components": [] - }, - "msg": "Successful Operation!" - } - } -] \ No newline at end of file diff --git a/packageship/test/common_files/correct_test_result_json/dependinfo_be_depend.json b/packageship/test/common_files/correct_test_result_json/dependinfo_be_depend.json deleted file mode 100644 index f0416fe6553237f2ba528b0f1af7e89d8c4c4d3f..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/correct_test_result_json/dependinfo_be_depend.json +++ /dev/null @@ -1,81 +0,0 @@ -[ - { - "input": { - "packagename": "A", - "dbname": "mainline" - }, - "output": { - "code": "2001", - "data": { - "binary_dicts": [ - { - "binary_name": "A1", - "database": "mainline", - "source_name": "A", - "version": "0.0.23b" - }, - { - "binary_name": "A2", - "database": "mainline", - "source_name": "A", - "version": "0.0.23b" - }, - { - "binary_name": "B1", - "database": "mainline", - "source_name": "B", - "version": "0.0.2" - }, - { - "binary_name": "B2", - "database": "mainline", - "source_name": "B", - "version": "0.0.2" - }, - { - "binary_name": "C1", - "database": "mainline", - "source_name": "C", - "version": "0.1" - }, - { - "binary_name": "D1", - "database": "mainline", - "source_name": "D", - "version": "0.11" - } - ], - "source_dicts": [ - { - "database": "mainline", - "source_name": "B", - "version": "0.0.2" - }, - { - "database": "mainline", - "source_name": "A", - "version": "0.0.23b" - }, - { - "database": "mainline", - "source_name": "C", - "version": "0.1" - }, - { - "database": "mainline", - "source_name": "D", - "version": "0.11" - } - ], - "statistics": [ - { - "binary_num": 6, - "database": "mainline", - "source_num": 4 - } - ] - }, - "msg": "Successful Operation!" - } - } -] \ No newline at end of file diff --git a/packageship/test/common_files/correct_test_result_json/dependinfo_self_depend.json b/packageship/test/common_files/correct_test_result_json/dependinfo_self_depend.json deleted file mode 100644 index d85050b2ecf611209ba32385260756304fb5040d..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/correct_test_result_json/dependinfo_self_depend.json +++ /dev/null @@ -1,334 +0,0 @@ -[ - { - "input": { - "packagename": "A1" - }, - "output": { - "code": "2001", - "data": { - "binary_dicts": [ - { - "binary_name": "A2", - "database": "mainline", - "source_name": "A", - "version": "0.0.23b" - }, - { - "binary_name": "C1", - "database": "mainline", - "source_name": "C", - "version": "0.1" - }, - { - "binary_name": "D1", - "database": "mainline", - "source_name": "D", - "version": "0.11" - }, - { - "binary_name": "B1", - "database": "mainline", - "source_name": "B", - "version": "0.0.2" - }, - { - "binary_name": "B2", - "database": "mainline", - "source_name": "B", - "version": "0.0.2" - } - ], - "source_dicts": [ - { - "database": "mainline", - "source_name": "A", - "version": "0.0.23b" - }, - { - "database": "mainline", - "source_name": "C", - "version": "0.1" - }, - { - "database": "mainline", - "source_name": "D", - "version": "0.11" - }, - { - "database": "mainline", - "source_name": "B", - "version": "0.0.2" - } - ], - "statistics": [ - { - "binary_num": 5, - "database": "mainline", - "source_num": 4 - }, - { - "binary_num": 0, - "database": "fedora30", - "source_num": 0 - } - ] - }, - "msg": "Successful Operation!" - } - }, - { - "input": { - "packagename": "C", - "packtype": "source" - }, - "output": { - "code": "2001", - "data": { - "binary_dicts": [ - { - "binary_name": "A2", - "database": "mainline", - "source_name": "A", - "version": "0.0.23b" - }, - { - "binary_name": "D1", - "database": "mainline", - "source_name": "D", - "version": "0.11" - }, - { - "binary_name": "A1", - "database": "mainline", - "source_name": "A", - "version": "0.0.23b" - }, - { - "binary_name": "B1", - "database": "mainline", - "source_name": "B", - "version": "0.0.2" - }, - { - "binary_name": "B2", - "database": "mainline", - "source_name": "B", - "version": "0.0.2" - } - ], - "source_dicts": [ - { - "database": "mainline", - "source_name": "C", - "version": "0.1" - }, - { - "database": "mainline", - "source_name": "A", - "version": "0.0.23b" - }, - { - "database": "mainline", - "source_name": "D", - "version": "0.11" - }, - { - "database": "mainline", - "source_name": "B", - "version": "0.0.2" - } - ], - "statistics": [ - { - "binary_num": 5, - "database": "mainline", - "source_num": 4 - }, - { - "binary_num": 0, - "database": "fedora30", - "source_num": 0 - } - ] - }, - "msg": "Successful Operation!" - } - }, - { - "input": { - "packagename": "A2", - "selfbuild": "0", - "withsubpack": "1" - }, - "output": { - "code": "2001", - "data": { - "binary_dicts": [ - { - "binary_name": "C1", - "database": "mainline", - "source_name": "C", - "version": "0.1" - }, - { - "binary_name": "D1", - "database": "mainline", - "source_name": "D", - "version": "0.11" - }, - { - "binary_name": "A1", - "database": "mainline", - "source_name": "A", - "version": "0.0.23b" - }, - { - "binary_name": "C2", - "database": "mainline", - "source_name": "C", - "version": "0.1" - }, - { - "binary_name": "D2", - "database": "mainline", - "source_name": "D", - "version": "0.11" - }, - { - "binary_name": "B1", - "database": "mainline", - "source_name": "B", - "version": "0.0.2" - }, - { - "binary_name": "B2", - "database": "mainline", - "source_name": "B", - "version": "0.0.2" - } - ], - "source_dicts": [ - { - "database": "mainline", - "source_name": "A", - "version": "0.0.23b" - }, - { - "database": "mainline", - "source_name": "C", - "version": "0.1" - }, - { - "database": "mainline", - "source_name": "D", - "version": "0.11" - }, - { - "database": "mainline", - "source_name": "B", - "version": "0.0.2" - } - ], - "statistics": [ - { - "binary_num": 7, - "database": "mainline", - "source_num": 4 - }, - { - "binary_num": 0, - "database": "fedora30", - "source_num": 0 - } - ] - }, - "msg": "Successful Operation!" - } - }, - { - "input": { - "packagename": "A", - "selfbuild": "0", - "withsubpack": "1", - "packtype": "source" - }, - "output": { - "code": "2001", - "data": { - "binary_dicts": [ - { - "binary_name": "D1", - "database": "mainline", - "source_name": "D", - "version": "0.11" - }, - { - "binary_name": "C1", - "database": "mainline", - "source_name": "C", - "version": "0.1" - }, - { - "binary_name": "C2", - "database": "mainline", - "source_name": "C", - "version": "0.1" - }, - { - "binary_name": "D2", - "database": "mainline", - "source_name": "D", - "version": "0.11" - }, - { - "binary_name": "B1", - "database": "mainline", - "source_name": "B", - "version": "0.0.2" - }, - { - "binary_name": "B2", - "database": "mainline", - "source_name": "B", - "version": "0.0.2" - } - ], - "source_dicts": [ - { - "database": "mainline", - "source_name": "A", - "version": "0.0.23b" - }, - { - "database": "mainline", - "source_name": "D", - "version": "0.11" - }, - { - "database": "mainline", - "source_name": "C", - "version": "0.1" - }, - { - "database": "mainline", - "source_name": "B", - "version": "0.0.2" - } - ], - "statistics": [ - { - "binary_num": 6, - "database": "mainline", - "source_num": 4 - }, - { - "binary_num": 0, - "database": "fedora30", - "source_num": 0 - } - ] - }, - "msg": "Successful Operation!" - } - } -] \ No newline at end of file diff --git a/packageship/test/common_files/correct_test_result_json/get_repodatas.json b/packageship/test/common_files/correct_test_result_json/get_repodatas.json deleted file mode 100644 index c977baa482b87e8d8d400469974ff312ec65f5bb..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/correct_test_result_json/get_repodatas.json +++ /dev/null @@ -1,10 +0,0 @@ -[ - { - "name": "mainline", - "priority": 1 - }, - { - "name": "fedora30", - "priority": 2 - } -] \ No newline at end of file diff --git a/packageship/test/common_files/correct_test_result_json/get_single_package.json b/packageship/test/common_files/correct_test_result_json/get_single_package.json deleted file mode 100644 index 32656ba744299405c9104e4ce79ff96d72a6e731..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/correct_test_result_json/get_single_package.json +++ /dev/null @@ -1,77 +0,0 @@ -{ - "buildrequired": [ - "B1", - "C1", - "C1" - ], - "description": "0 A.D. (pronounced \"zero ey-dee\") is a free, open-source, cross-platform\nreal-time strategy (RTS) game of ancient warfare. In short, it is a\nhistorically-based war/economy game that allows players to relive or rewrite\nthe history of Western civilizations, focusing on the years between 500 B.C.\nand 500 A.D. The project is highly ambitious, involving state-of-the-art 3D\ngraphics, detailed artwork, sound, and a flexible and powerful custom-built\ngame engine.\n\nThe game has been in development by Wildfire Games (WFG), a group of volunteer,\nhobbyist game developers, since 2001.", - "feature": 0, - "gitee_url": "www.gitee.com/src-openeuler/A", - "issue": 0, - "license": "GPLv2+ and BSD and MIT and IBM", - "maintainer": null, - "maintainlevel": null, - "pkg_name": "A", - "release": "1.fc29", - "subpack": [ - { - "id": 1, - "name": "A1", - "provides": [ - { - "id": 4, - "name": "Ba", - "requiredby": [ - "B1" - ] - }, - { - "id": 7, - "name": "Da", - "requiredby": [ - "D1" - ] - } - ], - "requires": [ - { - "id": 1, - "name": "Ac", - "providedby": [ - "C1" - ] - } - ] - }, - { - "id": 2, - "name": "A2", - "provides": [ - { - "id": 6, - "name": "Ca", - "requiredby": [] - } - ], - "requires": [ - { - "id": 2, - "name": "Ac", - "providedby": [ - "C1" - ] - }, - { - "id": 3, - "name": "Bc", - "providedby": [ - "C1" - ] - } - ] - } - ], - "summary": "Cross-Platform RTS Game of Ancient Warfare", - "url": "http://play0ad.com", - "version": "0.0.23b" - } \ No newline at end of file diff --git a/packageship/test/common_files/correct_test_result_json/install_depend.json b/packageship/test/common_files/correct_test_result_json/install_depend.json deleted file mode 100644 index 34d0982156ae0ff0cb22010998d4de46163de018..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/correct_test_result_json/install_depend.json +++ /dev/null @@ -1,170 +0,0 @@ -[ - { - "input": { - "binaryName": "A1" - }, - "output": { - "code": "2001", - "data": { - "install_dict": { - "A1": [ - "A", - "0.0.23b", - "mainline", - [ - [ - "root", - null - ], - [ - "D1", - "install" - ] - ] - ], - "A2": [ - "A", - "0.0.23b", - "mainline", - [ - [ - "A1", - "install" - ], - [ - "C1", - "install" - ] - ] - ], - "C1": [ - "C", - "0.1", - "mainline", - [ - [ - "A2", - "install" - ] - ] - ], - "D1": [ - "D", - "0.11", - "mainline", - [ - [ - "A2", - "install" - ] - ] - ] - }, - "not_found_components": [] - }, - "msg": "Successful Operation!" - } - }, - { - "input": { - "binaryName": "D2" - }, - "output": { - "code": "2001", - "data": { - "install_dict": { - "A1": [ - "A", - "0.0.23b", - "mainline", - [ - [ - "D1", - "install" - ] - ] - ], - "A2": [ - "A", - "0.0.23b", - "mainline", - [ - [ - "A1", - "install" - ], - [ - "C1", - "install" - ] - ] - ], - "C1": [ - "C", - "0.1", - "mainline", - [ - [ - "A2", - "install" - ] - ] - ], - "D1": [ - "D", - "0.11", - "mainline", - [ - [ - "D2", - "install" - ], - [ - "A2", - "install" - ] - ] - ], - "D2": [ - "D", - "0.11", - "mainline", - [ - [ - "root", - null - ] - ] - ] - }, - "not_found_components": [] - }, - "msg": "Successful Operation!" - } - }, - { - "input": { - "binaryName": "C2" - }, - "output": { - "code": "2001", - "data": { - "install_dict": { - "C2": [ - "C", - "0.1", - "mainline", - [ - [ - "root", - null - ] - ] - ] - }, - "not_found_components": [] - }, - "msg": "Successful Operation!" - } - } -] \ No newline at end of file diff --git a/packageship/test/common_files/correct_test_result_json/issues.json b/packageship/test/common_files/correct_test_result_json/issues.json deleted file mode 100644 index c276341f7a87932440002680aff912a41627549d..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/correct_test_result_json/issues.json +++ /dev/null @@ -1,152 +0,0 @@ -[ - { - "issue_type": [ - "需求", - "CVE和安全问题", - "缺陷" - ] - }, - { - "issue_status": [ - "open", - "progressing", - "closed", - "rejected" - ] - }, - { - "input": { - "hook_name": "issue_hooks", - "password": "pwd", - "hook_id": 1, - "hook_url": "http://gitee.com/liwen/gitos/hooks/1/edit", - "timestamp": "1576754827988", - "sign": "rLEHLuZRIQHuTPeXMib9Czoq9dVXO4TsQcmQQHtjXHA=", - "issue": { - "html_url": "https://gitee.com/oschina/gitee/issues/IG6E9", - "id": 295024870, - "number": "IG8E1", - "title": "IE fault", - "body": "js fault", - "issue_type": "缺陷", - "state": "open", - "comments": 0, - "created_at": "2018-02-07T23:46:46+08:00", - "updated_at": "2018-02-07T23:46:46+08:00", - "user": { - "id": 1, - "login": "robot", - "avatar_url": "https://gitee.com/assets/favicon.ico", - "html_url": "https://gitee.com/robot", - "type": "User", - "site_admin": false, - "name": "robot", - "email": "robot@gitee.com", - "username": "robot", - "user_name": "robot", - "url": "https://gitee.com/robot" - }, - "labels": [ - { - "id": 827033694, - "name": "bug", - "color": "d73a4a" - } - ], - "assignee": { - "id": 1, - "login": "robot", - "avatar_url": "https://gitee.com/assets/favicon.ico", - "html_url": "https://gitee.com/robot", - "type": "User", - "site_admin": false, - "name": "robot", - "email": "robot@gitee.com", - "username": "robot", - "user_name": "robot", - "url": "https://gitee.com/robot" - }, - "milestone": { - "html_url": "https://gitee.com/oschina/gitee/milestones/1", - "id": 3096855, - "number": 1, - "title": "problem", - "description": null, - "open_issues": 13, - "started_issues": 6, - "closed_issues": 31, - "approved_issues": 42, - "state": "open", - "created_at": "2018-02-01T23:46:46+08:00", - "updated_at": "2018-02-02T23:46:46+08:00", - "due_on": null - } - }, - "repository": { - "id": 120249025, - "name": "Gitee", - "path": "Cython", - "full_name": "China/Gitee", - "owner": { - "id": 1, - "login": "robot", - "avatar_url": "https://gitee.com/assets/favicon.ico", - "html_url": "https://gitee.com/robot", - "type": "User", - "site_admin": false, - "name": "robot", - "email": "robot@gitee.com", - "username": "robot", - "user_name": "robot", - "url": "https://gitee.com/robot" - }, - "private": false, - "html_url": "https://gitee.com/oschina/gitee", - "url": "https://gitee.com/oschina/gitee", - "description": "", - "fork": false, - "created_at": "2018-02-05T23:46:46+08:00", - "updated_at": "2018-02-05T23:46:46+08:00", - "pushed_at": "2018-02-05T23:46:46+08:00", - "git_url": "git://gitee.com:oschina/gitee.git", - "ssh_url": "git@gitee.com:oschina/gitee.git", - "clone_url": "https://gitee.com/oschina/gitee.git", - "svn_url": "svn://gitee.com/oschina/gitee", - "git_http_url": "https://gitee.com/oschina/gitee.git", - "git_ssh_url": "git@gitee.com:oschina/gitee.git", - "git_svn_url": "svn://gitee.com/oschina/gitee", - "homepage": null, - "stargazers_count": 11, - "watchers_count": 12, - "forks_count": 0, - "language": "ruby", - "has_issues": true, - "has_wiki": true, - "has_pages": false, - "license": null, - "open_issues_count": 0, - "default_branch": "master", - "namespace": "oschina", - "name_with_namespace": "China/Gitee", - "path_with_namespace": "oschina/gitee" - }, - "sender": { - "id": 1, - "login": "robot", - "avatar_url": "https://gitee.com/assets/favicon.ico", - "html_url": "https://gitee.com/robot", - "type": "User", - "site_admin": false, - "name": "robot", - "email": "robot@gitee.com", - "username": "robot", - "user_name": "robot", - "url": "https://gitee.com/robot" - }, - "enterprise": { - "name": "oschina", - "url": "https://gitee.com/oschina" - } -} - } -] \ No newline at end of file diff --git a/packageship/test/common_files/correct_test_result_json/packages.json b/packageship/test/common_files/correct_test_result_json/packages.json deleted file mode 100644 index dbcd63459aac26aa82f549b3c284acf7d315735e..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/correct_test_result_json/packages.json +++ /dev/null @@ -1,18 +0,0 @@ -[ - { - "release": "1.fc29", - "used_time": 0, - "maintainer": null, - "feature": 0, - "latest_version": null, - "latest_version_time": null, - "maintainlevel": null, - "url": "http://play0ad.com", - "release_time": null, - "id": 1, - "version": "0.0.23b", - "name": "A", - "rpm_license": "GPLv2+ and BSD and MIT and IBM", - "issue": 0 - } -] \ No newline at end of file diff --git a/packageship/test/common_files/correct_test_result_json/packages_like_src.json b/packageship/test/common_files/correct_test_result_json/packages_like_src.json deleted file mode 100644 index 54c3fff8fea415f4a56b489585c3795688016edd..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/correct_test_result_json/packages_like_src.json +++ /dev/null @@ -1,18 +0,0 @@ -[ - { - "id": 1, - "name": "A", - "maintainer": null, - "maintainlevel": null, - "release": "1.fc29", - "rpm_version": "0.0.23b", - "used_time": 0, - "release_time": null, - "latest_version_time": null, - "latest_version": null, - "feature": 0, - "url": "http://play0ad.com", - "license": "GPLv2+ and BSD and MIT and IBM", - "issue": 0 - } - ] \ No newline at end of file diff --git a/packageship/test/common_files/correct_test_result_json/self_depend.json b/packageship/test/common_files/correct_test_result_json/self_depend.json deleted file mode 100644 index 260aedb47209ab4216c6d563d20f985ffe45ced5..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/correct_test_result_json/self_depend.json +++ /dev/null @@ -1,515 +0,0 @@ -[ - { - "input": { - "packagename": "A1" - }, - "output": { - "code": "2001", - "data": { - "binary_dicts": { - "A1": [ - "A", - "0.0.23b", - "mainline", - [ - [ - "root", - null - ], - [ - "D1", - "install" - ] - ] - ], - "A2": [ - "A", - "0.0.23b", - "mainline", - [ - [ - "A1", - "install" - ], - [ - "C1", - "install" - ] - ] - ], - "B1": [ - "B", - "0.0.2", - "mainline", - [ - [ - "A", - "build" - ], - [ - "D", - "build" - ] - ] - ], - "B2": [ - "B", - "0.0.2", - "mainline", - [ - [ - "C", - "build" - ] - ] - ], - "C1": [ - "C", - "0.1", - "mainline", - [ - [ - "A2", - "install" - ] - ] - ], - "D1": [ - "D", - "0.11", - "mainline", - [ - [ - "A2", - "install" - ] - ] - ] - }, - "source_dicts": { - "A": [ - "mainline", - "0.0.23b" - ], - "B": [ - "mainline", - "0.0.2" - ], - "C": [ - "mainline", - "0.1" - ], - "D": [ - "mainline", - "0.11" - ] - }, - "not_found_components": [] - }, - "msg": "Successful Operation!" - } - }, - { - "input": { - "packagename": "C", - "packtype": "source" - }, - "output": { - "code": "2001", - "data": { - "binary_dicts": { - "A1": [ - "A", - "0.0.23b", - "mainline", - [ - [ - "D1", - "install" - ] - ] - ], - "A2": [ - "A", - "0.0.23b", - "mainline", - [ - [ - "C1", - "install" - ], - [ - "A1", - "install" - ] - ] - ], - "B1": [ - "B", - "0.0.2", - "mainline", - [ - [ - "A", - "build" - ], - [ - "D", - "build" - ] - ] - ], - "B2": [ - "B", - "0.0.2", - "mainline", - [ - [ - "C", - "build" - ] - ] - ], - "C1": [ - "C", - "0.1", - "mainline", - [ - [ - "root", - null - ], - [ - "A2", - "install" - ] - ] - ], - "C2": [ - "C", - "0.1", - "mainline", - [ - [ - "root", - null - ] - ] - ], - "D1": [ - "D", - "0.11", - "mainline", - [ - [ - "A2", - "install" - ] - ] - ] - }, - "source_dicts": { - "A": [ - "mainline", - "0.0.23b" - ], - "B": [ - "mainline", - "0.0.2" - ], - "C": [ - "mainline", - "0.1" - ], - "D": [ - "mainline", - "0.11" - ] - }, - "not_found_components": [] - }, - "msg": "Successful Operation!" - } - }, - { - "input": { - "packagename": "A2", - "selfbuild": "0", - "withsubpack": "1" - }, - "output": { - "code": "2001", - "data": { - "binary_dicts": { - "A1": [ - "A", - "0.0.23b", - "mainline", - [ - [ - "D1", - "install" - ] - ] - ], - "A2": [ - "A", - "0.0.23b", - "mainline", - [ - [ - "root", - null - ], - [ - "C1", - "install" - ], - [ - "A1", - "install" - ] - ] - ], - "B1": [ - "B", - "0.0.2", - "mainline", - [ - [ - "A", - "build" - ], - [ - "D", - "build" - ] - ] - ], - "B2": [ - "B", - "0.0.2", - "mainline", - [ - [ - "C", - "build" - ] - ] - ], - "C1": [ - "C", - "0.1", - "mainline", - [ - [ - "A2", - "install" - ] - ] - ], - "D1": [ - "D", - "0.11", - "mainline", - [ - [ - "A2", - "install" - ], - [ - "D2", - "install" - ] - ] - ], - "C2": [ - "C", - "0.1", - "mainline", - [ - [ - "C", - "Subpack" - ] - ] - ], - "D2": [ - "D", - "0.11", - "mainline", - [ - [ - "D", - "Subpack" - ] - ] - ] - }, - "source_dicts": { - "A": [ - "mainline", - "0.0.23b" - ], - "B": [ - "mainline", - "0.0.2" - ], - "C": [ - "mainline", - "0.1" - ], - "D": [ - "mainline", - "0.11" - ] - }, - "not_found_components": [] - }, - "msg": "Successful Operation!" - } - }, - { - "input": { - "packagename": "A", - "selfbuild": "0", - "withsubpack": "1", - "packtype": "source" - }, - "output": { - "code": "2001", - "data": { - "binary_dicts": { - "A1": [ - "A", - "0.0.23b", - "mainline", - [ - [ - "root", - null - ], - [ - "D1", - "install" - ] - ] - ], - "A2": [ - "A", - "0.0.23b", - "mainline", - [ - [ - "root", - null - ], - [ - "C1", - "install" - ], - [ - "A1", - "install" - ] - ] - ], - "B1": [ - "B", - "0.0.2", - "mainline", - [ - [ - "A", - "build" - ], - [ - "D", - "build" - ] - ] - ], - "B2": [ - "B", - "0.0.2", - "mainline", - [ - [ - "C", - "build" - ] - ] - ], - "C1": [ - "C", - "0.1", - "mainline", - [ - [ - "A2", - "install" - ] - ] - ], - "D1": [ - "D", - "0.11", - "mainline", - [ - [ - "A2", - "install" - ], - [ - "D2", - "install" - ] - ] - ], - "C2": [ - "C", - "0.1", - "mainline", - [ - [ - "C", - "Subpack" - ] - ] - ], - "D2": [ - "D", - "0.11", - "mainline", - [ - [ - "D", - "Subpack" - ] - ] - ] - }, - "source_dicts": { - "A": [ - "mainline", - "0.0.23b" - ], - "B": [ - "mainline", - "0.0.2" - ], - "C": [ - "mainline", - "0.1" - ], - "D": [ - "mainline", - "0.11" - ] - }, - "not_found_components": [] - }, - "msg": "Successful Operation!" - } - } -] \ No newline at end of file diff --git a/packageship/test/common_files/database_file_info.yaml b/packageship/test/common_files/database_file_info.yaml deleted file mode 100644 index f9b468a05e27b811a51ff92239135416a543821c..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/database_file_info.yaml +++ /dev/null @@ -1,4 +0,0 @@ -- database_name: mainline - priority: 1 -- database_name: fedora30 - priority: 2 diff --git a/packageship/test/common_files/db_origin/data_1_bin.sqlite b/packageship/test/common_files/db_origin/data_1_bin.sqlite deleted file mode 100644 index 5a7d5a748d93f43596bd4c29aa3f38c94b80506a..0000000000000000000000000000000000000000 Binary files a/packageship/test/common_files/db_origin/data_1_bin.sqlite and /dev/null differ diff --git a/packageship/test/common_files/db_origin/data_1_src.sqlite b/packageship/test/common_files/db_origin/data_1_src.sqlite deleted file mode 100644 index 2b053b0a8f3166558bfcf63505421984902def50..0000000000000000000000000000000000000000 Binary files a/packageship/test/common_files/db_origin/data_1_src.sqlite and /dev/null differ diff --git a/packageship/test/common_files/db_origin/data_2_bin.sqlite b/packageship/test/common_files/db_origin/data_2_bin.sqlite deleted file mode 100644 index aa3d11c5a2a741750fa6ed08279514f9c87532de..0000000000000000000000000000000000000000 Binary files a/packageship/test/common_files/db_origin/data_2_bin.sqlite and /dev/null differ diff --git a/packageship/test/common_files/db_origin/data_2_src.sqlite b/packageship/test/common_files/db_origin/data_2_src.sqlite deleted file mode 100644 index e4bbb155d066e6ee52f5119563f93fc15cfd7a16..0000000000000000000000000000000000000000 Binary files a/packageship/test/common_files/db_origin/data_2_src.sqlite and /dev/null differ diff --git a/packageship/test/common_files/dbs/fedora30.db b/packageship/test/common_files/dbs/fedora30.db deleted file mode 100644 index cb66065433352c32ea7d4d71c9cdbce45582c38a..0000000000000000000000000000000000000000 Binary files a/packageship/test/common_files/dbs/fedora30.db and /dev/null differ diff --git a/packageship/test/common_files/dbs/lifecycle.db b/packageship/test/common_files/dbs/lifecycle.db deleted file mode 100644 index 457902d611737b640179d876a3a9e5c67b68c3fa..0000000000000000000000000000000000000000 Binary files a/packageship/test/common_files/dbs/lifecycle.db and /dev/null differ diff --git a/packageship/test/common_files/dbs/mainline.db b/packageship/test/common_files/dbs/mainline.db deleted file mode 100644 index c78effe5542c9170811ee769d564a45a5ddf498f..0000000000000000000000000000000000000000 Binary files a/packageship/test/common_files/dbs/mainline.db and /dev/null differ diff --git a/packageship/test/common_files/dbs/maintenance.information.db b/packageship/test/common_files/dbs/maintenance.information.db deleted file mode 100644 index d43b5e4e10a2b922a2931664afe5cb6aba22852f..0000000000000000000000000000000000000000 Binary files a/packageship/test/common_files/dbs/maintenance.information.db and /dev/null differ diff --git a/packageship/test/common_files/operate_dbs/fedora30.db b/packageship/test/common_files/operate_dbs/fedora30.db deleted file mode 100644 index cb66065433352c32ea7d4d71c9cdbce45582c38a..0000000000000000000000000000000000000000 Binary files a/packageship/test/common_files/operate_dbs/fedora30.db and /dev/null differ diff --git a/packageship/test/common_files/operate_dbs/lifecycle.db b/packageship/test/common_files/operate_dbs/lifecycle.db deleted file mode 100644 index 27f42bd18c02efbd6be91f531faa4825fae77025..0000000000000000000000000000000000000000 Binary files a/packageship/test/common_files/operate_dbs/lifecycle.db and /dev/null differ diff --git a/packageship/test/common_files/operate_dbs/mainline.db b/packageship/test/common_files/operate_dbs/mainline.db deleted file mode 100644 index c78effe5542c9170811ee769d564a45a5ddf498f..0000000000000000000000000000000000000000 Binary files a/packageship/test/common_files/operate_dbs/mainline.db and /dev/null differ diff --git a/packageship/test/common_files/operate_dbs/maintenance.information.db b/packageship/test/common_files/operate_dbs/maintenance.information.db deleted file mode 100644 index d43b5e4e10a2b922a2931664afe5cb6aba22852f..0000000000000000000000000000000000000000 Binary files a/packageship/test/common_files/operate_dbs/maintenance.information.db and /dev/null differ diff --git a/packageship/test/common_files/package.ini b/packageship/test/common_files/package.ini deleted file mode 100644 index 1bfb3279f1e413fe6c87c9d7c118c8ec66603115..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/package.ini +++ /dev/null @@ -1,31 +0,0 @@ -[SYSTEM] -init_conf_path = -write_port = 8080 -query_port = 8090 -write_ip_addr = 127.0.0.1 -query_ip_addr = 127.0.0.1 -remote_host = https://api.openeuler.org/pkgmanage - -[LOG] -log_level = INFO -log_name = log_info.log -backup_count = 10 -max_bytes = 314572800 - -[UWSGI] -daemonize = /var/log/uwsgi.log -buffer-size = 65536 -http-timeout = 600 -harakiri = 600 - -[TIMEDTASK] -open = True -hour = 3 -minute = 0 - -[LIFECYCLE] -warehouse_remote = https://gitee.com/openeuler/openEuler-Advisor/raw/master/upstream-info/ -pool_workers = 10 -warehouse = src-openeuler -queue_maxsize = 1000 - diff --git a/packageship/test/common_files/test_true_yaml/CUnit.yaml b/packageship/test/common_files/test_true_yaml/CUnit.yaml deleted file mode 100644 index c8422018718cd23e04003ec2ef9e4629d6490f08..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/test_true_yaml/CUnit.yaml +++ /dev/null @@ -1,6 +0,0 @@ -version_control: NA -src_repo: NA -tag_prefix: NA -seperator: NA -maintainer: taotao -maintainlevel: 4 \ No newline at end of file diff --git a/packageship/test/common_files/test_true_yaml/Judy.yaml b/packageship/test/common_files/test_true_yaml/Judy.yaml deleted file mode 100644 index 32bd50d837afb3dc73111d2fad904c9966797434..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/test_true_yaml/Judy.yaml +++ /dev/null @@ -1,6 +0,0 @@ -version_control: NA -src_repo: NA -tag_prefix: NA -seperator: NA -maintainer: taotao -maintainlevel: 3 \ No newline at end of file diff --git a/packageship/test/common_files/test_wrong_format_yaml/CUnit.yaml b/packageship/test/common_files/test_wrong_format_yaml/CUnit.yaml deleted file mode 100644 index 55d600f6a357f64818d5f3de7a9daff0d85c6880..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/test_wrong_format_yaml/CUnit.yaml +++ /dev/null @@ -1,6 +0,0 @@ -version_control: NA -src_repo: NA -tag_prefix: NA -seperator: NA -maintainer: taotao - maintainlevel: 4 \ No newline at end of file diff --git a/packageship/test/common_files/test_wrong_format_yaml/Judy.yaml b/packageship/test/common_files/test_wrong_format_yaml/Judy.yaml deleted file mode 100644 index ed6d215c0a3b56ab87e7bf3d93d03bea11a1d910..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/test_wrong_format_yaml/Judy.yaml +++ /dev/null @@ -1,6 +0,0 @@ -version_control: NA -src_repo: NA -tag_prefix: NA -seperator: NA -maintainer: taotao - maintainlevel: 3 \ No newline at end of file diff --git a/packageship/test/common_files/test_wrong_main_yaml/CUnit.yaml b/packageship/test/common_files/test_wrong_main_yaml/CUnit.yaml deleted file mode 100644 index 6b84e8eb63b84db382680cf21ec790e717431bf5..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/test_wrong_main_yaml/CUnit.yaml +++ /dev/null @@ -1,6 +0,0 @@ -version_control: NA -src_repo: NA -tag_prefix: NA -seperator: NA -maintainer: taotao -maintainlevel: 6 \ No newline at end of file diff --git a/packageship/test/common_files/test_wrong_main_yaml/Judy.yaml b/packageship/test/common_files/test_wrong_main_yaml/Judy.yaml deleted file mode 100644 index 32bd50d837afb3dc73111d2fad904c9966797434..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/test_wrong_main_yaml/Judy.yaml +++ /dev/null @@ -1,6 +0,0 @@ -version_control: NA -src_repo: NA -tag_prefix: NA -seperator: NA -maintainer: taotao -maintainlevel: 3 \ No newline at end of file diff --git a/packageship/test/common_files/test_wrong_pkgname_yaml/CUnit.yaml b/packageship/test/common_files/test_wrong_pkgname_yaml/CUnit.yaml deleted file mode 100644 index c8422018718cd23e04003ec2ef9e4629d6490f08..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/test_wrong_pkgname_yaml/CUnit.yaml +++ /dev/null @@ -1,6 +0,0 @@ -version_control: NA -src_repo: NA -tag_prefix: NA -seperator: NA -maintainer: taotao -maintainlevel: 4 \ No newline at end of file diff --git a/packageship/test/common_files/test_wrong_pkgname_yaml/tets.yaml b/packageship/test/common_files/test_wrong_pkgname_yaml/tets.yaml deleted file mode 100644 index 32bd50d837afb3dc73111d2fad904c9966797434..0000000000000000000000000000000000000000 --- a/packageship/test/common_files/test_wrong_pkgname_yaml/tets.yaml +++ /dev/null @@ -1,6 +0,0 @@ -version_control: NA -src_repo: NA -tag_prefix: NA -seperator: NA -maintainer: taotao -maintainlevel: 3 \ No newline at end of file diff --git a/packageship/test/init_test.py b/packageship/test/init_test.py deleted file mode 100644 index fba21fdcb068ed695f1304b06c3ef31fadd4fed6..0000000000000000000000000000000000000000 --- a/packageship/test/init_test.py +++ /dev/null @@ -1,29 +0,0 @@ -import unittest -import datetime -from test.base_code.my_test_runner import MyTestRunner - -RUNNER = MyTestRunner() - - -def import_data_tests(): - """Initialize related test cases""" - - from test.test_module.init_system_tests.test_importdata import ImportData - suite = unittest.TestSuite() - suite.addTests(unittest.TestLoader().loadTestsFromTestCase(ImportData)) - - return RUNNER.run(suite) - - -start_time = datetime.datetime.now() -result_4_import = import_data_tests() -stop_time = datetime.datetime.now() - -print('\nA Init Test total of %s test cases were run: \nsuccessful:%s\tfailed:%s\terror:%s\n' % ( - int(result_4_import.testsRun), - int(result_4_import.success_case_count), - int(result_4_import.failure_case_count), - int(result_4_import.err_case_count) -)) - -print('Init Test Total Time: %s' % (stop_time - start_time)) diff --git a/packageship/test/read_test.py b/packageship/test/read_test.py deleted file mode 100644 index 709986b820371ceaf442bd70a350feba84bb8377..0000000000000000000000000000000000000000 --- a/packageship/test/read_test.py +++ /dev/null @@ -1,45 +0,0 @@ -import unittest -import datetime -from test.base_code.my_test_runner import MyTestRunner - -RUNNER = MyTestRunner() - - -def read_data_tests(): - """Test cases with read operations on data""" - from test.test_module.dependent_query_tests.test_install_depend import TestInstallDepend - from test.test_module.dependent_query_tests.test_self_depend import TestSelfDepend - from test.test_module.dependent_query_tests.test_be_depend import TestBeDepend - from test.test_module.repodatas_test.test_get_repodatas import TestGetRepodatas - from test.test_module.dependent_query_tests.test_build_depend import TestBuildDepend - from test.test_module.packages_tests.test_packages import TestPackages - from test.test_module.single_package_tests.test_get_singlepack import TestGetSinglePack - from test.test_module.lifecycle.test_maintainer import TestGetMaintainers - from test.test_module.lifecycle.test_downloadexcel import TestDownloadExcelFile - from test.test_module.lifecycle.test_get_issues import TestGetIssue - from test.test_module.dependinfo_tests.test_dependinfo_self_depend import TestDependInfoSelfDepend - from test.test_module.dependinfo_tests.test_dependinfo_be_depend import TestDependInfoBeDepend - - suite = unittest.TestSuite() - - classes = [TestInstallDepend, TestSelfDepend, TestBeDepend, - TestGetRepodatas, TestBuildDepend, TestPackages, - TestGetSinglePack, TestGetMaintainers, TestDownloadExcelFile, - TestGetIssue, TestDependInfoBeDepend, TestDependInfoSelfDepend] - for cls in classes: - suite.addTests(unittest.TestLoader().loadTestsFromTestCase(cls)) - return RUNNER.run(suite) - - -start_time = datetime.datetime.now() -result_4_read = read_data_tests() -stop_time = datetime.datetime.now() - -print('\nA Read Test total of %s test cases were run: \nsuccessful:%s\tfailed:%s\terror:%s\n' % ( - int(result_4_read.testsRun), - int(result_4_read.success_case_count), - int(result_4_read.failure_case_count), - int(result_4_read.err_case_count) -)) - -print('Read Test Total Time: %s' % (stop_time - start_time)) diff --git a/packageship/test/test_module/__init__.py b/packageship/test/test_module/__init__.py deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/packageship/test/test_module/dependent_query_tests/__init__.py b/packageship/test/test_module/dependent_query_tests/__init__.py deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/packageship/test/test_module/dependent_query_tests/test_be_depend.py b/packageship/test/test_module/dependent_query_tests/test_be_depend.py deleted file mode 100644 index 8fcf171acfb82032fa68842c7955988b2a4d6b7b..0000000000000000000000000000000000000000 --- a/packageship/test/test_module/dependent_query_tests/test_be_depend.py +++ /dev/null @@ -1,361 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -Less transmission is always parameter transmission -""" -import unittest -import json -from test.base_code.read_data_base import ReadTestBase -from test.base_code.common_test_code import compare_two_values, get_correct_json_by_filename -from packageship.application.apps.package.function.constants import ResponseCode -from packageship.application.apps.package.function.searchdb import db_priority - - -class TestBeDepend(ReadTestBase): - """ - The dependencies of the package are tested - """ - db_name = db_priority()[0] - - def test_lack_parameter(self): - """ - Less transmission is always parameter transmission - """ - # No arguments passed - resp = self.client.post("/packages/findBeDepend", - data='{}', - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Only the packagename - resp = self.client.post("/packages/findBeDepend", - data=json.dumps({ - "packagename": "CUnit", - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Only the withsubpack - resp = self.client.post("/packages/findBeDepend", - data=json.dumps({ - "withsubpack": "0", - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Only the dbname - resp = self.client.post("/packages/findBeDepend", - data=json.dumps({ - "dbname": f"{self.db_name}", - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Don't preach withsubpack - resp = self.client.post("/packages/findBeDepend", - data=json.dumps({ - "packagename": "A", - "dbname": f"{self.db_name}" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Don't preach dbname - resp = self.client.post("/packages/findBeDepend", - data=json.dumps({ - "packagename": "CUnit", - "withsubpack": "0" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Don't preach packagename - resp = self.client.post("/packages/findBeDepend", - data=json.dumps({ - "dbname": f"{self.db_name}", - "withsubpack": "0" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # All incoming withsubpack=0 - resp = self.client.post("/packages/findBeDepend", - data=json.dumps({ - "packagename": "A", - "dbname": f"{self.db_name}", - "withsubpack": "0" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone( - resp_dict.get("data"), - msg="Error in data information return") - - # All incoming withsubpack=1 - resp = self.client.post("/packages/findBeDepend", - data=json.dumps({ - "packagename": "A", - "dbname": f"{self.db_name}", - "withsubpack": "1" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_wrong_parameter(self): - """ - Parameter error - """ - - # packagename Parameter error - resp = self.client.post("/packages/findBeDepend", - data=json.dumps({ - "packagename": "詹姆斯", - "dbname": f"{self.db_name}", - "withsubpack": "0" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PACK_NAME_NOT_FOUND, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PACK_NAME_NOT_FOUND), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # dbname Parameter error - resp = self.client.post("/packages/findBeDepend", - data=json.dumps({ - "packagename": "ATpy", - "dbname": "asdfgjhk", - "withsubpack": "0" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.DB_NAME_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.DB_NAME_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # withsubpack Parameter error - resp = self.client.post("/packages/findBeDepend", - data=json.dumps({ - "packagename": "CUnit", - "dbname": f"{self.db_name}", - "withsubpack": "3" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_true_params_result(self): - """ - Results contrast - """ - correct_list = get_correct_json_by_filename("be_depend") - - self.assertNotEqual([], correct_list, msg="Error reading JSON file") - - for correct_data in correct_list: - input_value = correct_data["input"] - resp = self.client.post("/packages/findBeDepend", - data=json.dumps(input_value), - content_type="application/json") - output_for_input = correct_data["output"] - resp_dict = json.loads(resp.data) - self.assertTrue(compare_two_values(output_for_input, resp_dict), - msg="The answer is not correct") diff --git a/packageship/test/test_module/dependent_query_tests/test_build_depend.py b/packageship/test/test_module/dependent_query_tests/test_build_depend.py deleted file mode 100644 index 5e0df7ff3c2eaa70d9b717a303eed383154e49ed..0000000000000000000000000000000000000000 --- a/packageship/test/test_module/dependent_query_tests/test_build_depend.py +++ /dev/null @@ -1,161 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" - build_depend unittest -""" -import json -import unittest - -from test.base_code.read_data_base import ReadTestBase -from test.base_code.common_test_code import compare_two_values, get_correct_json_by_filename -from packageship.application.apps.package.function.constants import ResponseCode - - -class TestBuildDepend(ReadTestBase): - """ - class for test build_depend - """ - - def test_empty_source_name_dblist(self): - """ - test empty parameters:sourceName,dbList - :return: - """ - resp = self.client.post("/packages/findBuildDepend", - data="{}", - content_type="application/json") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone(resp_dict.get("data"), msg="Error in data information return") - - resp = self.client.post("/packages/findBuildDepend", - data=json.dumps({"sourceName": "A"}), - content_type="application/json") - - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone(resp_dict.get("data"), msg="Error in data information return") - - def test_wrong_source_name_dblist(self): - """ - test wrong parameters:sourceName,dbList - :return: None - """ - resp = self.client.post("/packages/findBuildDepend", - data=json.dumps({"sourceName": 0}), - content_type="application/json") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone(resp_dict.get("data"), msg="Error in data information return") - - resp = self.client.post("/packages/findBuildDepend", - data=json.dumps({"sourceName": "qitiandasheng"}), - content_type="application/json") - - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PACK_NAME_NOT_FOUND, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PACK_NAME_NOT_FOUND), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone(resp_dict.get("data"), msg="Error in data information return") - - resp = self.client.post("/packages/findBuildDepend", - data=json.dumps({"sourceName": "CUnit", - "db_list": [12, 3, 4]}), - content_type="application/json") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone(resp_dict.get("data"), msg="Error in data information return") - - resp = self.client.post("/packages/findBuildDepend", - data=json.dumps({"sourceName": "CUnit", - "db_list": ["shifu", "bajie"] - }), content_type="application/json") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.DB_NAME_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.DB_NAME_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone(resp_dict.get("data"), msg="Error in data information return") - - def test_true_params_result(self): - """ - test_true_params_result - Returns: - - """ - correct_list = get_correct_json_by_filename("build_depend") - - self.assertNotEqual([], correct_list, msg="Error reading JSON file") - - for correct_data in correct_list: - input_value = correct_data["input"] - resp = self.client.post("/packages/findBuildDepend", - data=json.dumps(input_value), - content_type="application/json") - output_for_input = correct_data["output"] - resp_dict = json.loads(resp.data) - self.assertTrue(compare_two_values(output_for_input, resp_dict), - msg="The answer is not correct") - - -if __name__ == '__main__': - unittest.main() diff --git a/packageship/test/test_module/dependent_query_tests/test_install_depend.py b/packageship/test/test_module/dependent_query_tests/test_install_depend.py deleted file mode 100644 index 9b9280be69fb930a2ec157e7cfd6d6e254011057..0000000000000000000000000000000000000000 --- a/packageship/test/test_module/dependent_query_tests/test_install_depend.py +++ /dev/null @@ -1,164 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -TestInstallDepend -""" -import unittest -import json - -from test.base_code.common_test_code import get_correct_json_by_filename, compare_two_values -from test.base_code.read_data_base import ReadTestBase -from packageship.application.apps.package.function.constants import ResponseCode - - -class TestInstallDepend(ReadTestBase): - """ - TestInstallDepend - """ - - def test_empty_binaryName_dbList(self): - """ - test_empty_binaryName_dbList - Returns: - - """ - resp = self.client.post("/packages/findInstallDepend", - data="{}", - content_type="application/json") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone(resp_dict.get("data"), msg="Error in data information return") - - resp = self.client.post("/packages/findInstallDepend", - data=json.dumps({"binaryName": "A1"}), - content_type="application/json") - - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone(resp_dict.get("data"), msg="Error in data information return") - - def test_wrong_binaryName_dbList(self): - """ - test_empty_binaryName_dbList - Returns: - - """ - resp = self.client.post("/packages/findInstallDepend", - data=json.dumps({"binaryName": 0}), - content_type="application/json") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone(resp_dict.get("data"), msg="Error in data information return") - - resp = self.client.post("/packages/findInstallDepend", - data=json.dumps( - {"binaryName": "qitiandasheng"}), - content_type="application/json") - - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PACK_NAME_NOT_FOUND, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PACK_NAME_NOT_FOUND), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone(resp_dict.get("data"), msg="Error in data information return") - - resp = self.client.post("/packages/findInstallDepend", - data=json.dumps({"binaryName": "A1", - "db_list": [12, 3, 4]}), - content_type="application/json") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone(resp_dict.get("data"), msg="Error in data information return") - - resp = self.client.post("/packages/findInstallDepend", - data=json.dumps({"binaryName": "A1", - "db_list": ["shifu", "bajie"] - }), content_type="application/json") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.DB_NAME_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.DB_NAME_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone(resp_dict.get("data"), msg="Error in data information return") - - def test_true_params_result(self): - """ - test_empty_binaryName_dbList - Returns: - - """ - correct_list = get_correct_json_by_filename("install_depend") - - self.assertNotEqual([], correct_list, msg="Error reading JSON file") - - for correct_data in correct_list: - input_value = correct_data["input"] - resp = self.client.post("/packages/findInstallDepend", - data=json.dumps(input_value), - content_type="application/json") - output_for_input = correct_data["output"] - resp_dict = json.loads(resp.data) - self.assertTrue(compare_two_values(output_for_input, resp_dict), - msg="The answer is not correct") - - -if __name__ == '__main__': - unittest.main() diff --git a/packageship/test/test_module/dependent_query_tests/test_self_depend.py b/packageship/test/test_module/dependent_query_tests/test_self_depend.py deleted file mode 100644 index 4a2fcb5d1f794f88b8ed8ab45b279ed1df2acc43..0000000000000000000000000000000000000000 --- a/packageship/test/test_module/dependent_query_tests/test_self_depend.py +++ /dev/null @@ -1,289 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -TestSelfDepend -""" -import unittest -import json - -from test.base_code.common_test_code import get_correct_json_by_filename, compare_two_values -from test.base_code.read_data_base import ReadTestBase -from packageship.application.apps.package.function.constants import ResponseCode -from packageship.application.apps.package.function.searchdb import db_priority - - -class TestSelfDepend(ReadTestBase): - """ - TestSelfDepend - """ - - def test_empty_parameter(self): - """ - test_empty_parameter - Returns: - - """ - resp = self.client.post("/packages/findSelfDepend", - data='{}', - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/packages/findSelfDepend", - data=json.dumps({ - "packagename": "A1", - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNotNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/packages/findSelfDepend", - data=json.dumps({ - "packagename": "A1", - "db_list": db_priority() - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNotNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/packages/findSelfDepend", - data=json.dumps({ - "packagename": "A1", - "db_list": db_priority(), - "selfbuild": "0" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNotNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/packages/findSelfDepend", - data=json.dumps({ - "packagename": "A1", - "db_list": db_priority(), - "selfbuild": "0", - "withsubpack": "0" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNotNone(resp_dict.get("data"), msg="Data return error!") - - def test_wrong_parameter(self): - """ - test_wrong_parameter - Returns: - - """ - resp = self.client.post("/packages/findSelfDepend", - data=json.dumps({ - "packagename": "wukong" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.PACK_NAME_NOT_FOUND, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PACK_NAME_NOT_FOUND), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/packages/findSelfDepend", - data=json.dumps({ - "packagename": "A1", - "db_list": [1, 2, 3, 4] - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/packages/findSelfDepend", - data=json.dumps({ - "packagename": "A1", - "db_list": ["bajie", "shifu"] - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.DB_NAME_ERROR, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.DB_NAME_ERROR), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/packages/findSelfDepend", - data=json.dumps({ - "packagename": "A1", - "db_list": db_priority(), - "selfbuild": "nverguo" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/packages/findSelfDepend", - data=json.dumps({ - "packagename": "A1", - "db_list": db_priority(), - "selfbuild": "0", - "withsubpack": "pansidong", - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/packages/findSelfDepend", - data=json.dumps({ - "packagename": "A1", - "db_list": db_priority(), - "selfbuild": "0", - "withsubpack": "0", - "packtype": "pansidaxian" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNone(resp_dict.get("data"), msg="Data return error!") - - def test_true_params_result(self): - """ - test_true_params_result - Returns: - - """ - correct_list = get_correct_json_by_filename("self_depend") - - self.assertNotEqual([], correct_list, msg="Error reading JSON file") - - for correct_data in correct_list: - input_value = correct_data["input"] - resp = self.client.post("/packages/findSelfDepend", - data=json.dumps(input_value), - content_type="application/json") - output_for_input = correct_data["output"] - resp_dict = json.loads(resp.data) - self.assertTrue(compare_two_values(output_for_input, resp_dict), - msg="The answer is not correct") - - -if __name__ == '__main__': - unittest.main() diff --git a/packageship/test/test_module/dependinfo_tests/__init__.py b/packageship/test/test_module/dependinfo_tests/__init__.py deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/packageship/test/test_module/dependinfo_tests/test_dependinfo_be_depend.py b/packageship/test/test_module/dependinfo_tests/test_dependinfo_be_depend.py deleted file mode 100644 index fc9d12b03137afc4280dd5a9f23f1ddc86829956..0000000000000000000000000000000000000000 --- a/packageship/test/test_module/dependinfo_tests/test_dependinfo_be_depend.py +++ /dev/null @@ -1,365 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -Less transmission is always parameter transmission -""" -import unittest -import json -from test.base_code.read_data_base import ReadTestBase -from test.base_code.common_test_code import compare_two_values, get_correct_json_by_filename -from packageship.application.apps.package.function.constants import ResponseCode -from packageship.application.apps.package.function.searchdb import db_priority - - -class TestDependInfoBeDepend(ReadTestBase): - """ - The dependencies of the package are tested - """ - db_name = db_priority()[0] - - def test_lack_parameter(self): - """ - Less transmission is always parameter transmission - """ - # No arguments passed - resp = self.client.post("/dependInfo/beDepend", - data='{}', - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Only the packagename - resp = self.client.post("/dependInfo/beDepend", - data=json.dumps({ - "packagename": "CUnit", - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Only the withsubpack - resp = self.client.post("/dependInfo/beDepend", - data=json.dumps({ - "withsubpack": "0", - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Only the dbname - resp = self.client.post("/dependInfo/beDepend", - data=json.dumps({ - "dbname": f"{self.db_name}", - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Don't preach withsubpack - resp = self.client.post("/dependInfo/beDepend", - data=json.dumps({ - "packagename": "A", - "dbname": f"{self.db_name}" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Don't preach dbname - resp = self.client.post("/dependInfo/beDepend", - data=json.dumps({ - "packagename": "CUnit", - "withsubpack": "0" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Don't preach packagename - resp = self.client.post("/dependInfo/beDepend", - data=json.dumps({ - "dbname": f"{self.db_name}", - "withsubpack": "0" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # All incoming withsubpack=0 - resp = self.client.post("/dependInfo/beDepend", - data=json.dumps({ - "packagename": "A", - "dbname": f"{self.db_name}", - "withsubpack": "0" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone( - resp_dict.get("data"), - msg="Error in data information return") - - # All incoming withsubpack=1 - resp = self.client.post("/dependInfo/beDepend", - data=json.dumps({ - "packagename": "A", - "dbname": f"{self.db_name}", - "withsubpack": "1" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_wrong_parameter(self): - """ - Parameter error - """ - - # packagename Parameter error - resp = self.client.post("/dependInfo/beDepend", - data=json.dumps({ - "packagename": "詹姆斯", - "dbname": f"{self.db_name}", - "withsubpack": "0" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PACK_NAME_NOT_FOUND, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PACK_NAME_NOT_FOUND), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # dbname Parameter error - resp = self.client.post("/dependInfo/beDepend", - data=json.dumps({ - "packagename": "ATpy", - "dbname": "asdfgjhk", - "withsubpack": "0" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.DB_NAME_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.DB_NAME_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # withsubpack Parameter error - resp = self.client.post("/dependInfo/beDepend", - data=json.dumps({ - "packagename": "CUnit", - "dbname": f"{self.db_name}", - "withsubpack": "3" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_true_params_result(self): - """ - Results contrast - """ - correct_list = get_correct_json_by_filename("dependinfo_be_depend") - - self.assertNotEqual([], correct_list, msg="Error reading JSON file") - - for correct_data in correct_list: - input_value = correct_data["input"] - resp = self.client.post("/dependInfo/beDepend", - data=json.dumps(input_value), - content_type="application/json") - output_for_input = correct_data["output"] - resp_dict = json.loads(resp.data) - self.assertTrue(compare_two_values(output_for_input, resp_dict), - msg="The answer is not correct") - - -if __name__ == '__main__': - unittest.main() diff --git a/packageship/test/test_module/dependinfo_tests/test_dependinfo_self_depend.py b/packageship/test/test_module/dependinfo_tests/test_dependinfo_self_depend.py deleted file mode 100644 index 650a81d90996af10a70e9b23bb589cfdb94d850c..0000000000000000000000000000000000000000 --- a/packageship/test/test_module/dependinfo_tests/test_dependinfo_self_depend.py +++ /dev/null @@ -1,289 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -TestSelfDepend -""" -import unittest -import json - -from test.base_code.common_test_code import get_correct_json_by_filename, compare_two_values -from test.base_code.read_data_base import ReadTestBase -from packageship.application.apps.package.function.constants import ResponseCode -from packageship.application.apps.package.function.searchdb import db_priority - - -class TestDependInfoSelfDepend(ReadTestBase): - """ - TestSelfDepend - """ - - def test_empty_parameter(self): - """ - test_empty_parameter - Returns: - - """ - resp = self.client.post("/dependInfo/selfDepend", - data='{}', - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/dependInfo/selfDepend", - data=json.dumps({ - "packagename": "A1", - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNotNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/dependInfo/selfDepend", - data=json.dumps({ - "packagename": "A1", - "db_list": db_priority() - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNotNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/dependInfo/selfDepend", - data=json.dumps({ - "packagename": "A1", - "db_list": db_priority(), - "selfbuild": "0" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNotNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/dependInfo/selfDepend", - data=json.dumps({ - "packagename": "A1", - "db_list": db_priority(), - "selfbuild": "0", - "withsubpack": "0" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNotNone(resp_dict.get("data"), msg="Data return error!") - - def test_wrong_parameter(self): - """ - test_wrong_parameter - Returns: - - """ - resp = self.client.post("/dependInfo/selfDepend", - data=json.dumps({ - "packagename": "wukong" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.PACK_NAME_NOT_FOUND, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PACK_NAME_NOT_FOUND), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/dependInfo/selfDepend", - data=json.dumps({ - "packagename": "A1", - "db_list": [1, 2, 3, 4] - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/dependInfo/selfDepend", - data=json.dumps({ - "packagename": "A1", - "db_list": ["bajie", "shifu"] - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.DB_NAME_ERROR, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.DB_NAME_ERROR), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/dependInfo/selfDepend", - data=json.dumps({ - "packagename": "A1", - "db_list": db_priority(), - "selfbuild": "nverguo" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/dependInfo/selfDepend", - data=json.dumps({ - "packagename": "A1", - "db_list": db_priority(), - "selfbuild": "0", - "withsubpack": "pansidong", - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNone(resp_dict.get("data"), msg="Data return error!") - - resp = self.client.post("/dependInfo/selfDepend", - data=json.dumps({ - "packagename": "A1", - "db_list": db_priority(), - "selfbuild": "0", - "withsubpack": "0", - "packtype": "pansidaxian" - }), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNone(resp_dict.get("data"), msg="Data return error!") - - def test_true_params_result(self): - """ - test_true_params_result - Returns: - - """ - correct_list = get_correct_json_by_filename("dependinfo_self_depend") - - self.assertNotEqual([], correct_list, msg="Error reading JSON file") - - for correct_data in correct_list: - input_value = correct_data["input"] - resp = self.client.post("/dependInfo/selfDepend", - data=json.dumps(input_value), - content_type="application/json") - output_for_input = correct_data["output"] - resp_dict = json.loads(resp.data) - self.assertTrue(compare_two_values(output_for_input, resp_dict), - msg="The answer is not correct") - - -if __name__ == '__main__': - unittest.main() diff --git a/packageship/test/test_module/init_system_tests/__init__.py b/packageship/test/test_module/init_system_tests/__init__.py deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/packageship/test/test_module/init_system_tests/test_importdata.py b/packageship/test/test_module/init_system_tests/test_importdata.py deleted file mode 100644 index ea0731bb83dba8a32ce526c8e8cee2653bc93534..0000000000000000000000000000000000000000 --- a/packageship/test/test_module/init_system_tests/test_importdata.py +++ /dev/null @@ -1,245 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -test import_databases -""" -import os -import shutil -import unittest -import warnings -from configparser import ConfigParser -import yaml - -from packageship import system_config -from packageship.libs.exception import Error -from packageship.libs.exception import ConfigurationException - -try: - - system_config.SYS_CONFIG_PATH = os.path.join( - os.path.dirname( - system_config.BASE_PATH), - 'test', - 'common_files', - 'package.ini') - - system_config.DATABASE_FOLDER_PATH = os.path.join(os.path.dirname( - system_config.BASE_PATH), 'test', 'init_system_files', 'dbs') - - from test.base_code.init_config_path import init_config - -except Error: - raise Error - -from sqlalchemy.exc import SQLAlchemyError -from packageship.application.initsystem.data_import import InitDataBase -from packageship.libs.exception import ContentNoneException -from packageship.libs.exception import DatabaseRepeatException -from packageship.libs.configutils.readconfig import ReadConfig -from packageship.application.models.package import DatabaseInfo -from packageship.libs.dbutils import DBHelper - - -class ImportData(unittest.TestCase): - """ - test importdatabases - """ - - def setUp(self): - - warnings.filterwarnings("ignore") - - def test_empty_param(self): - """If init is not obtained_ conf_ Path parameter""" - try: - InitDataBase(config_file_path=None).init_data() - except ContentNoneException as error: - self.assertEqual( - error.__class__, - ContentNoneException, - msg="No init in package_ conf_ Path parameter, wrong exception type thrown") - - # Yaml file exists but the content is empty - - try: - _config_path = ReadConfig( - system_config.SYS_CONFIG_PATH).get_system('init_conf_path') - shutil.copyfile(_config_path, _config_path + '.bak') - - with open(_config_path, 'w', encoding='utf-8') as w_f: - w_f.write("") - - InitDataBase(config_file_path=_config_path).init_data() - except ConfigurationException as error: - self.assertEqual( - error.__class__, - ConfigurationException, - msg="Yaml file exists, but the content is empty. The exception type is wrong") - finally: - # Restore files - os.remove(_config_path) - os.rename(_config_path + '.bak', _config_path) - - # Yaml file exists but DB exists_ The same with name - try: - _config_path = ReadConfig( - system_config.SYS_CONFIG_PATH).get_system('init_conf_path') - shutil.copyfile(_config_path, _config_path + '.bak') - with open(_config_path, 'r', encoding='utf-8') as file: - origin_yaml = yaml.load(file.read(), Loader=yaml.FullLoader) - for obj in origin_yaml: - obj["dbname"] = "openEuler" - with open(_config_path, 'w', encoding='utf-8') as w_f: - yaml.dump(origin_yaml, w_f) - - InitDataBase(config_file_path=_config_path).init_data() - except DatabaseRepeatException as error: - - self.assertEqual( - error.__class__, - DatabaseRepeatException, - msg="Yaml file exists but DB_ Name duplicate exception type is wrong") - finally: - # Restore files - os.remove(_config_path) - os.rename(_config_path + '.bak', _config_path) - - def test_wrong_param(self): - """If the corresponding current init_ conf_ The directory - specified by path is incorrect""" - try: - # Back up source files - shutil.copyfile( - system_config.SYS_CONFIG_PATH, - system_config.SYS_CONFIG_PATH + ".bak") - # Modify dbtype to "test"_ dbtype" - config = ConfigParser() - config.read(system_config.SYS_CONFIG_PATH) - config.set("SYSTEM", "init_conf_path", "D:\\Users\\conf.yaml") - config.write(open(system_config.SYS_CONFIG_PATH, "w")) - - _config_path = ReadConfig( - system_config.SYS_CONFIG_PATH).get_system('init_conf_path') - InitDataBase(config_file_path=_config_path).init_data() - except FileNotFoundError as error: - self.assertEqual( - error.__class__, - FileNotFoundError, - msg="init_ conf_ Path specified directory is empty exception type is wrong") - finally: - # To restore a file, delete the file first and then rename it back - os.remove(system_config.SYS_CONFIG_PATH) - os.rename( - system_config.SYS_CONFIG_PATH + ".bak", - system_config.SYS_CONFIG_PATH) - - def test_dbname(self): - """test dbname""" - try: - _config_path = ReadConfig( - system_config.SYS_CONFIG_PATH).get_system('init_conf_path') - shutil.copyfile(_config_path, _config_path + '.bak') - with open(_config_path, 'r', encoding='utf-8') as file: - origin_yaml = yaml.load(file.read(), Loader=yaml.FullLoader) - for obj in origin_yaml: - obj["dbname"] = "" - with open(_config_path, 'w', encoding='utf-8') as w_f: - yaml.dump(origin_yaml, w_f) - - InitDataBase(config_file_path=_config_path).init_data() - except DatabaseRepeatException as error: - - self.assertEqual( - error.__class__, - DatabaseRepeatException, - msg="Yaml file exists but DB_ Name duplicate exception type is wrong") - finally: - # Restore files - os.remove(_config_path) - os.rename(_config_path + '.bak', _config_path) - - def test_src_db_file(self): - """test src db file""" - try: - _config_path = ReadConfig( - system_config.SYS_CONFIG_PATH).get_system('init_conf_path') - shutil.copyfile(_config_path, _config_path + '.bak') - with open(_config_path, 'r', encoding='utf-8') as file: - origin_yaml = yaml.load(file.read(), Loader=yaml.FullLoader) - for obj in origin_yaml: - obj["src_db_file"] = "" - with open(_config_path, 'w', encoding='utf-8') as w_f: - yaml.dump(origin_yaml, w_f) - - InitDataBase(config_file_path=_config_path).init_data() - except TypeError as error: - - self.assertEqual( - error.__class__, - TypeError, - msg="Yaml file exists but DB_ Name duplicate exception type is wrong") - finally: - # Restore files - os.remove(_config_path) - os.rename(_config_path + '.bak', _config_path) - - def test_priority(self): - """test priority""" - try: - _config_path = ReadConfig( - system_config.SYS_CONFIG_PATH).get_system('init_conf_path') - shutil.copyfile(_config_path, _config_path + '.bak') - with open(_config_path, 'r', encoding='utf-8') as file: - origin_yaml = yaml.load(file.read(), Loader=yaml.FullLoader) - for obj in origin_yaml: - obj["priority"] = "-1" - with open(_config_path, 'w', encoding='utf-8') as w_f: - yaml.dump(origin_yaml, w_f) - InitDataBase(config_file_path=_config_path).init_data() - - with DBHelper(db_name='lifecycle') as data_name: - name_list = data_name.session.query( - DatabaseInfo.name).order_by(DatabaseInfo.priority).all() - init_database_date = [name[0] for name in name_list] - self.assertEqual( - init_database_date, - [], - msg=" Priority must be a positive integer between 0 and 100 ") - except FileNotFoundError: - return - finally: - # Restore files - os.remove(_config_path) - os.rename(_config_path + '.bak', _config_path) - - def test_true_init_data(self): - """ - Initialization of system data - """ - # Normal configuration - try: - _config_path = ReadConfig( - system_config.SYS_CONFIG_PATH).get_system('init_conf_path') - InitDataBase(config_file_path=_config_path).init_data() - with DBHelper(db_name='lifecycle') as data_name: - name_list = data_name.session.query( - DatabaseInfo.name, DatabaseInfo.priority).order_by(DatabaseInfo.priority).all() - data_list = [dict(zip(ven.keys(), ven)) for ven in name_list] - _config_path = ReadConfig( - system_config.SYS_CONFIG_PATH).get_system('init_conf_path') - with open(_config_path, 'r', encoding='utf-8') as file: - origin_yaml = yaml.load(file.read(), Loader=yaml.FullLoader) - origin_list = list() - for item in origin_yaml: - data_dict = dict() - data_dict['name'] = item.get("dbname") - data_dict['priority'] = item.get("priority") - origin_list.append(data_dict) - - self.assertEqual( - data_list, - origin_list, - msg="The name and priority of the data generated by the initialization are correct") - - except (Error, SQLAlchemyError, FileNotFoundError, yaml.YAMLError): - return None diff --git a/packageship/test/test_module/lifecycle/__init__.py b/packageship/test/test_module/lifecycle/__init__.py deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/packageship/test/test_module/lifecycle/test_downloadexcel.py b/packageship/test/test_module/lifecycle/test_downloadexcel.py deleted file mode 100644 index e84994ab9a132a456f1114b1deca0b612cb27efa..0000000000000000000000000000000000000000 --- a/packageship/test/test_module/lifecycle/test_downloadexcel.py +++ /dev/null @@ -1,110 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -test_get_single_packages -""" -from io import BytesIO -import os -import unittest -import pandas as pd -from test.base_code.read_data_base import ReadTestBase -from packageship import system_config -from packageship.application.apps.package.function.constants import ResponseCode - - -class TestDownloadExcelFile(ReadTestBase): - """ - Download test of excel file - """ - - def test_file_type_error(self): - """ - The file type to be downloaded for the test is incorrect - """ - - response = self.client_request("/lifeCycle/download/xxx") - self.response_json_format(response) - self.assertEqual(ResponseCode.PARAM_ERROR, - response.get("code"), - msg="Error in status code return") - - self.assertIsNone( - response.get("data"), - msg="Error in data information return") - - def _save_download_file(self, response_data, path): - """ - Save the downloaded file - """ - with open(path, 'wb') as f: - f.write(response_data) - - def test_issue_file(self): - """ - Issue file download test - """ - response = self.client.get("/lifeCycle/download/issues") - data_frame = pd.read_excel( - BytesIO(response.data), sheet_name='Summary',engine='xlrd') - datas = data_frame.values.tolist() - self.assertEqual( - 14, len(datas), msg="An error occurred in the downloaded data") - data_dict = dict(zip(data_frame.columns.tolist(), datas[0])) - data = { - 'issue_id': 'I1OQW8', - 'issue_url': 'https://gitee.com/openeuler/openEuler-Advisor/issues/I1PGWQ', - 'issue_content': 'def get_yaml(self, pkg):', - 'issue_title': '【CI加固】对识别修改对周边组件和升级影响', - 'issue_status': 'open', - 'pkg_name': 'dnf', - 'issue_type': 'defect', - 'related_release': 'hahaxx' - } - self.assertEqual(data, data_dict, - msg='An error occurred in the downloaded data') - - def test_package_file(self): - """ - download packages file - """ - response = self.client.get( - "/lifeCycle/download/packages?table_name=mainline") - - data_frame = pd.read_excel( - BytesIO(response.data), sheet_name='Summary',engine='xlrd') - datas = data_frame.values.tolist() - self.assertEqual( - 5, len(datas), msg="An error occurred in the downloaded data") - data_dict = dict(zip(data_frame.columns.tolist(), datas[0])) - data = { - 'name': 'CUnit', - 'url': 'http://cunit.sourceforge.net/', - 'rpm_license': 'LGPLv2+', - 'version': '2.1.3', - 'release': '21.oe1', - 'release_time': 1.0, - 'used_time': 2.0, - 'latest_version': 3.0, - 'latest_version_time': 4.0, - 'feature': 5, - 'cve': 0, - 'defect': 0, - 'maintainer': 'userA', - 'maintainlevel': 6.0, - } - self.assertEqual(data, data_dict, - msg='An error occurred in the downloaded data') - - def test_package_file_no_table_name(self): - """ - download packages file - """ - response = self.client_request( - "/lifeCycle/download/packages") - self.response_json_format(response) - self.assertEqual(ResponseCode.SERVICE_ERROR, response.get( - 'code'), msg='Error in status code return') - - -if __name__ == '__main__': - unittest.main() diff --git a/packageship/test/test_module/lifecycle/test_get_issues.py b/packageship/test/test_module/lifecycle/test_get_issues.py deleted file mode 100644 index 756a185c090156d59748dd29467c08c1780de573..0000000000000000000000000000000000000000 --- a/packageship/test/test_module/lifecycle/test_get_issues.py +++ /dev/null @@ -1,273 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -test get issues -""" -from test.base_code.common_test_code import get_correct_json_by_filename -from test.base_code.read_data_base import ReadTestBase -import unittest -import json - -from packageship.application.apps.package.function.constants import ResponseCode - - -class TestGetIssue(ReadTestBase): - """ - Issues test case - """ - - def test_lack_parameter(self): - """ - Less transmission is always parameter transmission - """ - # No arguments passed - resp = self.client.get("/lifeCycle/issuetrace?page_num=&page_size=") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Only the page_num - resp = self.client.get("/lifeCycle/issuetrace?page_num=1&page_size=") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Only the page_size - resp = self.client.get("/lifeCycle/issuetrace?page_num=&page_size=5") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Without page_num - resp = self.client.get("/lifeCycle/issuetrace?page_num=&page_size=5") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Without page_size - resp = self.client.get("/lifeCycle/issuetrace?page_num=1&page_size=") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_true_params_result(self): - """ - Results contrast - """ - # All incoming - resp = self.client.get("/lifeCycle/issuetrace?page_num=1&page_size=5") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - - def test_wrong_parameter(self): - """ - Parameter error - """ - # pkg_name Parameter error - resp = self.client.get( - "/lifeCycle/issuetrace?pkg_name=hhh") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - - # page_num Parameter error - resp = self.client.get( - "/lifeCycle/issuetrace?pkg_name=dnf&page_num=x") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # page_size Parameter error - resp = self.client.get( - "/lifeCycle/issuetrace?pkg_name=dnf&page_num=1&page_size=x") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - - def test_issue_type(self): - """ - test issue type - """ - resp = self.client.get( - "/lifeCycle/issuetype") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - # Compare data - correct_list = get_correct_json_by_filename("issues") - self.assertNotEqual([], correct_list, msg="Error reading JSON file") - correct_data = correct_list[0]["issue_type"] - self.assertTrue(set(correct_data).issubset(set(resp_dict.get("data"))), - msg="The answer is not correct") - - def test_issue_status(self): - """ - test issue status - """ - resp = self.client.get( - "/lifeCycle/issuestatus") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - - # Compare data - correct_list = get_correct_json_by_filename("issues") - self.assertNotEqual([], correct_list, msg="Error reading JSON file") - correct_data = correct_list[1]["issue_status"] - self.assertTrue(set(resp_dict.get("data")).issubset(set(correct_data)), - msg="The answer is not correct") - - -if __name__ == '__main__': - unittest.main() diff --git a/packageship/test/test_module/lifecycle/test_issue_catch.py b/packageship/test/test_module/lifecycle/test_issue_catch.py deleted file mode 100644 index 2f5eb3afcfa76583306dfdf1930ab7c0dc1b9304..0000000000000000000000000000000000000000 --- a/packageship/test/test_module/lifecycle/test_issue_catch.py +++ /dev/null @@ -1,66 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -TestGetIssue - -""" -from test.base_code.common_test_code import get_correct_json_by_filename -from test.base_code.operate_data_base import OperateTestBase -import unittest -import json - -from packageship.application.apps.package.function.constants import ResponseCode - - -class TestIssueCatch(OperateTestBase): - """ - Test Get Issue info - """ - - def test_wrong_params(self): - """ - test issue catch - """ - # No arguments passed - resp = self.client.post("/lifeCycle/issuecatch", - json='') - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return!") - self.assertIn("msg", resp_dict, msg="Wrong return format!") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status information return!") - - self.assertIn("data", resp_dict, msg="Wrong return format!") - self.assertIsNone(resp_dict.get("data"), msg="Data return error!") - - def test_correct_params(self): - # Correct params - correct_list = get_correct_json_by_filename("issues") - - self.assertNotEqual([], correct_list, msg="Error reading JSON file") - - input_value = correct_list[2]["input"] - resp = self.client.post("/lifeCycle/issuecatch", - json=input_value) - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status prompt return") - self.assertIn("data", resp_dict, msg="Error in data format return") - - -if __name__ == '__main__': - unittest.main() diff --git a/packageship/test/test_module/lifecycle/test_maintainer.py b/packageship/test/test_module/lifecycle/test_maintainer.py deleted file mode 100644 index 7162024c1b72828f5f2a32cb89b7f8c932541774..0000000000000000000000000000000000000000 --- a/packageship/test/test_module/lifecycle/test_maintainer.py +++ /dev/null @@ -1,31 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -test_get_single_packages -""" -from test.base_code.read_data_base import ReadTestBase -import unittest -from packageship.application.apps.package.function.constants import ResponseCode - - -class TestGetMaintainers(ReadTestBase): - """ - Maintainer list acquisition test - """ - - def test_maintainer(self): - """ - Test the actual data sheet - """ - response = self.client_request( - "/lifeCycle/maintainer") - self.response_json_format(response) - self.assertEqual(ResponseCode.SUCCESS, - response.get("code"), - msg="Error in status code return") - self.assertEqual(['userA', 'userB'], response.get( - 'data'), msg="The data content is incorrect") - - -if __name__ == '__main__': - unittest.main() diff --git a/packageship/test/test_module/packages_tests/__init__.py b/packageship/test/test_module/packages_tests/__init__.py deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/packageship/test/test_module/packages_tests/test_packages.py b/packageship/test/test_module/packages_tests/test_packages.py deleted file mode 100644 index 1568a6190d6b97f57ec781e08a357d9d69f6ce8b..0000000000000000000000000000000000000000 --- a/packageship/test/test_module/packages_tests/test_packages.py +++ /dev/null @@ -1,403 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -packges test -""" -from test.base_code.common_test_code import get_correct_json_by_filename, compare_two_values -from test.base_code.read_data_base import ReadTestBase -import unittest -import json - -from packageship.application.apps.package.function.constants import ResponseCode - - -class TestPackages(ReadTestBase): - """ - All package test cases - """ - - def test_miss_required_parameter(self): - """ - Missing required parameters - """ - # test miss all table_name page_num page_size - resp = self.client.get("/packages?table_name=&page_num=&page_size=") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # test miss table_name - - resp = self.client.get("/packages?table_name=&page_num=1&page_size=1") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # test miss page_num - resp = self.client.get( - "/packages?table_name=mainline&page_num=&page_size=1") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # test miss page_size - resp = self.client.get( - "/packages?table_name=mainline&page_num=1&page_size=") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_wrong_required_parameter(self): - """ - wrong required parameters - """ - # test wrong page_num - resp = self.client.get( - "/packages?table_name=mainline&page_num=-1&page_size=1") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - # test wrong page_num - resp = self.client.get( - "/packages?table_name=mainline&page_num=a&page_size=1") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # test wrong page_num - resp = self.client.get( - "/packages?table_name=mainline&page_num=1.1&page_size=1") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # test wrong page_num - resp = self.client.get( - "/packages?table_name=mainline&page_num=65536&page_size=1") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # test wrong page_size - resp = self.client.get( - "/packages?table_name=mainline&page_num=1&page_size=-1") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # test wrong page_size - resp = self.client.get( - "/packages?table_name=mainline&page_num=1&page_size=65536") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # test wrong page_size - resp = self.client.get( - "/packages?table_name=mainline&page_num=1&page_size=a") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # test wrong page_size - resp = self.client.get( - "/packages?table_name=mainline&page_num=1&page_size=1.1") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_true_required_parameter(self): - """ - test_true_required_parameter - """ - resp = self.client.get( - "/packages?table_name=fedora30&page_num=1&page_size=1") - resp_dict = json.loads(resp.data) - - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone( - resp_dict.get("data"), - msg="Error in data information return") - - correct_list = get_correct_json_by_filename("packages") - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertTrue( - compare_two_values( - resp_dict.get("data"), - correct_list), - msg="Error in data information return") - - def test_wrong_table_name(self): - """ - test_wrong_table_name - """ - # test wrong table name - resp = self.client.get( - "/packages?table_name=test&page_num=1&page_size=1") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.TABLE_NAME_NOT_EXIST, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.TABLE_NAME_NOT_EXIST), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_wrong_optional_paramters(self): - """ - test wrong Optional parameters - """ - - # test wrong maintainlevel - resp = self.client.get( - "/packages?table_name=mainline&query_pkg_name=&maintainner=&maintainlevel=5&page_num=1&page_size=4") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status code return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_like_srcname_paramters(self): - resp = self.client.get( - "/packages?table_name=fedora30&page_num=1&page_size=4&query_pkg_name=A") - resp_dict = json.loads(resp.data) - correct_list = get_correct_json_by_filename( - "packages_like_src") - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertTrue( - compare_two_values( - resp_dict.get("data"), - correct_list), - msg="Error in data information return") diff --git a/packageship/test/test_module/repodatas_test/__init__.py b/packageship/test/test_module/repodatas_test/__init__.py deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/packageship/test/test_module/repodatas_test/test_delete_repodatas.py b/packageship/test/test_module/repodatas_test/test_delete_repodatas.py deleted file mode 100644 index 722fd0c192f71ff4b7dae314a94ddee53b45f560..0000000000000000000000000000000000000000 --- a/packageship/test/test_module/repodatas_test/test_delete_repodatas.py +++ /dev/null @@ -1,100 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -test delete repodatas -""" -import os -import shutil - -from sqlalchemy.exc import SQLAlchemyError - -from test.base_code.operate_data_base import OperateTestBase -import json -from packageship import system_config -from packageship.libs.exception import Error -from packageship.application.apps.package.function.constants import ResponseCode - - -class TestDeleteRepodatas(OperateTestBase): - """ - test delete repodata - """ - - def test_wrong_dbname(self): - """Test simulation scenario, dbname is not transmitted""" - - # Scenario 1: the value passed by dbname is empty - resp = self.client.delete("/repodatas?dbName=") - resp_dict = json.loads(resp.data) - - # assert - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - resp = self.client.delete("/repodatas?dbName=rr") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.DB_NAME_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.DB_NAME_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_true_dbname(self): - """ - Returns: - """ - try: - shutil.copytree( - system_config.DATABASE_FOLDER_PATH, - system_config.DATABASE_FOLDER_PATH + '.bak') - resp = self.client.delete("/repodatas?dbName=fedora30") - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - except (SQLAlchemyError, FileExistsError, Error): - return None - finally: - shutil.rmtree(system_config.DATABASE_FOLDER_PATH) - os.rename( - system_config.DATABASE_FOLDER_PATH + '.bak', - system_config.DATABASE_FOLDER_PATH) diff --git a/packageship/test/test_module/repodatas_test/test_get_repodatas.py b/packageship/test/test_module/repodatas_test/test_get_repodatas.py deleted file mode 100644 index 82dc3adb0f34b58ab9b8e01493c745db1c6fccc5..0000000000000000000000000000000000000000 --- a/packageship/test/test_module/repodatas_test/test_get_repodatas.py +++ /dev/null @@ -1,66 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -test get repodatas -""" -from test.base_code.common_test_code import get_correct_json_by_filename -from test.base_code.common_test_code import compare_two_values -from test.base_code.read_data_base import ReadTestBase -import unittest -import json - -from packageship.application.apps.package.function.constants import ResponseCode - - -class TestGetRepodatas(ReadTestBase): - """ - test get repodatas - """ - - def test_dbname(self): - """no dbName""" - correct_list = get_correct_json_by_filename("get_repodatas") - self.assertNotEqual([], correct_list, msg="Error reading JSON file") - resp = self.client.get("/repodatas") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertTrue( - compare_two_values( - resp_dict.get("data"), - correct_list), - msg="Error in data information return") - - resp = self.client.get("/repodatas?ddd") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertTrue( - compare_two_values( - resp_dict.get("data"), - correct_list), - msg="Error in data information return") - - -if __name__ == '__main__': - unittest.main() diff --git a/packageship/test/test_module/single_package_tests/__init__.py b/packageship/test/test_module/single_package_tests/__init__.py deleted file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000 diff --git a/packageship/test/test_module/single_package_tests/test_get_singlepack.py b/packageship/test/test_module/single_package_tests/test_get_singlepack.py deleted file mode 100644 index 412d112b7ecdd6b47f527f5b6f09f1005f53a3f8..0000000000000000000000000000000000000000 --- a/packageship/test/test_module/single_package_tests/test_get_singlepack.py +++ /dev/null @@ -1,142 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -test_get_single_packages -""" -from test.base_code.common_test_code import get_correct_json_by_filename -from test.base_code.common_test_code import compare_two_values -from test.base_code.read_data_base import ReadTestBase -import unittest -import json - -from packageship.application.apps.package.function.constants import ResponseCode -from packageship.application.apps.package.function.searchdb import db_priority - - -class TestGetSinglePack(ReadTestBase): - """ - Single package test case - """ - - def test_missing_required_parameters(self): - """ - Missing required parameters - """ - # Missing required parameters pkg_name - resp = self.client.get( - f"packages/packageInfo?pkg_name=&table_name=mainline") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Missing required parameters table_name - resp = self.client.get(f"packages/packageInfo?pkg_name=A&table_name=") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_wrong_parameters(self): - """ - test wrong parramters - """ - - # Missing required parameters table_name - resp = self.client.get( - f"packages/packageInfo?pkg_name=A&table_name=test") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.TABLE_NAME_NOT_EXIST, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.TABLE_NAME_NOT_EXIST), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Missing required parameters pkg_name - resp = self.client.get( - f"packages/packageInfo?pkg_name=test&table_name=fedora30") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PACK_NAME_NOT_FOUND, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PACK_NAME_NOT_FOUND), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_true_parameters(self): - """ - test true parameters - """ - resp = self.client.get( - "/packages/packageInfo?pkg_name=A&table_name=fedora30") - resp_dict = json.loads(resp.data) - - correct_list = get_correct_json_by_filename( - "get_single_package") - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status prompt return") - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertTrue( - compare_two_values( - resp_dict.get("data"), - correct_list), - msg="Error in data information return") diff --git a/packageship/test/test_module/single_package_tests/test_update_singlepack.py b/packageship/test/test_module/single_package_tests/test_update_singlepack.py deleted file mode 100644 index b6c8ffb3c1bd51b8170b1453f93c926e0430994e..0000000000000000000000000000000000000000 --- a/packageship/test/test_module/single_package_tests/test_update_singlepack.py +++ /dev/null @@ -1,434 +0,0 @@ -#!/usr/bin/python3 -"""TestUpdatePackage""" -# -*- coding:utf-8 -*- -import os -from test.base_code.operate_data_base import OperateTestBase -from packageship import system_config - -import json - -from packageship.application.apps.package.function.constants import ResponseCode - - -class TestBatchUpdatePackage(OperateTestBase): - """TestUpdatePackage""" - - def test_missing_required_parameters(self): - """ - Parameter error - """ - # all miss required parameters - resp = self.client.put("/lifeCycle/updatePkgInfo", - data=json.dumps({"batch": ""}), - content_type="application/json") - resp_dict = json.loads(resp.data) - resp_dict.get("data") - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # all miss wrong parameters - resp = self.client.put("/lifeCycle/updatePkgInfo", - data=json.dumps({"batch": "1"}), - content_type="application/json") - resp_dict = json.loads(resp.data) - resp_dict.get("data") - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SPECIFIED_FILE_NOT_EXIST, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.SPECIFIED_FILE_NOT_EXIST), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_read_yaml_update(self): - """ - - Returns: - - """ - - # Missing file path - resp = self.client.put("/lifeCycle/updatePkgInfo", - data=json.dumps({"batch": 1}), - content_type="application/json") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SPECIFIED_FILE_NOT_EXIST, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.SPECIFIED_FILE_NOT_EXIST), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # File path error - resp = self.client.put("/lifeCycle/updatePkgInfo", - data=json.dumps({"batch": 1, - "filepath": "D\\"}), - content_type="application/json") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SPECIFIED_FILE_NOT_EXIST, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.SPECIFIED_FILE_NOT_EXIST), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Data error in yaml file - yaml_path = os.path.join( - os.path.dirname(system_config.BASE_PATH), - "test", - "common_files", - "test_wrong_format_yaml") - resp = self.client.put("/lifeCycle/updatePkgInfo", - data=json.dumps({"filepath": yaml_path, - "batch": 1 - }), - content_type="application/json") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.YAML_FILE_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.YAML_FILE_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - # Data error in yaml file - yaml_path = os.path.join( - os.path.dirname(system_config.BASE_PATH), - "test", - "common_files", - "test_wrong_main_yaml") - resp = self.client.put("/lifeCycle/updatePkgInfo", - data=json.dumps({"filepath": yaml_path, - "batch": 1 - }), - content_type="application/json") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.YAML_FILE_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.YAML_FILE_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_batch_error(self): - """ - test_batch_error - Returns: - - """ - - resp = self.client.put("/lifeCycle/updatePkgInfo", - data=json.dumps({"filepath": "C:\\", - "batch": 2 - }), - content_type="application/json") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_true_yaml(self): - """ - - Returns: - - """ - yaml_path = os.path.join( - os.path.dirname(system_config.BASE_PATH), - "test", - "common_files", - "test_true_yaml") - resp = self.client.put("/lifeCycle/updatePkgInfo", - data=json.dumps({"filepath": yaml_path, - "batch": 1 - }), - content_type="application/json") - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_single_update(self): - """ - - Returns: - - """ - - # Various parameters are missing - resp = self.client.put("/lifeCycle/updatePkgInfo", - data=json.dumps({ - "pkg_name": "", - "batch": 0 - }), - content_type="application/json") - resp_dict = json.loads(resp.data) - resp_dict.get("data") - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PACK_NAME_NOT_FOUND, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PACK_NAME_NOT_FOUND), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - resp = self.client.put("/lifeCycle/updatePkgInfo", - data=json.dumps({ - "pkg_name": "CUnit", - "batch": 0, - "maintainlevel": "a" - }), - content_type="application/json") - resp_dict = json.loads(resp.data) - resp_dict.get("data") - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - resp = self.client.put("/lifeCycle/updatePkgInfo", - data=json.dumps({ - "pkg_name": "CUnit", - "batch": 0, - "maintainlevel": "6" - }), - content_type="application/json") - resp_dict = json.loads(resp.data) - resp_dict.get("data") - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PARAM_ERROR, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PARAM_ERROR), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_ture_name(self): - """ - test table name - Returns: - - """ - resp = self.client.put("/lifeCycle/updatePkgInfo", - data=json.dumps({ - "pkg_name": "CUnit", - "batch": 0, - "maintainlevel": "4" - }), - content_type="application/json") - resp_dict = json.loads(resp.data) - resp_dict.get("data") - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_pkg_name(self): - """ - test_pkg_name - Returns: - - """ - resp = self.client.put("/lifeCycle/updatePkgInfo", - data=json.dumps({ - "pkg_name": "", - "batch": 0, - "maintainlevel": "4" - }), - content_type="application/json") - resp_dict = json.loads(resp.data) - resp_dict.get("data") - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.PACK_NAME_NOT_FOUND, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.PACK_NAME_NOT_FOUND), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") - - def test_true_updata_single(self): - """ - test_true_single - Returns: - - """ - resp = self.client.put("/lifeCycle/updatePkgInfo", - data=json.dumps({ - "pkg_name": "CUnit", - "batch": 0, - "maintainlevel": "4" - }), - content_type="application/json") - resp_dict = json.loads(resp.data) - resp_dict.get("data") - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, - resp_dict.get("code"), - msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get( - ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status prompt return") - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNone( - resp_dict.get("data"), - msg="Error in data information return") diff --git a/packageship/test/write_test.py b/packageship/test/write_test.py deleted file mode 100644 index 7d12f13d69e97fb1a485c045f6ce8efa9cf862c4..0000000000000000000000000000000000000000 --- a/packageship/test/write_test.py +++ /dev/null @@ -1,44 +0,0 @@ -#!/usr/bin/python3 -# -*- coding:utf-8 -*- -""" -Execute all test cases -""" -import unittest -import datetime -from test.base_code.my_test_runner import MyTestRunner - -RUNNER = MyTestRunner() - - -def write_data_tests(): - """Test cases with write operations on data""" - - from test.test_module.repodatas_test.test_delete_repodatas import TestDeleteRepodatas - from test.test_module.single_package_tests.test_update_singlepack import TestBatchUpdatePackage - from test.test_module.lifecycle.test_issue_catch import TestIssueCatch - suite = unittest.TestSuite() - - classes = [ - TestDeleteRepodatas, - TestBatchUpdatePackage, - TestIssueCatch - ] - for cls in classes: - suite.addTests(unittest.TestLoader().loadTestsFromTestCase(cls)) - return RUNNER.run(suite) - - -start_time = datetime.datetime.now() - -result_4_write = write_data_tests() - -stop_time = datetime.datetime.now() - -print('\nA Write Test total of %s test cases were run: \nsuccessful:%s\tfailed:%s\terror:%s\n' % ( - int(result_4_write.testsRun), - int(result_4_write.success_case_count), - int(result_4_write.failure_case_count), - int(result_4_write.err_case_count) -)) - -print('Write Test Total Time: %s' % (stop_time - start_time)) diff --git a/packageship/web-ui/.browserslistrc b/packageship/web-ui/.browserslistrc deleted file mode 100644 index 214388fe43cdfd7ce1c29cd3e401541ded620dba..0000000000000000000000000000000000000000 --- a/packageship/web-ui/.browserslistrc +++ /dev/null @@ -1,3 +0,0 @@ -> 1% -last 2 versions -not dead diff --git a/packageship/web-ui/.eslintrc.js b/packageship/web-ui/.eslintrc.js deleted file mode 100644 index 948395b666ffdf677d191e6dfaed5278151d8a36..0000000000000000000000000000000000000000 --- a/packageship/web-ui/.eslintrc.js +++ /dev/null @@ -1,21 +0,0 @@ -/** - * @file 代码规范配置文件 - * */ - -module.exports = { - root: true, - env: { - node: true - }, - 'extends': [ - 'plugin:vue/essential', - 'eslint:recommended' - ], - parserOptions: { - parser: 'babel-eslint' - }, - rules: { - 'no-console': process.env.NODE_ENV === 'production' ? 'warn' : 'off', - 'no-debugger': process.env.NODE_ENV === 'production' ? 'warn' : 'off' - } -}; \ No newline at end of file diff --git a/packageship/web-ui/Dockerfile b/packageship/web-ui/Dockerfile deleted file mode 100644 index bc3130832e5599de716f07606dde3ca2549910f4..0000000000000000000000000000000000000000 --- a/packageship/web-ui/Dockerfile +++ /dev/null @@ -1,24 +0,0 @@ -FROM node:alpine as Builder - -MAINTAINER zhangxiaopan - -RUN mkdir -p /home/openeuler/pkgwebui -WORKDIR /home/openeuler/pkgwebui -COPY . /home/openeuler/pkgwebui - -RUN npm install -g vue && \ - npm install && \ - npm run build - -FROM nginx:1.19.2 - -COPY --from=Builder /home/openeuler/pkgwebui/dist /usr/share/nginx/html/ -RUN chmod -R 755 /usr/share/nginx/html -COPY ./deploy/nginx/default.conf /etc/nginx/conf.d/ - -ENV RUN_USER nginx -ENV RUN_GROUP nginx -EXPOSE 8080 -ENTRYPOINT ["nginx", "-g", "daemon off;"] - - diff --git a/packageship/web-ui/README.md b/packageship/web-ui/README.md deleted file mode 100644 index 362eb2092a92a02c95fa773b3c3e56b32e020c27..0000000000000000000000000000000000000000 --- a/packageship/web-ui/README.md +++ /dev/null @@ -1,24 +0,0 @@ -# openeuler-v2 - -## Project setup -``` -npm install -``` - -### Compiles and hot-reloads for development -``` -npm run serve -``` - -### Compiles and minifies for production -``` -npm run build -``` - -### Lints and fixes files -``` -npm run lint -``` - -### Customize configuration -See [Configuration Reference](https://cli.vuejs.org/config/). diff --git a/packageship/web-ui/babel.config.js b/packageship/web-ui/babel.config.js deleted file mode 100644 index e5921a25e798e8a22b7f3028d08dd8e0481fd35f..0000000000000000000000000000000000000000 --- a/packageship/web-ui/babel.config.js +++ /dev/null @@ -1,18 +0,0 @@ -/** - * @file babel配置文件 - * */ - -module.exports = { - presets: [ - '@vue/cli-plugin-babel/preset' - ], - 'plugins': [ - [ - 'component', - { - 'libraryName': 'element-ui', - 'styleLibraryName': 'theme-chalk' - } - ] - ] -}; \ No newline at end of file diff --git a/packageship/web-ui/deploy/nginx/default.conf b/packageship/web-ui/deploy/nginx/default.conf deleted file mode 100644 index 9094297456db871796a0482b493be80503eb9ef9..0000000000000000000000000000000000000000 --- a/packageship/web-ui/deploy/nginx/default.conf +++ /dev/null @@ -1,18 +0,0 @@ -server { - listen 8080; - include /etc/nginx/mime.types; - default_type application/octet-stream; - server_name localhost; - - #charset koi8-r; - #access_log /var/log/nginx/host.access.log main; - - location / { - root /usr/share/nginx/html; - index /index.html; - } - - location /api/ { - proxy_pass https://api.openeuler.org/pkgmanage/; - } -} \ No newline at end of file diff --git a/packageship/web-ui/package.json b/packageship/web-ui/package.json deleted file mode 100644 index 1cda30914a7b1515f3af0707ace5f961c5b5263a..0000000000000000000000000000000000000000 --- a/packageship/web-ui/package.json +++ /dev/null @@ -1,34 +0,0 @@ -{ - "name": "Package-Management", - "version": "0.1.0", - "private": true, - "scripts": { - "serve": "vue-cli-service serve --port 80", - "build": "vue-cli-service build", - "lint": "vue-cli-service lint" - }, - "dependencies": { - "axios": "^0.19.2", - "core-js": "^3.6.5", - "echarts": "^4.9.0", - "element-ui": "^2.13.2", - "vue": "^2.6.11", - "vue-i18n": "^8.18.2", - "vue-router": "^3.2.0" - }, - "devDependencies": { - "@vue/cli-plugin-babel": "~4.4.0", - "@vue/cli-plugin-eslint": "~4.4.0", - "@vue/cli-plugin-router": "~4.4.0", - "@vue/cli-service": "~4.4.0", - "babel-eslint": "^10.1.0", - "babel-plugin-component": "^1.1.1", - "eslint": "^6.7.2", - "eslint-plugin-vue": "^6.2.2", - "less": "^3.0.4", - "less-loader": "^5.0.0", - "style-resources-loader": "^1.3.3", - "vue-cli-plugin-style-resources-loader": "^0.1.4", - "vue-template-compiler": "^2.6.11" - } -} diff --git a/packageship/web-ui/public/favicon.ico b/packageship/web-ui/public/favicon.ico deleted file mode 100644 index df36fcfb72584e00488330b560ebcf34a41c64c2..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/public/favicon.ico and /dev/null differ diff --git a/packageship/web-ui/public/index.html b/packageship/web-ui/public/index.html deleted file mode 100644 index 622b5dd01027aad1c8fbc4e4aaacace1ffd9275c..0000000000000000000000000000000000000000 --- a/packageship/web-ui/public/index.html +++ /dev/null @@ -1,21 +0,0 @@ - - - - - - <%= htmlWebpackPlugin.options.title %> - - - - - - - - We're sorry but <%= htmlWebpackPlugin.options.title %> doesn't work properly without JavaScript enabled. - Please enable it to continue. - - - - - - \ No newline at end of file diff --git a/packageship/web-ui/src/App.vue b/packageship/web-ui/src/App.vue deleted file mode 100644 index 33b4367d78aa21a51d0383bb078fc71859a21579..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/App.vue +++ /dev/null @@ -1,35 +0,0 @@ - - - - - - - - - - - - - diff --git a/packageship/web-ui/src/api/issue.js b/packageship/web-ui/src/api/issue.js deleted file mode 100644 index 6f6a09dfab542881fe41a22110addd02ce190688..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/api/issue.js +++ /dev/null @@ -1,76 +0,0 @@ -/** - * @file 包管理 issue接口配置文件 - * */ - -import appAjax from './../libs/ajax-utils'; -export const issueList = ({ - pageNum, - pageSize, - issueType, - issueStatus -}) => { - return new Promise((resolve, reject) => { - appAjax.postJson({ - url: '/lifeCycle/issuetrace', - type: 'get', - params: { - page_num: pageNum, - page_size: pageSize, - issue_type: issueType, - issue_status: issueStatus - }, - success(result) { - if (result) { - resolve(result); - return; - } - reject(result); - }, - error(msg) { - reject(msg); - } - - }); - - }); -}; - -export const issueType = () => { - return new Promise((resolve, reject) => { - appAjax.postJson({ - url: '/lifeCycle/issuetype', - type: 'get', - success(result) { - if (result) { - resolve(result); - return; - } - reject(result); - }, - error(msg) { - reject(msg); - } - - }); - - }); -}; - -export const issueStatus = () => new Promise((resolve, reject) => { - appAjax.postJson({ - url: '/lifeCycle/issuestatus', - type: 'get', - success(result) { - if (result) { - resolve(result); - return; - } - reject(result); - }, - error(msg) { - reject(msg); - } - - }); - - }); diff --git a/packageship/web-ui/src/api/repo.js b/packageship/web-ui/src/api/repo.js deleted file mode 100644 index 4786ad4c9f588c3a08e3b077a62edac1865da43c..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/api/repo.js +++ /dev/null @@ -1,127 +0,0 @@ -/** - * @file 包管理接口配置文件 - * */ - -import appAjax from './../libs/ajax-utils'; -export const packages = ({ - pageNum, - pageSize, - tableName, - queryPkgName, - maintainer, - maintainlevel -}) => { - return new Promise((resolve, reject) => { - appAjax.postJson({ - url: '/packages', - type: 'get', - params: { - page_num: pageNum, - page_size: pageSize, - table_name: tableName, - query_pkg_name: queryPkgName, - maintainer, - maintainlevel - }, - success(result) { - if (result) { - resolve(result); - return; - } - reject(result); - }, - error(msg) { - reject(msg); - } - - }); - - }); -}; -export const productVersion = () => { - return new Promise((resolve, reject) => { - appAjax.postJson({ - url: '/lifeCycle/tables', - type: 'get', - success(result) { - if (result) { - resolve(result); - return; - } - reject(result); - }, - error(msg) { - reject(msg); - } - - }); - - }); -}; - -export const tableCol = () => { - return new Promise((resolve, reject) => { - appAjax.postJson({ - url: '/packages/tablecol', - type: 'get', - success(result) { - if (result) { - resolve(result); - return; - } - reject(result); - }, - error(msg) { - reject(msg); - } - - }); - - }); -}; - -export const packageDetail = ({table_name, pkg_name}) => { - return new Promise((resolve, reject) => { - appAjax.postJson({ - url: '/packages/packageInfo', - type: 'get', - params: { - table_name, - pkg_name - }, - success(result) { - if (result) { - resolve(result); - return; - } - reject(result); - }, - error(msg) { - reject(msg); - } - - }); - - }); -}; - -export const maintainer = () => { - return new Promise((resolve, reject) => { - appAjax.postJson({ - url: '/lifeCycle/maintainer', - type: 'get', - success(result) { - if (result) { - resolve(result); - return; - } - reject(result); - }, - error(msg) { - reject(msg); - } - - }); - - }); -}; \ No newline at end of file diff --git a/packageship/web-ui/src/assets/fonts/FZLTCHJW.TTF b/packageship/web-ui/src/assets/fonts/FZLTCHJW.TTF deleted file mode 100644 index 2ba2430be2cfc3bb6de1f2870b6d5364ad69e1e3..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/fonts/FZLTCHJW.TTF and /dev/null differ diff --git a/packageship/web-ui/src/assets/fonts/FZLTHJW.TTF b/packageship/web-ui/src/assets/fonts/FZLTHJW.TTF deleted file mode 100644 index 379dc975c37291aa56568ff1f078ed12345a280e..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/fonts/FZLTHJW.TTF and /dev/null differ diff --git a/packageship/web-ui/src/assets/fonts/FZLTXIHJW.TTF b/packageship/web-ui/src/assets/fonts/FZLTXIHJW.TTF deleted file mode 100644 index 5800b22a5e222d3a42a35fb91b020f806520ed23..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/fonts/FZLTXIHJW.TTF and /dev/null differ diff --git a/packageship/web-ui/src/assets/fonts/HuaweiSans-Bold.ttf b/packageship/web-ui/src/assets/fonts/HuaweiSans-Bold.ttf deleted file mode 100644 index f3a838546d3141c6684098f77bdaa763db8e5ffe..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/fonts/HuaweiSans-Bold.ttf and /dev/null differ diff --git a/packageship/web-ui/src/assets/fonts/HuaweiSans-Light.ttf b/packageship/web-ui/src/assets/fonts/HuaweiSans-Light.ttf deleted file mode 100644 index d3b3ac183eab2eb6e69b7b49905157b72dffdf47..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/fonts/HuaweiSans-Light.ttf and /dev/null differ diff --git a/packageship/web-ui/src/assets/fonts/HuaweiSans-Medium.ttf b/packageship/web-ui/src/assets/fonts/HuaweiSans-Medium.ttf deleted file mode 100644 index 6878ed7c64b5f0d0ac4c24f3fcbfba9aa90e58a1..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/fonts/HuaweiSans-Medium.ttf and /dev/null differ diff --git a/packageship/web-ui/src/assets/fonts/HuaweiSans-Regular.ttf b/packageship/web-ui/src/assets/fonts/HuaweiSans-Regular.ttf deleted file mode 100644 index 3118eaffb6d174531b17061c8c133b591256e152..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/fonts/HuaweiSans-Regular.ttf and /dev/null differ diff --git a/packageship/web-ui/src/assets/fonts/Roboto-Black.ttf b/packageship/web-ui/src/assets/fonts/Roboto-Black.ttf deleted file mode 100644 index 86ec2b29ba56a3d6c45f1a8584ff3780fa70c60e..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/fonts/Roboto-Black.ttf and /dev/null differ diff --git a/packageship/web-ui/src/assets/fonts/Roboto-BlackItalic.ttf b/packageship/web-ui/src/assets/fonts/Roboto-BlackItalic.ttf deleted file mode 100644 index 1904c99b2c81e3e278d3c373f397b38049824d4c..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/fonts/Roboto-BlackItalic.ttf and /dev/null differ diff --git a/packageship/web-ui/src/assets/fonts/Roboto-Bold.ttf b/packageship/web-ui/src/assets/fonts/Roboto-Bold.ttf deleted file mode 100644 index 91ec21227866ca9d1cf77ec13660b7b85ec900dd..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/fonts/Roboto-Bold.ttf and /dev/null differ diff --git a/packageship/web-ui/src/assets/fonts/Roboto-Light.ttf b/packageship/web-ui/src/assets/fonts/Roboto-Light.ttf deleted file mode 100644 index d43e943312e0f2c653815dd791d93f94f0abd73f..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/fonts/Roboto-Light.ttf and /dev/null differ diff --git a/packageship/web-ui/src/assets/fonts/Roboto-Medium.ttf b/packageship/web-ui/src/assets/fonts/Roboto-Medium.ttf deleted file mode 100644 index 87983419893a8952c3f286dc56d37fb94e320da0..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/fonts/Roboto-Medium.ttf and /dev/null differ diff --git a/packageship/web-ui/src/assets/fonts/Roboto-Regular.ttf b/packageship/web-ui/src/assets/fonts/Roboto-Regular.ttf deleted file mode 100644 index 7d9a6c4c32d7e920b549caf531e390733496b6e0..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/fonts/Roboto-Regular.ttf and /dev/null differ diff --git a/packageship/web-ui/src/assets/fonts/icomoon.eot b/packageship/web-ui/src/assets/fonts/icomoon.eot deleted file mode 100644 index 296ebc035c5b8d333fc5b33d192e9aab34674455..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/fonts/icomoon.eot and /dev/null differ diff --git a/packageship/web-ui/src/assets/fonts/icomoon.svg b/packageship/web-ui/src/assets/fonts/icomoon.svg deleted file mode 100644 index 5993cbfc3e08ce844a2560a3c02a5b559c3db9a4..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/assets/fonts/icomoon.svg +++ /dev/null @@ -1,13 +0,0 @@ - - - -Generated by IcoMoon - - - - - - - - - \ No newline at end of file diff --git a/packageship/web-ui/src/assets/fonts/icomoon.ttf b/packageship/web-ui/src/assets/fonts/icomoon.ttf deleted file mode 100644 index 4d22f59786856202df95442ab1af225a53a7a1e3..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/fonts/icomoon.ttf and /dev/null differ diff --git a/packageship/web-ui/src/assets/fonts/icomoon.woff b/packageship/web-ui/src/assets/fonts/icomoon.woff deleted file mode 100644 index 04a91af85e46766f2d489439cef5f58fa209467b..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/fonts/icomoon.woff and /dev/null differ diff --git a/packageship/web-ui/src/assets/images/Gitee.png b/packageship/web-ui/src/assets/images/Gitee.png deleted file mode 100644 index bb68b9c00d7cef5091a97d5d589cdbbf76374990..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/images/Gitee.png and /dev/null differ diff --git a/packageship/web-ui/src/assets/images/column.svg b/packageship/web-ui/src/assets/images/column.svg deleted file mode 100644 index 48afb28d4def283a58d91965748dcee773919246..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/assets/images/column.svg +++ /dev/null @@ -1,9 +0,0 @@ - - - 形状结合 - - - - - - \ No newline at end of file diff --git a/packageship/web-ui/src/assets/images/dowmload.svg b/packageship/web-ui/src/assets/images/dowmload.svg deleted file mode 100644 index 0dd2ff5cda735dfffeba4f7299169d1c4cc59d54..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/assets/images/dowmload.svg +++ /dev/null @@ -1,9 +0,0 @@ - - - Style - - - - - - \ No newline at end of file diff --git a/packageship/web-ui/src/assets/images/footer-logo.png b/packageship/web-ui/src/assets/images/footer-logo.png deleted file mode 100644 index e029c8d7daad57c231228b2140a505fcf93b8e79..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/images/footer-logo.png and /dev/null differ diff --git a/packageship/web-ui/src/assets/images/lang.png b/packageship/web-ui/src/assets/images/lang.png deleted file mode 100644 index 4d968b6ba46e2d3fabf1d0806c5f32ef96e30c4d..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/images/lang.png and /dev/null differ diff --git a/packageship/web-ui/src/assets/images/logo-mobile.png b/packageship/web-ui/src/assets/images/logo-mobile.png deleted file mode 100644 index d26be4e7ac88b84d44dc593059afcfe274986b28..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/images/logo-mobile.png and /dev/null differ diff --git a/packageship/web-ui/src/assets/images/menu-mobile.png b/packageship/web-ui/src/assets/images/menu-mobile.png deleted file mode 100644 index aaa0e59a59327cde15d49037b5a01f1bee3c1f3b..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/images/menu-mobile.png and /dev/null differ diff --git a/packageship/web-ui/src/assets/images/openeuler.png b/packageship/web-ui/src/assets/images/openeuler.png deleted file mode 100644 index f7da72fd7ddd2a0965f8dd90ece4094b00a59e0b..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/images/openeuler.png and /dev/null differ diff --git a/packageship/web-ui/src/assets/images/search.png b/packageship/web-ui/src/assets/images/search.png deleted file mode 100644 index 11ee04ea7e4fa8872035b5b1451e3f8e06954f9b..0000000000000000000000000000000000000000 Binary files a/packageship/web-ui/src/assets/images/search.png and /dev/null differ diff --git a/packageship/web-ui/src/assets/style/base.css b/packageship/web-ui/src/assets/style/base.css deleted file mode 100644 index d7a2de0df496c01ce297142089add4623fb5c045..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/assets/style/base.css +++ /dev/null @@ -1,181 +0,0 @@ -/* CSS Document */ -/*css reset*/ -html { - box-sizing: border-box; -} - -*, -*:before, -*:after { - box-sizing: inherit; -} - -body, -div, -dl, -dt, -dd, -ul, -ol, -li, -h1, -h2, -h3, -h4, -h5, -h6, -pre, -form, -fieldset, -input, -textarea, -p, -blockquote, -th, -td { - margin: 0; - padding: 0; -} - -table { - border-collapse: collapse; - border-spacing: 0; -} - -fieldest, -img { - border: 0; -} - -address, -caption, -cite, -code, -dfn, -em, -strong, -th, -var { - font-style: normal; - font-weight: normal; -} - -ol, -ul { - list-style: none; -} - -caption, -th { - text-align: left; -} - -h1, -h2, -h3, -h4, -h5, -h6 { - font-size: 100%; - font-weight: normal; -} - -p:before, -q:after { - content: ""; -} - -abbr, -acronym { - border: 0; -} - -/*定位*/ -.tl { - text-align: left; -} - -.tc { - text-align: center; -} - -.tr { - text-align: right; -} - -.bc { - margin-left: auto; - margin-right: auto; -} - -.fl { - float: left; -} - -.fr { - float: right; -} - -.cb { - clear: both; -} - -.cl { - clear: left; -} - -.cr { - clear: right; -} - -.clearfix:after { - content: "."; - display: block; - height: 0; - clear: both; - visibility: hidden; -} - -.clearfix { - display: inline-block; -} - -@font-face { - font-family: "icomoon"; - src: url(../fonts/icomoon.eot?4mtq8t); - src: url(../fonts/icomoon.eot?4mtq8t#iefix) format('embedded-opentype'), - url(../fonts/icomoon.ttf?4mtq8t) format('truetype'), - url(../fonts/icomoon.woff?4mtq8t) format('woff'), - url(../fonts/icomoon.svg?4mtq8t#icomoon) format('svg'); - font-weight: normal; - font-style: normal; - font-display: block; -} - - [class^="icon-"], - [class*=" icon-"] { - /* use !important to prevent issues with browser extensions that change fonts */ - font-family: "icomoon" !important; - speak: never; - font-style: normal; - font-weight: normal; - font-variant: normal; - text-transform: none; - line-height: 1; - - /* Better Font Rendering =========== */ - -webkit-font-smoothing: antialiased; - -moz-osx-font-smoothing: grayscale; - } - - .icon-menu:before { - content: "\e900"; - } - - .icon-arrow:before { - content: "\e901"; - } - - .icon-search:before { - content: "\e902"; - } \ No newline at end of file diff --git a/packageship/web-ui/src/assets/style/font-cn.css b/packageship/web-ui/src/assets/style/font-cn.css deleted file mode 100644 index 17b9e3ed8495169bafe99e86cd43c92bb2ab589a..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/assets/style/font-cn.css +++ /dev/null @@ -1,17 +0,0 @@ -@font-face { - font-family: "FZLTHJW"; - src: url(../fonts/FZLTHJW.TTF); -} -@font-face { - font-family: "FZLTCHJW"; - src: url(../fonts/FZLTCHJW.TTF); -} - -@font-face { - font-family: "FZLTHJW"; - src: url(../fonts/FZLTHJW.TTF); -} -@font-face { - font-family: "FZLTXIHJW"; - src: url(../fonts/FZLTXIHJW.TTF); -} \ No newline at end of file diff --git a/packageship/web-ui/src/assets/style/font-en.css b/packageship/web-ui/src/assets/style/font-en.css deleted file mode 100644 index 1bb936bb472e11d7e2a518da00c410b692f0737c..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/assets/style/font-en.css +++ /dev/null @@ -1,40 +0,0 @@ -@font-face { - font-family: "HuaweiSans-Bold"; - src: url(../fonts/HuaweiSans-Bold.ttf); -} -@font-face { - font-family: "HuaweiSans-Light"; - src: url(../fonts/HuaweiSans-Light.ttf); -} -@font-face { - font-family: "HuaweiSans-Medium"; - src: url(../fonts/HuaweiSans-Medium.ttf); -} -@font-face { - font-family: "HuaweiSans"; - src: url(../fonts/HuaweiSans-Regular.ttf); -} -@font-face { - font-family: "Roboto-Black"; - src: url(../fonts/Roboto-Black.ttf); -} -@font-face { - font-family: "Roboto-BlackItalic"; - src: url(../fonts/Roboto-BlackItalic.ttf); -} -@font-face { - font-family: "Roboto-Bold"; - src: url(../fonts/Roboto-Bold.ttf); -} -@font-face { - font-family: "Roboto-Light"; - src: url(../fonts/Roboto-Light.ttf); -} -@font-face { - font-family: "Roboto-Medium"; - src: url(../fonts/Roboto-Medium.ttf); -} -@font-face { - font-family: "Roboto-Regular"; - src: url(../fonts/Roboto-Regular.ttf); -} \ No newline at end of file diff --git a/packageship/web-ui/src/assets/style/vars.less b/packageship/web-ui/src/assets/style/vars.less deleted file mode 100644 index 372d0dc7afdfc4396a90d71ebc8dc14393b96d67..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/assets/style/vars.less +++ /dev/null @@ -1,61 +0,0 @@ -@primary-color: #002FA7; //主色 -@primary-color2: #0041BD; //主色2 -@text-white: #fff; -@text-dark: #000; -@text-dark2: #0B162B; -@text-light: rgba(0, 0, 0, 0.85); -@text-lighter: rgba(0, 0, 0, 0.7); -@text-lightest: rgba(0, 0, 0, 0.5); - -.ff-h(){ - font-family: FZLTHJW; -} -.ff-xih(){ - font-family: FZLTXIHJW; -} -.ff-ch(){ - font-family: FZLTCHJW; -} -.ff-hwsans(){ - font-family: HuaweiSans; -} -.ff-hwsans-b(){ - font-family: HuaweiSans-Bold; -} -.ff-hwsans-m(){ - font-family: HuaweiSans-Medium; -} -.ff-pfsc-r(){ - font-family: PingFangSC-Regular; -} - -.fz20(){ - font-size: 20px; -} -.fz18(){ - font-size: 18px; -} -.fz16(){ - font-size: 16px; -} -.fz14(){ - font-size: 14px; -} -.fz12(){ - font-size: 12px; -} -.fz60(){ - font-size: 60px; -} -.fz40(){ - font-size: 40px; -} -.fz24(){ - font-size: 24px; -} -.fz36(){ - font-size: 36px; -} -.fz72(){ - font-size: 72px; -} \ No newline at end of file diff --git a/packageship/web-ui/src/components/footer.vue b/packageship/web-ui/src/components/footer.vue deleted file mode 100644 index b61746eeeadf4bc2db773191870df43994944dc7..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/components/footer.vue +++ /dev/null @@ -1,102 +0,0 @@ - - - - - - - diff --git a/packageship/web-ui/src/components/header.vue b/packageship/web-ui/src/components/header.vue deleted file mode 100644 index fc2bd2a70eeefd87b6dfd7fef65f911df699b8b2..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/components/header.vue +++ /dev/null @@ -1,448 +0,0 @@ - - - - - - - - - - - {{ $t("common.gitee") }} - - - - - - - - - - diff --git a/packageship/web-ui/src/config/index.js b/packageship/web-ui/src/config/index.js deleted file mode 100644 index c6f69416206a377add74ddbf175931f1e6d35c7e..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/config/index.js +++ /dev/null @@ -1,7 +0,0 @@ -/** - * @file 站点基础配置文件 - * */ - -export default { - serviceBaseUrl: '/api', -}; \ No newline at end of file diff --git a/packageship/web-ui/src/lang/cn.js b/packageship/web-ui/src/lang/cn.js deleted file mode 100644 index e8003d7380560135601bf1d2821ae90782655424..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/lang/cn.js +++ /dev/null @@ -1,8 +0,0 @@ -/** - * @file 国际化中文配置主入口 - * */ - -module.exports = { - common: require('@/lang/lang-modules/common').cn, - home: require('@/lang/lang-modules/home').cn, -}; \ No newline at end of file diff --git a/packageship/web-ui/src/lang/en.js b/packageship/web-ui/src/lang/en.js deleted file mode 100644 index 9dfea9098077ffd357297cc26dac49dd0619fd23..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/lang/en.js +++ /dev/null @@ -1,8 +0,0 @@ -/** - * @file 国际化英文配置主入口 - * */ - -module.exports = { - common: require('@/lang/lang-modules/common').en, - home: require('@/lang/lang-modules/home').en, -}; \ No newline at end of file diff --git a/packageship/web-ui/src/lang/lang-modules/common.js b/packageship/web-ui/src/lang/lang-modules/common.js deleted file mode 100644 index 880c88b23ec2e6ff5659a2e89fe1e0e8ce4bfb12..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/lang/lang-modules/common.js +++ /dev/null @@ -1,233 +0,0 @@ -/** - * @file 公共模块国际化配置入口 - * */ - -module.exports = { - cn: { - searchPlaceholder: '输入内容', - navRouterConfig: [{ - path: '/download', - name: '下载', - children: [], - class: [] - }, - { - path: '/documentation/documentation', - name: '文档', - children: [], - class: [] - }, - { - path: '', - name: '社区', - subName: '社区玩转指引', - subPath: '/community/community-guidance', - subImg: '', - children: [{ - name: '活动', - path: '/community/event-list' - }, - { - name: '博客', - path: '/community/blog-list' - }, - { - name: '新闻', - path: '/community/news-list' - }, - { - name: '活动', - path: '/community/mailing-list' - } - ], - class: [] - }, - { - path: '', - name: 'SIG', - subName: 'SIG玩转指引', - subPath: '/sig/sig-guidance', - subImg: '', - viewAllName: '查看全部', - viewAllPath: '/sig/sig-list', - children: [{ - name: 'A-Tune', - path: '/sig/sig-detail/1' - }, - { - name: 'Base-service', - path: '/sig/sig-detail/2' - }, - { - name: 'Computing', - path: '/sig/sig-detail/3' - }, - { - name: 'DB', - path: '/sig/sig-detail/4' - }, - { - name: 'GNOME', - path: '/sig/sig-detail/5' - }, - { - name: 'Application', - path: '/sig/sig-detail/6' - }, - { - name: 'Compiler', - path: '/sig/sig-detail/7' - }, - { - name: 'Container', - path: '/sig/sig-detail/8' - }, - { - name: 'Desktop', - path: '/sig/sig-detail/9' - }, - { - name: 'Infrastructure', - path: '/sig/sig-detail/10' - } - ], - class: [] - }, - { - path: '/authentication', - name: '认证', - children: [], - class: [] - }, - { - path: '/security', - name: '安全', - children: [], - class: [] - } - ], - PAGE_NAME: 'Package Management', - lang: 'EN', - search: '搜索', - gitee: '码云', - footer: { - leftLogo: 'openEuler', - mail: 'contact@openeuler.org', - copyright: '版权所有 © 2020 openeuler 保留一切权利', - rightList: ['品牌', '法律声明', '隐私政策'] - } - }, - en: { - searchPlaceholder: 'Input content', - navRouterConfig: [{ - path: '/download', - name: 'Download', - children: [], - class: [] - }, - { - path: '/documentation/documentation', - name: 'Documentation', - children: [], - class: [] - }, - { - path: '', - name: 'Community', - subName: '社区玩转指引', - subPath: '/community/community-guidance', - subImg: '', - children: [{ - name: 'Events', - path: '/community/event-list' - }, - { - name: 'Blog', - path: '/community/blog-list' - }, - { - name: 'News', - path: '/community/news-list' - }, - { - name: 'Mailing', - path: '/community/mailing-list' - } - ], - class: [] - }, - { - path: '', - name: 'SIG', - subName: 'SIG Play guide', - subPath: '/sig/sig-guidance', - subImg: '', - viewAllName: 'View All', - viewAllPath: '/sig/sig-list', - children: [{ - name: 'A-Tune', - path: '/sig/sig-detail/1' - }, - { - name: 'Base-service', - path: '/sig/sig-detail/2' - }, - { - name: 'Computing', - path: '/sig/sig-detail/3' - }, - { - name: 'DB', - path: '/sig/sig-detail/4' - }, - { - name: 'GNOME', - path: '/sig/sig-detail/5' - }, - { - name: 'Application', - path: '/sig/sig-detail/6' - }, - { - name: 'Compiler', - path: '/sig/sig-detail/7' - }, - { - name: 'Container', - path: '/sig/sig-detail/8' - }, - { - name: 'Desktop', - path: '/sig/sig-detail/9' - }, - { - name: 'Infrastructure', - path: '/sig/sig-detail/10' - } - ], - class: [] - }, - { - path: '/authentication', - name: 'Authentication', - children: [], - class: [] - }, - { - path: '/security', - name: 'Security', - children: [], - class: [] - } - ], - PAGE_NAME: 'Package Management', - lang: '中', - search: 'search', - gitee: 'gitee', - footer: { - mail: 'contact@openeuler.org', - copyright: 'Copyright © 2020 openEuler. All rights reserved.', - rightList: ['TradeMark', 'Legal', 'Privacy'] - } - } -}; \ No newline at end of file diff --git a/packageship/web-ui/src/lang/lang-modules/home.js b/packageship/web-ui/src/lang/lang-modules/home.js deleted file mode 100644 index ae9f3a357bffffeb5cea3bf51372f584fd518599..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/lang/lang-modules/home.js +++ /dev/null @@ -1,88 +0,0 @@ -/** - * @file 首页模块国际化配置入口 - * */ - -module.exports = { - cn: { - MANAGEMENT: '包管理', - PACKAGE_INF0: 'Package Info', - ISSUE_LIST: 'Issue List', - PKG_VERSION: '版本号', - SEARCH: '搜索', - BTN_DISPLAY: 'Custom Display Column', - BTN_EXCEL: '导出表格', - BTN_OK: '确认', - BTN_CLEAR: '取消', - TABLE_LABEL: { - NAME: 'Name', - MAINTAINER: 'Maintainer', - MAIN_LEVEL: 'Maintenance Level' - }, - MOBILE_VERSION: 'Product Version', - MOBILE_SEARCH: 'Search Name', - PKG_TABLE: { - VERSION: 'Version:', - RELEASE: 'Release:', - URL: 'URL:', - LICENSE: 'License:', - FEATURE: 'Feature:', - MAINTAINER: 'Maintainer:', - MAINTIAN_LEVEL: 'Maintainlevel:', - REPO_URL: 'Repo URL:', - SUMMARY: 'Summary:', - DESCRIPTION: 'Description:', - SUBPACK: 'Subpack:', - REQUIRED: 'Required:', - ISSUE_NUM: 'Issue Num:' - }, - ISSUE_TABLE: { - ISSUE_ID: 'Issue ID', - PACKAGE_NAME: 'Package Name', - ISSUE_TITLE: 'Issue Title', - ISSUE_TYPE: 'Issue Type', - ISSUE_STATUS: 'Issue Status', - MAINTAINER: 'Maintainer', - } - }, - en: { - MANAGEMENT: 'Package Management', - PACKAGE_INF0: 'Package Info', - ISSUE_LIST: 'Issue List', - PKG_VERSION: 'Product Version', - SEARCH: 'Search', - BTN_DISPLAY: 'Custom Display Column', - BTN_EXCEL: 'Export Excel', - BTN_OK: 'OK', - BTN_CLEAR: 'Clear', - TABLE_LABEL: { - NAME: 'Name', - MAINTAINER: 'Maintainer', - MAIN_LEVEL: 'Maintenance Level' - }, - MOBILE_VERSION: 'Product Version', - MOBILE_SEARCH: 'Search', - PKG_TABLE: { - VERSION: 'Version:', - RELEASE: 'Release:', - URL: 'URL:', - LICENSE: 'License:', - FEATURE: 'Feature:', - MAINTAINER: 'Maintainer:', - MAINTIAN_LEVEL: 'Maintainlevel:', - REPO_URL: 'Repo URL:', - SUMMARY: 'Summary:', - DESCRIPTION: 'Description:', - SUBPACK: 'Subpack:', - REQUIRED: 'Required:', - ISSUE_NUM: 'Issue Num:' - }, - ISSUE_TABLE: { - ISSUE_ID: 'Issue ID', - PACKAGE_NAME: 'Package Name', - ISSUE_TITLE: 'Issue Title', - ISSUE_TYPE: 'Issue Type', - ISSUE_STATUS: 'Issue Status', - MAINTAINER: 'Maintainer', - } - } -}; \ No newline at end of file diff --git a/packageship/web-ui/src/libs/ajax-utils.js b/packageship/web-ui/src/libs/ajax-utils.js deleted file mode 100644 index 5624cef4009f8bbc91f5b9dad5168f606866df3c..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/libs/ajax-utils.js +++ /dev/null @@ -1,43 +0,0 @@ -/** - * @file axios工具类 - * */ - -import axios from 'axios'; -import config from '@/config'; -import Vue from 'vue'; - -let postJson = params => { - let api = axios.create({ - baseURL: config.serviceBaseUrl || '' - }); - api.defaults.headers.post['Content-Type'] = 'application/json'; - - // 请求数据 - let dataStr = params['data'] && ((typeof (params['data']) === 'object') - ? JSON.stringify(params['data']) : params['data']); - - let ajaxParams = {}; - // success方法重载 - ajaxParams['success'] = function (d) { - const data = typeof d.data == 'string' ? JSON.parse(d.data) : d.data; - if (data) { - params.success(data); - } else { - new Vue().$message.error('开小差~请稍后重试。'); - } - }; - - return api({ - method: params['type'] || 'post', - url: params['url'], - data: dataStr, - params: params['params'], - responseType: 'json' - }).then(ajaxParams['success']).catch(params['error']); -}; - -let exportsMethods = { - postJson: params => postJson(params) -}; - -export default exportsMethods; \ No newline at end of file diff --git a/packageship/web-ui/src/main.js b/packageship/web-ui/src/main.js deleted file mode 100644 index 71ab022755b178a4f8589171d391814f789d6799..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/main.js +++ /dev/null @@ -1,40 +0,0 @@ -/** - * @file vue组件主入口 - * */ - -import Vue from 'vue'; -import App from './App.vue'; -import router from './router'; -import ElementUI from 'element-ui'; -import locale from 'element-ui/lib/locale/lang/en'; -import Vue18n from 'vue-i18n'; -import '@/assets/style/base.css'; -import echarts from 'echarts'; - - -if (!localStorage.getItem('locale') || localStorage.getItem('locale') === 'zh-en') { - import('@/assets/style/font-en.css'); -} else { - import('@/assets/style/font-cn.css'); -} - -Vue.use(Vue18n); -Vue.use(ElementUI, {locale}); -Vue.use(echarts) -Vue.prototype.$echarts = echarts - -const i18n = new Vue18n({ - locale: localStorage.getItem('locale') || 'zh-en', - messages: { - 'zh-cn': require('@/lang/cn'), // 中文语言包 - 'zh-en': require('@/lang/en') // 英文语言包 - } -}); - -Vue.config.productionTip = false; - -new Vue({ - i18n, - router, - render: h => h(App) -}).$mount('#app'); \ No newline at end of file diff --git a/packageship/web-ui/src/router/index.js b/packageship/web-ui/src/router/index.js deleted file mode 100644 index 85733588c29278b300e3b489a9999c641d3585b1..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/router/index.js +++ /dev/null @@ -1,26 +0,0 @@ -/** - * @file 路由配置入口文件 - * */ - -import Vue from 'vue'; -import Router from 'vue-router'; -import routes from './routers'; - -Vue.use(Router); - -const router = new Router({ - routes, - mode: 'history' -}); - -const originalPush = Router.prototype.push; - -Router.prototype.push = function push(location) { - return originalPush.call(this, location).catch(err => err); -} - -router.beforeEach((to, from, next) => { - next(); -}); - -export default router; \ No newline at end of file diff --git a/packageship/web-ui/src/router/routers.js b/packageship/web-ui/src/router/routers.js deleted file mode 100644 index ddaef75d87487ef48386f00cca59f9ac770f8120..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/router/routers.js +++ /dev/null @@ -1,17 +0,0 @@ -/** - * @file 路由配置文件 - * */ - -export default [{ - path: '/', - component: () => import('@/views/home/home.vue') - }, - { - path: '/package-detail', - component: () => import('@/views/package/package-detail.vue') - }, - { - path: '*', - component: () => import('@/views/404.vue') - } -]; \ No newline at end of file diff --git a/packageship/web-ui/src/views/404.vue b/packageship/web-ui/src/views/404.vue deleted file mode 100644 index 4c4f2f480288f33226b32e0353c7c2ac0270665f..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/views/404.vue +++ /dev/null @@ -1,11 +0,0 @@ - - - 404 no found! - - - - diff --git a/packageship/web-ui/src/views/home/depend-info.vue b/packageship/web-ui/src/views/home/depend-info.vue deleted file mode 100644 index 119fc428008a739ec4e531d839145eb500b6d14e..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/views/home/depend-info.vue +++ /dev/null @@ -1,111 +0,0 @@ - - - - Type - - Install Depend - Build Depend - Self Build Depend - Bedepend - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - \ No newline at end of file diff --git a/packageship/web-ui/src/views/home/home.vue b/packageship/web-ui/src/views/home/home.vue deleted file mode 100644 index fa2dde52d8af58da20a55aa4027c7fd7e82bf317..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/views/home/home.vue +++ /dev/null @@ -1,105 +0,0 @@ - - - Package Management - - - Package Info - Issue List - Depend Info - - - - - - - - - - - - - - diff --git a/packageship/web-ui/src/views/home/issue-list.vue b/packageship/web-ui/src/views/home/issue-list.vue deleted file mode 100644 index f854741129a8adf0c80893290aa7976660efcd0f..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/views/home/issue-list.vue +++ /dev/null @@ -1,284 +0,0 @@ - - - - - - Export Excel - - - - - - {{ scope.row.issue_id }} - - - - - - - - - {{ scope.row.issue_type }} - - - - - {{ scope.row.issue_status }} - - - - - {{ scope.row.maintainer }} - - - - - - Issue ID: - {{ item.issue_id }} - - - Package Name: - {{ item.pkg_name }} - - - Issue Title: - {{ item.issue_title }} - - - Issue Type: - {{ item.issue_type }} - - - Issue Status: - {{ item.issue_status }} - - - Maintainer: - {{ item.maintainer }} - - - - - - - - - - - - - - \ No newline at end of file diff --git a/packageship/web-ui/src/views/home/package-info.vue b/packageship/web-ui/src/views/home/package-info.vue deleted file mode 100644 index 3f9d34dd410fb9b88fb7488dce5cba91749647bd..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/views/home/package-info.vue +++ /dev/null @@ -1,543 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - Custom Display Column - - - - Export Excel - - - - - - - {{item}} - - - - OK - Clear - - - - - - - - {{ scope.row.name }} - - - - - {{ scope.row.maintainer }} - - - - - {{ scope.row.maintainlevel }} - - - - - - - - - Name: - {{ item.name }} - - - Version: - {{ item.version }} - - - Release: - {{ item.release }} - - - URL: - {{ item.url }} - - - License: - {{ item.rpm_license }} - - - Feature: - {{ item.feature }} - - - Maintainer: - {{ item.maintainer }} - - - Maintainlevel: - {{ item.maintainlevel }} - - - Repo URL: - {{ item.gitee_url }} - - - Summary: - {{ item.summary }} - - - Description: - {{ item.description }} - - - Subpack: - {{ item.name }} - - - Required: - {{ item }} - - - Issue Num: - {{ item.issue }} - - - - - - - - - - - - - - \ No newline at end of file diff --git a/packageship/web-ui/src/views/package/package-detail.vue b/packageship/web-ui/src/views/package/package-detail.vue deleted file mode 100644 index 26b43986927706eea9b756d227c7fc1f1d1a6064..0000000000000000000000000000000000000000 --- a/packageship/web-ui/src/views/package/package-detail.vue +++ /dev/null @@ -1,245 +0,0 @@ - - - {{ detailData.pkg_name }} Issues - - - Version: - {{ detailData.version }} - - - Release: - {{ detailData.release }} - - - URL: - {{ detailData.url }} - - - License: - {{ detailData.license }} - - - Feature: - {{ detailData.feature }} - - - Maintainer: - {{ detailData.maintainer }} - - - Maintainlevel: - {{ detailData.maintainlevel }} - - - Repo URL: - {{ detailData.gitee_url }} - - - Summary: - {{ detailData.summary }} - - - Description: - {{ detailData.description }} - - - Subpack: - {{ item.name }} - - - Required: - {{ item }} - - - Issue Num: - {{ detailData.issue }} - - - - {{ tables.name }} - - - - - - - - {{ item }} - - - - - - - - - - - {{ item }} - - - - - - - - - - - - \ No newline at end of file diff --git a/packageship/web-ui/vue.config.js b/packageship/web-ui/vue.config.js deleted file mode 100644 index e58c93fd2e42379cfcb27f3b9e9ca02d0e7e9f0e..0000000000000000000000000000000000000000 --- a/packageship/web-ui/vue.config.js +++ /dev/null @@ -1,49 +0,0 @@ -/** - * @file vue构建相关配置文件 - * */ - -const path = require('path'); - -const resolve = dir => { - return path.join(__dirname, dir); -}; - -const BASE_URL = process.env.NODE_ENV === 'production' ? '/' : '/'; - -module.exports = { - publicPath: BASE_URL, - - pluginOptions: { - 'style-resources-loader': { - preProcessor: 'less', - patterns: [ - path.resolve(__dirname, './src/assets/style/vars.less') - ] - } - }, - - chainWebpack: config => { - config.resolve.alias - .set('@', resolve('src')) - .set('_c', resolve('src/components')) - .set('_libs', resolve('src/libs')); - }, - - // 设为false打包时不生成.map文件 - productionSourceMap: false, - - // 这里写你调用接口的基础路径,来解决跨域,如果设置了代理,那你本地开发环境的axios的baseUrl要写为 '' ,即空字符串 - devServer: { - proxy: { - '/api': { - target: 'https://api.openeuler.org/pkgmanage/', - ws: true, - changeOrigin: true, - pathRewrite: { - '^/api': '' - } - } - } - - } -}; \ No newline at end of file diff --git a/patch-tracking/.gitignore b/patch-tracking/.gitignore deleted file mode 100644 index 283bf0a0cf7a80027a9fff4ffb782238d0105c4c..0000000000000000000000000000000000000000 --- a/patch-tracking/.gitignore +++ /dev/null @@ -1,70 +0,0 @@ -# Byte-compiled / optimized / DLL files -__pycache__/ -*.py[cod] -*$py.class - -# C extensions -*.so - -# Distribution / packaging -.Python -env/ -build/ -develop-eggs/ -dist/ -downloads/ -eggs/ -.eggs/ -lib/ -lib64/ -parts/ -sdist/ -var/ -*.egg-info/ -.installed.cfg -*.egg - -# PyInstaller -# Usually these files are written by a python script from a template -# before PyInstaller builds the exe, so as to inject date/other infos into it. -*.manifest - -# Installer logs -pip-log.txt -pip-delete-this-directory.txt - -# Unit test / coverage reports -htmlcov/ -.tox/ -.coverage -.coverage.* -.cache -nosetests.xml -coverage.xml -*,cover -.hypothesis/ - -# Translations -*.mo -*.pot - -# Flask stuff: -instance/ -.webassets-cache - -# pyenv -.python-version - -# dotenv -.env - -# virtualenv -venv/ -ENV/ - -# Editors -.idea/ - - -# log file -*.log diff --git a/patch-tracking/.pylintrc b/patch-tracking/.pylintrc deleted file mode 100644 index 856a976236cf79e5d250840ec55589f07e93bd68..0000000000000000000000000000000000000000 --- a/patch-tracking/.pylintrc +++ /dev/null @@ -1,595 +0,0 @@ -[MASTER] - -# A comma-separated list of package or module names from where C extensions may -# be loaded. Extensions are loading into the active Python interpreter and may -# run arbitrary code. -extension-pkg-whitelist= - -# Specify a score threshold to be exceeded before program exits with error. -fail-under=10 - -# Add files or directories to the blacklist. They should be base names, not -# paths. -ignore=CVS - -# Add files or directories matching the regex patterns to the blacklist. The -# regex matches against base names, not paths. -ignore-patterns=issue_test,tracking_test - -# Python code to execute, usually for sys.path manipulation such as -# pygtk.require(). -init-hook="from pylint.config import find_pylintrc; import os, sys; sys.path.append(os.path.dirname(find_pylintrc()))" - -# Use multiple processes to speed up Pylint. Specifying 0 will auto-detect the -# number of processors available to use. -jobs=1 - -# Control the amount of potential inferred values when inferring a single -# object. This can help the performance when dealing with large functions or -# complex, nested conditions. -limit-inference-results=100 - -# List of plugins (as comma separated values of python module names) to load, -# usually to register additional checkers. -load-plugins= - -# Pickle collected data for later comparisons. -persistent=yes - -# When enabled, pylint would attempt to guess common misconfiguration and emit -# user-friendly hints instead of false-positive error messages. -suggestion-mode=yes - -# Allow loading of arbitrary C extensions. Extensions are imported into the -# active Python interpreter and may run arbitrary code. -unsafe-load-any-extension=no - - -[MESSAGES CONTROL] - -# Only show warnings with the listed confidence levels. Leave empty to show -# all. Valid levels: HIGH, INFERENCE, INFERENCE_FAILURE, UNDEFINED. -confidence= - -# Disable the message, report, category or checker with the given id(s). You -# can either give multiple identifiers separated by comma (,) or put this -# option multiple times (only on the command line, not in the configuration -# file where it should appear only once). You can also use "--disable=all" to -# disable everything first and then reenable specific checks. For example, if -# you want to run only the similarities checker, you can use "--disable=all -# --enable=similarities". If you want to run only the classes checker, but have -# no Warning level messages displayed, use "--disable=all --enable=classes -# --disable=W". -disable=print-statement, - parameter-unpacking, - unpacking-in-except, - old-raise-syntax, - backtick, - long-suffix, - old-ne-operator, - old-octal-literal, - import-star-module-level, - non-ascii-bytes-literal, - raw-checker-failed, - bad-inline-option, - locally-disabled, - file-ignored, - suppressed-message, - useless-suppression, - deprecated-pragma, - use-symbolic-message-instead, - apply-builtin, - basestring-builtin, - buffer-builtin, - cmp-builtin, - coerce-builtin, - execfile-builtin, - file-builtin, - long-builtin, - raw_input-builtin, - reduce-builtin, - standarderror-builtin, - unicode-builtin, - xrange-builtin, - coerce-method, - delslice-method, - getslice-method, - setslice-method, - no-absolute-import, - old-division, - dict-iter-method, - dict-view-method, - next-method-called, - metaclass-assignment, - indexing-exception, - raising-string, - reload-builtin, - oct-method, - hex-method, - nonzero-method, - cmp-method, - input-builtin, - round-builtin, - intern-builtin, - unichr-builtin, - map-builtin-not-iterating, - zip-builtin-not-iterating, - range-builtin-not-iterating, - filter-builtin-not-iterating, - using-cmp-argument, - eq-without-hash, - div-method, - idiv-method, - rdiv-method, - exception-message-attribute, - invalid-str-codec, - sys-max-int, - bad-python3-import, - deprecated-string-function, - deprecated-str-translate-call, - deprecated-itertools-function, - deprecated-types-field, - next-method-defined, - dict-items-not-iterating, - dict-keys-not-iterating, - dict-values-not-iterating, - deprecated-operator-function, - deprecated-urllib-function, - xreadlines-attribute, - deprecated-sys-function, - exception-escape, - comprehension-escape - -# Enable the message, report, category or checker with the given id(s). You can -# either give multiple identifier separated by comma (,) or put this option -# multiple time (only on the command line, not in the configuration file where -# it should appear only once). See also the "--disable" option for examples. -enable=c-extension-no-member - - -[REPORTS] - -# Python expression which should return a score less than or equal to 10. You -# have access to the variables 'error', 'warning', 'refactor', and 'convention' -# which contain the number of messages in each category, as well as 'statement' -# which is the total number of statements analyzed. This score is used by the -# global evaluation report (RP0004). -evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10) - -# Template used to display messages. This is a python new-style format string -# used to format the message information. See doc for all details. -#msg-template= - -# Set the output format. Available formats are text, parseable, colorized, json -# and msvs (visual studio). You can also give a reporter class, e.g. -# mypackage.mymodule.MyReporterClass. -output-format=text - -# Tells whether to display a full report or only the messages. -reports=no - -# Activate the evaluation score. -score=yes - - -[REFACTORING] - -# Maximum number of nested blocks for function / method body -max-nested-blocks=5 - -# Complete name of functions that never returns. When checking for -# inconsistent-return-statements if a never returning function is called then -# it will be considered as an explicit return statement and no message will be -# printed. -never-returning-functions=sys.exit - - -[LOGGING] - -# The type of string formatting that logging methods do. `old` means using % -# formatting, `new` is for `{}` formatting. -logging-format-style=old - -# Logging modules to check that the string format arguments are in logging -# function parameter format. -logging-modules=logging - - -[SPELLING] - -# Limits count of emitted suggestions for spelling mistakes. -max-spelling-suggestions=4 - -# Spelling dictionary name. Available dictionaries: none. To make it work, -# install the python-enchant package. -spelling-dict= - -# List of comma separated words that should not be checked. -spelling-ignore-words= - -# A path to a file that contains the private dictionary; one word per line. -spelling-private-dict-file= - -# Tells whether to store unknown words to the private dictionary (see the -# --spelling-private-dict-file option) instead of raising a message. -spelling-store-unknown-words=no - - -[MISCELLANEOUS] - -# List of note tags to take in consideration, separated by a comma. -notes=FIXME, - XXX, - TODO - -# Regular expression of note tags to take in consideration. -#notes-rgx= - - -[TYPECHECK] - -# List of decorators that produce context managers, such as -# contextlib.contextmanager. Add to this list to register other decorators that -# produce valid context managers. -contextmanager-decorators=contextlib.contextmanager - -# List of members which are set dynamically and missed by pylint inference -# system, and so shouldn't trigger E1101 when accessed. Python regular -# expressions are accepted. -generated-members= - -# Tells whether missing members accessed in mixin class should be ignored. A -# mixin class is detected if its name ends with "mixin" (case insensitive). -ignore-mixin-members=yes - -# Tells whether to warn about missing members when the owner of the attribute -# is inferred to be None. -ignore-none=yes - -# This flag controls whether pylint should warn about no-member and similar -# checks whenever an opaque object is returned when inferring. The inference -# can return multiple potential results while evaluating a Python object, but -# some branches might not be evaluated, which results in partial inference. In -# that case, it might be useful to still emit no-member and other checks for -# the rest of the inferred objects. -ignore-on-opaque-inference=yes - -# List of class names for which member attributes should not be checked (useful -# for classes with dynamically set attributes). This supports the use of -# qualified names. -ignored-classes=optparse.Values,thread._local,_thread._local - -# List of module names for which member attributes should not be checked -# (useful for modules/projects where namespaces are manipulated during runtime -# and thus existing member attributes cannot be deduced by static analysis). It -# supports qualified module names, as well as Unix pattern matching. -ignored-modules= - -# Show a hint with possible names when a member name was not found. The aspect -# of finding the hint is based on edit distance. -missing-member-hint=yes - -# The minimum edit distance a name should have in order to be considered a -# similar match for a missing member name. -missing-member-hint-distance=1 - -# The total number of similar names that should be taken in consideration when -# showing a hint for a missing member. -missing-member-max-choices=1 - -# List of decorators that change the signature of a decorated function. -signature-mutators= - - -[VARIABLES] - -# List of additional names supposed to be defined in builtins. Remember that -# you should avoid defining new builtins when possible. -additional-builtins= - -# Tells whether unused global variables should be treated as a violation. -allow-global-unused-variables=yes - -# List of strings which can identify a callback function by name. A callback -# name must start or end with one of those strings. -callbacks=cb_, - _cb - -# A regular expression matching the name of dummy variables (i.e. expected to -# not be used). -dummy-variables-rgx=_+$|(_[a-zA-Z0-9_]*[a-zA-Z0-9]+?$)|dummy|^ignored_|^unused_ - -# Argument names that match this expression will be ignored. Default to name -# with leading underscore. -ignored-argument-names=_.*|^ignored_|^unused_ - -# Tells whether we should check for unused import in __init__ files. -init-import=no - -# List of qualified module names which can have objects that can redefine -# builtins. -redefining-builtins-modules=six.moves,past.builtins,future.builtins,builtins,io - - -[FORMAT] - -# Expected format of line ending, e.g. empty (any line ending), LF or CRLF. -expected-line-ending-format= - -# Regexp for a line that is allowed to be longer than the limit. -ignore-long-lines=^\s*(# )??$ - -# Number of spaces of indent required inside a hanging or continued line. -indent-after-paren=4 - -# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1 -# tab). -indent-string=' ' - -# Maximum number of characters on a single line. -max-line-length=120 - -# Maximum number of lines in a module. -max-module-lines=1000 - -# List of optional constructs for which whitespace checking is disabled. `dict- -# separator` is used to allow tabulation in dicts, etc.: {1 : 1,\n222: 2}. -# `trailing-comma` allows a space between comma and closing bracket: (a, ). -# `empty-line` allows space-only lines. -no-space-check=trailing-comma, - dict-separator - -# Allow the body of a class to be on the same line as the declaration if body -# contains single statement. -single-line-class-stmt=no - -# Allow the body of an if to be on the same line as the test if there is no -# else. -single-line-if-stmt=no - - -[SIMILARITIES] - -# Ignore comments when computing similarities. -ignore-comments=yes - -# Ignore docstrings when computing similarities. -ignore-docstrings=yes - -# Ignore imports when computing similarities. -ignore-imports=no - -# Minimum lines number of a similarity. -min-similarity-lines=4 - - -[BASIC] - -# Naming style matching correct argument names. -argument-naming-style=snake_case - -# Regular expression matching correct argument names. Overrides argument- -# naming-style. -#argument-rgx= - -# Naming style matching correct attribute names. -attr-naming-style=snake_case - -# Regular expression matching correct attribute names. Overrides attr-naming- -# style. -#attr-rgx= - -# Bad variable names which should always be refused, separated by a comma. -bad-names=foo, - bar, - baz, - toto, - tutu, - tata - -# Bad variable names regexes, separated by a comma. If names match any regex, -# they will always be refused -bad-names-rgxs= - -# Naming style matching correct class attribute names. -class-attribute-naming-style=any - -# Regular expression matching correct class attribute names. Overrides class- -# attribute-naming-style. -#class-attribute-rgx= - -# Naming style matching correct class names. -class-naming-style=PascalCase - -# Regular expression matching correct class names. Overrides class-naming- -# style. -#class-rgx= - -# Naming style matching correct constant names. -const-naming-style=UPPER_CASE - -# Regular expression matching correct constant names. Overrides const-naming- -# style. -#const-rgx= - -# Minimum line length for functions/classes that require docstrings, shorter -# ones are exempt. -docstring-min-length=-1 - -# Naming style matching correct function names. -function-naming-style=snake_case - -# Regular expression matching correct function names. Overrides function- -# naming-style. -#function-rgx= - -# Good variable names which should always be accepted, separated by a comma. -good-names=i, - j, - k, - ex, - Run, - _ - -# Good variable names regexes, separated by a comma. If names match any regex, -# they will always be accepted -good-names-rgxs= - -# Include a hint for the correct naming format with invalid-name. -include-naming-hint=no - -# Naming style matching correct inline iteration names. -inlinevar-naming-style=any - -# Regular expression matching correct inline iteration names. Overrides -# inlinevar-naming-style. -#inlinevar-rgx= - -# Naming style matching correct method names. -method-naming-style=snake_case - -# Regular expression matching correct method names. Overrides method-naming- -# style. -#method-rgx= - -# Naming style matching correct module names. -module-naming-style=snake_case - -# Regular expression matching correct module names. Overrides module-naming- -# style. -#module-rgx= - -# Colon-delimited sets of names that determine each other's naming style when -# the name regexes allow several styles. -name-group= - -# Regular expression which should only match function or class names that do -# not require a docstring. -no-docstring-rgx=^_ - -# List of decorators that produce properties, such as abc.abstractproperty. Add -# to this list to register other decorators that produce valid properties. -# These decorators are taken in consideration only for invalid-name. -property-classes=abc.abstractproperty - -# Naming style matching correct variable names. -variable-naming-style=snake_case - -# Regular expression matching correct variable names. Overrides variable- -# naming-style. -#variable-rgx= - - -[STRING] - -# This flag controls whether inconsistent-quotes generates a warning when the -# character used as a quote delimiter is used inconsistently within a module. -check-quote-consistency=no - -# This flag controls whether the implicit-str-concat should generate a warning -# on implicit string concatenation in sequences defined over several lines. -check-str-concat-over-line-jumps=no - - -[IMPORTS] - -# List of modules that can be imported at any level, not just the top level -# one. -allow-any-import-level= - -# Allow wildcard imports from modules that define __all__. -allow-wildcard-with-all=no - -# Analyse import fallback blocks. This can be used to support both Python 2 and -# 3 compatible code, which means that the block might have code that exists -# only in one or another interpreter, leading to false positives when analysed. -analyse-fallback-blocks=no - -# Deprecated modules which should not be used, separated by a comma. -deprecated-modules=optparse,tkinter.tix - -# Create a graph of external dependencies in the given file (report RP0402 must -# not be disabled). -ext-import-graph= - -# Create a graph of every (i.e. internal and external) dependencies in the -# given file (report RP0402 must not be disabled). -import-graph= - -# Create a graph of internal dependencies in the given file (report RP0402 must -# not be disabled). -int-import-graph= - -# Force import order to recognize a module as part of the standard -# compatibility libraries. -known-standard-library= - -# Force import order to recognize a module as part of a third party library. -known-third-party=enchant - -# Couples of modules and preferred modules, separated by a comma. -preferred-modules= - - -[CLASSES] - -# List of method names used to declare (i.e. assign) instance attributes. -defining-attr-methods=__init__, - __new__, - setUp, - __post_init__ - -# List of member names, which should be excluded from the protected access -# warning. -exclude-protected=_asdict, - _fields, - _replace, - _source, - _make - -# List of valid names for the first argument in a class method. -valid-classmethod-first-arg=cls - -# List of valid names for the first argument in a metaclass class method. -valid-metaclass-classmethod-first-arg=cls - - -[DESIGN] - -# Maximum number of arguments for function / method. -max-args=5 - -# Maximum number of attributes for a class (see R0902). -max-attributes=7 - -# Maximum number of boolean expressions in an if statement (see R0916). -max-bool-expr=5 - -# Maximum number of branch for function / method body. -max-branches=12 - -# Maximum number of locals for function / method body. -max-locals=15 - -# Maximum number of parents for a class (see R0901). -max-parents=7 - -# Maximum number of public methods for a class (see R0904). -max-public-methods=20 - -# Maximum number of return / yield for function / method body. -max-returns=6 - -# Maximum number of statements in function / method body. -max-statements=50 - -# Minimum number of public methods for a class (see R0903). -min-public-methods=2 - - -[EXCEPTIONS] - -# Exceptions that will emit a warning when being caught. Defaults to -# "BaseException, Exception". -overgeneral-exceptions=BaseException, - Exception diff --git a/patch-tracking/.style.yapf b/patch-tracking/.style.yapf deleted file mode 100644 index 1c04a76b1a34ef85b9ac2c32c51016d5af6a5e34..0000000000000000000000000000000000000000 --- a/patch-tracking/.style.yapf +++ /dev/null @@ -1,4 +0,0 @@ -[style] -based_on_style = pep8 -column_limit = 120 -dedent_closing_brackets = True diff --git a/patch-tracking/Pipfile b/patch-tracking/Pipfile deleted file mode 100644 index 65c8b43e3b622defe15a2288384bea5e12ace0a0..0000000000000000000000000000000000000000 --- a/patch-tracking/Pipfile +++ /dev/null @@ -1,22 +0,0 @@ -[[source]] -name = "pypi" -url = "https://pypi.tuna.tsinghua.edu.cn/simple" -verify_ssl = true - -[dev-packages] -pylint = "*" -yapf = "*" -pyopenssl = "*" - -[packages] -flask = "*" -flask-sqlalchemy = "*" -flask-apscheduler = "*" -requests = "*" -werkzeug = "*" -flask-httpauth = "*" -sqlalchemy = "*" -pandas = "*" - -[requires] -python_version = "3.7" diff --git a/patch-tracking/Pipfile.lock b/patch-tracking/Pipfile.lock deleted file mode 100644 index fa544caa1c19c5f3ee208914305a6f09eb6b954a..0000000000000000000000000000000000000000 --- a/patch-tracking/Pipfile.lock +++ /dev/null @@ -1,448 +0,0 @@ -{ - "_meta": { - "hash": { - "sha256": "69f670800c1dbbc64632f716294e7acfb72b3be7bee88a2701745239b39d9935" - }, - "pipfile-spec": 6, - "requires": { - "python_version": "3.7" - }, - "sources": [ - { - "name": "pypi", - "url": "https://pypi.tuna.tsinghua.edu.cn/simple", - "verify_ssl": true - } - ] - }, - "default": { - "apscheduler": { - "hashes": [ - "sha256:3bb5229eed6fbbdafc13ce962712ae66e175aa214c69bed35a06bffcf0c5e244", - "sha256:e8b1ecdb4c7cb2818913f766d5898183c7cb8936680710a4d3a966e02262e526" - ], - "version": "==3.6.3" - }, - "certifi": { - "hashes": [ - "sha256:5930595817496dd21bb8dc35dad090f1c2cd0adfaf21204bf6732ca5d8ee34d3", - "sha256:8fc0819f1f30ba15bdb34cceffb9ef04d99f420f68eb75d901e9560b8749fc41" - ], - "version": "==2020.6.20" - }, - "chardet": { - "hashes": [ - "sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae", - "sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691" - ], - "version": "==3.0.4" - }, - "click": { - "hashes": [ - "sha256:d2b5255c7c6349bc1bd1e59e08cd12acbbd63ce649f2588755783aa94dfb6b1a", - "sha256:dacca89f4bfadd5de3d7489b7c8a566eee0d3676333fbb50030263894c38c0dc" - ], - "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", - "version": "==7.1.2" - }, - "flask": { - "hashes": [ - "sha256:4efa1ae2d7c9865af48986de8aeb8504bf32c7f3d6fdc9353d34b21f4b127060", - "sha256:8a4fdd8936eba2512e9c85df320a37e694c93945b33ef33c89946a340a238557" - ], - "index": "pypi", - "version": "==1.1.2" - }, - "flask-apscheduler": { - "hashes": [ - "sha256:7911d66e449f412d92a1a6c524217f44f4c40a5c92148c60d5189c6c402f87d0" - ], - "index": "pypi", - "version": "==1.11.0" - }, - "flask-httpauth": { - "hashes": [ - "sha256:29e0288869a213c7387f0323b6bf2c7191584fb1da8aa024d9af118e5cd70de7", - "sha256:9e028e4375039a49031eb9ecc40be4761f0540476040f6eff329a31dabd4d000" - ], - "index": "pypi", - "version": "==4.1.0" - }, - "flask-sqlalchemy": { - "hashes": [ - "sha256:05b31d2034dd3f2a685cbbae4cfc4ed906b2a733cff7964ada450fd5e462b84e", - "sha256:bfc7150eaf809b1c283879302f04c42791136060c6eeb12c0c6674fb1291fae5" - ], - "index": "pypi", - "version": "==2.4.4" - }, - "idna": { - "hashes": [ - "sha256:b307872f855b18632ce0c21c5e45be78c0ea7ae4c15c828c20788b26921eb3f6", - "sha256:b97d804b1e9b523befed77c48dacec60e6dcb0b5391d57af6a65a312a90648c0" - ], - "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", - "version": "==2.10" - }, - "itsdangerous": { - "hashes": [ - "sha256:321b033d07f2a4136d3ec762eac9f16a10ccd60f53c0c91af90217ace7ba1f19", - "sha256:b12271b2047cb23eeb98c8b5622e2e5c5e9abd9784a153e9d8ef9cb4dd09d749" - ], - "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", - "version": "==1.1.0" - }, - "jinja2": { - "hashes": [ - "sha256:89aab215427ef59c34ad58735269eb58b1a5808103067f7bb9d5836c651b3bb0", - "sha256:f0a4641d3cf955324a89c04f3d94663aa4d638abe8f733ecd3582848e1c37035" - ], - "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", - "version": "==2.11.2" - }, - "markupsafe": { - "hashes": [ - "sha256:00bc623926325b26bb9605ae9eae8a215691f33cae5df11ca5424f06f2d1f473", - "sha256:09027a7803a62ca78792ad89403b1b7a73a01c8cb65909cd876f7fcebd79b161", - "sha256:09c4b7f37d6c648cb13f9230d847adf22f8171b1ccc4d5682398e77f40309235", - "sha256:1027c282dad077d0bae18be6794e6b6b8c91d58ed8a8d89a89d59693b9131db5", - "sha256:13d3144e1e340870b25e7b10b98d779608c02016d5184cfb9927a9f10c689f42", - "sha256:24982cc2533820871eba85ba648cd53d8623687ff11cbb805be4ff7b4c971aff", - "sha256:29872e92839765e546828bb7754a68c418d927cd064fd4708fab9fe9c8bb116b", - "sha256:43a55c2930bbc139570ac2452adf3d70cdbb3cfe5912c71cdce1c2c6bbd9c5d1", - "sha256:46c99d2de99945ec5cb54f23c8cd5689f6d7177305ebff350a58ce5f8de1669e", - "sha256:500d4957e52ddc3351cabf489e79c91c17f6e0899158447047588650b5e69183", - "sha256:535f6fc4d397c1563d08b88e485c3496cf5784e927af890fb3c3aac7f933ec66", - "sha256:596510de112c685489095da617b5bcbbac7dd6384aeebeda4df6025d0256a81b", - "sha256:62fe6c95e3ec8a7fad637b7f3d372c15ec1caa01ab47926cfdf7a75b40e0eac1", - "sha256:6788b695d50a51edb699cb55e35487e430fa21f1ed838122d722e0ff0ac5ba15", - "sha256:6dd73240d2af64df90aa7c4e7481e23825ea70af4b4922f8ede5b9e35f78a3b1", - "sha256:717ba8fe3ae9cc0006d7c451f0bb265ee07739daf76355d06366154ee68d221e", - "sha256:79855e1c5b8da654cf486b830bd42c06e8780cea587384cf6545b7d9ac013a0b", - "sha256:7c1699dfe0cf8ff607dbdcc1e9b9af1755371f92a68f706051cc8c37d447c905", - "sha256:88e5fcfb52ee7b911e8bb6d6aa2fd21fbecc674eadd44118a9cc3863f938e735", - "sha256:8defac2f2ccd6805ebf65f5eeb132adcf2ab57aa11fdf4c0dd5169a004710e7d", - "sha256:98c7086708b163d425c67c7a91bad6e466bb99d797aa64f965e9d25c12111a5e", - "sha256:9add70b36c5666a2ed02b43b335fe19002ee5235efd4b8a89bfcf9005bebac0d", - "sha256:9bf40443012702a1d2070043cb6291650a0841ece432556f784f004937f0f32c", - "sha256:ade5e387d2ad0d7ebf59146cc00c8044acbd863725f887353a10df825fc8ae21", - "sha256:b00c1de48212e4cc9603895652c5c410df699856a2853135b3967591e4beebc2", - "sha256:b1282f8c00509d99fef04d8ba936b156d419be841854fe901d8ae224c59f0be5", - "sha256:b2051432115498d3562c084a49bba65d97cf251f5a331c64a12ee7e04dacc51b", - "sha256:ba59edeaa2fc6114428f1637ffff42da1e311e29382d81b339c1817d37ec93c6", - "sha256:c8716a48d94b06bb3b2524c2b77e055fb313aeb4ea620c8dd03a105574ba704f", - "sha256:cd5df75523866410809ca100dc9681e301e3c27567cf498077e8551b6d20e42f", - "sha256:cdb132fc825c38e1aeec2c8aa9338310d29d337bebbd7baa06889d09a60a1fa2", - "sha256:e249096428b3ae81b08327a63a485ad0878de3fb939049038579ac0ef61e17e7", - "sha256:e8313f01ba26fbbe36c7be1966a7b7424942f670f38e666995b88d012765b9be" - ], - "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", - "version": "==1.1.1" - }, - "numpy": { - "hashes": [ - "sha256:082f8d4dd69b6b688f64f509b91d482362124986d98dc7dc5f5e9f9b9c3bb983", - "sha256:1bc0145999e8cb8aed9d4e65dd8b139adf1919e521177f198529687dbf613065", - "sha256:309cbcfaa103fc9a33ec16d2d62569d541b79f828c382556ff072442226d1968", - "sha256:3673c8b2b29077f1b7b3a848794f8e11f401ba0b71c49fbd26fb40b71788b132", - "sha256:480fdd4dbda4dd6b638d3863da3be82873bba6d32d1fc12ea1b8486ac7b8d129", - "sha256:56ef7f56470c24bb67fb43dae442e946a6ce172f97c69f8d067ff8550cf782ff", - "sha256:5a936fd51049541d86ccdeef2833cc89a18e4d3808fe58a8abeb802665c5af93", - "sha256:5b6885c12784a27e957294b60f97e8b5b4174c7504665333c5e94fbf41ae5d6a", - "sha256:667c07063940e934287993366ad5f56766bc009017b4a0fe91dbd07960d0aba7", - "sha256:7ed448ff4eaffeb01094959b19cbaf998ecdee9ef9932381420d514e446601cd", - "sha256:8343bf67c72e09cfabfab55ad4a43ce3f6bf6e6ced7acf70f45ded9ebb425055", - "sha256:92feb989b47f83ebef246adabc7ff3b9a59ac30601c3f6819f8913458610bdcc", - "sha256:935c27ae2760c21cd7354402546f6be21d3d0c806fffe967f745d5f2de5005a7", - "sha256:aaf42a04b472d12515debc621c31cf16c215e332242e7a9f56403d814c744624", - "sha256:b12e639378c741add21fbffd16ba5ad25c0a1a17cf2b6fe4288feeb65144f35b", - "sha256:b1cca51512299841bf69add3b75361779962f9cee7d9ee3bb446d5982e925b69", - "sha256:b8456987b637232602ceb4d663cb34106f7eb780e247d51a260b84760fd8f491", - "sha256:b9792b0ac0130b277536ab8944e7b754c69560dac0415dd4b2dbd16b902c8954", - "sha256:c9591886fc9cbe5532d5df85cb8e0cc3b44ba8ce4367bd4cf1b93dc19713da72", - "sha256:cf1347450c0b7644ea142712619533553f02ef23f92f781312f6a3553d031fc7", - "sha256:de8b4a9b56255797cbddb93281ed92acbc510fb7b15df3f01bd28f46ebc4edae", - "sha256:e1b1dc0372f530f26a03578ac75d5e51b3868b9b76cd2facba4c9ee0eb252ab1", - "sha256:e45f8e981a0ab47103181773cc0a54e650b2aef8c7b6cd07405d0fa8d869444a", - "sha256:e4f6d3c53911a9d103d8ec9518190e52a8b945bab021745af4939cfc7c0d4a9e", - "sha256:ed8a311493cf5480a2ebc597d1e177231984c818a86875126cfd004241a73c3e", - "sha256:ef71a1d4fd4858596ae80ad1ec76404ad29701f8ca7cdcebc50300178db14dfc" - ], - "markers": "python_version >= '3.6'", - "version": "==1.19.1" - }, - "pandas": { - "hashes": [ - "sha256:02f1e8f71cd994ed7fcb9a35b6ddddeb4314822a0e09a9c5b2d278f8cb5d4096", - "sha256:13f75fb18486759da3ff40f5345d9dd20e7d78f2a39c5884d013456cec9876f0", - "sha256:35b670b0abcfed7cad76f2834041dcf7ae47fd9b22b63622d67cdc933d79f453", - "sha256:4c73f373b0800eb3062ffd13d4a7a2a6d522792fa6eb204d67a4fad0a40f03dc", - "sha256:5759edf0b686b6f25a5d4a447ea588983a33afc8a0081a0954184a4a87fd0dd7", - "sha256:5a7cf6044467c1356b2b49ef69e50bf4d231e773c3ca0558807cdba56b76820b", - "sha256:69c5d920a0b2a9838e677f78f4dde506b95ea8e4d30da25859db6469ded84fa8", - "sha256:8778a5cc5a8437a561e3276b85367412e10ae9fff07db1eed986e427d9a674f8", - "sha256:9871ef5ee17f388f1cb35f76dc6106d40cb8165c562d573470672f4cdefa59ef", - "sha256:9c31d52f1a7dd2bb4681d9f62646c7aa554f19e8e9addc17e8b1b20011d7522d", - "sha256:ab8173a8efe5418bbe50e43f321994ac6673afc5c7c4839014cf6401bbdd0705", - "sha256:ae961f1f0e270f1e4e2273f6a539b2ea33248e0e3a11ffb479d757918a5e03a9", - "sha256:b3c4f93fcb6e97d993bf87cdd917883b7dab7d20c627699f360a8fb49e9e0b91", - "sha256:c9410ce8a3dee77653bc0684cfa1535a7f9c291663bd7ad79e39f5ab58f67ab3", - "sha256:f69e0f7b7c09f1f612b1f8f59e2df72faa8a6b41c5a436dde5b615aaf948f107", - "sha256:faa42a78d1350b02a7d2f0dbe3c80791cf785663d6997891549d0f86dc49125e" - ], - "index": "pypi", - "version": "==1.0.5" - }, - "python-dateutil": { - "hashes": [ - "sha256:73ebfe9dbf22e832286dafa60473e4cd239f8592f699aa5adaf10050e6e1823c", - "sha256:75bb3f31ea686f1197762692a9ee6a7550b59fc6ca3a1f4b5d7e32fb98e2da2a" - ], - "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", - "version": "==2.8.1" - }, - "pytz": { - "hashes": [ - "sha256:a494d53b6d39c3c6e44c3bec237336e14305e4f29bbf800b599253057fbb79ed", - "sha256:c35965d010ce31b23eeb663ed3cc8c906275d6be1a34393a1d73a41febf4a048" - ], - "version": "==2020.1" - }, - "requests": { - "hashes": [ - "sha256:b3559a131db72c33ee969480840fff4bb6dd111de7dd27c8ee1f820f4f00231b", - "sha256:fe75cc94a9443b9246fc7049224f75604b113c36acb93f87b80ed42c44cbb898" - ], - "index": "pypi", - "version": "==2.24.0" - }, - "six": { - "hashes": [ - "sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259", - "sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced" - ], - "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", - "version": "==1.15.0" - }, - "sqlalchemy": { - "hashes": [ - "sha256:0942a3a0df3f6131580eddd26d99071b48cfe5aaf3eab2783076fbc5a1c1882e", - "sha256:0ec575db1b54909750332c2e335c2bb11257883914a03bc5a3306a4488ecc772", - "sha256:109581ccc8915001e8037b73c29590e78ce74be49ca0a3630a23831f9e3ed6c7", - "sha256:16593fd748944726540cd20f7e83afec816c2ac96b082e26ae226e8f7e9688cf", - "sha256:427273b08efc16a85aa2b39892817e78e3ed074fcb89b2a51c4979bae7e7ba98", - "sha256:50c4ee32f0e1581828843267d8de35c3298e86ceecd5e9017dc45788be70a864", - "sha256:512a85c3c8c3995cc91af3e90f38f460da5d3cade8dc3a229c8e0879037547c9", - "sha256:57aa843b783179ab72e863512e14bdcba186641daf69e4e3a5761d705dcc35b1", - "sha256:621f58cd921cd71ba6215c42954ffaa8a918eecd8c535d97befa1a8acad986dd", - "sha256:6ac2558631a81b85e7fb7a44e5035347938b0a73f5fdc27a8566777d0792a6a4", - "sha256:716754d0b5490bdcf68e1e4925edc02ac07209883314ad01a137642ddb2056f1", - "sha256:736d41cfebedecc6f159fc4ac0769dc89528a989471dc1d378ba07d29a60ba1c", - "sha256:8619b86cb68b185a778635be5b3e6018623c0761dde4df2f112896424aa27bd8", - "sha256:87fad64529cde4f1914a5b9c383628e1a8f9e3930304c09cf22c2ae118a1280e", - "sha256:89494df7f93b1836cae210c42864b292f9b31eeabca4810193761990dc689cce", - "sha256:8cac7bb373a5f1423e28de3fd5fc8063b9c8ffe8957dc1b1a59cb90453db6da1", - "sha256:8fd452dc3d49b3cc54483e033de6c006c304432e6f84b74d7b2c68afa2569ae5", - "sha256:adad60eea2c4c2a1875eb6305a0b6e61a83163f8e233586a4d6a55221ef984fe", - "sha256:c26f95e7609b821b5f08a72dab929baa0d685406b953efd7c89423a511d5c413", - "sha256:cbe1324ef52ff26ccde2cb84b8593c8bf930069dfc06c1e616f1bfd4e47f48a3", - "sha256:d05c4adae06bd0c7f696ae3ec8d993ed8ffcc4e11a76b1b35a5af8a099bd2284", - "sha256:d98bc827a1293ae767c8f2f18be3bb5151fd37ddcd7da2a5f9581baeeb7a3fa1", - "sha256:da2fb75f64792c1fc64c82313a00c728a7c301efe6a60b7a9fe35b16b4368ce7", - "sha256:e4624d7edb2576cd72bb83636cd71c8ce544d8e272f308bd80885056972ca299", - "sha256:e89e0d9e106f8a9180a4ca92a6adde60c58b1b0299e1b43bd5e0312f535fbf33", - "sha256:f11c2437fb5f812d020932119ba02d9e2bc29a6eca01a055233a8b449e3e1e7d", - "sha256:f57be5673e12763dd400fea568608700a63ce1c6bd5bdbc3cc3a2c5fdb045274", - "sha256:fc728ece3d5c772c196fd338a99798e7efac7a04f9cb6416299a3638ee9a94cd" - ], - "index": "pypi", - "version": "==1.3.18" - }, - "tzlocal": { - "hashes": [ - "sha256:643c97c5294aedc737780a49d9df30889321cbe1204eac2c2ec6134035a92e44", - "sha256:e2cb6c6b5b604af38597403e9852872d7f534962ae2954c7f35efcb1ccacf4a4" - ], - "version": "==2.1" - }, - "urllib3": { - "hashes": [ - "sha256:91056c15fa70756691db97756772bb1eb9678fa585d9184f24534b100dc60f4a", - "sha256:e7983572181f5e1522d9c98453462384ee92a0be7fac5f1413a1e35c56cc0461" - ], - "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4' and python_version < '4'", - "version": "==1.25.10" - }, - "werkzeug": { - "hashes": [ - "sha256:2de2a5db0baeae7b2d2664949077c2ac63fbd16d98da0ff71837f7d1dea3fd43", - "sha256:6c80b1e5ad3665290ea39320b91e1be1e0d5f60652b964a3070216de83d2e47c" - ], - "index": "pypi", - "version": "==1.0.1" - } - }, - "develop": { - "astroid": { - "hashes": [ - "sha256:2f4078c2a41bf377eea06d71c9d2ba4eb8f6b1af2135bec27bbbb7d8f12bb703", - "sha256:bc58d83eb610252fd8de6363e39d4f1d0619c894b0ed24603b881c02e64c7386" - ], - "markers": "python_version >= '3.5'", - "version": "==2.4.2" - }, - "cffi": { - "hashes": [ - "sha256:267adcf6e68d77ba154334a3e4fc921b8e63cbb38ca00d33d40655d4228502bc", - "sha256:26f33e8f6a70c255767e3c3f957ccafc7f1f706b966e110b855bfe944511f1f9", - "sha256:3cd2c044517f38d1b577f05927fb9729d3396f1d44d0c659a445599e79519792", - "sha256:4a03416915b82b81af5502459a8a9dd62a3c299b295dcdf470877cb948d655f2", - "sha256:4ce1e995aeecf7cc32380bc11598bfdfa017d592259d5da00fc7ded11e61d022", - "sha256:4f53e4128c81ca3212ff4cf097c797ab44646a40b42ec02a891155cd7a2ba4d8", - "sha256:4fa72a52a906425416f41738728268072d5acfd48cbe7796af07a923236bcf96", - "sha256:66dd45eb9530e3dde8f7c009f84568bc7cac489b93d04ac86e3111fb46e470c2", - "sha256:6923d077d9ae9e8bacbdb1c07ae78405a9306c8fd1af13bfa06ca891095eb995", - "sha256:833401b15de1bb92791d7b6fb353d4af60dc688eaa521bd97203dcd2d124a7c1", - "sha256:8416ed88ddc057bab0526d4e4e9f3660f614ac2394b5e019a628cdfff3733849", - "sha256:892daa86384994fdf4856cb43c93f40cbe80f7f95bb5da94971b39c7f54b3a9c", - "sha256:98be759efdb5e5fa161e46d404f4e0ce388e72fbf7d9baf010aff16689e22abe", - "sha256:a6d28e7f14ecf3b2ad67c4f106841218c8ab12a0683b1528534a6c87d2307af3", - "sha256:b1d6ebc891607e71fd9da71688fcf332a6630b7f5b7f5549e6e631821c0e5d90", - "sha256:b2a2b0d276a136146e012154baefaea2758ef1f56ae9f4e01c612b0831e0bd2f", - "sha256:b87dfa9f10a470eee7f24234a37d1d5f51e5f5fa9eeffda7c282e2b8f5162eb1", - "sha256:bac0d6f7728a9cc3c1e06d4fcbac12aaa70e9379b3025b27ec1226f0e2d404cf", - "sha256:c991112622baee0ae4d55c008380c32ecfd0ad417bcd0417ba432e6ba7328caa", - "sha256:cda422d54ee7905bfc53ee6915ab68fe7b230cacf581110df4272ee10462aadc", - "sha256:d3148b6ba3923c5850ea197a91a42683f946dba7e8eb82dfa211ab7e708de939", - "sha256:d6033b4ffa34ef70f0b8086fd4c3df4bf801fee485a8a7d4519399818351aa8e", - "sha256:ddff0b2bd7edcc8c82d1adde6dbbf5e60d57ce985402541cd2985c27f7bec2a0", - "sha256:e23cb7f1d8e0f93addf0cae3c5b6f00324cccb4a7949ee558d7b6ca973ab8ae9", - "sha256:effd2ba52cee4ceff1a77f20d2a9f9bf8d50353c854a282b8760ac15b9833168", - "sha256:f90c2267101010de42f7273c94a1f026e56cbc043f9330acd8a80e64300aba33", - "sha256:f960375e9823ae6a07072ff7f8a85954e5a6434f97869f50d0e41649a1c8144f", - "sha256:fcf32bf76dc25e30ed793145a57426064520890d7c02866eb93d3e4abe516948" - ], - "version": "==1.14.1" - }, - "cryptography": { - "hashes": [ - "sha256:0c608ff4d4adad9e39b5057de43657515c7da1ccb1807c3a27d4cf31fc923b4b", - "sha256:0cbfed8ea74631fe4de00630f4bb592dad564d57f73150d6f6796a24e76c76cd", - "sha256:124af7255ffc8e964d9ff26971b3a6153e1a8a220b9a685dc407976ecb27a06a", - "sha256:384d7c681b1ab904fff3400a6909261cae1d0939cc483a68bdedab282fb89a07", - "sha256:45741f5499150593178fc98d2c1a9c6722df88b99c821ad6ae298eff0ba1ae71", - "sha256:4b9303507254ccb1181d1803a2080a798910ba89b1a3c9f53639885c90f7a756", - "sha256:4d355f2aee4a29063c10164b032d9fa8a82e2c30768737a2fd56d256146ad559", - "sha256:51e40123083d2f946794f9fe4adeeee2922b581fa3602128ce85ff813d85b81f", - "sha256:8713ddb888119b0d2a1462357d5946b8911be01ddbf31451e1d07eaa5077a261", - "sha256:8e924dbc025206e97756e8903039662aa58aa9ba357d8e1d8fc29e3092322053", - "sha256:8ecef21ac982aa78309bb6f092d1677812927e8b5ef204a10c326fc29f1367e2", - "sha256:8ecf9400d0893836ff41b6f977a33972145a855b6efeb605b49ee273c5e6469f", - "sha256:9367d00e14dee8d02134c6c9524bb4bd39d4c162456343d07191e2a0b5ec8b3b", - "sha256:a09fd9c1cca9a46b6ad4bea0a1f86ab1de3c0c932364dbcf9a6c2a5eeb44fa77", - "sha256:ab49edd5bea8d8b39a44b3db618e4783ef84c19c8b47286bf05dfdb3efb01c83", - "sha256:bea0b0468f89cdea625bb3f692cd7a4222d80a6bdafd6fb923963f2b9da0e15f", - "sha256:bec7568c6970b865f2bcebbe84d547c52bb2abadf74cefce396ba07571109c67", - "sha256:ce82cc06588e5cbc2a7df3c8a9c778f2cb722f56835a23a68b5a7264726bb00c", - "sha256:dea0ba7fe6f9461d244679efa968d215ea1f989b9c1957d7f10c21e5c7c09ad6" - ], - "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", - "version": "==3.0" - }, - "isort": { - "hashes": [ - "sha256:54da7e92468955c4fceacd0c86bd0ec997b0e1ee80d97f67c35a78b719dccab1", - "sha256:6e811fcb295968434526407adb8796944f1988c5b65e8139058f2014cbe100fd" - ], - "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", - "version": "==4.3.21" - }, - "lazy-object-proxy": { - "hashes": [ - "sha256:0c4b206227a8097f05c4dbdd323c50edf81f15db3b8dc064d08c62d37e1a504d", - "sha256:194d092e6f246b906e8f70884e620e459fc54db3259e60cf69a4d66c3fda3449", - "sha256:1be7e4c9f96948003609aa6c974ae59830a6baecc5376c25c92d7d697e684c08", - "sha256:4677f594e474c91da97f489fea5b7daa17b5517190899cf213697e48d3902f5a", - "sha256:48dab84ebd4831077b150572aec802f303117c8cc5c871e182447281ebf3ac50", - "sha256:5541cada25cd173702dbd99f8e22434105456314462326f06dba3e180f203dfd", - "sha256:59f79fef100b09564bc2df42ea2d8d21a64fdcda64979c0fa3db7bdaabaf6239", - "sha256:8d859b89baf8ef7f8bc6b00aa20316483d67f0b1cbf422f5b4dc56701c8f2ffb", - "sha256:9254f4358b9b541e3441b007a0ea0764b9d056afdeafc1a5569eee1cc6c1b9ea", - "sha256:9651375199045a358eb6741df3e02a651e0330be090b3bc79f6d0de31a80ec3e", - "sha256:97bb5884f6f1cdce0099f86b907aa41c970c3c672ac8b9c8352789e103cf3156", - "sha256:9b15f3f4c0f35727d3a0fba4b770b3c4ebbb1fa907dbcc046a1d2799f3edd142", - "sha256:a2238e9d1bb71a56cd710611a1614d1194dc10a175c1e08d75e1a7bcc250d442", - "sha256:a6ae12d08c0bf9909ce12385803a543bfe99b95fe01e752536a60af2b7797c62", - "sha256:ca0a928a3ddbc5725be2dd1cf895ec0a254798915fb3a36af0964a0a4149e3db", - "sha256:cb2c7c57005a6804ab66f106ceb8482da55f5314b7fcb06551db1edae4ad1531", - "sha256:d74bb8693bf9cf75ac3b47a54d716bbb1a92648d5f781fc799347cfc95952383", - "sha256:d945239a5639b3ff35b70a88c5f2f491913eb94871780ebfabb2568bd58afc5a", - "sha256:eba7011090323c1dadf18b3b689845fd96a61ba0a1dfbd7f24b921398affc357", - "sha256:efa1909120ce98bbb3777e8b6f92237f5d5c8ea6758efea36a473e1d38f7d3e4", - "sha256:f3900e8a5de27447acbf900b4750b0ddfd7ec1ea7fbaf11dfa911141bc522af0" - ], - "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", - "version": "==1.4.3" - }, - "mccabe": { - "hashes": [ - "sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42", - "sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f" - ], - "version": "==0.6.1" - }, - "pycparser": { - "hashes": [ - "sha256:2d475327684562c3a96cc71adf7dc8c4f0565175cf86b6d7a404ff4c771f15f0", - "sha256:7582ad22678f0fcd81102833f60ef8d0e57288b6b5fb00323d101be910e35705" - ], - "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", - "version": "==2.20" - }, - "pylint": { - "hashes": [ - "sha256:7dd78437f2d8d019717dbf287772d0b2dbdfd13fc016aa7faa08d67bccc46adc", - "sha256:d0ece7d223fe422088b0e8f13fa0a1e8eb745ebffcb8ed53d3e95394b6101a1c" - ], - "index": "pypi", - "version": "==2.5.3" - }, - "pyopenssl": { - "hashes": [ - "sha256:621880965a720b8ece2f1b2f54ea2071966ab00e2970ad2ce11d596102063504", - "sha256:9a24494b2602aaf402be5c9e30a0b82d4a5c67528fe8fb475e3f3bc00dd69507" - ], - "index": "pypi", - "version": "==19.1.0" - }, - "six": { - "hashes": [ - "sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259", - "sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced" - ], - "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", - "version": "==1.15.0" - }, - "toml": { - "hashes": [ - "sha256:926b612be1e5ce0634a2ca03470f95169cf16f939018233a670519cb4ac58b0f", - "sha256:bda89d5935c2eac546d648028b9901107a595863cb36bae0c73ac804a9b4ce88" - ], - "version": "==0.10.1" - }, - "wrapt": { - "hashes": [ - "sha256:b62ffa81fb85f4332a4f609cab4ac40709470da05643a082ec1eb88e6d9b97d7" - ], - "version": "==1.12.1" - }, - "yapf": { - "hashes": [ - "sha256:3000abee4c28daebad55da6c85f3cd07b8062ce48e2e9943c8da1b9667d48427", - "sha256:3abf61ba67cf603069710d30acbc88cfe565d907e16ad81429ae90ce9651e0c9" - ], - "index": "pypi", - "version": "==0.30.0" - } - } -} diff --git a/patch-tracking/README.md b/patch-tracking/README.md deleted file mode 100644 index bf1a37e1eb4262fdcd93b426609679ba4afa9ee0..0000000000000000000000000000000000000000 --- a/patch-tracking/README.md +++ /dev/null @@ -1,294 +0,0 @@ -patch-tracking -=== - - -# 简介 - -在 openEuler 发行版开发过程,需要及时更新上游社区各个软件包的最新代码,修改功能 bug 及安全问题,确保发布的 openEuler 发行版尽可能避免缺陷和漏洞。 - -本工具对软件包进行补丁管理,主动监控上游社区提交,自动生成补丁,并自动提交 issue 给对应的 maintainer,同时自动验证补丁基础功能,减少验证工作量支持 maintainer 快速决策。 - -# 架构 - -## C/S架构 - -patch-tracking采用 C/S 架构。 - -服务端(patch-tracking) :负责执行补丁跟踪任务,包括:维护跟踪项,识别上游仓库分支代码变更并形成补丁文件,向 Gitee 提交 issue 及 PR,同时 patch-tracking 提供 RESTful 接口,用于对跟踪项进行增删改查操作。 - -客户端:即命令行工具(patch-tracking-cli),通过调用 patch-tracking 的 RESTful 接口,实现对跟踪项的增删改查操作。 - -## 核心流程 - -* 补丁跟踪服务流程 - -**主要步骤:** - -1. 命令行工具写入跟踪项。 -2. 自动从跟踪项配置的上游仓库(例如Github)获取补丁文件。 -3. 创建临时分支,将获取到的补丁文件提交到临时分支。 -4. 自动提交issue到对应项目,并生成关联 issue 的 PR。 - - - -* Maintainer对提交的补丁处理流程 - -**主要步骤:** -1. Maintainer分析临时分支中的补丁文件,判断是否合入。 -2. 执行构建,构建成功后判断是否合入PR。 - - - -## 数据结构 - -* Tracking表 - -| 序号 | 名称 | 说明 | 类型 | 键 | 允许空 | -|:----:| ----| ----| ----| ----| ----| -| 1 | id | 自增补丁跟踪项序号 | int | - | NO | -| 2 | version_control | 上游SCM的版本控制系统类型 | String | - | NO | -| 3 | scm_repo | 上游SCM仓库地址 | String | - | NO | -| 4 | scm_branch | 上游SCM跟踪分支 | String | - | NO | -| 5 | scm_commit | 上游代码最新处理过的Commit ID | String | - | YES | -| 6 | repo | 包源码在Gitee的仓库地址 | String | Primary | NO | -| 7 | branch | 包源码在Gitee的仓库分支 | String | Primary | NO | -| 8 | enabled | 是否启动跟踪 | Boolean | -| NO | - -* Issue表 - -| 序号 | 名称 | 说明 | 类型 | 键 | 允许空 | -|:----:| ----| ----| ----| ----| ----| -| 1 | issue | issue编号 | String | Primary | NO | -| 2 | repo | 包源码在Gitee的仓库地址 | String | - | NO | -| 3 | branch | 包源码在Gitee的仓库分支 | String | - | NO | - -# 工具部署 - -## 软件下载 - -Repo 源挂载正式发布地址:https://repo.openeuler.org/ - -rpm 包获取地址:https://build.openeuler.org/package/show/openEuler:20.09/patch-tracking - - -## 安装工具 - -#### 方法1:从repo源安装 - -1. 使用 dnf 挂载 repo源(需要 20.09 或更新的 repo 源,具体方法参考[应用开发指南](https://openeuler.org/zh/docs/20.03_LTS/docs/ApplicationDev/%E5%BC%80%E5%8F%91%E7%8E%AF%E5%A2%83%E5%87%86%E5%A4%87.html)),然后执行如下指令下载以及安装pkgship及其依赖。 - -2. 执行以下命令安装`patch-tracking`。 - - ```shell script - dnf install patch-tracking - ``` - -#### 方法2:直接使用rpm安装 - -1. 首先安装相关依赖。 - - ```shell script - dnf install python3-uWSGI python3-flask python3-Flask-SQLAlchemy python3-Flask-APScheduler python3-Flask-HTTPAuth python3-requests python3-pandas - ``` - -2. 以`patch-tracking-1.0.0-1.oe1.noarch.rpm`为例,执行如下命令安装。 - - ```shell script - rpm -ivh patch-tracking-1.0.0-1.oe1.noarch.rpm - ``` - - -## 生成证书 -执行如下命令生成证书。 - -```shell script -openssl req -x509 -days 3650 -subj "/CN=self-signed" \ --nodes -newkey rsa:4096 -keyout self-signed.key -out self-signed.crt -``` - -将生成的 `self-signed.key` 和 `self-signed.crt` 文件拷贝到 __/etc/patch-tracking__ 目录 - - -## 配置参数 - -在配置文件中对相应参数进行配置,配置文件路径为 `/etc/patch-tracking/settings.conf`。 - -1. 配置服务监听地址。 - - ``` - LISTEN = "127.0.0.1:5001" - ``` - -2. GitHub Token, 用于访问托管在 GitHub 上游开源软件仓的仓库信息 , 生成 GitHub Token 的方法参考 [Creating a personal access token](https://docs.github.com/en/github/authenticating-to-github/creating-a-personal-access-token) 。 - - ``` - GITHUB_ACCESS_TOKEN = "" - ``` - -3. 对于托管在gitee上的需要跟踪的仓库,配置一个有该仓库权限的gitee的token,用于提交patch文件,提交issue,提交PR等操作。 - - ``` - GITEE_ACCESS_TOKEN = "" - ``` - -4. 定时扫描数据库中是否有新增或修改的跟踪项,对扫描到的跟踪项执行获取上游补丁任务,在这里配置扫描的时间间隔,数字单位是秒。 - - ``` - SCAN_DB_INTERVAL = 3600 - ``` - -5. 命令行工具运行过程中,POST接口需要填写进行认证的用户名和口令哈希值。 - - ``` - USER = "admin" - - PASSWORD = "" - ``` - - > `USER`默认值为`admin`。 - - 执行如下指令,获取口令的哈希值,其中Test@123为设置的口令。 - -``` -[root]# generate_password Test@123 -pbkdf2:sha256:150000$w38eLeRm$ebb5069ba3b4dda39a698bd1d9d7f5f848af3bd93b11e0cde2b28e9e34bfbbae -``` - -> `口令值`需要满足如下复杂度要求: -> -> * 长度大于等于6个字符 -> * 必须包含大写字母、小写字母、数字、特殊字符(~!@#%^*-_=+) - - 将口令的哈希值`pbkdf2:sha256:150000$w38eLeRm$ebb5069ba3b4dda39a698bd1d9d7f5f848af3bd93b11e0cde2b28e9e34bfbbae`配置到`PASSWORD = ""`引号中。 - -## 启动补丁跟踪服务 - -可以使用以下两种方式启动服务。 - -* 使用systemd方式。 - - ``` - systemctl start patch-tracking - ``` - -* 直接执行可执行程序。 - - ``` - /usr/bin/patch-tracking - ``` - -# 工具使用 - -## 添加跟踪项 - -将需要跟踪的软件仓库和分支与其上游开源软件仓库与分支关联起来,可以通过以下三种方式实现。 - -### 命令行直接添加 - -参数含义: ->--user :POST接口需要进行认证的用户名,同settings.conf中的USER参数 \ ---password :POST接口需要进行认证的口令,为settings.conf中的PASSWORD哈希值对应的实际的口令字符串 \ ---server :启动Patch Tracking服务的URL,例如:127.0.0.1:5001 \ ---version_control :上游仓库版本的控制工具,只支持github \ ---repo: 需要进行跟踪的仓库名称,格式:组织/仓库 \ ---branch :需要进行跟踪的仓库的分支名称 \ ---scm_repo :被跟踪的上游仓库的仓库名称,github格式:组织/仓库 \ ---scm_branch: 被跟踪的上游仓库的仓库的分支 \ ---enabled :是否自动跟踪该仓库 - -例如: -```shell script -patch-tracking-cli add --server 127.0.0.1:5001 --user admin --password Test@123 --version_control github --repo testPatchTrack/testPatch1 --branch master --scm_repo BJMX/testPatch01 --scm_branch test --enabled true -``` - -### 指定文件添加 - -参数含义: ->--server :启动Patch Tracking服务的URL,例如:127.0.0.1:5001 \ ---user :POST接口需要进行认证的用户名,同settings.conf中的USER参数 \ ---password :POST接口需要进行认证的口令,为settings.conf中的PASSWORD哈希值对应的实际的口令字符串 \ ---file :yaml文件路径 - -将仓库、分支、版本管理工具、是否启动监控等信息写入yaml文件(例如tracking.yaml),文件路径作为`--file`的入参调用命令。 - -例如: -```shell script -patch-tracking-cli add --server 127.0.0.1:5001 --user admin --password Test@123 --file tracking.yaml -``` - -yaml文件内容格式如下,冒号左边的内容不可修改,右边内容根据实际情况填写。 - -```shell script -version_control: github -scm_repo: xxx/xxx -scm_branch: master -repo: xxx/xxx -branch: master -enabled: true -``` - ->version_control :上游仓库版本的控制工具,只支持github \ -scm_repo :被跟踪的上游仓库的仓库名称,github格式:组织/仓库 \ -scm_branch :被跟踪的上游仓库的仓库的分支 \ -repo :需要进行跟踪的仓库名称,格式:组织/仓库 \ -branch :需要进行跟踪的仓库的分支名称 \ -enabled :是否自动跟踪该仓库 - -### 指定目录添加 - -在指定的目录,例如`test_yaml`下放入多个`xxx.yaml`文件,执行如下命令,记录指定目录下所有yaml文件的跟踪项。 - -参数含义: ->--user :POST接口需要进行认证的用户名,同settings.conf中的USER参数 \ ---password :POST接口需要进行认证的口令,为settings.conf中的PASSWORD哈希值对应的实际的口令字符串 \ ---server :启动Patch Tracking服务的URL,例如:127.0.0.1:5001 \ ---dir :存放yaml文件目录的路径 - -```shell script -patch-tracking-cli add --server 127.0.0.1:5001 --user admin --password Test@123 --dir /home/Work/test_yaml/ -``` - -## 查询跟踪项 - -参数含义: ->--server :必选参数,启动Patch Tracking服务的URL,例如:127.0.0.1:5001 \ ---table :必选参数,需要查询的表 \ ---repo :可选参数,需要查询的repo;如果没有该参数查询表中所有内容 \ ---branch :可选参数,需要查询的branch -```shell script -patch-tracking-cli query --server --table tracking -``` -例如: -```shell script -patch-tracking-cli query --server 127.0.0.1:5001 --table tracking -``` - -## 查询生成的 Issue - -```shell script -patch-tracking-cli query --server --table issue -``` -例如: -```shell script -patch-tracking-cli query --server 127.0.0.1:5001 --table issue -``` - -## 删除跟踪项 - -```shell script -patch-tracking-cli delete --server SERVER --user USER --password PWD --repo REPO [--branch BRANCH] -``` -例如: -```shell script -patch-tracking-cli delete --server 127.0.0.1:5001 --user admin --password Test@123 --repo testPatchTrack/testPatch1 --branch master -``` - -> 可以删除指定repo和branch的单条数据;也可直接删除指定repo下所有branch的数据。 - - -## 码云查看 issue 及 PR - -登录Gitee上进行跟踪的软件项目,在该项目的Issues和Pull Requests页签下,可以查看到名为`[patch tracking] TIME`,例如` [patch tracking] 20200713101548`的条目,该条目即是刚生成的补丁文件的issue和对应PR。 - - - diff --git a/patch-tracking/images/Maintainer.jpg b/patch-tracking/images/Maintainer.jpg deleted file mode 100644 index da0d5f1b5d928eca3a0d63795f59c55331136065..0000000000000000000000000000000000000000 Binary files a/patch-tracking/images/Maintainer.jpg and /dev/null differ diff --git a/patch-tracking/images/PatchTracking.jpg b/patch-tracking/images/PatchTracking.jpg deleted file mode 100644 index e12afd6227c18c333f289b9aa71abf608d8058a0..0000000000000000000000000000000000000000 Binary files a/patch-tracking/images/PatchTracking.jpg and /dev/null differ diff --git a/patch-tracking/patch-tracking.spec b/patch-tracking/patch-tracking.spec deleted file mode 100644 index f89b3d06762e604abf779c24ab017859f8d46d87..0000000000000000000000000000000000000000 --- a/patch-tracking/patch-tracking.spec +++ /dev/null @@ -1,68 +0,0 @@ -Summary: This is a tool for automatically tracking upstream repository code patches -Name: patch-tracking -Version: 1.0.2 -Release: 2 -License: Mulan PSL v2 -URL: https://gitee.com/openeuler/openEuler-Advisor -Source0: patch-tracking-%{version}.tar -BuildArch: noarch - - -BuildRequires: python3-setuptools -Requires: python3-uWSGI python3-flask python3-Flask-SQLAlchemy python3-Flask-APScheduler python3-Flask-HTTPAuth -Requires: python3-requests python3-pandas - - -%description -This is a tool for automatically tracking upstream repository code patches - -%prep -%setup -n %{name}-%{version} - -%build -%py3_build - -%install -%py3_install - -%post -sed -i "s|\blogging.conf\b|/etc/patch-tracking/logging.conf|" %{python3_sitelib}/patch_tracking/app.py -sed -i "s|\bsqlite:///db.sqlite\b|sqlite:////var/patch-tracking/db.sqlite|" %{python3_sitelib}/patch_tracking/app.py -sed -i "s|\bsettings.conf\b|/etc/patch-tracking/settings.conf|" %{python3_sitelib}/patch_tracking/app.py -chmod +x /usr/bin/patch-tracking-cli -chmod +x /usr/bin/patch-tracking -chmod +x /usr/bin/generate_password -sed -i "s|\bpatch-tracking.log\b|/var/log/patch-tracking.log|" /etc/patch-tracking/logging.conf - -%preun -%systemd_preun patch-tracking.service - -%clean -rm -rf $RPM_BUILD_ROOT - -%files -%{python3_sitelib}/* -/etc/patch-tracking/logging.conf -/etc/patch-tracking/settings.conf -/usr/bin/patch-tracking -/usr/bin/patch-tracking-cli -/var/patch-tracking/db.sqlite -/usr/bin/generate_password -/usr/lib/systemd/system/patch-tracking.service - - -%changelog -* Sat Sep 12 2020 chenyanpan - 1.0.2-2 -- Type: bugfix -- DESC: fixed name of python3-Flask-HTTPAuth - -* Fri Sep 11 2020 chenyanpan - 1.0.2-1 -- Type: bugfix -- DESC: fixed issues, specify Requires -- https://gitee.com/src-openeuler/patch-tracking/issues: I1TXTA I1TWVU I1TSG7 I1TYJV I1UAMC I1TYNW -- https://gitee.com/openeuler/docs/issues: I1U54H - - -* Mon Sep 07 2020 chenyanpan - 1.0.1-1 -- Type: bugfix -- DESC: fixed issues related to the validity of service configuration items and command line parameters diff --git a/patch-tracking/patch_tracking/__init__.py b/patch-tracking/patch_tracking/__init__.py deleted file mode 100644 index 10aa07dfba6f285624686888fded0cd26f2dd51b..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/__init__.py +++ /dev/null @@ -1 +0,0 @@ -""" module of patch_tracking """ diff --git a/patch-tracking/patch_tracking/api/__init__.py b/patch-tracking/patch_tracking/api/__init__.py deleted file mode 100644 index 452755011d5bac39b0f419c1739595144db192ce..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/api/__init__.py +++ /dev/null @@ -1 +0,0 @@ -""" module of api """ diff --git a/patch-tracking/patch_tracking/api/auth.py b/patch-tracking/patch_tracking/api/auth.py deleted file mode 100644 index 6916f9bc358fdf6652ff77087dd48177f640e21e..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/api/auth.py +++ /dev/null @@ -1,39 +0,0 @@ -""" -http basic auth -""" -import logging -from werkzeug.security import check_password_hash -from flask_httpauth import HTTPBasicAuth -from flask import current_app as app - -logger = logging.getLogger(__name__) - -auth = HTTPBasicAuth() - - -@auth.verify_password -def verify_password(username, password): - """ - verify password - """ - try: - if username == app.config["USER"] and \ - check_password_hash(app.config["PASSWORD"], password): - return username - except ValueError as err: - logger.error(err) - return None - logger.error("verify password failed") - return None - - -if __name__ == "__main__": - try: - print( - check_password_hash( - " pbkdf2:sha256:150000$ClAZjafb$ec0718c193c000e70812a0709919596e7523ab581c25ea6883aadba33c2edf0d", - "Test@123" - ) - ) - except ValueError as err: - print(err) diff --git a/patch-tracking/patch_tracking/api/business.py b/patch-tracking/patch_tracking/api/business.py deleted file mode 100644 index 2152a12de8a34fba0f5e4c82c79990fec47cdfd6..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/api/business.py +++ /dev/null @@ -1,80 +0,0 @@ -""" -api action method -""" -from sqlalchemy import and_ -from patch_tracking.database import db -from patch_tracking.database.models import Tracking, Issue - - -def create_tracking(data): - """ - create tracking - """ - version_control = data.get("version_control") - scm_repo = data.get('scm_repo') - scm_branch = data.get('scm_branch') - scm_commit = data.get('scm_commit') - repo = data.get('repo') - branch = data.get('branch') - enabled = data.get('enabled') - tracking = Tracking(version_control, scm_repo, scm_branch, scm_commit, repo, branch, enabled) - db.session.add(tracking) - db.session.commit() - - -def update_tracking(data): - """ - update tracking - """ - repo = data.get('repo') - branch = data.get('branch') - tracking = Tracking.query.filter(and_(Tracking.repo == repo, Tracking.branch == branch)).one() - tracking.version_control = data.get("version_control") - tracking.scm_repo = data.get('scm_repo') - tracking.scm_branch = data.get('scm_branch') - tracking.scm_commit = data.get('scm_commit') - tracking.enabled = data.get('enabled') - db.session.commit() - - -def delete_tracking(repo_, branch_=None): - """ - delete tracking - """ - if branch_: - Tracking.query.filter(Tracking.repo == repo_, Tracking.branch == branch_).delete() - else: - Tracking.query.filter(Tracking.repo == repo_).delete() - db.session.commit() - - -def create_issue(data): - """ - create issue - """ - issue = data.get('issue') - repo = data.get('repo') - branch = data.get('branch') - issue_ = Issue(issue, repo, branch) - db.session.add(issue_) - db.session.commit() - - -def update_issue(data): - """ - update issue - """ - issue = data.get('issue') - issue_ = Issue.query.filter(Issue.issue == issue).one() - issue_.issue = data.get('issue') - db.session.add(issue_) - db.session.commit() - - -def delete_issue(issue): - """ - delete issue - """ - issue_ = Issue.query.filter(Issue.issue == issue).one() - db.session.delete(issue_) - db.session.commit() diff --git a/patch-tracking/patch_tracking/api/constant.py b/patch-tracking/patch_tracking/api/constant.py deleted file mode 100644 index c2de1446e52817fd29c92f73a1db7e2eee8ad0f8..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/api/constant.py +++ /dev/null @@ -1,52 +0,0 @@ -''' - Response contain and code ID -''' -import json - - -class ResponseCode: - """ - Description: response code to web - changeLog: - """ - - SUCCESS = "2001" - INPUT_PARAMETERS_ERROR = "4001" - TRACKING_NOT_FOUND = "4002" - ISSUE_NOT_FOUND = "4003" - - GITHUB_ADDRESS_ERROR = "5001" - GITEE_ADDRESS_ERROR = "5002" - GITHUB_CONNECTION_ERROR = "5003" - GITEE_CONNECTION_ERROR = "5004" - - INSERT_DATA_ERROR = "6004" - DELETE_DB_ERROR = "6001" - CONFIGFILE_PATH_EMPTY = "6002" - DIS_CONNECTION_DB = "6003" - DELETE_DB_NOT_FOUND = "6005" - - CODE_MSG_MAP = { - SUCCESS: "Successful Operation!", - INPUT_PARAMETERS_ERROR: "Please enter the correct parameters", - TRACKING_NOT_FOUND: "The tracking you are looking for does not exist", - ISSUE_NOT_FOUND: "The issue you are looking for does not exist", - GITHUB_ADDRESS_ERROR: "The Github address is wrong", - GITEE_ADDRESS_ERROR: "The Gitee address is wrong", - GITHUB_CONNECTION_ERROR: "Unable to connect to the github", - GITEE_CONNECTION_ERROR: "Unable to connect to the gitee", - DELETE_DB_ERROR: "Failed to delete database", - CONFIGFILE_PATH_EMPTY: "Initialization profile does not exist or cannot be found", - DIS_CONNECTION_DB: "Unable to connect to the database, check the database configuration", - DELETE_DB_NOT_FOUND: "The tracking you want to delete does not exist" - } - - @classmethod - def ret_message(cls, code, data=None): - """ - generate response dictionary - """ - return json.dumps({"code": code, "msg": cls.CODE_MSG_MAP[code], "data": data}) - - def __str__(self): - return 'ResponseCode' diff --git a/patch-tracking/patch_tracking/api/issue.py b/patch-tracking/patch_tracking/api/issue.py deleted file mode 100644 index 92a6ac01d96fbe9a64fcfbd81a3b6f83c6da95d9..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/api/issue.py +++ /dev/null @@ -1,34 +0,0 @@ -""" -module of issue API -""" -import logging -from flask import request -from flask import Blueprint -from patch_tracking.database.models import Issue -from patch_tracking.api.constant import ResponseCode - -log = logging.getLogger(__name__) -issue = Blueprint('issue', __name__) - - -@issue.route('', methods=["GET"]) -def get(): - """ - Returns list of issue. - """ - if not request.args: - issues = Issue.query.all() - else: - allowed_key = ['repo', 'branch'] - input_params = request.args - data = dict() - for k, param in input_params.items(): - if k in allowed_key: - data[k] = param - else: - return ResponseCode.ret_message(ResponseCode.INPUT_PARAMETERS_ERROR) - issues = Issue.query.filter_by(**data).all() - resp_data = list() - for item in issues: - resp_data.append(item.to_json()) - return ResponseCode.ret_message(code=ResponseCode.SUCCESS, data=resp_data) diff --git a/patch-tracking/patch_tracking/api/tracking.py b/patch-tracking/patch_tracking/api/tracking.py deleted file mode 100644 index 4a8a24cf78e244656c4a4b587eb2b00b9ebf4268..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/api/tracking.py +++ /dev/null @@ -1,117 +0,0 @@ -""" -module of issue API -""" -import logging -from flask import request, Blueprint -from sqlalchemy.exc import SQLAlchemyError -from patch_tracking.database.models import Tracking -from patch_tracking.api.business import create_tracking, update_tracking, delete_tracking -from patch_tracking.api.constant import ResponseCode -from patch_tracking.api.auth import auth - -logger = logging.getLogger(__name__) -tracking = Blueprint('tracking', __name__) - - -@tracking.route('', methods=["DELETE"]) -@auth.login_required -def delete(): - """ - Delete tracking(s). - """ - input_params = request.args - keys = list(input_params.keys()) - - if not keys or "repo" not in keys: - return ResponseCode.ret_message(ResponseCode.INPUT_PARAMETERS_ERROR) - - if len(set(keys) - {"repo", "branch"}) != 0: - return ResponseCode.ret_message(ResponseCode.INPUT_PARAMETERS_ERROR) - - try: - if "branch" in keys: - if Tracking.query.filter(Tracking.repo == input_params['repo'], - Tracking.branch == input_params['branch']).first(): - delete_tracking(input_params['repo'], input_params['branch']) - logger.info('Delete tracking repo: %s, branch: %s', input_params['repo'], input_params['branch']) - return ResponseCode.ret_message(code=ResponseCode.SUCCESS) - else: - logger.info( - 'Delete tracking repo: %s, branch: %s not found.', input_params['repo'], input_params['branch'] - ) - return ResponseCode.ret_message(code=ResponseCode.DELETE_DB_NOT_FOUND) - else: - if Tracking.query.filter(Tracking.repo == input_params['repo']).first(): - delete_tracking(input_params['repo']) - logger.info('Delete tracking repo: %s', input_params['repo']) - return ResponseCode.ret_message(code=ResponseCode.SUCCESS) - else: - logger.info('Delete tracking repo: %s not found.', input_params['repo']) - return ResponseCode.ret_message(code=ResponseCode.DELETE_DB_NOT_FOUND) - except SQLAlchemyError as err: - return ResponseCode.ret_message(code=ResponseCode.DELETE_DB_ERROR, data=err) - - -@tracking.route('', methods=["GET"]) -def get(): - """ - Returns list of tracking - """ - if not request.args: - trackings = Tracking.query.all() - else: - allowed_key = ['repo', 'branch', 'enabled'] - input_params = request.args - - data = dict() - for k, param in input_params.items(): - if k in allowed_key: - if k == 'enabled': - param = bool(param == 'true') - data[k] = param - else: - return ResponseCode.ret_message(ResponseCode.INPUT_PARAMETERS_ERROR) - trackings = Tracking.query.filter_by(**data).all() - - resp_data = list() - for item in trackings: - resp_data.append(item.to_json()) - return ResponseCode.ret_message(code=ResponseCode.SUCCESS, data=resp_data) - - -@tracking.route('', methods=["POST"]) -@auth.login_required -def post(): - """ - Creates or update a tracking. - """ - required_params = ['version_control', 'scm_repo', 'scm_branch', 'scm_commit', 'repo', 'branch', 'enabled'] - input_params = request.json - data = dict() - for item in input_params: - if item in required_params: - data[item] = input_params[item] - required_params.remove(item) - else: - return ResponseCode.ret_message(ResponseCode.INPUT_PARAMETERS_ERROR) - - if len(required_params) > 1 or (len(required_params) == 1 and required_params[0] != 'scm_commit'): - return ResponseCode.ret_message(ResponseCode.INPUT_PARAMETERS_ERROR) - - if data['version_control'] != 'github': - return ResponseCode.ret_message(ResponseCode.INPUT_PARAMETERS_ERROR) - - track = Tracking.query.filter_by(repo=data['repo'], branch=data['branch']).first() - if track: - try: - update_tracking(data) - logger.info('Update tracking. Data: %s.', data) - except SQLAlchemyError as err: - return ResponseCode.ret_message(code=ResponseCode.INSERT_DATA_ERROR, data=err) - else: - try: - create_tracking(data) - logger.info('Create tracking. Data: %s.', data) - except SQLAlchemyError as err: - return ResponseCode.ret_message(code=ResponseCode.INSERT_DATA_ERROR, data=err) - return ResponseCode.ret_message(code=ResponseCode.SUCCESS, data=request.json) diff --git a/patch-tracking/patch_tracking/app.py b/patch-tracking/patch_tracking/app.py deleted file mode 100644 index 84aaad191ea4574ef5da3b4c8002d35843149866..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/app.py +++ /dev/null @@ -1,111 +0,0 @@ -""" -flask app -""" -import logging.config -import os -import sys -from flask import Flask -from patch_tracking.api.issue import issue -from patch_tracking.api.tracking import tracking -from patch_tracking.database import db -from patch_tracking.task import task -from patch_tracking.util import github_api, gitee_api - -logging.config.fileConfig('logging.conf', disable_existing_loggers=False) - -app = Flask(__name__) -logger = logging.getLogger(__name__) - - -def check_token(): - """ check gitee/github token """ - gitee_token = app.config['GITEE_ACCESS_TOKEN'] - github_token = app.config['GITHUB_ACCESS_TOKEN'] - - github_ret = github_api.get_user_info(github_token) - if not github_ret[0]: - logger.error(github_ret[1]) - sys.exit(1) - - gitee_ret = gitee_api.get_user_info(gitee_token) - if not gitee_ret[0]: - logger.error(gitee_ret[1]) - sys.exit(1) - - -def check_listen(listen_param): - """ check LISTEN """ - check_ret = True - if ":" in listen_param and listen_param.count(":") == 1: - host, port = listen_param.split(":") - if int(port) > 65535 or int(port) <= 0: - check_ret = False - if "." in host and host.count(".") == 3: - for item in host.split("."): - if int(item) < 0 or int(item) > 255: - check_ret = False - else: - check_ret = False - else: - check_ret = False - return check_ret - - -def check_settings_conf(): - """ - check settings.conf - """ - setting_error = False - required_settings = ['LISTEN', 'GITHUB_ACCESS_TOKEN', 'GITEE_ACCESS_TOKEN', 'SCAN_DB_INTERVAL', 'USER', 'PASSWORD'] - for setting in required_settings: - if setting in app.config: - if app.config[setting] == "": - logger.error('%s is empty in settings.conf.', setting) - setting_error = True - else: - if setting == "LISTEN" and (not check_listen(app.config[setting])): - logger.error('LISTEN error: illegal param in /etc/patch-tracking/settings.conf.') - setting_error = True - if setting == "SCAN_DB_INTERVAL" and int(app.config[setting]) <= 0: - logger.error( - 'SCAN_DB_INTERVAL error: must be greater than zero in /etc/patch-tracking/settings.conf.' - ) - setting_error = True - if setting == "USER" and len(app.config[setting]) > 32: - logger.error('USER value error: user name too long, USER character should less than 32.') - setting_error = True - else: - logger.error('%s not configured in settings.conf.', setting) - setting_error = True - if setting_error: - sys.exit(1) - - -settings_file = os.path.join(os.path.abspath(os.curdir), "settings.conf") -try: - app.config.from_pyfile(settings_file) - check_settings_conf() - app.config["LISTEN"] = app.config["LISTEN"].strip() - app.config["GITHUB_ACCESS_TOKEN"] = app.config["GITHUB_ACCESS_TOKEN"].strip() - app.config["GITEE_ACCESS_TOKEN"] = app.config["GITEE_ACCESS_TOKEN"].strip() - app.config["USER"] = app.config["USER"].strip() - app.config["PASSWORD"] = app.config["PASSWORD"].strip() -except (SyntaxError, NameError): - logger.error('settings.conf content format error.') - sys.exit(1) - -check_token() - -app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///db.sqlite?check_same_thread=False&timeout=30' -app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False -app.config['SCHEDULER_EXECUTORS'] = {'default': {'type': 'threadpool', 'max_workers': 100}} - -app.register_blueprint(issue, url_prefix="/issue") -app.register_blueprint(tracking, url_prefix="/tracking") - -db.init_app(app) - -task.init(app) - -if __name__ == "__main__": - app.run(ssl_context="adhoc") diff --git a/patch-tracking/patch_tracking/cli/__init__.py b/patch-tracking/patch_tracking/cli/__init__.py deleted file mode 100644 index 872a5094faf3989e273e1ae1c8313d59aaaf3319..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/cli/__init__.py +++ /dev/null @@ -1 +0,0 @@ -""" module of cli """ diff --git a/patch-tracking/patch_tracking/cli/generate_password b/patch-tracking/patch_tracking/cli/generate_password deleted file mode 100644 index e126ac71b18ab645d20ff1f350eee73f93ed80f8..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/cli/generate_password +++ /dev/null @@ -1,96 +0,0 @@ -#!/usr/bin/env python3 -""" -command line to generate password hash by pbkdf2 -""" - -import sys -import re -from werkzeug.security import generate_password_hash - - -def usage(): - """ usage """ - print( - """usage: generate_password PASSWORD - -Requirements: -1. PASSWORD must be within the 'latin1' character set -2. PASSWORD strength require: - length must be between 6 and 32 - at least 1 digit [0-9] - at least 1 alphabet [a-z] - at least 1 alphabet of Upper Case [A-Z] - at least 1 special character from [~!@#%^*_+=-] -""" - ) - - -def password_encode_check(password): - """ check if password within the latin1 character set """ - try: - password.encode("latin1") - except UnicodeEncodeError as err: - return str(err) - return None - - -def password_strength_check(password): - """ - Verify the strength of 'password' - Returns a dict indicating the wrong criteria - """ - - # calculating the length - length_error = len(password) < 6 or len(password) > 32 - - # searching for digits - digit_error = re.search(r"\d", password) is None - - # searching for uppercase - uppercase_error = re.search(r"[A-Z]", password) is None - - # searching for lowercase - lowercase_error = re.search(r"[a-z]", password) is None - - # searching for symbols - symbol_error = re.search(r"[~!@#%^*_+=-]", password) is None - - # overall result - password_ok = not (length_error or digit_error or uppercase_error or lowercase_error or symbol_error) - - return { - 'ok': password_ok, - 'error': { - 'length': length_error, - 'digit': digit_error, - 'uppercase': uppercase_error, - 'lowercase': lowercase_error, - 'symbol': symbol_error, - } - } - - -if __name__ == "__main__": - if len(sys.argv) != 2: - usage() - print("Error: One password input allowed.") - sys.exit(1) - - password_ = sys.argv[1] - - ret = password_encode_check(password_) - if ret: - usage() - print("PASSWORD: only latin1 character set are allowed") - sys.exit(1) - - ret = password_strength_check(password_) - if not ret['ok']: - usage() - print("Password strength is not satisfied:") - for item in ret['error']: - if ret['error'][item]: - print("{} not satisfied.".format(item)) - sys.exit(1) - else: - print(generate_password_hash(password_)) diff --git a/patch-tracking/patch_tracking/cli/patch-tracking-cli b/patch-tracking/patch_tracking/cli/patch-tracking-cli deleted file mode 100644 index bf88f93a8ea167a7d0e59a77773c6327b7602f21..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/cli/patch-tracking-cli +++ /dev/null @@ -1,11 +0,0 @@ -#!/usr/bin/env python3 - -# -*- coding: utf-8 -*- -import re -import sys - -from patch_tracking.cli.patch_tracking_cli import main - -if __name__ == '__main__': - sys.argv[0] = re.sub(r'(-script\.pyw?|\.exe)?$', '', sys.argv[0]) - sys.exit(main()) diff --git a/patch-tracking/patch_tracking/cli/patch_tracking_cli.py b/patch-tracking/patch_tracking/cli/patch_tracking_cli.py deleted file mode 100755 index c2a67f7003fb107f037a94d1c6108cf8d0c62c4c..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/cli/patch_tracking_cli.py +++ /dev/null @@ -1,419 +0,0 @@ -#!/usr/bin/env python3 -""" -command line of creating tracking item -""" -import argparse -import os -import sys -import pandas -import requests -from requests.auth import HTTPBasicAuth -from requests.packages.urllib3.exceptions import InsecureRequestWarning - -requests.packages.urllib3.disable_warnings(InsecureRequestWarning) -pandas.set_option('display.max_rows', None) -pandas.set_option('display.width', None) - - -def query_table(args): - """ - query table - """ - server = args.server - - if args.table == "tracking": - url = '/'.join(['https:/', server, 'tracking']) - params = {'repo': args.repo, 'branch': args.branch} - try: - ret = requests.get(url, params=params, verify=False) - if ret.status_code == 200 and ret.json()['code'] == '2001': - return 'success', ret - - return 'error', ret - except IOError as exception: - return 'error', 'Connect server error: ' + str(exception) - elif args.table == "issue": - url = '/'.join(['https:/', server, 'issue']) - params = {'repo': args.repo, 'branch': args.branch} - try: - ret = requests.get(url, params=params, verify=False) - if ret.status_code == 200 and ret.json()['code'] == '2001': - return 'success', ret - - return 'error', ret - except IOError as exception: - return 'error', 'Connect server error: ' + str(exception) - return 'error', 'table ' + args.table + ' not found' - - -def add_param_check_url(params, file_path=None): - """ - check url - """ - scm_url = f"https://github.com/{params['scm_repo']}/tree/{params['scm_branch']}" - url = f"https://gitee.com/{params['repo']}/tree/{params['branch']}" - patch_tracking_url = f"https://{params['server']}" - server_ret = server_check(patch_tracking_url) - if server_ret[0] != 'success': - return 'error' - - scm_ret = repo_branch_check(scm_url) - if scm_ret[0] != 'success': - if file_path: - print( - f"scm_repo: {params['scm_repo']} and scm_branch: {params['scm_branch']} check failed. \n" - f"Error in {file_path}. {scm_ret[1]}" - ) - else: - print(f"scm_repo: {params['scm_repo']} and scm_branch: {params['scm_branch']} check failed. {scm_ret[1]}") - return 'error' - ret = repo_branch_check(url) - if ret[0] != 'success': - if file_path: - print(f"repo: {params['repo']} and branch: {params['branch']} check failed. {ret[1]}. Error in {file_path}") - else: - print(f"repo: {params['repo']} and branch: {params['branch']} check failed. {ret[1]}.") - return 'error' - return None - - -def server_check(url): - """ - check if patch_tracking server start - """ - try: - ret = requests.head(url=url, verify=False) - except IOError as exception: - print(f"Error: Cannot connect to {url}, please make sure patch-tracking service is running.") - return 'error', exception - if ret.status_code == 200 or ret.status_code == 404: - return 'success', ret - - print(f"Unexpected Error: {ret.text}") - return 'error', ret.text - - -def repo_branch_check(url): - """ - check if repo/branch exist - """ - headers = { - "User-Agent": - "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) " + - "Ubuntu Chromium/83.0.4103.61 Chrome/83.0.4103.61 Safari/537.36" - } - try: - ret = requests.get(url=url, headers=headers) - except IOError as exception: - return 'error', exception - if ret.status_code == 404: - return 'error', f'{url} not exist.' - if ret.status_code == 200: - return 'success', ret - - return 'error', ret.text - - -def latin1_encode(text): - """ latin1 encode """ - try: - text.encode("latin1") - except UnicodeEncodeError as err: - return str(err) - return None - - -def params_input_track(params, file_path=None): - """ - load tracking from command line arguments - """ - if not check_add_param(params): - return 'error', 'Check input params error' - - if add_param_check_url(params, file_path) == 'error': - return 'error', 'Check input params error.' - - repo = params['repo'] - branch = params['branch'] - scm_repo = params['scm_repo'] - scm_branch = params['scm_branch'] - version_control = params['version_control'].lower() - enabled = params['enabled'].lower() - server = params['server'] - user = params['user'] - password = params['password'] - - if enabled not in ["True", "true", "False", "false"]: - print(ADD_USAGE) - return "error", "error: enabled: invalid value: '{}' (choose from 'True', 'true', 'False', 'false')".format( - enabled - ) - - if version_control not in ["github"]: - print(ADD_USAGE) - return "error", "error: version_control: invalid value: '{}' (choose from 'github')".format(version_control) - - err = latin1_encode(user) - if err: - return "error", "ERROR: user: only latin1 character set are allowed." - err = latin1_encode(password) - if err: - return "error", "ERROR: user: only latin1 character set are allowed." - - enabled = bool(enabled == 'true') - - url = '/'.join(['https:/', server, 'tracking']) - data = { - 'version_control': version_control, - 'scm_repo': scm_repo, - 'scm_branch': scm_branch, - 'repo': repo, - 'branch': branch, - 'enabled': enabled - } - try: - ret = requests.post(url, json=data, verify=False, auth=HTTPBasicAuth(user, password)) - except IOError as exception: - return 'error', 'Connect server error: ' + str(exception) - if ret.status_code == 401 or ret.status_code == 403: - return 'error', 'Authenticate Error. Please make sure user and password are correct.' - if ret.status_code == 200 and ret.json()['code'] == '2001': - return 'success', 'created' - - print("status_code: {}, return text: {}".format(ret.status_code, ret.text)) - return 'error', 'Unexpected Error.' - - -def check_add_param(params): - """check add type param""" - success = True - required_params = ["repo", "branch", "scm_repo", "scm_branch", "version_control", "enabled"] - miss_params = list() - for param in required_params: - if param not in params or not params[param]: - miss_params.append(param) - success = False - if not success: - print( - "patch_tracking_cli add: error: the following arguments are required: --{}".format( - ", --".join(miss_params) - ) - ) - return success - - -def add(args): - """ - add tracking - """ - if not check_password_length(args.password): - print('PASSWORD: Password length must be between 6 and 32') - return - style1 = bool(args.version_control) or bool(args.repo) or bool(args.branch) or bool(args.scm_repo) or bool( - args.scm_branch - ) or bool(args.enabled) - style2 = bool(args.file) - style3 = bool(args.dir) - - if str([style1, style2, style3]).count('True') >= 2: - print("usage:" + ADD_USAGE) - print("patch_tracking_cli add: error: mix different usage style") - return - - if style2: - file_input_track(args.file, args) - elif style3: - dir_input_track(args.dir, args) - else: - params = { - 'repo': args.repo, - 'branch': args.branch, - 'scm_repo': args.scm_repo, - 'scm_branch': args.scm_branch, - 'version_control': args.version_control, - 'enabled': args.enabled, - 'server': args.server, - 'user': args.user, - 'password': args.password - } - ret = params_input_track(params) - if ret[0] == 'success': - print('Tracking successfully.') - else: - print(ret[1]) - - -def delete(args): - """ - delete tracking - """ - server = args.server - user = args.user - password = args.password - - if not check_password_length(password): - print('PASSWORD: Password length must be between 6 and 32') - return - - err = latin1_encode(user) - if err: - print("ERROR: user: only latin1 character set are allowed.") - return - err = latin1_encode(password) - if err: - print("ERROR: password: Only latin1 character set are allowed.") - return - - url = '/'.join(['https:/', server, 'tracking']) - if args.branch: - params = {'repo': args.repo, 'branch': args.branch} - else: - params = {'repo': args.repo} - try: - ret = requests.delete(url, params=params, verify=False, auth=HTTPBasicAuth(user, password)) - if ret.status_code == 200 and ret.json()['code'] == '2001': - print('Tracking delete successfully.') - return - if ret.status_code == 200 and ret.json()['code'] == '6005': - print('Delete Nothing. Tracking not exist.') - return - - print("Tracking delete failed. Error: {}".format(ret.text)) - except IOError as exception: - print('Connect server error: ' + str(exception)) - - -def query(args): - """ - query table data - """ - status, ret = query_table(args) - if status == "success": - data_frame = pandas.DataFrame.from_dict(ret.json()["data"], orient="columns") - data_frame.index = range(1, len(data_frame) + 1) - print(data_frame) - else: - print(ret) - - -def file_input_track(file_path, args): - """ - load tracking from file - """ - if os.path.exists(file_path) and os.path.isfile(file_path): - if os.path.splitext(file_path)[-1] != ".yaml": - print('Please input yaml file. Error in {}'.format(file_path)) - return - with open(file_path) as file: - content = file.readlines() - params = dict() - for item in content: - if ":" in item: - k = item.split(':')[0] - value = item.split(':')[1].strip(' ').strip('\n') - params.update({k: value}) - params.update({'server': args.server, 'user': args.user, 'password': args.password}) - ret = params_input_track(params, file_path) - if ret[0] == 'success': - print('Tracking successfully {} for {}'.format(ret[1], file_path)) - else: - print('Tracking failed for {}: {}'.format(file_path, ret[1])) - else: - print('yaml path error. Params error in {}'.format(file_path)) - - -def dir_input_track(dir_path, args): - """ - load tracking from dir - """ - if os.path.exists(dir_path) and os.path.isdir(dir_path): - dir_files = os.listdir(dir_path) - if not dir_files: - print('error: dir path empty') - return - for file in dir_files: - if os.path.isfile(os.path.join(dir_path, file)) and os.path.splitext(file)[-1] == ".yaml": - file_path = os.path.join(dir_path, file) - file_input_track(file_path, args) - else: - print('Please input yaml file. Error in {}'.format(file)) - else: - print('error: dir path error. Params error in {}'.format(dir_path)) - - -def check_password_length(password): - """ - Password length must be between 6 and 32 - """ - if 6 <= len(password) <= 32: - return True - return False - - -parser = argparse.ArgumentParser( - prog='patch_tracking_cli', - allow_abbrev=False, - description="command line tool for manipulating patch tracking information" -) -subparsers = parser.add_subparsers(description=None, dest='subparser_name', help='additional help') - -# common argument -common_parser = argparse.ArgumentParser(add_help=False) -common_parser.add_argument("--server", required=True, help="patch tracking daemon server") - -# authentication argument -authentication_parser = argparse.ArgumentParser(add_help=False) -authentication_parser.add_argument('--user', required=True, help='authentication username') -authentication_parser.add_argument('--password', required=True, help='authentication password') - -# add -ADD_USAGE = """ - patch_tracking_cli add --server SERVER --user USER --password PASSWORD - --version_control github --scm_repo SCM_REPO --scm_branch SCM_BRANCH - --repo REPO --branch BRANCH --enabled True - patch_tracking_cli add --server SERVER --user USER --password PASSWORD --file FILE - patch_tracking_cli add --server SERVER --user USER --password PASSWORD --dir DIR""" -parser_add = subparsers.add_parser( - 'add', parents=[common_parser, authentication_parser], help="add tracking", usage=ADD_USAGE, allow_abbrev=False -) -parser_add.set_defaults(func=add) -parser_add.add_argument("--version_control", choices=['github'], help="upstream version control system") -parser_add.add_argument("--scm_repo", help="upstream scm repository") -parser_add.add_argument("--scm_branch", help="upstream scm branch") -parser_add.add_argument("--repo", help="source package repository") -parser_add.add_argument("--branch", help="source package branch") -parser_add.add_argument("--enabled", choices=["True", "true", "False", "false"], help="whether tracing is enabled") -parser_add.add_argument('--file', help='import patch tracking from file') -parser_add.add_argument('--dir', help='import patch tracking from files in directory') - -# delete -parser_delete = subparsers.add_parser( - 'delete', parents=[common_parser, authentication_parser], help="delete tracking", allow_abbrev=False -) -parser_delete.set_defaults(func=delete) -parser_delete.add_argument("--repo", required=True, help="source package repository") -parser_delete.add_argument("--branch", help="source package branch") - -# query -parser_query = subparsers.add_parser('query', parents=[common_parser], help="query tracking/issue", allow_abbrev=False) -parser_query.set_defaults(func=query) -parser_query.add_argument("--table", required=True, choices=["tracking", "issue"], help="query tracking or issue") -parser_query.add_argument("--repo", help="source package repository") -parser_query.add_argument("--branch", help="source package branch") - - -def main(): - """main""" - args_ = parser.parse_args() - if args_.subparser_name: - if args_.func(args_) != "success": - sys.exit(1) - else: - sys.exit(0) - else: - parser.print_help() - sys.exit(1) - - -if __name__ == "__main__": - main() diff --git a/patch-tracking/patch_tracking/database/__init__.py b/patch-tracking/patch_tracking/database/__init__.py deleted file mode 100644 index 83b427cae72e9b47097d8b247a04b8ce1c5efb62..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/database/__init__.py +++ /dev/null @@ -1,14 +0,0 @@ -""" -database init -""" -from flask_sqlalchemy import SQLAlchemy - -db = SQLAlchemy() - - -def reset_database(): - """ - reset database - """ - db.drop_all() - db.create_all() diff --git a/patch-tracking/patch_tracking/database/models.py b/patch-tracking/patch_tracking/database/models.py deleted file mode 100644 index 8aee57cd9c292d03ed6ffc023de86f603e05b0f1..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/database/models.py +++ /dev/null @@ -1,67 +0,0 @@ -""" -module of database model -""" -from patch_tracking.database import db - - -class Tracking(db.Model): - """ - database model of tracking - """ - id = db.Column(db.Integer, autoincrement=True) - version_control = db.Column(db.String(80)) - scm_repo = db.Column(db.String(80)) - scm_branch = db.Column(db.String(80)) - scm_commit = db.Column(db.String(80)) - repo = db.Column(db.String(80), primary_key=True) - branch = db.Column(db.String(80), primary_key=True) - enabled = db.Column(db.Boolean) - - def __init__(self, version_control, scm_repo, scm_branch, scm_commit, repo, branch, enabled=True): - self.version_control = version_control - self.scm_repo = scm_repo - self.scm_branch = scm_branch - self.scm_commit = scm_commit - self.repo = repo - self.branch = branch - self.enabled = enabled - - def __repr__(self): - return '' % (self.repo, self.branch) - - def to_json(self): - """ - convert to json - """ - return { - 'version_control': self.version_control, - 'scm_repo': self.scm_repo, - 'scm_branch': self.scm_branch, - 'scm_commit': self.scm_commit, - 'repo': self.repo, - 'branch': self.branch, - 'enabled': self.enabled - } - - -class Issue(db.Model): - """ - database model of issue - """ - issue = db.Column(db.String(80), primary_key=True) - repo = db.Column(db.String(80)) - branch = db.Column(db.String(80)) - - def __init__(self, issue, repo, branch): - self.issue = issue - self.repo = repo - self.branch = branch - - def __repr__(self): - return '' % (self.issue, self.repo, self.branch) - - def to_json(self): - """ - convert to json - """ - return {'issue': self.issue, 'repo': self.repo, 'branch': self.branch} diff --git a/patch-tracking/patch_tracking/database/reset_db.py b/patch-tracking/patch_tracking/database/reset_db.py deleted file mode 100644 index 7581dea51c37ae215585b59826e42704a62387e7..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/database/reset_db.py +++ /dev/null @@ -1,17 +0,0 @@ -""" -reset database -""" -from patch_tracking.app import app -from patch_tracking.database import reset_database - - -def reset(): - """ - reset database - """ - with app.app_context(): - reset_database() - - -if __name__ == "__main__": - reset() diff --git a/patch-tracking/patch_tracking/db.sqlite b/patch-tracking/patch_tracking/db.sqlite deleted file mode 100644 index aa4d6cc3dc7000855b726c6e0300b4cb556f13f1..0000000000000000000000000000000000000000 Binary files a/patch-tracking/patch_tracking/db.sqlite and /dev/null differ diff --git a/patch-tracking/patch_tracking/logging.conf b/patch-tracking/patch_tracking/logging.conf deleted file mode 100644 index 40edbe15461d905612e6927bdc3b535d992a568b..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/logging.conf +++ /dev/null @@ -1,29 +0,0 @@ -[loggers] -keys=root - -[handlers] -keys=console,logfile - -[formatters] -keys=simple - -[logger_root] -level=DEBUG -handlers=console,logfile - -[handler_console] -class=StreamHandler -level=DEBUG -formatter=simple -args=(sys.stdout,) - -[formatter_simple] -format=%(asctime)s|%(name)s|%(filename)s:%(lineno)d|%(threadName)s|%(levelname)s|%(message)s -datefmt= - - -[handler_logfile] -class=handlers.RotatingFileHandler -level=DEBUG -args=('patch-tracking.log', 'a', 1024*1024*100, 10) -formatter=simple diff --git a/patch-tracking/patch_tracking/patch-tracking b/patch-tracking/patch_tracking/patch-tracking deleted file mode 100755 index e70ea6751d4de7d77ad70c7dbecee5670f18a644..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/patch-tracking +++ /dev/null @@ -1,10 +0,0 @@ -#!/bin/bash - -app_file=`rpm -ql patch-tracking | grep app.py` -app_path=${app_file%/app.py} -chdir_path=${app_file%/patch_tracking/app.py} - -settings_file='/etc/patch-tracking/settings.conf' - -server=`grep 'LISTEN' $settings_file | awk -F'=' '{print $2}' | sed -e 's/^[ ]"*//g' -e "s/^'*//g" | sed -e 's/"*$//g' -e "s/'*$//g" | sed -e 's/^[ \t]*//g' | sed -e 's/[ \t]*$//g'` -uwsgi --master --https "${server},/etc/patch-tracking/self-signed.crt,/etc/patch-tracking/self-signed.key" --wsgi-file "${app_file}" --callable app --chdir "${chdir_path}" --threads 100 --lazy diff --git a/patch-tracking/patch_tracking/patch-tracking.service b/patch-tracking/patch_tracking/patch-tracking.service deleted file mode 100644 index 9293e003c51c44d95130dd118d7113efdc7bfe7b..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/patch-tracking.service +++ /dev/null @@ -1,16 +0,0 @@ -[Unit] -Description=uWSGI Emperor -After=syslog.target - -[Service] -ExecStart=/usr/bin/patch-tracking -RuntimeDirectory=patch-tracking -Restart=always -RestartSec=10 -KillSignal=SIGQUIT -Type=notify -StandardError=syslog -NotifyAccess=all - -[Install] -WantedBy=multi-user.target diff --git a/patch-tracking/patch_tracking/settings.conf b/patch-tracking/patch_tracking/settings.conf deleted file mode 100644 index 779a498a23953725cb218e86a5aaf8f42b3244d0..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/settings.conf +++ /dev/null @@ -1,17 +0,0 @@ -# server settings -LISTEN = "127.0.0.1:5001" - -# GitHub API settings -GITHUB_ACCESS_TOKEN = "" - -# Gitee API settings -GITEE_ACCESS_TOKEN = "" - -# Time interval -SCAN_DB_INTERVAL = 3600 - -# username -USER = "admin" - -# password -PASSWORD = "" diff --git a/patch-tracking/patch_tracking/task/__init__.py b/patch-tracking/patch_tracking/task/__init__.py deleted file mode 100644 index 0d39b9ef4c5e67d6428ebcf2aac379dc8a7222e1..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/task/__init__.py +++ /dev/null @@ -1,6 +0,0 @@ -""" -apscheduler init -""" -from flask_apscheduler import APScheduler - -scheduler = APScheduler() diff --git a/patch-tracking/patch_tracking/task/task.py b/patch-tracking/patch_tracking/task/task.py deleted file mode 100644 index 27408b437a578a6e49436888e86f882b8ae31ebd..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/task/task.py +++ /dev/null @@ -1,107 +0,0 @@ -""" -load job/task of tracking -""" -import datetime -import logging -from patch_tracking.task import scheduler -from patch_tracking.database.models import Tracking -from patch_tracking.util.github_api import GitHubApi -from patch_tracking.api.business import update_tracking - -logger = logging.getLogger(__name__) - - -def init(app): - """ - scheduler jobs init - """ - scan_db_interval = app.config['SCAN_DB_INTERVAL'] - scheduler.init_app(app) - scheduler.add_job( - id='Add Tracking job - Update DB', - func=patch_tracking_task, - trigger='interval', - args=(app, ), - seconds=int(scan_db_interval), - next_run_time=datetime.datetime.now() - ) - - scheduler.add_job( - id=str("Check empty commitID"), - func=check_empty_commit_id, - trigger='interval', - args=(app, ), - seconds=600, - next_run_time=datetime.datetime.now(), - misfire_grace_time=300, - ) - - scheduler.start() - - -def add_job(job_id, func, args): - """ - add job - """ - logger.info("Add Tracking job - %s", job_id) - scheduler.add_job( - id=job_id, func=func, args=args, trigger='date', run_date=datetime.datetime.now(), misfire_grace_time=600 - ) - - -def check_empty_commit_id(flask_app): - """ - check commit ID for empty tracking - """ - with flask_app.app_context(): - new_track = get_track_from_db() - github_api = GitHubApi() - for item in new_track: - if item.scm_commit: - continue - status, result = github_api.get_latest_commit(item.scm_repo, item.scm_branch) - if status == 'success': - commit_id = result['latest_commit'] - data = { - 'version_control': item.version_control, - 'repo': item.repo, - 'branch': item.branch, - 'enabled': item.enabled, - 'scm_commit': commit_id, - 'scm_branch': item.scm_branch, - 'scm_repo': item.scm_repo - } - update_tracking(data) - else: - logger.error( - 'Check empty CommitID: Fail to get latest commit id of scm_repo: %s scm_branch: %s. Return val: %s', - item.scm_repo, item.scm_branch, result - ) - - -def get_track_from_db(): - """ - query all trackings from database - """ - all_track = Tracking.query.filter_by(enabled=True) - return all_track - - -def patch_tracking_task(flask_app): - """ - add patch trackings to jobs - """ - with flask_app.app_context(): - all_track = get_track_from_db() - all_job_id = list() - for item in scheduler.get_jobs(): - all_job_id.append(item.id) - for track in all_track: - if track.branch.split('/')[0] != 'patch-tracking': - job_id = str(track.repo + ":" + track.branch) - if job_id not in all_job_id: - add_job( - job_id=job_id, - func='patch_tracking.task.task_apscheduler:upload_patch_to_gitee', - args=(track, ) - ) diff --git a/patch-tracking/patch_tracking/task/task_apscheduler.py b/patch-tracking/patch_tracking/task/task_apscheduler.py deleted file mode 100644 index 63d0f4228203c698670578452c9e835baf4297f9..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/task/task_apscheduler.py +++ /dev/null @@ -1,303 +0,0 @@ -""" -tracking job -""" -import logging -import base64 -import datetime -import time -import random -from sqlalchemy.exc import SQLAlchemyError -from patch_tracking.util.gitee_api import create_branch, upload_patch, create_gitee_issue -from patch_tracking.util.gitee_api import create_pull_request, get_path_content, upload_spec, create_spec -from patch_tracking.util.github_api import GitHubApi -from patch_tracking.database.models import Tracking -from patch_tracking.api.business import update_tracking, create_issue -from patch_tracking.task import scheduler -from patch_tracking.util.spec import Spec - -logger = logging.getLogger(__name__) - - -def upload_patch_to_gitee(track): - """ - upload a patch file to Gitee - """ - cur_time = datetime.datetime.now().strftime("%Y%m%d%H%M%S%f") - with scheduler.app.app_context(): - logger.info('[Patch Tracking %s] track.scm_commit_id: %s.', cur_time, track.scm_commit) - patch = get_scm_patch(track) - if patch: - issue = create_patch_issue_pr(patch, cur_time) - if issue: - create_issue_db(issue) - else: - logger.info('[Patch Tracking %s] No issue need to create.', cur_time) - else: - logger.debug('[Patch Tracking %s] No new commit.', cur_time) - - -def get_all_commit_info(scm_repo, db_commit, latest_commit): - """ - get all commit information between to commits - """ - commit_list = list() - github_api = GitHubApi() - - while db_commit != latest_commit: - status, result = github_api.get_commit_info(scm_repo, latest_commit) - logger.debug('get_commit_info: %s %s', status, result) - if status == 'success': - if 'parent' in result: - ret = github_api.get_patch(scm_repo, latest_commit, latest_commit) - logger.debug('get patch api ret: %s', ret) - if ret['status'] == 'success': - result['patch_content'] = ret['api_ret'] - # inverted insert commit_list - commit_list.insert(0, result) - else: - logger.error('Get scm: %s commit: %s patch failed. Result: %s', scm_repo, latest_commit, result) - - latest_commit = result['parent'] - else: - logger.info( - '[Patch Tracking] Successful get scm commit from %s to %s ID/message/time/patch.', db_commit, - latest_commit - ) - break - else: - logger.error( - '[Patch Tracking] Get scm: %s commit: %s ID/message/time failed. Result: %s', scm_repo, latest_commit, - result - ) - - return commit_list - - -def get_scm_patch(track): - """ - Traverse the Tracking data table to get the patch file of enabled warehouse. - Different warehouse has different acquisition methods - :return: - """ - github_api = GitHubApi() - scm_dict = dict( - scm_repo=track.scm_repo, - scm_branch=track.scm_branch, - scm_commit=track.scm_commit, - enabled=track.enabled, - repo=track.repo, - branch=track.branch, - version_control=track.version_control - ) - status, result = github_api.get_latest_commit(scm_dict['scm_repo'], scm_dict['scm_branch']) - logger.debug( - 'repo: %s branch: %s. get_latest_commit: %s %s', scm_dict['scm_repo'], scm_dict['scm_branch'], status, result - ) - - if status == 'success': - commit_id = result['latest_commit'] - if not scm_dict['scm_commit']: - data = { - 'version_control': scm_dict['version_control'], - 'repo': scm_dict['repo'], - 'branch': scm_dict['branch'], - 'enabled': scm_dict['enabled'], - 'scm_commit': commit_id, - 'scm_branch': scm_dict['scm_branch'], - 'scm_repo': scm_dict['scm_repo'] - } - update_tracking(data) - logger.info( - '[Patch Tracking] Scm_repo: %s Scm_branch: %s.Get latest commit ID: %s From commit ID: None.', - scm_dict['scm_repo'], scm_dict['scm_branch'], result['latest_commit'] - ) - else: - if commit_id != scm_dict['scm_commit']: - commit_list = get_all_commit_info(scm_dict['scm_repo'], scm_dict['scm_commit'], commit_id) - scm_dict['commit_list'] = commit_list - return scm_dict - logger.info( - '[Patch Tracking] Scm_repo: %s Scm_branch: %s.Get latest commit ID: %s From commit ID: %s. ' - 'Nothing need to do.', scm_dict['scm_repo'], scm_dict['scm_branch'], commit_id, scm_dict['scm_commit'] - ) - else: - logger.error( - '[Patch Tracking] Fail to get latest commit id of scm_repo: %s scm_branch: %s. Return val: %s', - scm_dict['scm_repo'], scm_dict['scm_branch'], result - ) - return None - - -def create_patch_issue_pr(patch, cur_time): - """ - Create temporary branches, submit files, and create PR and issue - :return: - """ - issue_dict = dict() - if not patch: - return None - - issue_dict['repo'] = patch['repo'] - issue_dict['branch'] = patch['branch'] - new_branch = 'patch-tracking/' + cur_time - result = create_branch(patch['repo'], patch['branch'], new_branch) - if result == 'success': - logger.info('[Patch Tracking %s] Successful create branch: %s', cur_time, new_branch) - else: - logger.error('[Patch Tracking %s] Fail to create branch: %s', cur_time, new_branch) - return None - patch_lst = list() - issue_table = "| Commit | Datetime | Message |\n| ------ | ------ | ------ |\n" - for latest_commit in patch['commit_list']: - scm_commit_url = '/'.join(['https://github.com', patch['scm_repo'], 'commit', latest_commit['commit_id']]) - latest_commit['message'] = latest_commit['message'].replace("\r", "").replace("\n", "") - issue_table += '| [{}]({}) | {} | {} |'.format( - latest_commit['commit_id'][0:7], scm_commit_url, latest_commit['time'], latest_commit['message'] - ) + '\n' - - patch_file_content = latest_commit['patch_content'] - post_data = { - 'repo': patch['repo'], - 'branch': new_branch, - 'latest_commit_id': latest_commit['commit_id'], - 'patch_file_content': str(patch_file_content), - 'cur_time': cur_time, - 'commit_url': scm_commit_url - } - result = upload_patch(post_data) - if result == 'success': - logger.info( - '[Patch Tracking %s] Successfully upload patch file of commit: %s', cur_time, latest_commit['commit_id'] - ) - else: - logger.error( - '[Patch Tracking %s] Fail to upload patch file of commit: %s', cur_time, latest_commit['commit_id'] - ) - return None - patch_lst.append(str(latest_commit['commit_id'])) - - result = upload_spec_to_repo(patch, patch_lst, cur_time) - if result == "success": - logger.info('[Patch Tracking %s] Successfully upload spec file.', cur_time) - else: - logger.error('[Patch Tracking %s] Fail to upload spec file. Result: %s', cur_time, result) - return None - - logger.debug(issue_table) - result = create_gitee_issue(patch['repo'], patch['branch'], issue_table, cur_time) - if result[0] == 'success': - issue_num = result[1] - logger.info('[Patch Tracking %s] Successfully create issue: %s', cur_time, issue_num) - - retry_count = 10 - while retry_count > 0: - ret = create_pull_request(patch['repo'], patch['branch'], new_branch, issue_num, cur_time) - if ret == 'success': - logger.info('[Patch Tracking %s] Successfully create PR of issue: %s.', cur_time, issue_num) - break - else: - logger.warning( - '[Patch Tracking %s] Fail to create PR of issue: %s. Result: %s', cur_time, issue_num, ret - ) - retry_count -= 1 - time.sleep(random.random() * 2) - continue - if retry_count == 0: - logger.error('[Patch Tracking %s] Fail to create PR of issue: %s.', cur_time, issue_num) - return None - - issue_dict['issue'] = issue_num - - data = { - 'version_control': patch['version_control'], - 'repo': patch['repo'], - 'branch': patch['branch'], - 'enabled': patch['enabled'], - 'scm_commit': patch['commit_list'][-1]['commit_id'], - 'scm_branch': patch['scm_branch'], - 'scm_repo': patch['scm_repo'] - } - try: - update_tracking(data) - except SQLAlchemyError as err: - logger.error('[Patch Tracking %s] Fail to update tracking: %s. Result: %s', cur_time, data, err) - else: - logger.error('[Patch Tracking %s] Fail to create issue: %s. Result: %s', cur_time, issue_table, result[1]) - return None - - return issue_dict - - -def upload_spec_to_repo(patch, patch_lst, cur_time): - """ - update and upload spec file - """ - new_branch = 'patch-tracking/' + cur_time - - _, repo_name = patch['repo'].split('/') - spec_file = repo_name + '.spec' - - patch_file_lst = [patch + '.patch' for patch in patch_lst] - - log_title = "{} patch-tracking".format(cur_time) - log_content = "append patch file of upstream repository from <{}> to <{}>".format(patch_lst[0], patch_lst[-1]) - - ret = get_path_content(patch['repo'], patch['branch'], spec_file) - if 'content' in ret: - spec_content = str(base64.b64decode(ret['content']), encoding='utf-8') - spec_sha = ret['sha'] - new_spec = modify_spec(log_title, log_content, patch_file_lst, spec_content) - result = update_spec_to_repo(patch['repo'], new_branch, cur_time, new_spec, spec_sha) - else: - spec_content = '' - new_spec = modify_spec(log_title, log_content, patch_file_lst, spec_content) - result = create_spec_to_repo(patch['repo'], new_branch, cur_time, new_spec) - - return result - - -def modify_spec(log_title, log_content, patch_file_lst, spec_content): - """ - modify spec file - """ - spec = Spec(spec_content) - return spec.update(log_title, log_content, patch_file_lst) - - -def update_spec_to_repo(repo, branch, cur_time, spec_content, spec_sha): - """ - update spec file - """ - ret = upload_spec(repo, branch, cur_time, spec_content, spec_sha) - if ret == 'success': - logger.info('[Patch Tracking %s] Successfully update spec file.', cur_time) - else: - logger.error('[Patch Tracking %s] Fail to update spec file. Result: %s', cur_time, ret) - - return ret - - -def create_spec_to_repo(repo, branch, cur_time, spec_content): - """ - create new spec file - """ - ret = create_spec(repo, branch, spec_content, cur_time) - if ret == 'success': - logger.info('[Patch Tracking %s] Successfully create spec file.', cur_time) - else: - logger.error('[Patch Tracking %s] Fail to create spec file. Result: %s', cur_time, ret) - - return ret - - -def create_issue_db(issue): - """ - create issue into database - """ - issue_num = issue['issue'] - tracking = Tracking.query.filter_by(repo=issue['repo'], branch=issue['branch']).first() - tracking_repo = tracking.repo - tracking_branch = tracking.branch - data = {'issue': issue_num, 'repo': tracking_repo, 'branch': tracking_branch} - logger.debug('issue data: %s', data) - create_issue(data) diff --git a/patch-tracking/patch_tracking/tests/issue_test.py b/patch-tracking/patch_tracking/tests/issue_test.py deleted file mode 100644 index c9554ebcc3a5ccc60688985b0b6e21f628ba85c2..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/tests/issue_test.py +++ /dev/null @@ -1,191 +0,0 @@ -# pylint: disable=R0801 -''' -Automated testing of the Issue interface, GET requests -''' -import unittest -import json -from patch_tracking.app import app -from patch_tracking.api.business import create_issue -from patch_tracking.database import reset_db -from patch_tracking.api.constant import ResponseCode - - -class TestIssue(unittest.TestCase): - ''' - Automated testing of the Issue interface, GET requests - ''' - def setUp(self) -> None: - ''' - Prepare the environment - :return: - ''' - self.client = app.test_client() - reset_db.reset() - - def test_none_data(self): - ''' - In the absence of data, the GET interface queries all the data - :return: - ''' - with app.app_context(): - - resp = self.client.get("/issue") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, resp_dict.get("code"), msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone(resp_dict.get("data"), msg="Error in data information return") - self.assertEqual(resp_dict.get("data"), [], msg="Error in data information return") - - def test_query_inserted_data(self): - ''' - The GET interface queries existing data - :return: - ''' - with app.app_context(): - data_insert = {"issue": "A", "repo": "A", "branch": "A"} - - create_issue(data_insert) - - resp = self.client.get("/issue?repo=A&branch=A") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, resp_dict.get("code"), msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone(resp_dict.get("data"), msg="Error in data information return") - self.assertIn(data_insert, resp_dict.get("data"), msg="Error in data information return") - - def test_find_all_data(self): - ''' - The GET interface queries all the data - :return: - ''' - with app.app_context(): - data_insert_c = {"issue": "C", "repo": "C", "branch": "C"} - data_insert_d = {"issue": "D", "repo": "D", "branch": "D"} - create_issue(data_insert_c) - create_issue(data_insert_d) - resp = self.client.get("/issue") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, resp_dict.get("code"), msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone(resp_dict.get("data"), msg="Error in data information return") - self.assertIn(data_insert_c, resp_dict.get("data"), msg="Error in data information return") - self.assertIn(data_insert_d, resp_dict.get("data"), msg="Error in data information return") - - def test_find_nonexistent_data(self): - ''' - The GET interface queries data that does not exist - :return: - ''' - with app.app_context(): - - resp = self.client.get("/issue?repo=aa&branch=aa") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, resp_dict.get("code"), msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone(resp_dict.get("data"), msg="Error in data information return") - self.assertEqual(resp_dict.get("data"), [], msg="Error in data information return") - - def test_get_error_parameters(self): - ''' - The get interface passes in the wrong parameter - :return: - ''' - with app.app_context(): - data_insert = {"issue": "BB", "repo": "BB", "branch": "BB"} - - create_issue(data_insert) - - resp = self.client.get("/issue?oper=BB&chcnsrb=BB") - - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.INPUT_PARAMETERS_ERROR, resp_dict.get("code"), msg="Error in status code return" - ) - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.INPUT_PARAMETERS_ERROR), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertEqual(resp_dict.get("data"), None, msg="Error in data information return") - - def test_get_interface_uppercase(self): - ''' - The get interface uppercase - :return: - ''' - with app.app_context(): - data_insert = {"issue": "CCC", "repo": "CCC", "branch": "CCC"} - - create_issue(data_insert) - - resp = self.client.get("/issue?RrPo=CCC&brANch=CCC") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.INPUT_PARAMETERS_ERROR, resp_dict.get("code"), msg="Error in status code return" - ) - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.INPUT_PARAMETERS_ERROR), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertEqual(resp_dict.get("data"), None, msg="Error in data information return") - - -if __name__ == '__main__': - unittest.main() diff --git a/patch-tracking/patch_tracking/tests/logging.conf b/patch-tracking/patch_tracking/tests/logging.conf deleted file mode 100644 index f153c42fef2e76db461f7128dc024da1c1f79a6e..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/tests/logging.conf +++ /dev/null @@ -1,22 +0,0 @@ -[loggers] -keys=root - -[handlers] -keys=console - -[formatters] -keys=simple - -[logger_root] -level=DEBUG -handlers=console - -[handler_console] -class=StreamHandler -level=DEBUG -formatter=simple -args=(sys.stdout,) - -[formatter_simple] -format=%(asctime)s - %(name)s - %(levelname)s - %(message)s -datefmt= diff --git a/patch-tracking/patch_tracking/tests/settings.conf b/patch-tracking/patch_tracking/tests/settings.conf deleted file mode 100644 index 960c2d2e5061ff6628a53d5273faaca384dbd034..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/tests/settings.conf +++ /dev/null @@ -1,17 +0,0 @@ -# server settings -LISTEN = "127.0.0.1:5001" - -# GitHub API settings -GITHUB_ACCESS_TOKEN = "" - -# Gitee API settings -GITEE_ACCESS_TOKEN = "" - -# Time interval -SCAN_DB_INTERVAL = 3600 - -# username -USER = "" - -# password -PASSWORD = "" diff --git a/patch-tracking/patch_tracking/tests/tracking_test.py b/patch-tracking/patch_tracking/tests/tracking_test.py deleted file mode 100644 index 79c1962c9e55763666a129d52074b901565a3b4a..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/tests/tracking_test.py +++ /dev/null @@ -1,432 +0,0 @@ -# -*- coding:utf-8 -*- -''' -Automated testing of the Tracking interface, including POST requests and GET requests -''' -import unittest -import json -from base64 import b64encode -from werkzeug.security import generate_password_hash -from patch_tracking.app import app -from patch_tracking.database import reset_db -from patch_tracking.api.business import create_tracking -from patch_tracking.api.constant import ResponseCode - - -class TestTracking(unittest.TestCase): - ''' - Automated testing of the Tracking interface, including POST requests and GET requests - ''' - def setUp(self) -> None: - ''' - Prepare the environment - :return: - ''' - self.client = app.test_client() - reset_db.reset() - app.config["USER"] = "hello" - app.config["PASSWORD"] = generate_password_hash("world") - - credentials = b64encode(b"hello:world").decode('utf-8') - self.auth = {"Authorization": f"Basic {credentials}"} - - def test_none_data(self): - ''' - In the absence of data, the GET interface queries all the data - :return: - ''' - with app.app_context(): - - resp = self.client.get("/tracking") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, resp_dict.get("code"), msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone(resp_dict.get("data"), msg="Error in data information return") - self.assertEqual(resp_dict.get("data"), [], msg="Error in data information return") - - def test_find_nonexistent_data(self): - ''' - The GET interface queries data that does not exist - :return: - ''' - with app.app_context(): - - resp = self.client.get("/tracking?repo=aa&branch=aa") - - resp_dict = json.loads(resp.data) - - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, resp_dict.get("code"), msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone(resp_dict.get("data"), msg="Error in data information return") - self.assertEqual(resp_dict.get("data"), [], msg="Error in data information return") - - def test_insert_data(self): - ''' - The POST interface inserts data - :return: - ''' - data = { - "version_control": "github", - "scm_repo": "A", - "scm_branch": "A", - "scm_commit": "A", - "repo": "A", - "branch": "A", - "enabled": 0 - } - - resp = self.client.post("/tracking", json=data, content_type="application/json", headers=self.auth) - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, resp_dict.get("code"), msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone(resp_dict.get("data"), msg="Error in data information return") - - def test_query_inserted_data(self): - ''' - The GET interface queries existing data - :return: - ''' - with app.app_context(): - data_insert = { - "version_control": "github", - "scm_repo": "B", - "scm_branch": "B", - "scm_commit": "B", - "repo": "B", - "branch": "B", - "enabled": False - } - - create_tracking(data_insert) - - resp = self.client.get("/tracking?repo=B&branch=B") - - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, resp_dict.get("code"), msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone(resp_dict.get("data"), msg="Error in data information return") - self.assertIn(data_insert, resp_dict.get("data"), msg="Error in data information return") - - def test_only_input_branch(self): - ''' - Get interface queries enter only BRANCH, not REPO - :return: - ''' - with app.app_context(): - data_insert = { - "version_control": "github", - "scm_repo": "C", - "scm_branch": "C", - "scm_commit": "C", - "repo": "C", - "branch": "C", - "enabled": 0 - } - - create_tracking(data_insert) - - resp = self.client.get("/tracking?branch=B") - - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.SUCCESS, resp_dict.get("code"), msg="Error in status code return" - ) - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertEqual(resp_dict.get("data"), None, msg="Error in data information return") - - def test_fewer_parameters(self): - ''' - When the POST interface passes in parameters, fewer parameters must be passed - :return: - ''' - data = {"version_control": "github", "scm_commit": "AA", "repo": "AA", "branch": "AA", "enabled": 1} - - resp = self.client.post("/tracking", json=data, content_type="application/json", headers=self.auth) - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.INPUT_PARAMETERS_ERROR, resp_dict.get("code"), msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.INPUT_PARAMETERS_ERROR), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertEqual(resp_dict.get("data"), None, msg="Error in data information return") - - def test_error_parameters_value(self): - ''' - The post interface passes in the wrong parameter - :return: - ''' - data = {"version_control": "github", "scm_commit": "AA", "repo": "AA", "branch": "AA", "enabled": "AA"} - - resp = self.client.post("/tracking", json=data, content_type="application/json", headers=self.auth) - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.INPUT_PARAMETERS_ERROR, resp_dict.get("code"), msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.INPUT_PARAMETERS_ERROR), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertEqual(resp_dict.get("data"), None, msg="Error in data information return") - - def test_post_error_parameters(self): - ''' - The post interface passes in the wrong parameter - :return: - ''' - data = {"version_control": "github", "scm_commit": "AA", "oper": "AA", "hcnarb": "AA", "enabled": "AA"} - - resp = self.client.post("/tracking", json=data, content_type="application/json", headers=self.auth) - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.INPUT_PARAMETERS_ERROR, resp_dict.get("code"), msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.INPUT_PARAMETERS_ERROR), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertEqual(resp_dict.get("data"), None, msg="Error in data information return") - - def test_get_error_parameters(self): - ''' - The get interface passes in the wrong parameter - :return: - ''' - with app.app_context(): - data_insert = { - "version_control": "github", - "scm_repo": "BB", - "scm_branch": "BB", - "scm_commit": "BB", - "repo": "BB", - "branch": "BB", - "enabled": True - } - - create_tracking(data_insert) - - resp = self.client.get("/tracking?oper=B&chcnsrb=B") - - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.INPUT_PARAMETERS_ERROR, resp_dict.get("code"), msg="Error in status code return" - ) - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.INPUT_PARAMETERS_ERROR), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertEqual(resp_dict.get("data"), None, msg="Error in data information return") - - def test_update_data(self): - ''' - update data - :return: - ''' - with app.app_context(): - data_old = { - "version_control": "github", - "scm_repo": "str", - "scm_branch": "str", - "scm_commit": "str", - "repo": "string", - "branch": "string", - "enabled": False - } - - self.client.post("/tracking", json=data_old, content_type="application/json", headers=self.auth) - - data_new = { - "branch": "string", - "enabled": True, - "repo": "string", - "scm_branch": "string", - "scm_commit": "string", - "scm_repo": "string", - "version_control": "github", - } - - self.client.post("/tracking", json=data_new, content_type="application/json") - - resp = self.client.get("/tracking?repo=string&branch=string") - - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, resp_dict.get("code"), msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.SUCCESS), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertIsNotNone(resp_dict.get("data"), msg="Error in data information return") - #self.assertIn(data_new, resp_dict.get("data"), msg="Error in data information return") - - def test_get_interface_uppercase(self): - ''' - The get interface uppercase - :return: - ''' - with app.app_context(): - data_insert = { - "version_control": "github", - "scm_repo": "BBB", - "scm_branch": "BBB", - "scm_commit": "BBB", - "repo": "BBB", - "branch": "BBB", - "enabled": False - } - - create_tracking(data_insert) - - resp = self.client.get("/tracking?rep=BBB&BRAnch=BBB") - - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.INPUT_PARAMETERS_ERROR, resp_dict.get("code"), msg="Error in status code return" - ) - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.INPUT_PARAMETERS_ERROR), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertEqual(resp_dict.get("data"), None, msg="Error in data information return") - - def test_version_control_error(self): - ''' - The POST version control error - :return: - ''' - data = { - "version_control": "gitgitgit", - "scm_repo": "A", - "scm_branch": "A", - "scm_commit": "A", - "repo": "A", - "branch": "A", - "enabled": 0 - } - - resp = self.client.post("/tracking", json=data, content_type="application/json", headers=self.auth) - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.INPUT_PARAMETERS_ERROR, resp_dict.get("code"), msg="Error in status code return") - - self.assertIn("msg", resp_dict, msg="Error in data format return") - self.assertEqual( - ResponseCode.CODE_MSG_MAP.get(ResponseCode.INPUT_PARAMETERS_ERROR), - resp_dict.get("msg"), - msg="Error in status code return" - ) - - self.assertIn("data", resp_dict, msg="Error in data format return") - self.assertEqual(resp_dict.get("data"), None, msg="Error in data information return") - - def test_delete_data(self): - """ - The POST interface inserts data - :return: - """ - data = { - "version_control": "github", - "scm_repo": "test_delete", - "scm_branch": "test_delete", - "scm_commit": "test_delete", - "repo": "test_delete1", - "branch": "test_delete1", - "enabled": 0 - } - - self.client.post("/tracking", json=data, content_type="application/json", headers=self.auth) - - resp = self.client.delete("/tracking?repo=test_delete1&branch=test_delete1", content_type="application/json", headers=self.auth) - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.SUCCESS, resp_dict.get("code"), msg="Error in status code return") - - def test_delete_not_found(self): - """ - The POST interface inserts data - :return: - """ - resp = self.client.delete("/tracking?repo=not_found1&branch=not_found1", content_type="application/json", headers=self.auth) - resp_dict = json.loads(resp.data) - self.assertIn("code", resp_dict, msg="Error in data format return") - self.assertEqual(ResponseCode.DELETE_DB_NOT_FOUND, resp_dict.get("code"), msg="Error in status code return") - - -if __name__ == '__main__': - unittest.main() diff --git a/patch-tracking/patch_tracking/util/__init__.py b/patch-tracking/patch_tracking/util/__init__.py deleted file mode 100644 index 34a27a793c356c83f9c03490a8a02eb25125b633..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/util/__init__.py +++ /dev/null @@ -1 +0,0 @@ -""" module of util """ diff --git a/patch-tracking/patch_tracking/util/gitee_api.py b/patch-tracking/patch_tracking/util/gitee_api.py deleted file mode 100644 index 4a0af0025560298fd529f9df81da43a8e363e208..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/util/gitee_api.py +++ /dev/null @@ -1,219 +0,0 @@ -""" -function of invoking Gitee API -""" -import base64 -import logging -import requests -from flask import current_app -from requests import exceptions - -logger = logging.getLogger(__name__) - -ORG_URL = "https://gitee.com/api/v5/orgs" -REPO_URL = "https://gitee.com/api/v5/repos" - - -def get_request(url, params): - """ - get request - """ - logger.debug("Get request, connect url: %s", url) - try: - response = requests.get(url, params=params) - return True, response - except exceptions.ConnectionError as err: - logger.error(err) - return False, 'connection error' - except IOError as err: - logger.error(err) - return False, 'IO error' - - -def post_request(url, data): - """ - post request - """ - logger.debug("Post request, connect url: %s", url) - try: - response = requests.post(url, data=data) - return True, response - except exceptions.ConnectionError as err: - logger.error(err) - return False, 'connection error' - except IOError as err: - logger.error(err) - return False, 'IO error' - - -def put_request(url, data): - """ - put request - """ - logger.debug("Put request, connect url: %s", url) - try: - response = requests.put(url, data=data) - return True, response - except exceptions.ConnectionError as err: - logger.error(err) - return False, 'connection error' - except IOError as err: - logger.error(err) - return False, 'IO error' - - -def get_user_info(token): - """ - get user info - """ - url = "https://gitee.com/api/v5/user" - gitee_token = token - param = {'access_token': gitee_token} - ret, ret_info = get_request(url, params=param) - if ret: - if ret_info.status_code == 200: - return True, ret_info.text - return False, ret_info.json() - - return False, ret_info - - -def get_path_content(repo, branch, path): - """ - get file content - """ - gitee_token = current_app.config['GITEE_ACCESS_TOKEN'] - url = '/'.join([REPO_URL, repo, 'contents', path]) - param = {'access_token': gitee_token, 'ref': branch} - _, ret_info = get_request(url, params=param) - return ret_info.json() - - -def create_branch(repo, branch, new_branch): - """ - create branch - """ - gitee_token = current_app.config['GITEE_ACCESS_TOKEN'] - url = '/'.join([REPO_URL, repo, 'branches']) - data = {'access_token': gitee_token, 'refs': branch, 'branch_name': new_branch} - ret, response = post_request(url, data=data) - if ret: - if response.status_code == 201: - return 'success' - return response.json() - - return response - - -def upload_patch(data): - """ - upload patch - """ - gitee_token = current_app.config['GITEE_ACCESS_TOKEN'] - patch_file_name = data['latest_commit_id'] + '.patch' - url = '/'.join([REPO_URL, data['repo'], 'contents', patch_file_name]) - content = base64.b64encode(data['patch_file_content'].encode("utf-8")) - message = '[patch tracking] ' + data['cur_time'] + ' - ' + data['commit_url'] + '\n' - data = {'access_token': gitee_token, 'content': content, 'message': message, 'branch': data['branch']} - ret, response = post_request(url, data=data) - if ret: - if response.status_code == 201: - return 'success' - return response.json() - - return response - - -def create_spec(repo, branch, spec_content, cur_time): - """ - create spec - """ - gitee_token = current_app.config['GITEE_ACCESS_TOKEN'] - owner, repo = repo.split('/') - spec_file_name = repo + '.spec' - url = '/'.join([REPO_URL, owner, repo, 'contents', spec_file_name]) - content = base64.b64encode(spec_content.encode("utf-8")) - message = '[patch tracking] ' + cur_time + ' - ' + 'create spec file' + '\n' - data = {'access_token': gitee_token, 'content': content, 'message': message, 'branch': branch} - ret, response = post_request(url, data=data) - if ret: - if response.status_code == 201: - return 'success' - return response.json() - - return response - - -def upload_spec(repo, branch, cur_time, spec_content, spec_sha): - """ - upload spec - """ - gitee_token = current_app.config['GITEE_ACCESS_TOKEN'] - owner, repo = repo.split('/') - spec_file_name = repo + '.spec' - url = '/'.join([REPO_URL, owner, repo, 'contents', spec_file_name]) - content = base64.b64encode(spec_content.encode("utf-8")) - message = '[patch tracking] ' + cur_time + ' - ' + 'update spec file' + '\n' - data = { - 'access_token': gitee_token, - 'owner': owner, - 'repo': repo, - 'path': spec_file_name, - 'content': content, - 'message': message, - 'branch': branch, - 'sha': spec_sha - } - ret, response = put_request(url, data=data) - if ret: - if response.status_code == 200: - return 'success' - return response.json() - - return response - - -def create_gitee_issue(repo, branch, issue_body, cur_time): - """ - create issue - """ - gitee_token = current_app.config['GITEE_ACCESS_TOKEN'] - owner, repo = repo.split('/') - url = '/'.join([REPO_URL, owner, 'issues']) - data = { - 'access_token': gitee_token, - 'repo': repo, - 'title': '[patch tracking] ' + branch + ' ' + cur_time, - 'body': issue_body - } - ret, response = post_request(url, data=data) - if ret: - if response.status_code == 201: - return 'success', response.json()['number'] - return 'error', response.json() - - return 'error', response - - -def create_pull_request(repo, branch, patch_branch, issue_num, cur_time): - """ - create pull request - """ - gitee_token = current_app.config['GITEE_ACCESS_TOKEN'] - owner, repo = repo.split('/') - url = '/'.join([REPO_URL, owner, repo, 'pulls']) - data = { - 'access_token': gitee_token, - 'repo': repo, - 'title': '[patch tracking] ' + cur_time, - 'head': patch_branch, - 'base': branch, - 'body': '#' + issue_num, - "prune_source_branch": "true" - } - ret, response = post_request(url, data=data) - if ret: - if response.status_code == 201: - return 'success' - return response.json() - - return response diff --git a/patch-tracking/patch_tracking/util/github_api.py b/patch-tracking/patch_tracking/util/github_api.py deleted file mode 100644 index 214d290f99931f6f210e473e2bbb295e8a7fb84c..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/util/github_api.py +++ /dev/null @@ -1,156 +0,0 @@ -""" -functionality of invoking GitHub API -""" -import time -import logging -import requests -from requests import exceptions -from flask import current_app - -logger = logging.getLogger(__name__) - - -def get_user_info(token): - """ - get user info - """ - url = "https://api.github.com/user" - count = 30 - token = 'token ' + token - headers = { - 'User-Agent': 'Mozilla/5.0', - 'Authorization': token, - 'Content-Type': 'application/json', - 'Connection': 'close', - 'method': 'GET', - 'Accept': 'application/json' - } - while count > 0: - try: - ret = requests.get(url, headers=headers) - if ret.status_code == 200: - return True, ret.text - return False, ret.json() - except exceptions.ConnectionError as err: - logger.warning(err) - time.sleep(10) - count -= 1 - continue - except UnicodeEncodeError: - return False, 'github token is bad credentials.' - except IOError as error: - return False, error - if count == 0: - logger.error('Fail to connnect to github: %s after retry 30 times.', url) - return False, 'connect error' - - -class GitHubApi(object): - """ - Encapsulates GitHub functionality - """ - def __init__(self): - github_token = current_app.config['GITHUB_ACCESS_TOKEN'] - token = 'token ' + github_token - self.headers = { - 'User-Agent': 'Mozilla/5.0', - 'Authorization': token, - 'Content-Type': 'application/json', - 'Connection': 'close', - 'method': 'GET', - 'Accept': 'application/json' - } - - def api_request(self, url): - """ - request GitHub API - """ - logger.debug("Connect url: %s", url) - count = 30 - while count > 0: - try: - response = requests.get(url, headers=self.headers) - return response - except exceptions.ConnectionError as err: - logger.warning(err) - time.sleep(10) - count -= 1 - continue - except IOError as err: - logger.error(err) - return False - if count == 0: - logger.error('Fail to connnect to github: %s after retry 30 times.', url) - return False - - def get_commit_info(self, repo_url, commit_id): - """ - get commit info - """ - res_dict = dict() - api_url = 'https://api.github.com/repos' - url = '/'.join([api_url, repo_url, 'commits', commit_id]) - ret = self.api_request(url) - if ret: - if ret.status_code == 200: - res_dict['commit_id'] = commit_id - res_dict['message'] = ret.json()['commit']['message'] - res_dict['time'] = ret.json()['commit']['author']['date'] - if 'parents' in ret.json() and ret.json()['parents']: - res_dict['parent'] = ret.json()['parents'][0]['sha'] - return 'success', res_dict - - logger.error('%s failed. Return val: %s', url, ret) - return 'error', ret.json() - return 'error', 'connect error' - - def get_latest_commit(self, repo_url, branch): - """ - get latest commit_ID, commit_message, commit_date - :param repo_url: - :param branch: - :return: res_dict - """ - api_url = 'https://api.github.com/repos' - url = '/'.join([api_url, repo_url, 'branches', branch]) - ret = self.api_request(url) - res_dict = dict() - if ret: - if ret.status_code == 200: - res_dict['latest_commit'] = ret.json()['commit']['sha'] - res_dict['message'] = ret.json()['commit']['commit']['message'] - res_dict['time'] = ret.json()['commit']['commit']['committer']['date'] - return 'success', res_dict - - logger.error('%s failed. Return val: %s', url, ret) - return 'error', ret.json() - - return 'error', 'connect error' - - def get_patch(self, repo_url, scm_commit, last_commit): - """ - get patch - """ - api_url = 'https://github.com' - if scm_commit != last_commit: - commit = scm_commit + '...' + last_commit + '.diff' - else: - commit = scm_commit + '^...' + scm_commit + '.diff' - ret_dict = dict() - - url = '/'.join([api_url, repo_url, 'compare', commit]) - ret = self.api_request(url) - if ret: - if ret.status_code == 200: - patch_content = ret.text - ret_dict['status'] = 'success' - ret_dict['api_ret'] = patch_content - else: - logger.error('%s failed. Return val: %s', url, ret) - ret_dict['status'] = 'error' - ret_dict['api_ret'] = ret.text - else: - ret_dict['status'] = 'error' - ret_dict['api_ret'] = 'fail to connect github by api.' - - return ret_dict diff --git a/patch-tracking/patch_tracking/util/spec.py b/patch-tracking/patch_tracking/util/spec.py deleted file mode 100644 index 84f6b9d23f119e893a4487b8481d7aaf39d0ea21..0000000000000000000000000000000000000000 --- a/patch-tracking/patch_tracking/util/spec.py +++ /dev/null @@ -1,121 +0,0 @@ -""" -functionality of modify the spec file -""" - -import re - - -class Spec: - """ - functionality of update spec file - """ - def __init__(self, content): - self._lines = content.splitlines() - self.version = "0.0" - self.release = {"num": 0, "lineno": 0} - self.source_lineno = 0 - self.patch = {"threshold": 6000, "max_num": 0, "lineno": 0} - self.changelog_lineno = 0 - - # 规避空文件异常 - if len(self._lines) == 0: - self._lines.append("") - - # 查找配置项最后一次出现所在行的行号 - for i, line in enumerate(self._lines): - match_find = re.match(r"[ \t]*Version:[ \t]*([\d.]+)", line) - if match_find: - self.version = match_find[1] - continue - - match_find = re.match(r"[ \t]*Release:[ \t]*([\d.]+)", line) - if match_find: - self.release["num"] = int(match_find[1]) - self.release["lineno"] = i - continue - - match_find = re.match(r"[ \t]*%changelog", line) - if match_find: - self.changelog_lineno = i - continue - - match_find = re.match(r"[ \t]*Source([\d]*):", line) - if match_find: - self.source_lineno = i - continue - - match_find = re.match(r"[ \t]*Patch([\d]+):", line) - if match_find: - num = int(match_find[1]) - self.patch["lineno"] = 0 - if num > self.patch["max_num"]: - self.patch["max_num"] = num - self.patch["lineno"] = i - continue - - if self.patch["lineno"] == 0: - self.patch["lineno"] = self.source_lineno - - if self.patch["max_num"] < self.patch["threshold"]: - self.patch["max_num"] = self.patch["threshold"] - else: - self.patch["max_num"] += 1 - - def update(self, log_title, log_content, patches): - """ - Update items in spec file - """ - self.release["num"] += 1 - self._lines[self.release["lineno"] - ] = re.sub(r"[\d]+", str(self.release["num"]), self._lines[self.release["lineno"]]) - - log_title = "* " + log_title + " " + self.version + "-" + str(self.release["num"]) - log_content = "- " + log_content - self._lines.insert(self.changelog_lineno + 1, log_title + "\n" + log_content + "\n") - - patch_list = [] - for patch in patches: - patch_list.append("Patch" + str(self.patch["max_num"]) + ": " + patch) - self.patch["max_num"] += 1 - self._lines.insert(self.patch["lineno"] + 1, "\n".join(patch_list)) - - return self.__str__() - - def __str__(self): - return "\n".join(self._lines) - - -if __name__ == "__main__": - SPEC_CONTENT = """Name: diffutils -Version: 3.7 -Release: 3 - -Source: ftp://ftp.gnu.org/gnu/diffutils/diffutils-%{version}.tar.xz - -Patch: diffutils-cmp-s-empty.patch - -%changelog -* Mon Nov 11 2019 shenyangyang 3.7-3 -- DESC:delete unneeded comments - -* Thu Oct 24 2019 shenyangyang 3.7-2 -- Type:enhancement -""" - - s = Spec(SPEC_CONTENT) - s.update("Mon Nov 11 2019 patch-tracking", "DESC:add patch files", [ - "xxx.patch", - "yyy.patch", - ]) - - print(s) - - SPEC_CONTENT = """""" - - s = Spec(SPEC_CONTENT) - s.update("Mon Nov 11 2019 patch-tracking", "DESC:add patch files", [ - "xxx.patch", - "yyy.patch", - ]) - - print(s) diff --git a/patch-tracking/setup.py b/patch-tracking/setup.py deleted file mode 100644 index 3425bc1a8a624644ece91cd7e1b8ace4e74af6fb..0000000000000000000000000000000000000000 --- a/patch-tracking/setup.py +++ /dev/null @@ -1,25 +0,0 @@ -""" -setup about building of pactch tracking -""" -import setuptools - -setuptools.setup( - name='patch-tracking', - version='1.0.0', - packages=setuptools.find_packages(), - url='https://openeuler.org/zh/', - license='Mulan PSL v2', - author='ChenYanpan', - author_email='chenyanpan@huawei.com', - description='This is a tool for automatically tracking upstream repository code patches', - requires=['requests', 'flask', 'flask_restx', 'Flask_SQLAlchemy', 'Flask_APScheduler'], - data_files=[ - ('/etc/patch-tracking/', ['patch_tracking/settings.conf']), - ('/etc/patch-tracking/', ['patch_tracking/logging.conf']), - ('/var/patch-tracking/', ['patch_tracking/db.sqlite']), - ('/usr/bin/', ['patch_tracking/cli/patch-tracking-cli']), - ('/usr/bin/', ['patch_tracking/patch-tracking']), - ('/usr/bin/', ['patch_tracking/cli/generate_password']), - ('/usr/lib/systemd/system/', ['patch_tracking/patch-tracking.service']), - ], -)
{{ tables.name }}