diff --git a/.gitignore b/.gitignore index ece2343af46ee4da2e69abb2801f9d8e491db24e..1e7ad161f8aa86d6bffaba3c0376809388a9204c 100644 --- a/.gitignore +++ b/.gitignore @@ -1,6 +1,7 @@ *.iml .idea .DS_Store +*.class assembly/target @@ -20,4 +21,14 @@ plugins/azkaban/linkis-jobtype/target/ plugins/linkis/linkis-appjoint-entrance/target/ sendemail-appjoint/sendemail-core/target/ -visualis-appjoint/appjoint/target/ \ No newline at end of file +visualis-appjoint/appjoint/target/ + +dss-user-manager/target/ +logs +### Example user template template +### Example user template + +# IntelliJ project files + +out +gen diff --git a/.vscode/settings.json b/.vscode/settings.json new file mode 100644 index 0000000000000000000000000000000000000000..0a9b7a79be01873fb12ae4b77bb81503435372d3 --- /dev/null +++ b/.vscode/settings.json @@ -0,0 +1,5 @@ +{ + "editor.formatOnPaste": true, + "editor.formatOnType": true, + "editor.formatOnSave": true +} \ No newline at end of file diff --git a/README-ZH.md b/README-ZH.md index 89b69eb635adaff434d3e7d170474d5be8a1dae0..bcfcfb709b671613a87d986f7d9f6840047f5f90 100644 --- a/README-ZH.md +++ b/README-ZH.md @@ -7,13 +7,15 @@ ## 引言 -DataSphere Studio(简称DSS)是微众银行大数据平台——WeDataSphere,自研的一站式数据应用开发管理门户。 +DataSphere Studio(简称DSS)是微众银行自研的一站式数据应用开发管理门户。 -基于 [**Linkis**](https://github.com/WeBankFinTech/Linkis) 计算中间件构建,可轻松整合上层各数据应用系统,让数据应用开发变得简洁又易用。 +基于插拔式的集成框架设计,及计算中间件 [**Linkis**](https://github.com/WeBankFinTech/Linkis) ,可轻松接入上层各种数据应用系统,让数据开发变得简洁又易用。 -DataSphere Studio定位为数据应用开发门户,闭环涵盖数据应用开发全流程。在统一的UI下,以工作流式的图形化拖拽开发体验,满足从数据导入、脱敏清洗、分析挖掘、质量检测、可视化展现、定时调度到数据输出应用等,数据应用开发全流程场景需求。 +在统一的UI下,DataSphere Studio以工作流式的图形化拖拽开发体验,将满足从数据交换、脱敏清洗、分析挖掘、质量检测、可视化展现、定时调度到数据输出应用等,数据应用开发全流程场景需求。 -借助于Linkis计算中间件的连接、复用与简化能力,DSS天生便具备了金融级高并发、高可用、多租户隔离和资源管控等执行与调度能力。 +**DSS通过插拔式的集成框架设计,让用户可以根据需要,简单快速替换DSS已集成的各种功能组件,或新增功能组件。** + +借助于 [**Linkis**](https://github.com/WeBankFinTech/Linkis) 计算中间件的连接、复用与简化能力,DSS天生便具备了金融级高并发、高可用、多租户隔离和资源管控等执行与调度能力。 ## 界面预览 @@ -37,10 +39,14 @@ DSS主要特点:        4、工作流调度工具——[Azkaban](https://azkaban.github.io/) +        **DSS插拔式的框架设计模式,允许用户快速替换DSS已集成的各个Web系统**。如:将Scriptis替换成Zeppelin,将Azkaban替换成DolphinScheduler。 + ![DSS一站式](images/zh_CN/readme/onestop.gif) ### 二、基于Linkis计算中间件,打造独有的AppJoint设计理念 +        AppJoint,是DSS可以简单快速集成各种上层Web系统的核心概念。 +        AppJoint——应用关节,定义了一套统一的前后台接入规范,可让外部数据应用系统快速简单地接入,成为DSS数据应用开发中的一环。        DSS通过串联多个AppJoint,编排成一条支持实时执行和定时调度的工作流,用户只需简单拖拽即可完成数据应用的全流程开发。 @@ -53,6 +59,10 @@ DSS主要特点: ### 四、已集成的数据应用组件 +        DSS通过实现多个AppJoint,已集成了丰富多样的各种上层数据应用系统,基本可满足用户的数据开发需求。 + +        **用户如果有需要,也可以轻松集成新的数据应用系统,以替换或丰富DSS的数据应用开发流程。** +        1、DSS的调度能力——Azkaban AppJoint            用户的很多数据应用,通常希望具备周期性的调度能力。 @@ -113,6 +123,21 @@ DSS主要特点:            空节点、子工作流节点。 +        8、**节点扩展** + +            **根据需要,用户可以简单快速替换DSS已集成的各种功能组件,或新增功能组件。** + +## Demo试用环境 + +       由于DataSphereStudio支持执行脚本风险较高,WeDataSphere Demo环境的隔离没有做完,考虑到大家都在咨询Demo环境,决定向社区先定向发放邀请码,接受企业和组织的试用申请。 + +       如果您想试用Demo环境,请加入DataSphere Studio社区用户群(**加群方式请翻到本文档末尾处**),联系团队成员获取邀请码。 + +       WeDataSphere Demo环境用户注册页面:https://sandbox.webank.com/wds/dss/#/register + +       WeDataSphere Demo环境登录页面:https://sandbox.webank.com/wds/dss/ + +       我们会尽快解决环境隔离问题,争取早日向社区完全开放WeDataSphere Demo环境。 ## 与类似系统对比 @@ -132,7 +157,7 @@ DSS主要特点: ## 快速安装使用 -点我进入[快速安装使用](docs/zh_CN/ch2/DSS快速安装使用文档.md) +点我进入[快速安装使用](docs/zh_CN/ch2/DSS_LINKIS_Quick_Install.md) ## 架构 @@ -140,11 +165,35 @@ DSS主要特点: ## 文档列表 +#### 1. 安装编译文档 + +[快速安装使用文档](docs/zh_CN/ch2/DSS_LINKIS_Quick_Install.md) + +[**DSS安装常见问题列表**](docs/zh_CN/ch1/DSS安装常见问题列表.md) + [DSS编译文档](docs/zh_CN/ch1/DSS编译文档.md) +#### 2. 使用文档 + +[快速使用文档](docs/zh_CN/ch3/DataSphere_Studio_QuickStart.md) + [用户手册](docs/zh_CN/ch3/DSS_User_Manual.md) -[外部系统快速接入DSS](docs/zh_CN/ch4/第三方系统接入DSS指南.md) +#### 3. AppJoint插件安装文档 + +**以下为手动安装相关插件的指南,DSS一键安装【标准版】已自动安装了以下插件,可忽略。** + +[DSS的Azkaban AppJoint插件安装指南](docs/zh_CN/ch4/如何接入调度系统Azkaban.md) + +[DSS的Qualitis AppJoint插件安装指南](https://github.com/WeBankFinTech/Qualitis/blob/master/docs/zh_CN/ch1/%E6%8E%A5%E5%85%A5%E5%B7%A5%E4%BD%9C%E6%B5%81%E6%8C%87%E5%8D%97.md) + +#### 4. 第三方系统如何接入文档 + +[DSS如何快速集成第三方系统](docs/zh_CN/ch4/第三方系统接入DSS指南.md) + +#### 5. 架构文档 + +[DSS工程发布到调度系统的架构设计](docs/zh_CN/ch4/DSS工程发布调度系统架构设计.md) 更多文档,敬请期待! @@ -154,4 +203,4 @@ DSS主要特点: ## License -DSS is under the Apache 2.0 license. See the [License](LICENSE) file for details. \ No newline at end of file +DSS is under the Apache 2.0 license. See the [License](LICENSE) file for details. diff --git a/assembly/pom.xml b/assembly/pom.xml index f41a9835759ba35f30cc872e475a2439c869dfb7..53958159640dc4d67cdbec05b2de3821ff1b8092 100644 --- a/assembly/pom.xml +++ b/assembly/pom.xml @@ -22,7 +22,7 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 4.0.0 @@ -103,7 +103,52 @@ com.fasterxml.jackson.core jackson-core - 2.9.6 + 2.10.0 + + + net.databinder.dispatch + dispatch-core_2.11 + 0.11.2 + + + net.databinder.dispatch + dispatch-json4s-jackson_2.11 + 0.11.2 + + + org.apache.htrace + htrace-core + 3.1.0-incubating + + + org.apache.commons + commons-math3 + 3.1.1 + + + org.apache.httpcomponents + httpclient + 4.5.4 + + + org.apache.httpcomponents + httpcore + 4.4.7 + + + com.ning + async-http-client + 1.8.10 + + + commons-beanutils + commons-beanutils + 1.7.0 + + + commons-beanutils + commons-beanutils-core + 1.8.0 diff --git a/assembly/src/main/assembly/assembly.xml b/assembly/src/main/assembly/assembly.xml index 56f67f1e259989c12a48d0a41f1c5969c3c179c9..de757f0d8c42c830a10b1e502ec2c37a88ef58ec 100644 --- a/assembly/src/main/assembly/assembly.xml +++ b/assembly/src/main/assembly/assembly.xml @@ -112,6 +112,16 @@ + + + ${project.parent.basedir}/dss-azkaban-scheduler-appjoint/target/ + + share/appjoints/schedulis + + *.zip + + + ${project.parent.basedir}/dss-flow-execution-entrance/target/ diff --git a/bin/checkEnv.sh b/bin/checkEnv.sh new file mode 100644 index 0000000000000000000000000000000000000000..d51bd5ca2180724ceff987c9100c8d91393c471e --- /dev/null +++ b/bin/checkEnv.sh @@ -0,0 +1,45 @@ +#!/bin/sh +# +# Copyright 2019 WeBank +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +say() { + printf 'check command fail \n %s\n' "$1" +} + +err() { + say "$1" >&2 + exit 1 +} + +check_cmd() { + command -v "$1" > /dev/null 2>&1 +} + +need_cmd() { + if ! check_cmd "$1"; then + err "need '$1' (your linux command not found)" + fi +} +echo "<-----start to check used cmd---->" +need_cmd yum +need_cmd java +need_cmd mysql +need_cmd unzip +need_cmd expect +need_cmd telnet +need_cmd tar +need_cmd sed +need_cmd dos2unix +echo "<-----end to check used cmd---->" diff --git a/bin/checkServices.sh b/bin/checkServices.sh new file mode 100644 index 0000000000000000000000000000000000000000..72df04be43a2baf5327d621dd67f7a254ee8479b --- /dev/null +++ b/bin/checkServices.sh @@ -0,0 +1,91 @@ +# +# Copyright 2019 WeBank +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +#!/bin/sh +source ~/.bash_profile + +shellDir=`dirname $0` +workDir=`cd ${shellDir}/..;pwd` + +##load config +export LINKIS_DSS_CONF_FILE=${LINKIS_DSS_CONF_FILE:-"${workDir}/conf/config.sh"} +export DISTRIBUTION=${DISTRIBUTION:-"${workDir}/conf/config.sh"} +source ${LINKIS_DSS_CONF_FILE} +source ${DISTRIBUTION} + +MICRO_SERVICE_NAME=$1 +MICRO_SERVICE_IP=$2 +MICRO_SERVICE_PORT=$3 + +local_host="`hostname --fqdn`" + +ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}'|awk 'NR==1') + +function isLocal(){ + if [ "$1" == "127.0.0.1" ];then + return 0 + elif [ $1 == "localhost" ]; then + return 0 + elif [ $1 == $local_host ]; then + return 0 + elif [ $1 == $ipaddr ]; then + return 0 + fi + return 1 +} + +function executeCMD(){ + isLocal $1 + flag=$? + echo "Is local "$flag + if [ $flag == "0" ];then + eval $2 + else + ssh -p $SSH_PORT $1 $2 + fi + +} + +#echo "<--------------------------------------------------------------------------->" +#echo "Start to Check if your microservice:$MICRO_SERVICE_NAME is normal via telnet" +#echo "" +#if ! executeCMD $SERVER_IP "test -e $DSS_INSTALL_HOME/$MICRO_SERVICE_NAME"; then +# echo "$MICRO_SERVICE_NAME is not installed,the check steps will be skipped" +# exit 0 +#fi +echo "===========================================================" +echo $MICRO_SERVICE_NAME +echo $MICRO_SERVICE_IP +echo $MICRO_SERVICE_PORT +echo "===========================================================" + +if [ $MICRO_SERVICE_NAME == "visualis-server" ] && [ $MICRO_SERVICE_IP == "127.0.0.1" ]; then + MICRO_SERVICE_IP=$ipaddr +fi + +result=`echo -e "\n" | telnet $MICRO_SERVICE_IP $MICRO_SERVICE_PORT 2>/dev/null | grep Connected | wc -l` +if [ $result -eq 1 ]; then + echo "$MICRO_SERVICE_NAME is ok." +else + echo "<--------------------------------------------------------------------------->" + echo "ERROR your $MICRO_SERVICE_NAME microservice is not start successful !!! ERROR logs as follows :" + echo "PLEAESE CHECK DETAIL LOG,LOCATION:$DSS_INSTALL_HOME/$MICRO_SERVICE_NAME/logs/linkis.out" + echo '<------------------------------------------------------------->' + executeCMD $MICRO_SERVICE_IP "tail -n 50 $DSS_INSTALL_HOME/$MICRO_SERVICE_NAME/logs/*.out" + echo '<-------------------------------------------------------------->' + echo "PLEAESE CHECK DETAIL LOG,LOCATION:$DSS_INSTALL_HOME/$MICRO_SERVICE_NAME/logs/linkis.out" + exit 1 +fi + diff --git a/bin/install.sh b/bin/install.sh index 81770ba9839a9d2ebd09095aee5e8e2b74ad798e..7a0b46db6d2df9296c28d573efef434b49743d25 100644 --- a/bin/install.sh +++ b/bin/install.sh @@ -1,5 +1,21 @@ +# +# Copyright 2019 WeBank +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# #!/bin/sh #Actively load user env + source ~/.bash_profile shellDir=`dirname $0` @@ -22,7 +38,6 @@ elif [[ "$OSTYPE" == "win32" ]]; then echo "dss not support Windows operating system" exit 1 elif [[ "$OSTYPE" == "freebsd"* ]]; then - txt="" else echo "Operating system unknown, please tell us(submit issue) for better service" @@ -54,63 +69,108 @@ else fi } -function checkPythonAndJava(){ - python --version - isSuccess "execute python --version" - java -version - isSuccess "execute java --version" + +say() { + printf 'check command fail \n %s\n' "$1" } -function checkHadoopAndHive(){ - hdfs version - isSuccess "execute hdfs version" - hive --help - #isSuccess "execute hive -h" +err() { + say "$1" >&2 + exit 1 } -function checkSpark(){ - spark-submit --version - isSuccess "execute spark-submit --version" +check_cmd() { + command -v "$1" > /dev/null 2>&1 } -##install env:expect, -sudo yum install -y expect -isSuccess "install expect" +need_cmd() { + if ! check_cmd "$1"; then + err "need '$1' (command not found)" + fi +} -##install env:telnet, -sudo yum install -y telnet -isSuccess "install telnet" +#check env +sh ${workDir}/bin/checkEnv.sh +isSuccess "check env" ##load config echo "step1:load config" -source ${workDir}/conf/config.sh -source ${workDir}/conf/db.sh +export DSS_CONFIG_PATH=${DSS_CONFIG_PATH:-"${workDir}/conf/config.sh"} +export DSS_DB_CONFIG_PATH=${DSS_DB_CONFIG_PATH:-"${workDir}/conf/db.sh"} +export DISTRIBUTION=${DISTRIBUTION:-"${workDir}/conf/config.sh"} +source ${DSS_CONFIG_PATH} +source ${DSS_DB_CONFIG_PATH} +source ${DISTRIBUTION} isSuccess "load config" local_host="`hostname --fqdn`" +ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}'|awk 'NR==1') + +function isLocal(){ + if [ "$1" == "127.0.0.1" ];then + return 0 + elif [ $1 == "localhost" ]; then + return 0 + elif [ $1 == $local_host ]; then + return 0 + elif [ $1 == $ipaddr ]; then + return 0 + fi + return 1 +} -##env check -echo "Please enter the mode selection such as: 1" -echo " 1: Simple" -echo " 2: Standard" -echo "" +function executeCMD(){ + isLocal $1 + flag=$? + if [ $flag == "0" ];then + echo "Is local execution:$2" + eval $2 + else + echo "Is remote execution:$2" + ssh -p $SSH_PORT $1 $2 + fi +} -INSTALL_MODE=1 +function copyFile(){ + isLocal $1 + flag=$? + src=$2 + dest=$3 + if [ $flag == "0" ];then + echo "Is local cp " + eval "cp -r $src $dest" + else + echo "Is remote cp " + scp -r -P $SSH_PORT $src $1:$dest + fi +} -read -p "Please input the choice:" idx -if [[ '1' = "$idx" ]];then - INSTALL_MODE=1 - echo "You chose Simple installation mode" - #check for Java +##install mode choice +if [ "$INSTALL_MODE" == "" ];then + echo "Please enter the mode selection such as: 1" + echo " 1: Lite" + echo " 2: Simple" + echo " 3: Standard" + echo "" + read -p "Please input the choice:" idx + INSTALL_MODE=$idx +fi + +if [[ '1' = "$INSTALL_MODE" ]];then + echo "You chose lite installation mode" checkJava - #check for mysql SERVER_NAME=MYSQL EXTERNAL_SERVER_IP=$MYSQL_HOST EXTERNAL_SERVER_PORT=$MYSQL_PORT checkExternalServer - -elif [[ '2' = "$idx" ]];then - INSTALL_MODE=2 +elif [[ '2' = "$INSTALL_MODE" ]];then + echo "You chose sample installation mode" + checkJava + SERVER_NAME=MYSQL + EXTERNAL_SERVER_IP=$MYSQL_HOST + EXTERNAL_SERVER_PORT=$MYSQL_PORT + checkExternalServer +elif [[ '3' = "$INSTALL_MODE" ]];then echo "You chose Standard installation mode" #check for Java checkJava @@ -123,20 +183,23 @@ elif [[ '2' = "$idx" ]];then SERVER_NAME=Qualitis EXTERNAL_SERVER_IP=$QUALITIS_ADRESS_IP EXTERNAL_SERVER_PORT=$QUALITIS_ADRESS_PORT + if [[ $IGNORECHECK = "" ]];then checkExternalServer + fi #check azkaban serivice SERVER_NAME=AZKABAN EXTERNAL_SERVER_IP=$AZKABAN_ADRESS_IP EXTERNAL_SERVER_PORT=$AZKABAN_ADRESS_PORT + if [[ $IGNORECHECK = "" ]];then checkExternalServer - + fi else echo "no choice,exit!" exit 1 fi -##env check -echo "Do you want to clear Dss table information in the database?" +##init db +echo "Do you want to clear DSS table information in the database?" echo " 1: Do not execute table-building statements" echo " 2: Dangerous! Clear all data and rebuild the tables." echo "" @@ -155,66 +218,127 @@ else exit 1 fi +echo "create hdfs directory and local directory" +if [ "$WORKSPACE_USER_ROOT_PATH" != "" ] +then + localRootDir=$WORKSPACE_USER_ROOT_PATH + if [[ $WORKSPACE_USER_ROOT_PATH == file://* ]];then + localRootDir=${WORKSPACE_USER_ROOT_PATH#file://} + mkdir -p $localRootDir/$deployUser + sudo chmod -R 775 $localRootDir/$deployUser + elif [[ $WORKSPACE_USER_ROOT_PATH == hdfs://* ]];then + localRootDir=${WORKSPACE_USER_ROOT_PATH#hdfs://} + hdfs dfs -mkdir -p $localRootDir/$deployUser + hdfs dfs -chmod -R 775 $localRootDir/$deployUser + else + echo "does not support $WORKSPACE_USER_ROOT_PATH filesystem types" + fi +isSuccess "create $WORKSPACE_USER_ROOT_PATH directory" +fi + + +if [ "$RESULT_SET_ROOT_PATH" != "" ] +then + localRootDir=$RESULT_SET_ROOT_PATH + if [[ $RESULT_SET_ROOT_PATH == file://* ]];then + localRootDir=${RESULT_SET_ROOT_PATH#file://} + mkdir -p $localRootDir/$deployUser + sudo chmod -R 775 $localRootDir/$deployUser + elif [[ $RESULT_SET_ROOT_PATH == hdfs://* ]];then + localRootDir=${RESULT_SET_ROOT_PATH#hdfs://} + hdfs dfs -mkdir -p $localRootDir/$deployUser + hdfs dfs -chmod -R 775 $localRootDir/$deployUser + else + echo "does not support $RESULT_SET_ROOT_PATH filesystem types" + fi +isSuccess "create $RESULT_SET_ROOT_PATH directory" +fi + + +if [ "$WDS_SCHEDULER_PATH" != "" ] +then + localRootDir=$WDS_SCHEDULER_PATH + if [[ $WDS_SCHEDULER_PATH == file://* ]];then + localRootDir=${WDS_SCHEDULER_PATH#file://} + mkdir -p $localRootDir + sudo chmod -R 775 $localRootDir + elif [[ $WDS_SCHEDULER_PATH == hdfs://* ]];then + localRootDir=${WDS_SCHEDULER_PATH#hdfs://} + hdfs dfs -mkdir -p $localRootDir + hdfs dfs -chmod -R 775 $localRootDir + else + echo "does not support $WDS_SCHEDULER_PATH filesystem types" + fi +isSuccess "create $WDS_SCHEDULER_PATH directory" +fi + + ##init db if [[ '2' = "$MYSQL_INSTALL_MODE" ]];then - mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source ${workDir}/db/dss_ddl.sql" - isSuccess "source linkis_ddl.sql" - LOCAL_IP="`ifconfig | grep 'inet' | grep -v '127.0.0.1' | cut -d: -f2 | awk '{ print $2}'`" - if [ $GATEWAY_INSTALL_IP == "127.0.0.1" ];then - echo "GATEWAY_INSTALL_IP is equals 127.0.0.1 ,we will change it to ip address" - GATEWAY_INSTALL_IP_2=$LOCAL_IP + mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source ${workDir}/db/dss_ddl.sql" + isSuccess "source dss_ddl.sql" + LOCAL_IP=$ipaddr + if [ $GATEWAY_INSTALL_IP == "127.0.0.1" ];then + echo "GATEWAY_INSTALL_IP is equals 127.0.0.1 ,we will change it to ip address" + GATEWAY_INSTALL_IP_2=$LOCAL_IP else - GATEWAY_INSTALL_IP_2=$GATEWAY_INSTALL_IP + GATEWAY_INSTALL_IP_2=$GATEWAY_INSTALL_IP fi - echo $GATEWAY_INSTALL_IP_2 + #echo $GATEWAY_INSTALL_IP_2 sed -i "s/GATEWAY_INSTALL_IP_2/$GATEWAY_INSTALL_IP_2/g" ${workDir}/db/dss_dml.sql sed -i "s/GATEWAY_PORT/$GATEWAY_PORT/g" ${workDir}/db/dss_dml.sql - if [ $AZKABAN_ADRESS_IP == "127.0.0.1" ];then - echo "AZKABAN_ADRESS_IP is equals 127.0.0.1 ,we will change it to ip address" - AZKABAN_ADRESS_IP_2=$LOCAL_IP - else - AZKABAN_ADRESS_IP_2=$AZKABAN_ADRESS_IP + mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source ${workDir}/db/dss_dml.sql" + isSuccess "source dss_dml.sql" + + if [[ '2' = "$INSTALL_MODE" ]] || [[ '3' = "$INSTALL_MODE" ]];then + echo "visualis support,visualis database will be initialized !" + if [ $VISUALIS_NGINX_IP == "127.0.0.1" ]||[ $VISUALIS_NGINX_IP == "0.0.0.0" ];then + echo "VISUALIS_NGINX_IP is equals $VISUALIS_NGINX_IP ,we will change it to ip address" + VISUALIS_NGINX_IP_2=$LOCAL_IP + else + VISUALIS_NGINX_IP_2=$VISUALIS_NGINX_IP + fi + #echo $VISUALIS_NGINX_IP_2 + sed -i "s/VISUALIS_NGINX_IP_2/$VISUALIS_NGINX_IP_2/g" ${workDir}/db/visualis.sql + sed -i "s/VISUALIS_NGINX_PORT/$VISUALIS_NGINX_PORT/g" ${workDir}/db/visualis.sql + mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source ${workDir}/db/visualis.sql" + isSuccess "source visualis.sql" + mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source ${workDir}/db/davinci.sql" + isSuccess "source davinci.sql" fi - echo $AZKABAN_ADRESS_IP_2 - sed -i "s/AZKABAN_ADRESS_IP_2/$AZKABAN_ADRESS_IP_2/g" ${workDir}/db/dss_dml.sql - sed -i "s/AZKABAN_ADRESS_PORT/$AZKABAN_ADRESS_PORT/g" ${workDir}/db/dss_dml.sql - if [ $VISUALIS_NGINX_IP == "127.0.0.1" ]||[ $VISUALIS_NGINX_IP == "0.0.0.0" ];then - echo "VISUALIS_NGINX_IP is equals $VISUALIS_NGINX_IP ,we will change it to ip address" - VISUALIS_NGINX_IP_2=$LOCAL_IP - else - VISUALIS_NGINX_IP_2=$VISUALIS_NGINX_IP - fi - echo $VISUALIS_NGINX_IP_2 - sed -i "s/VISUALIS_NGINX_IP_2/$VISUALIS_NGINX_IP_2/g" ${workDir}/db/dss_dml.sql - sed -i "s/VISUALIS_NGINX_PORT/$VISUALIS_NGINX_PORT/g" ${workDir}/db/dss_dml.sql - mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source ${workDir}/db/dss_dml.sql" - isSuccess "source linkis_dml.sql" - echo "Rebuild the table" -fi -##env check -echo "Do you want to clear davinci table information in the database ? If you have not installed davinci environment,you must input '2',if you have davinci installed,choice 1." -echo " 1: Do not execute table-building statements" -echo "WARN:" -echo " 2: Dangerous! Clear all data and rebuild the tables." -echo "" -DAVINCI_INSTALL_MODE=1 -read -p "Please input the choice:" idx -if [[ '2' = "$idx" ]];then - DAVINCI_INSTALL_MODE=2 - echo "You chose rebuild davinci's table !!! start rebuild all tables" - mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source ${workDir}/db/davinci.sql" - isSuccess "source davinci.sql" - echo "" -elif [[ '1' = "$idx" ]];then - DAVINCI_INSTALL_MODE=1 - echo "You chose not execute table-building statements" - echo "" -else - echo "no choice,exit!" - exit 1 + if [[ '3' = "$INSTALL_MODE" ]];then + echo "azkaban and qualitis support, azkaban and qualitis database will be initialized !" + #azkaban + if [ $AZKABAN_ADRESS_IP == "127.0.0.1" ];then + echo "AZKABAN_ADRESS_IP is equals 127.0.0.1 ,we will change it to ip address" + AZKABAN_ADRESS_IP_2=$LOCAL_IP + else + AZKABAN_ADRESS_IP_2=$AZKABAN_ADRESS_IP + fi + echo $AZKABAN_ADRESS_IP_2 + sed -i "s/AZKABAN_ADRESS_IP_2/$AZKABAN_ADRESS_IP_2/g" ${workDir}/db/azkaban.sql + sed -i "s/AZKABAN_ADRESS_PORT/$AZKABAN_ADRESS_PORT/g" ${workDir}/db/azkaban.sql + mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source ${workDir}/db/azkaban.sql" + isSuccess "source azkaban.sql" + #qualitis + if [ $QUALITIS_ADRESS_IP == "127.0.0.1" ];then + echo "QUALITIS_ADRESS_IP is equals 127.0.0.1 ,we will change it to ip address" + QUALITIS_ADRESS_IP_2=$LOCAL_IP + else + QUALITIS_ADRESS_IP_2=$QUALITIS_ADRESS_IP + fi + echo $QUALITIS_ADRESS_IP_2 + sed -i "s/QUALITIS_ADRESS_IP_2/$QUALITIS_ADRESS_IP_2/g" ${workDir}/db/qualitis.sql + sed -i "s/QUALITIS_ADRESS_PORT/$QUALITIS_ADRESS_PORT/g" ${workDir}/db/qualitis.sql + mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD -D$MYSQL_DB --default-character-set=utf8 -e "source ${workDir}/db/qualitis.sql" + isSuccess "source qualitis.sql" + fi fi +##Deal special symbol '#' +HIVE_META_PASSWORD=$(echo ${HIVE_META_PASSWORD//'#'/'\#'}) +MYSQL_PASSWORD=$(echo ${MYSQL_PASSWORD//'#'/'\#'}) ###linkis Eurkea info SERVER_IP=$EUREKA_INSTALL_IP @@ -236,24 +360,30 @@ then SERVER_IP=$local_host fi -if ! ssh -p $SSH_PORT $SERVER_IP test -e $SERVER_HOME; then - ssh -p $SSH_PORT $SERVER_IP "sudo mkdir -p $SERVER_HOME;sudo chown -R $deployUser:$deployUser $SERVER_HOME" +if ! executeCMD $SERVER_IP "test -e $SERVER_HOME"; then + executeCMD $SERVER_IP "sudo mkdir -p $SERVER_HOME;sudo chown -R $deployUser:$deployUser $SERVER_HOME" isSuccess "create the dir of $SERVERNAME" fi echo "$SERVERNAME-step2:copy install package" -scp -P $SSH_PORT ${workDir}/share/$PACKAGE_DIR/$SERVERNAME.zip $SERVER_IP:$SERVER_HOME +copyFile $SERVER_IP ${workDir}/share/$PACKAGE_DIR/$SERVERNAME.zip $SERVER_HOME + +if ! executeCMD $SERVER_IP "test -e $SERVER_HOME/lib"; then + copyFile $SERVER_IP ${workDir}/lib $SERVER_HOME +fi + +#copyFile $SERVER_IP ${workDir}/lib $SERVER_HOME isSuccess "copy ${SERVERNAME}.zip" -ssh -p $SSH_PORT $SERVER_IP "cd $SERVER_HOME/;rm -rf $SERVERNAME-bak; mv -f $SERVERNAME $SERVERNAME-bak" -ssh -p $SSH_PORT $SERVER_IP "cd $SERVER_HOME/;unzip $SERVERNAME.zip > /dev/null" -ssh -p $SSH_PORT $SERVER_IP "cd $workDir/;scp -r lib/* $SERVER_HOME/$SERVERNAME/lib" +executeCMD $SERVER_IP "cd $SERVER_HOME/;rm -rf $SERVERNAME-bak; mv -f $SERVERNAME $SERVERNAME-bak" +executeCMD $SERVER_IP "cd $SERVER_HOME/;unzip $SERVERNAME.zip > /dev/null" +executeCMD $SERVER_IP "cd $SERVER_HOME/;scp -r lib/* $SERVER_HOME/$SERVERNAME/lib" isSuccess "unzip ${SERVERNAME}.zip" echo "$SERVERNAME-step3:subsitution conf" SERVER_CONF_PATH=$SERVER_HOME/$SERVERNAME/conf/application.yml -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#port:.*#port: $SERVER_PORT#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#defaultZone:.*#defaultZone: $EUREKA_URL#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#hostname:.*#hostname: $SERVER_IP#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#port:.*#port: $SERVER_PORT#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#defaultZone:.*#defaultZone: $EUREKA_URL#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#hostname:.*#hostname: $SERVER_IP#g\" $SERVER_CONF_PATH" isSuccess "subsitution conf of $SERVERNAME" } ##function end @@ -267,16 +397,16 @@ then SERVER_IP=$local_host fi -if ! ssh -p $SSH_PORT $SERVER_IP test -e $SERVER_HOME; then - ssh -p $SSH_PORT $SERVER_IP "sudo mkdir -p $SERVER_HOME;sudo chown -R $deployUser:$deployUser $SERVER_HOME" +if ! executeCMD $SERVER_IP "test -e $SERVER_HOME"; then + executeCMD $SERVER_IP "sudo mkdir -p $SERVER_HOME;sudo chown -R $deployUser:$deployUser $SERVER_HOME" isSuccess "create the dir of $SERVERNAME" fi echo "$SERVERNAME-step2:copy install package" -scp -P $SSH_PORT ${workDir}/share/$PACKAGE_DIR/$SERVERNAME.zip $SERVER_IP:$SERVER_HOME +copyFile $SERVER_IP ${workDir}/share/$PACKAGE_DIR/$SERVERNAME.zip $SERVER_HOME isSuccess "copy ${SERVERNAME}.zip" -ssh -p $SSH_PORT $SERVER_IP "cd $SERVER_HOME/;rm -rf $SERVERNAME-bak; mv -f $SERVERNAME $SERVERNAME-bak" -ssh -p $SSH_PORT $SERVER_IP "cd $SERVER_HOME/;unzip $SERVERNAME.zip > /dev/null" +executeCMD $SERVER_IP "cd $SERVER_HOME/;rm -rf $SERVERNAME-bak; mv -f $SERVERNAME $SERVERNAME-bak" +executeCMD $SERVER_IP "cd $SERVER_HOME/;unzip $SERVERNAME.zip > /dev/null" isSuccess "unzip ${SERVERNAME}.zip" } ##function end @@ -291,20 +421,20 @@ then SERVER_IP=$local_host fi -if ! ssh -p $SSH_PORT $SERVER_IP test -e $SERVER_HOME/$APPJOINTPARENT; then - ssh -p $SSH_PORT $SERVER_IP "sudo mkdir -p $SERVER_HOME/$APPJOINTPARENT;sudo chown -R $deployUser:$deployUser $SERVER_HOME/$APPJOINTPARENT" +if ! executeCMD $SERVER_IP "test -e $SERVER_HOME/$APPJOINTPARENT"; then + executeCMD $SERVER_IP "sudo mkdir -p $SERVER_HOME/$APPJOINTPARENT;sudo chown -R $deployUser:$deployUser $SERVER_HOME/$APPJOINTPARENT" isSuccess "create the dir of $SERVER_HOME/$APPJOINTPARENT;" fi echo "$APPJOINTNAME-step2:copy install package" -scp -P $SSH_PORT $workDir/share/appjoints/$APPJOINTNAME/*.zip $SERVER_IP:$SERVER_HOME/$APPJOINTPARENT +copyFile $SERVER_IP $workDir/share/appjoints/$APPJOINTNAME/*.zip $SERVER_HOME/$APPJOINTPARENT isSuccess "copy ${APPJOINTNAME}.zip" -ssh -p $SSH_PORT $SERVER_IP "cd $SERVER_HOME/$APPJOINTPARENT/;unzip -o dss-$APPJOINTNAME-appjoint.zip > /dev/null;rm -rf dss-$APPJOINTNAME-appjoint.zip" +executeCMD $SERVER_IP "cd $SERVER_HOME/$APPJOINTPARENT/;unzip -o dss-*-appjoint.zip > /dev/null;rm -rf dss-*-appjoint.zip" isSuccess "install ${APPJOINTNAME}.zip" } ##function end -##Dss-Server Install +##dss-Server install PACKAGE_DIR=dss/dss-server SERVERNAME=dss-server SERVER_IP=$DSS_SERVER_INSTALL_IP @@ -315,16 +445,19 @@ installPackage ###update Dss-Server linkis.properties echo "$SERVERNAME-step4:update linkis.properties" SERVER_CONF_PATH=$SERVER_HOME/$SERVERNAME/conf/linkis.properties -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#wds.linkis.server.mybatis.datasource.url.*#wds.linkis.server.mybatis.datasource.url=jdbc:mysql://${MYSQL_HOST}:${MYSQL_PORT}/${MYSQL_DB}?characterEncoding=UTF-8#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#wds.linkis.server.mybatis.datasource.username.*#wds.linkis.server.mybatis.datasource.username=$MYSQL_USER#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#wds.linkis.server.mybatis.datasource.password.*#wds.linkis.server.mybatis.datasource.password=$MYSQL_PASSWORD#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#wds.dss.appjoint.scheduler.azkaban.address.*#wds.dss.appjoint.scheduler.azkaban.address=http://${AZKABAN_ADRESS_IP}:${AZKABAN_ADRESS_PORT}#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#wds.linkis.gateway.ip.*#wds.linkis.gateway.ip=$GATEWAY_INSTALL_IP#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#wds.dataworlcloud.gateway.port.*#wds.dataworlcloud.gateway.port=$GATEWAY_PORT#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#wds.dss.appjoint.scheduler.project.store.dir.*#wds.dss.appjoint.scheduler.project.store.dir=$WDS_SCHEDULER_PATH#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#wds.linkis.server.mybatis.datasource.url.*#wds.linkis.server.mybatis.datasource.url=jdbc:mysql://${MYSQL_HOST}:${MYSQL_PORT}/${MYSQL_DB}?characterEncoding=UTF-8#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#wds.linkis.server.mybatis.datasource.username.*#wds.linkis.server.mybatis.datasource.username=$MYSQL_USER#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#wds.linkis.server.mybatis.datasource.password.*#wds.linkis.server.mybatis.datasource.password=$MYSQL_PASSWORD#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#wds.dss.appjoint.scheduler.azkaban.address.*#wds.dss.appjoint.scheduler.azkaban.address=http://${AZKABAN_ADRESS_IP}:${AZKABAN_ADRESS_PORT}#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#wds.linkis.gateway.ip.*#wds.linkis.gateway.ip=$GATEWAY_INSTALL_IP#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#wds.linkis.gateway.port.*#wds.linkis.gateway.port=$GATEWAY_PORT#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#wds.dss.appjoint.scheduler.project.store.dir.*#wds.dss.appjoint.scheduler.project.store.dir=$WDS_SCHEDULER_PATH#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "echo "$deployUser=$deployUser" >> $SERVER_HOME/$SERVERNAME/conf/token.properties" isSuccess "subsitution linkis.properties of $SERVERNAME" echo "<----------------$SERVERNAME:end------------------->" echo "" + +if [[ '2' = "$INSTALL_MODE" ]]||[[ '3' = "$INSTALL_MODE" ]];then ##Flow execution Install PACKAGE_DIR=dss/dss-flow-execution-entrance SERVERNAME=dss-flow-execution-entrance @@ -336,9 +469,9 @@ installPackage ###Update flow execution linkis.properties echo "$SERVERNAME-step4:update linkis.properties" SERVER_CONF_PATH=$SERVER_HOME/$SERVERNAME/conf/linkis.properties -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#wds.linkis.entrance.config.logPath.*#wds.linkis.entrance.config.logPath=$WORKSPACE_USER_ROOT_PATH#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#wds.linkis.resultSet.store.path.*#wds.linkis.resultSet.store.path=$RESULT_SET_ROOT_PATH#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#wds.linkis.gateway.url.*#wds.linkis.gateway.url=http://${GATEWAY_INSTALL_IP}:${GATEWAY_PORT}#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#wds.linkis.entrance.config.logPath.*#wds.linkis.entrance.config.logPath=$WORKSPACE_USER_ROOT_PATH#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#wds.linkis.resultSet.store.path.*#wds.linkis.resultSet.store.path=$RESULT_SET_ROOT_PATH#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#wds.linkis.gateway.url.*#wds.linkis.gateway.url=http://${GATEWAY_INSTALL_IP}:${GATEWAY_PORT}#g\" $SERVER_CONF_PATH" isSuccess "subsitution linkis.properties of $SERVERNAME" echo "<----------------$SERVERNAME:end------------------->" echo "" @@ -353,8 +486,8 @@ installPackage ###Update appjoint entrance linkis.properties echo "$SERVERNAME-step4:update linkis.properties" SERVER_CONF_PATH=$SERVER_HOME/$SERVERNAME/conf/linkis.properties -ssh $SERVER_IP "sed -i \"s#wds.linkis.entrance.config.logPath.*#wds.linkis.entrance.config.logPath=$WORKSPACE_USER_ROOT_PATH#g\" $SERVER_CONF_PATH" -ssh $SERVER_IP "sed -i \"s#wds.linkis.resultSet.store.path.*#wds.linkis.resultSet.store.path=$RESULT_SET_ROOT_PATH#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#wds.linkis.entrance.config.logPath.*#wds.linkis.entrance.config.logPath=$WORKSPACE_USER_ROOT_PATH#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#wds.linkis.resultSet.store.path.*#wds.linkis.resultSet.store.path=$RESULT_SET_ROOT_PATH#g\" $SERVER_CONF_PATH" isSuccess "subsitution linkis.properties of $SERVERNAME" echo "<----------------$SERVERNAME:end------------------->" echo "" @@ -370,22 +503,25 @@ installVisualis echo "$SERVERNAME-step4:update linkis.properties" SERVER_CONF_PATH=$SERVER_HOME/$SERVERNAME/conf/linkis.properties if [ $VISUALIS_NGINX_IP == "127.0.0.1" ]||[ $VISUALIS_NGINX_IP == "0.0.0.0" ]; then - VISUALIS_NGINX_IP=$local_host + VISUALIS_NGINX_IP=$ipaddr +fi +if [ $VISUALIS_SERVER_INSTALL_IP == "127.0.0.1" ]||[ $VISUALIS_SERVER_INSTALL_IP == "0.0.0.0" ]; then + VISUALIS_SERVER_INSTALL_IP=$ipaddr fi -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#wds.linkis.entrance.config.logPath.*#wds.linkis.entrance.config.logPath=$WORKSPACE_USER_ROOT_PATH#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#wds.linkis.resultSet.store.path.*#wds.linkis.resultSet.store.path=$RESULT_SET_ROOT_PATH#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#wds.dss.visualis.gateway.ip.*#wds.dss.visualis.gateway.ip=$GATEWAY_INSTALL_IP#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#wds.dss.visualis.gateway.port.*#wds.dss.visualis.gateway.port=$GATEWAY_PORT#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#wds.linkis.entrance.config.logPath.*#wds.linkis.entrance.config.logPath=$WORKSPACE_USER_ROOT_PATH#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#wds.linkis.resultSet.store.path.*#wds.linkis.resultSet.store.path=$RESULT_SET_ROOT_PATH#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#wds.dss.visualis.gateway.ip.*#wds.dss.visualis.gateway.ip=$GATEWAY_INSTALL_IP#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#wds.dss.visualis.gateway.port.*#wds.dss.visualis.gateway.port=$GATEWAY_PORT#g\" $SERVER_CONF_PATH" SERVER_CONF_PATH=$SERVER_HOME/$SERVERNAME/conf/application.yml -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#address: 127.0.0.1#address: $VISUALIS_SERVER_INSTALL_IP#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#port: 9007#port: $VISUALIS_SERVER_PORT#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#url: http://0.0.0.0:0000/dss/visualis#url: http://$VISUALIS_NGINX_IP:$VISUALIS_NGINX_PORT/dss/visualis#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#address: 0.0.0.0#address: $VISUALIS_NGINX_IP#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#port: 0000#port: $VISUALIS_NGINX_PORT#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#defaultZone: http://127.0.0.1:20303/eureka/#defaultZone: http://$EUREKA_INSTALL_IP:$EUREKA_PORT/eureka/#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#url: jdbc:mysql://127.0.0.1:3306/xxx?characterEncoding=UTF-8#url: jdbc:mysql://$MYSQL_HOST:$MYSQL_PORT/$MYSQL_DB?characterEncoding=UTF-8#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#username: xxx#username: $MYSQL_USER#g\" $SERVER_CONF_PATH" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#password: xxx#password: $MYSQL_PASSWORD#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#address: 127.0.0.1#address: $VISUALIS_SERVER_INSTALL_IP#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#port: 9007#port: $VISUALIS_SERVER_PORT#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#url: http://0.0.0.0:0000/dss/visualis#url: http://$VISUALIS_NGINX_IP:$VISUALIS_NGINX_PORT/dss/visualis#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#address: 0.0.0.0#address: $VISUALIS_NGINX_IP#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#port: 0000#port: $VISUALIS_NGINX_PORT#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#defaultZone: http://127.0.0.1:20303/eureka/#defaultZone: http://$EUREKA_INSTALL_IP:$EUREKA_PORT/eureka/#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#url: jdbc:mysql://127.0.0.1:3306/xxx?characterEncoding=UTF-8#url: jdbc:mysql://$MYSQL_HOST:$MYSQL_PORT/$MYSQL_DB?characterEncoding=UTF-8#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#username: xxx#username: $MYSQL_USER#g\" $SERVER_CONF_PATH" +executeCMD $SERVER_IP "sed -i \"s#password: xxx#password: $MYSQL_PASSWORD#g\" $SERVER_CONF_PATH" isSuccess "subsitution linkis.properties of $SERVERNAME" echo "<----------------$SERVERNAME:end------------------->" echo "" @@ -397,10 +533,10 @@ APPJOINTNAME=datachecker installAppjoints echo "$APPJOINTNAME:subsitution conf" APPJOINTNAME_CONF_PATH_PATENT=$SERVER_HOME/$APPJOINTPARENT/$APPJOINTNAME/appjoint.properties -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#job.datachecker.jdo.option.url.*#job.datachecker.jdo.option.url=$HIVE_META_URL#g\" $APPJOINTNAME_CONF_PATH_PATENT" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#job.datachecker.jdo.option.username.*#job.datachecker.jdo.option.username=$HIVE_META_USER#g\" $APPJOINTNAME_CONF_PATH_PATENT" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#job.datachecker.jdo.option.password.*#job.datachecker.jdo.option.password=$HIVE_META_PASSWORD#g\" $APPJOINTNAME_CONF_PATH_PATENT" -isSuccess "subsitution conf of $SERVERNAME" +executeCMD $SERVER_IP "sed -i \"s#job.datachecker.jdo.option.url.*#job.datachecker.jdo.option.url=$HIVE_META_URL#g\" $APPJOINTNAME_CONF_PATH_PATENT" +executeCMD $SERVER_IP "sed -i \"s#job.datachecker.jdo.option.username.*#job.datachecker.jdo.option.username=$HIVE_META_USER#g\" $APPJOINTNAME_CONF_PATH_PATENT" +executeCMD $SERVER_IP "sed -i \"s#job.datachecker.jdo.option.password.*#job.datachecker.jdo.option.password=$HIVE_META_PASSWORD#g\" $APPJOINTNAME_CONF_PATH_PATENT" +isSuccess "subsitution conf of datachecker" echo "<----------------datachecker appjoint install end------------------->" echo "" echo "<----------------eventchecker appjoint install start------------------->" @@ -410,10 +546,10 @@ APPJOINTNAME=eventchecker installAppjoints echo "$APPJOINTNAME:subsitution conf" APPJOINTNAME_CONF_PATH_PATENT=$SERVER_HOME/$APPJOINTPARENT/$APPJOINTNAME/appjoint.properties -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#msg.eventchecker.jdo.option.url.*#msg.eventchecker.jdo.option.url=jdbc:mysql://${MYSQL_HOST}:${MYSQL_PORT}/${MYSQL_DB}?characterEncoding=UTF-8#g\" $APPJOINTNAME_CONF_PATH_PATENT" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#msg.eventchecker.jdo.option.username.*#msg.eventchecker.jdo.option.username=$MYSQL_USER#g\" $APPJOINTNAME_CONF_PATH_PATENT" -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#msg.eventchecker.jdo.option.password.*#msg.eventchecker.jdo.option.password=$MYSQL_PASSWORD#g\" $APPJOINTNAME_CONF_PATH_PATENT" -isSuccess "subsitution conf of $SERVERNAME" +executeCMD $SERVER_IP "sed -i \"s#msg.eventchecker.jdo.option.url.*#msg.eventchecker.jdo.option.url=jdbc:mysql://${MYSQL_HOST}:${MYSQL_PORT}/${MYSQL_DB}?characterEncoding=UTF-8#g\" $APPJOINTNAME_CONF_PATH_PATENT" +executeCMD $SERVER_IP "sed -i \"s#msg.eventchecker.jdo.option.username.*#msg.eventchecker.jdo.option.username=$MYSQL_USER#g\" $APPJOINTNAME_CONF_PATH_PATENT" +executeCMD $SERVER_IP "sed -i \"s#msg.eventchecker.jdo.option.password.*#msg.eventchecker.jdo.option.password=$MYSQL_PASSWORD#g\" $APPJOINTNAME_CONF_PATH_PATENT" +isSuccess "subsitution conf of eventchecker" echo "<----------------$APPJOINTNAME:end------------------->" echo "" echo "<----------------visualis appjoint install start------------------->" @@ -422,6 +558,10 @@ APPJOINTNAME=visualis #visualis appjoint install installAppjoints echo "<----------------$APPJOINTNAME:end------------------->" +fi + +##lite and sample version does not install qualitis APPJoint and scheduis APPJoint +if [[ '3' = "$INSTALL_MODE" ]];then echo "" echo "<----------------qualitis appjoint install start------------------->" APPJOINTPARENT=dss-appjoints @@ -429,6 +569,15 @@ APPJOINTNAME=qualitis #qualitis appjoint install installAppjoints APPJOINTNAME_CONF_PATH_PATENT=$SERVER_HOME/$APPJOINTPARENT/$APPJOINTNAME/appjoint.properties -ssh -p $SSH_PORT $SERVER_IP "sed -i \"s#baseUrl=http://127.0.0.1:8090#baseUrl=http://$QUALITIS_ADRESS_IP:$QUALITIS_ADRESS_PORT#g\" $APPJOINTNAME_CONF_PATH_PATENT" +executeCMD $SERVER_IP "sed -i \"s#baseUrl=http://127.0.0.1:8090#baseUrl=http://$QUALITIS_ADRESS_IP:$QUALITIS_ADRESS_PORT#g\" $APPJOINTNAME_CONF_PATH_PATENT" isSuccess "subsitution conf of qualitis" -echo "<----------------$APPJOINTNAME:end------------------->" \ No newline at end of file +echo "<----------------$APPJOINTNAME:end------------------->" +echo "" +echo "<----------------schedulis appjoint install start------------------->" +APPJOINTPARENT=dss-appjoints +APPJOINTNAME=schedulis +#schedulis appjoint install +installAppjoints +isSuccess "subsitution conf of schedulis" +echo "<----------------$APPJOINTNAME:end------------------->" +fi diff --git a/bin/start-all.sh b/bin/start-all.sh index abc0e64ed90e4451dc21af31330af7cb5bff803d..98a07f6bdc1c83612eab8fa71ffd162fa104f539 100644 --- a/bin/start-all.sh +++ b/bin/start-all.sh @@ -15,22 +15,24 @@ # limitations under the License. # - - # Start all dss applications info="We will start all dss applications, it will take some time, please wait" echo ${info} #Actively load user env +source /etc/profile source ~/.bash_profile -workDir=`dirname "${BASH_SOURCE-$0}"` -workDir=`cd "$workDir"; pwd` +shellDir=`dirname $0` +workDir=`cd ${shellDir}/..;pwd` -CONF_DIR="${workDir}"/../conf -CONF_FILE=${CONF_DIR}/config.sh +CONF_DIR="${workDir}"/conf +export LINKIS_DSS_CONF_FILE=${LINKIS_DSS_CONF_FILE:-"${CONF_DIR}/config.sh"} +export DISTRIBUTION=${DISTRIBUTION:-"${CONF_DIR}/config.sh"} +source $LINKIS_DSS_CONF_FILE +source ${DISTRIBUTION} function isSuccess(){ if [ $? -ne 0 ]; then echo "ERROR: " + $1 @@ -39,20 +41,43 @@ else echo "INFO:" + $1 fi } +local_host="`hostname --fqdn`" -sudo yum -y install dos2unix +ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}'|awk 'NR==1') + +function isLocal(){ + if [ "$1" == "127.0.0.1" ];then + return 0 + elif [ $1 == "localhost" ]; then + return 0 + elif [ $1 == $local_host ]; then + return 0 + elif [ $1 == $ipaddr ]; then + return 0 + fi + return 1 +} +function executeCMD(){ + isLocal $1 + flag=$? + echo "Is local "$flag + if [ $flag == "0" ];then + eval $2 + else + ssh -p $SSH_PORT $1 $2 + fi -local_host="`hostname --fqdn`" +} #if there is no LINKIS_INSTALL_HOME,we need to source config again if [ -z ${DSS_INSTALL_HOME} ];then echo "Warning: DSS_INSTALL_HOME does not exist, we will source config" - if [ ! -f "${CONF_FILE}" ];then + if [ ! -f "${LINKIS_DSS_CONF_FILE}" ];then echo "Error: can not find config file, start applications failed" exit 1 else - source ${CONF_FILE} + source ${LINKIS_DSS_CONF_FILE} fi fi @@ -60,12 +85,29 @@ function startApp(){ echo "<-------------------------------->" echo "Begin to start $SERVER_NAME" SERVER_BIN=${DSS_INSTALL_HOME}/${SERVER_NAME}/bin -SERVER_START_CMD="source ~/.bash_profile;cd ${SERVER_BIN}; dos2unix ./* > /dev/null 2>&1; dos2unix ../conf/* > /dev/null 2>&1;sh start-${SERVER_NAME}.sh > /dev/null 2>&1 &" +#echo $SERVER_BIN +SERVER_LOCAL_START_CMD="dos2unix ${SERVER_BIN}/* > /dev/null 2>&1; dos2unix ${SERVER_BIN}/../conf/* > /dev/null 2>&1;sh ${SERVER_BIN}/start-${SERVER_NAME}.sh > /dev/null 2>&1 &" +SERVER_REMOTE_START_CMD="source /etc/profile;source ~/.bash_profile;cd ${SERVER_BIN}; dos2unix ./* > /dev/null 2>&1; dos2unix ../conf/* > /dev/null 2>&1; sh start-${SERVER_NAME}.sh > /dev/null 2>&1" + +if test -z "$SERVER_IP" +then + SERVER_IP=$local_host +fi -if [ -n "${SERVER_IP}" ];then - ssh ${SERVER_IP} "${SERVER_START_CMD}" +if ! executeCMD $SERVER_IP "test -e $SERVER_BIN"; then + echo "<-------------------------------->" + echo "$SERVER_NAME is not installed,the start steps will be skipped" + echo "<-------------------------------->" + return +fi + +isLocal $SERVER_IP +flag=$? +echo "Is local "$flag +if [ $flag == "0" ];then + eval $SERVER_LOCAL_START_CMD else - ssh ${local_host} "${SERVER_START_CMD}" + ssh -p $SSH_PORT $SERVER_IP $SERVER_REMOTE_START_CMD fi isSuccess "End to start $SERVER_NAME" echo "<-------------------------------->" @@ -87,7 +129,61 @@ SERVER_NAME=linkis-appjoint-entrance SERVER_IP=$APPJOINT_ENTRANCE_INSTALL_IP startApp +#visualis-server SERVER_NAME=visualis-server SERVER_IP=$VISUALIS_SERVER_INSTALL_IP startApp +echo "" +echo "Start to check all dss microservice" +echo "" + +function checkServer(){ +echo "<-------------------------------->" +echo "Begin to check $SERVER_NAME" +if test -z "$SERVER_IP" +then + SERVER_IP=$local_host +fi + +SERVER_BIN=${SERVER_HOME}/${SERVER_NAME}/bin + +if ! executeCMD $SERVER_IP "test -e ${DSS_INSTALL_HOME}/${SERVER_NAME}"; then + echo "$SERVER_NAME is not installed,the checkServer steps will be skipped" + return +fi + +sh $workDir/bin/checkServices.sh $SERVER_NAME $SERVER_IP $SERVER_PORT +isSuccess "start $SERVER_NAME " +sleep 3 +echo "<-------------------------------->" +} + +#check dss-server +SERVER_NAME=dss-server +SERVER_IP=$DSS_SERVER_INSTALL_IP +SERVER_PORT=$DSS_SERVER_PORT +checkServer + + +#check dss-flow-execution-entrance +SERVER_NAME=dss-flow-execution-entrance +SERVER_IP=$FLOW_EXECUTION_INSTALL_IP +SERVER_PORT=$FLOW_EXECUTION_PORT +checkServer + +#check linkis-appjoint-entrance +SERVER_NAME=linkis-appjoint-entrance +SERVER_IP=$APPJOINT_ENTRANCE_INSTALL_IP +SERVER_PORT=$APPJOINT_ENTRANCE_PORT +checkServer + + +#check visualis-server +sleep 10 #visualis service need more time to register +SERVER_NAME=visualis-server +SERVER_IP=$VISUALIS_SERVER_INSTALL_IP +SERVER_PORT=$VISUALIS_SERVER_PORT +checkServer + +echo "DSS started successfully" diff --git a/bin/stop-all.sh b/bin/stop-all.sh index f1a2c36810bd590c5f1b17348b65fa082a2bf62d..838b9babc9218f313cc3b138533d20b4525db6cf 100644 --- a/bin/stop-all.sh +++ b/bin/stop-all.sh @@ -29,7 +29,12 @@ workDir=`cd "$workDir"; pwd` CONF_DIR="${workDir}"/../conf -CONF_FILE=${CONF_DIR}/config.sh +export LINKIS_DSS_CONF_FILE=${LINKIS_DSS_CONF_FILE:-"${CONF_DIR}/config.sh"} +export DISTRIBUTION=${DISTRIBUTION:-"${CONF_DIR}/config.sh"} +source ${DISTRIBUTION} + +local_host="`hostname --fqdn`" +ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}'|awk 'NR==1') function isSuccess(){ if [ $? -ne 0 ]; then @@ -40,18 +45,40 @@ else fi } +function isLocal(){ + if [ "$1" == "127.0.0.1" ];then + return 0 + elif [ $1 == "localhost" ]; then + return 0 + elif [ $1 == $local_host ]; then + return 0 + elif [ $1 == $ipaddr ]; then + return 0 + fi + return 1 +} +function executeCMD(){ + isLocal $1 + flag=$? + echo "Is local "$flag + if [ $flag == "0" ];then + eval $2 + else + ssh -p $SSH_PORT $1 $2 + fi + +} -local_host="`hostname --fqdn`" #if there is no LINKIS_INSTALL_HOME,we need to source config again if [ -z ${DSS_INSTALL_HOME} ];then echo "Warning: DSS_INSTALL_HOME does not exist, we will source config" - if [ ! -f "${CONF_FILE}" ];then + if [ ! -f "${LINKIS_DSS_CONF_FILE}" ];then echo "Error: can not find config file, stop applications failed" exit 1 else - source ${CONF_FILE} + source ${LINKIS_DSS_CONF_FILE} fi fi @@ -59,13 +86,26 @@ function stopAPP(){ echo "<-------------------------------->" echo "Begin to stop $SERVER_NAME" SERVER_BIN=${DSS_INSTALL_HOME}/${SERVER_NAME}/bin -SERVER_STOP_CMD="source ~/.bash_profile;cd ${SERVER_BIN}; dos2unix ./* > /dev/null 2>&1; dos2unix ../conf/* > /dev/null 2>&1; sh stop-${SERVER_NAME}.sh" -if [ -n "${SERVER_IP}" ];then - ssh -p $SSH_PORT ${SERVER_IP} "${SERVER_STOP_CMD}" +SERVER_LOCAL_STOP_CMD="sh ${SERVER_BIN}/stop-${SERVER_NAME}.sh" +SERVER_REMOTE_STOP_CMD="source /etc/profile;source ~/.bash_profile;cd ${SERVER_BIN}; sh stop-${SERVER_NAME}.sh " +if test -z "$SERVER_IP" +then + SERVER_IP=$local_host +fi + +if ! executeCMD $SERVER_IP "test -e ${DSS_INSTALL_HOME}/${SERVER_NAME}"; then + echo "$SERVER_NAME is not installed,the stop steps will be skipped" + return +fi + +isLocal $SERVER_IP +flag=$? +echo "Is local "$flag +if [ $flag == "0" ];then + eval $SERVER_LOCAL_STOP_CMD else - ssh -p $SSH_PORT ${local_host} "${SERVER_STOP_CMD}" + ssh -p $SSH_PORT $SERVER_IP $SERVER_REMOTE_STOP_CMD fi -isSuccess "End to stop $SERVER_NAME" echo "<-------------------------------->" sleep 3 } @@ -84,7 +124,10 @@ stopAPP SERVER_NAME=linkis-appjoint-entrance SERVER_IP=$APPJOINT_ENTRANCE_INSTALL_IP stopAPP + #visualis-server SERVER_NAME=visualis-server SERVER_IP=$VISUALIS_SERVER_INSTALL_IP -stopAPP \ No newline at end of file +stopAPP + +echo "stop-all shell script executed completely" diff --git a/conf/config.sh b/conf/config.sh index b0310192ea4e029e2c059d3b156383777d884cee..74c913ddc6e6b61048eb4fe8b1670344cfa87d47 100644 --- a/conf/config.sh +++ b/conf/config.sh @@ -1,12 +1,13 @@ +#!/bin/sh + +shellDir=`dirname $0` +workDir=`cd ${shellDir}/..;pwd` + ### deploy user deployUser=hadoop ### The install home path of DSS,Must provided -DSS_INSTALL_HOME=/appcom/Install/DSS - -### Linkis EUREKA information. -EUREKA_INSTALL_IP=127.0.0.1 # Microservices Service Registration Discovery Center -EUREKA_PORT=20303 +DSS_INSTALL_HOME=$workDir ### Specifies the user workspace, which is used to store the user's script files and log files. ### Generally local directory @@ -14,11 +15,6 @@ WORKSPACE_USER_ROOT_PATH=file:///tmp/linkis/ ### Path to store job ResultSet:file or hdfs path RESULT_SET_ROOT_PATH=hdfs:///tmp/linkis -### 1、DataCheck APPJOINT,This service is used to provide DataCheck capability. -HIVE_META_URL=jdbc:mysql://127.0.0.1:3306/linkis?characterEncoding=UTF-8 -HIVE_META_USER=xxx -HIVE_META_PASSWORD=xxx - ################### The install Configuration of all Micro-Services ##################### # # NOTICE: @@ -43,6 +39,10 @@ APPJOINT_ENTRANCE_PORT=9005 FLOW_EXECUTION_INSTALL_IP=127.0.0.1 FLOW_EXECUTION_PORT=9006 +### Linkis EUREKA information. +EUREKA_INSTALL_IP=127.0.0.1 # Microservices Service Registration Discovery Center +EUREKA_PORT=20303 + ### Linkis Gateway information GATEWAY_INSTALL_IP=127.0.0.1 GATEWAY_PORT=9001 @@ -50,26 +50,31 @@ GATEWAY_PORT=9001 ### SSH Port SSH_PORT=22 -#for azkaban +### 1、DataCheck APPJOINT,This service is used to provide DataCheck capability. +HIVE_META_URL=jdbc:mysql://127.0.0.1:3306/hivemeta?characterEncoding=UTF-8 +HIVE_META_USER=xxx +HIVE_META_PASSWORD=xxx + +#Used to store the azkaban project transformed by DSS WDS_SCHEDULER_PATH=file:///appcom/tmp/wds/scheduler ###The IP address and port are written into the database here, so be sure to plan ahead ## visualis-server VISUALIS_SERVER_INSTALL_IP=127.0.0.1 VISUALIS_SERVER_PORT=9007 -### visualis nginx acess ip -VISUALIS_NGINX_IP=0.0.0.0 -VISUALIS_NGINX_PORT=9009 +### visualis nginx acess ip,keep consistent with DSS front end +VISUALIS_NGINX_IP=127.0.0.1 +VISUALIS_NGINX_PORT=8088 ### Eventchecker APPJOINT ### This service is used to provide Eventchecker capability. it's config in db.sh same as dss-server. #azkaban address for check AZKABAN_ADRESS_IP=127.0.0.1 -AZKABAN_ADRESS_PORT=8091 +AZKABAN_ADRESS_PORT=8081 #qualitis.address for check QUALITIS_ADRESS_IP=127.0.0.1 QUALITIS_ADRESS_PORT=8090 -DSS_VERSION=0.5.0 \ No newline at end of file +DSS_VERSION=0.9.1 diff --git a/datachecker-appjoint/pom.xml b/datachecker-appjoint/pom.xml index ce032823da436f2c28887087e771a8dc934a3aeb..d609c3ab32df5450344f7810e0217c784b1db75b 100644 --- a/datachecker-appjoint/pom.xml +++ b/datachecker-appjoint/pom.xml @@ -22,7 +22,7 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 4.0.0 diff --git a/datachecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/connector/DataCheckerDao.java b/datachecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/connector/DataCheckerDao.java index d3caeac010d6a5ea717734e289cf76699c52d03d..d84a0d4bc620fe521b140816a82959e57c3481bf 100644 --- a/datachecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/connector/DataCheckerDao.java +++ b/datachecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/connector/DataCheckerDao.java @@ -158,8 +158,9 @@ public class DataCheckerDao { private PreparedStatement getStatement(Connection conn, String dataObject) throws SQLException { String dataScape = dataObject.contains("{") ? "Partition" : "Table"; - String dbName = dataObject.split("\\.")[0]; - String tableName = dataObject.split("\\.")[1]; + String[] dataObjectArray = dataObject.split("\\."); + String dbName = dataObjectArray[0]; + String tableName = dataObjectArray[1]; if(dataScape.equals("Partition")) { Pattern pattern = Pattern.compile("\\{([^\\}]+)\\}"); Matcher matcher = pattern.matcher(dataObject); @@ -174,11 +175,13 @@ public class DataCheckerDao { pstmt.setString(2, tableName); pstmt.setString(3, partitionName); return pstmt; - } else { + } else if(dataObjectArray.length == 2){ PreparedStatement pstmt = conn.prepareCall(SQL_SOURCE_TYPE_JOB_TABLE); pstmt.setString(1, dbName); pstmt.setString(2, tableName); return pstmt; + }else { + throw new SQLException("Incorrect input format for dataObject "+ dataObject); } } diff --git a/db/azkaban.sql b/db/azkaban.sql new file mode 100644 index 0000000000000000000000000000000000000000..7f18b33087b3c3f989c25ac87b5d32d0f4573435 --- /dev/null +++ b/db/azkaban.sql @@ -0,0 +1,4 @@ +INSERT INTO `dss_application` (`id`, `name`, `url`, `is_user_need_init`, `level`, `user_init_url`, `exists_project_service`, `project_url`, `enhance_json`, `if_iframe`, `homepage_url`, `redirect_url`) VALUES (NULL, 'schedulis', NULL, '0', '1', NULL, '0', NULL, NULL, '1', NULL, NULL); +UPDATE `dss_application` SET url = 'http://AZKABAN_ADRESS_IP_2:AZKABAN_ADRESS_PORT', project_url = 'http://AZKABAN_ADRESS_IP_2:AZKABAN_ADRESS_PORT/manager?project=${projectName}',homepage_url = 'http://AZKABAN_ADRESS_IP_2:AZKABAN_ADRESS_PORT/homepage' WHERE `name` in ('schedulis'); +SELECT @shcedulis_id:=id FROM `dss_application` WHERE `name` = 'schedulis'; +insert into dss_workflow_node values(null,null,'linkis.shell.sh',@shcedulis_id,1,1,0,1,null); diff --git a/db/dss_ddl.sql b/db/dss_ddl.sql index 5c016e0b1b86ef881a742bf4aa9006382c254162..cdaf8fb1a7cfc700c75ec2511780fdc81ac410ee 100644 --- a/db/dss_ddl.sql +++ b/db/dss_ddl.sql @@ -154,6 +154,7 @@ CREATE TABLE `dss_project` ( `name` varchar(200) COLLATE utf8_bin DEFAULT NULL, `source` varchar(50) COLLATE utf8_bin DEFAULT NULL COMMENT 'Source of the dss_project', `description` text COLLATE utf8_bin, + `workspace_id` bigint(20) DEFAULT 1, `user_id` bigint(20) DEFAULT NULL, `create_time` datetime DEFAULT NULL, `create_by` bigint(20) DEFAULT NULL, @@ -294,3 +295,156 @@ CREATE TABLE `event_status` ( `msg_id` int(11) NOT NULL COMMENT '消息的最大消费id', PRIMARY KEY (`receiver`,`topic`,`msg_name`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8 COMMENT='消息消费状态表'; + + +-- ---------------------------- +-- Table structure for dss_workspace +-- ---------------------------- +DROP TABLE IF EXISTS `dss_workspace`; +CREATE TABLE `dss_workspace` ( + `id` bigint(20) NOT NULL AUTO_INCREMENT, + `name` varchar(255) DEFAULT NULL, + `label` varchar(255) DEFAULT NULL, + `description` varchar(255) DEFAULT NULL, + `department` varchar(255) DEFAULT NULL, + `product` varchar(255) DEFAULT NULL, + `source` varchar(255) DEFAULT NULL, + `create_by` varchar(255) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `last_update_time` datetime DEFAULT NULL, + `last_update_user` varchar(30) DEFAULT NULL, + PRIMARY KEY (`id`), + UNIQUE KEY `name` (`name`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8; + +-- ---------------------------- +-- Table structure for dss_onestop_menu +-- ---------------------------- +DROP TABLE IF EXISTS `dss_onestop_menu`; +CREATE TABLE `dss_onestop_menu` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `name` varchar(64) DEFAULT NULL, + `title_en` varchar(64) DEFAULT NULL, + `title_cn` varchar(64) DEFAULT NULL, + `description` varchar(255) DEFAULT NULL, + `is_active` tinyint(1) DEFAULT 1, + `icon` varchar(255) DEFAULT NULL, + `order` int(2) DEFAULT NULL, + `create_by` varchar(255) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `last_update_time` datetime DEFAULT NULL, + `last_update_user` varchar(30) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8; + +-- ---------------------------- +-- Table structure for dss_onestop_menu_application +-- ---------------------------- +DROP TABLE IF EXISTS `dss_onestop_menu_application`; +CREATE TABLE `dss_onestop_menu_application` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `application_id` int(20) DEFAULT NULL, + `onestop_menu_id` int(20) NOT NULL, + `title_en` varchar(64) DEFAULT NULL, + `title_cn` varchar(64) DEFAULT NULL, + `desc_en` varchar(255) DEFAULT NULL, + `desc_cn` varchar(255) DEFAULT NULL, + `labels_en` varchar(255) DEFAULT NULL, + `labels_cn` varchar(255) DEFAULT NULL, + `is_active` tinyint(1) DEFAULT NULL, + `access_button_en` varchar(64) DEFAULT NULL, + `access_button_cn` varchar(64) DEFAULT NULL, + `manual_button_en` varchar(64) DEFAULT NULL, + `manual_button_cn` varchar(64) DEFAULT NULL, + `manual_button_url` varchar(255) DEFAULT NULL, + `icon` varchar(255) DEFAULT NULL, + `order` int(2) DEFAULT NULL, + `create_by` varchar(255) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `last_update_time` datetime DEFAULT NULL, + `last_update_user` varchar(30) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8; + +-- ---------------------------- +-- Table structure for dss_onestop_user_favorites +-- ---------------------------- +DROP TABLE IF EXISTS `dss_onestop_user_favorites`; +CREATE TABLE `dss_onestop_user_favorites` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `username` varchar(64) DEFAULT NULL, + `workspace_id` bigint(20) DEFAULT 1, + `menu_application_id` int(20) DEFAULT NULL, + `order` int(2) DEFAULT NULL, + `create_by` varchar(255) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `last_update_time` datetime DEFAULT NULL, + `last_update_user` varchar(30) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8; + +-- ---------------------------- +-- Table structure for dss_homepage_demo_menu +-- ---------------------------- +DROP TABLE IF EXISTS `dss_homepage_demo_menu`; +CREATE TABLE `dss_homepage_demo_menu` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `name` varchar(64) DEFAULT NULL, + `title_en` varchar(64) DEFAULT NULL, + `title_cn` varchar(64) DEFAULT NULL, + `description` varchar(255) DEFAULT NULL, + `is_active` tinyint(1) DEFAULT 1, + `icon` varchar(255) DEFAULT NULL, + `order` int(2) DEFAULT NULL, + `create_by` varchar(255) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `last_update_time` datetime DEFAULT NULL, + `last_update_user` varchar(30) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8; + +-- ---------------------------- +-- Table structure for dss_homepage_demo_instance +-- ---------------------------- +DROP TABLE IF EXISTS `dss_homepage_demo_instance`; +CREATE TABLE `dss_homepage_demo_instance` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `menu_id` int(20) DEFAULT NULL, + `name` varchar(64) DEFAULT NULL, + `url` varchar(128) DEFAULT NULL, + `title_en` varchar(64) DEFAULT NULL, + `title_cn` varchar(64) DEFAULT NULL, + `description` varchar(255) DEFAULT NULL, + `is_active` tinyint(1) DEFAULT 1, + `icon` varchar(255) DEFAULT NULL, + `order` int(2) DEFAULT NULL, + `click_num` int(11) DEFAULT 0, + `create_by` varchar(255) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `last_update_time` datetime DEFAULT NULL, + `last_update_user` varchar(30) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8; + +-- ---------------------------- +-- Table structure for dss_homepage_video +-- ---------------------------- +DROP TABLE IF EXISTS `dss_homepage_video`; +CREATE TABLE `dss_homepage_video` ( + `id` int(20) NOT NULL AUTO_INCREMENT, + `name` varchar(64) DEFAULT NULL, + `url` varchar(128) DEFAULT NULL, + `title_en` varchar(64) DEFAULT NULL, + `title_cn` varchar(64) DEFAULT NULL, + `description` varchar(255) DEFAULT NULL, + `is_active` tinyint(1) DEFAULT 1, + `icon` varchar(255) DEFAULT NULL, + `order` int(2) DEFAULT NULL, + `play_num` int(11) DEFAULT 0, + `create_by` varchar(255) DEFAULT NULL, + `create_time` datetime DEFAULT NULL, + `last_update_time` datetime DEFAULT NULL, + `last_update_user` varchar(30) DEFAULT NULL, + PRIMARY KEY (`id`) +) ENGINE=InnoDB DEFAULT CHARSET=utf8; + diff --git a/db/dss_dml.sql b/db/dss_dml.sql index 3ee9fca202377fc2a7d6ad8e3347d57bdef60902..78bf09bd45bd1d873cc0cc4b670e1e140790b888 100644 --- a/db/dss_dml.sql +++ b/db/dss_dml.sql @@ -1,21 +1,17 @@ INSERT INTO `dss_application` (`id`, `name`, `url`, `is_user_need_init`, `level`, `user_init_url`, `exists_project_service`, `project_url`, `enhance_json`, `if_iframe`, `homepage_url`, `redirect_url`) VALUES (NULL, 'linkis', null, '0', '1', NULL, '0', '/home', NULL, '0', '/home', NULL); -INSERT INTO `dss_application` (`id`, `name`, `url`, `is_user_need_init`, `level`, `user_init_url`, `exists_project_service`, `project_url`, `enhance_json`, `if_iframe`, `homepage_url`, `redirect_url`) VALUES (NULL, 'visualis', null, '0', '1', NULL, '0', NULL, NULL, '1', NULL, NULL); -INSERT INTO `dss_application` (`id`, `name`, `url`, `is_user_need_init`, `level`, `user_init_url`, `exists_project_service`, `project_url`, `enhance_json`, `if_iframe`, `homepage_url`, `redirect_url`) VALUES (NULL, 'schedulis', NULL, '0', '1', NULL, '0', NULL, NULL, '1', NULL, NULL); INSERT INTO `dss_application` (`id`, `name`, `url`, `is_user_need_init`, `level`, `user_init_url`, `exists_project_service`, `project_url`, `enhance_json`, `if_iframe`, `homepage_url`, `redirect_url`) VALUES (NULL, 'workflow', null, '0', '1', NULL, '0', '/workflow', NULL, '0', '/project', NULL); INSERT INTO `dss_application` (`id`, `name`, `url`, `is_user_need_init`, `level`, `user_init_url`, `exists_project_service`, `project_url`, `enhance_json`, `if_iframe`, `homepage_url`, `redirect_url`) VALUES (NULL, 'console', null, '0', '1', NULL, '0', '/console', NULL, '0', '/console', NULL); SELECT @linkis_appid:=id from dss_application WHERE `name` = 'linkis'; -SELECT @visualis_appid:=id from dss_application WHERE `name` = 'visualis'; SELECT @workflow_appid:=id from dss_application WHERE `name` = 'workflow'; INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.python.python', @linkis_appid, '1', '1', '0', '1', NULL); INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.spark.py', @linkis_appid, '1', '1', '0', '1', NULL); INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.spark.sql', @linkis_appid, '1', '1', '0', '1', NULL); INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.spark.scala', @linkis_appid, '1', '1', '0', '1', NULL); INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.hive.hql', @linkis_appid, '1', '1', '0', '1', NULL); +INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.jdbc.jdbc', @linkis_appid, '1', '1', '0', '1', NULL); INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.control.empty', @linkis_appid, '1', '1', '0', '0', NULL); INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.appjoint.sendemail', @linkis_appid, '1', '1', '0', '0', NULL); -INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.appjoint.visualis.display', @visualis_appid, '1', '1', '1', '1', NULL); -INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.appjoint.visualis.dashboard', @visualis_appid, '1', '1', '1', '1', NULL); INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.appjoint.eventchecker.eventsender', @linkis_appid, '1', '1', '0', '0', NULL); INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.appjoint.eventchecker.eventreceiver', @linkis_appid, '1', '1', '0', '0', NULL); INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.appjoint.datachecker', @linkis_appid, '1', '1', '0', '0', NULL); @@ -28,15 +24,7 @@ INSERT INTO `dss_project_taxonomy` (`id`, `name`, `description`, `creator_id`, ` INSERT INTO `dss_flow_taxonomy` (`id`, `name`, `description`, `creator_id`, `create_time`, `update_time`, `project_id`) VALUES (NULL, 'My workflow', NULL, NULL, NULL,NULL, '-1'); UPDATE `dss_application` SET url = 'http://GATEWAY_INSTALL_IP_2:GATEWAY_PORT' WHERE `name` in('linkis','workflow'); -UPDATE `dss_application` SET url = 'http://VISUALIS_NGINX_IP_2:VISUALIS_NGINX_PORT' WHERE `name` in('visualis'); -UPDATE `dss_application` SET project_url = 'http://VISUALIS_NGINX_IP_2:VISUALIS_NGINX_PORT/dss/visualis/#/project/${projectId}',homepage_url = 'http://VISUALIS_NGINX_IP_2:VISUALIS_NGINX_PORT/dss/visualis/#/projects' WHERE `name` in('visualis'); -UPDATE `dss_application` SET url = 'http://AZKABAN_ADRESS_IP_2:AZKABAN_ADRESS_PORT', project_url = 'http://AZKABAN_ADRESS_IP_2:AZKABAN_ADRESS_PORT/manager?project=${projectName}',homepage_url = 'http://AZKABAN_ADRESS_IP_2:AZKABAN_ADRESS_PORT/homepage' WHERE `name` in('schedulis'); -UPDATE `dss_workflow_node` SET jump_url = 'http://VISUALIS_NGINX_IP_2:VISUALIS_NGINX_PORT/dss/visualis/#/project/${projectId}/display/${nodeId}' where node_type = 'linkis.appjoint.visualis.display'; -UPDATE `dss_workflow_node` SET jump_url = 'http://VISUALIS_NGINX_IP_2:VISUALIS_NGINX_PORT/dss/visualis/#/project/${projectId}/portal/${nodeId}/portalName/${nodeName}' where node_type = 'linkis.appjoint.visualis.dashboard'; - -INSERT INTO `linkis_application` (`id`, `name`, `chinese_name`, `description`) VALUES (NULL, 'visualis', NULL, NULL); -select @application_id:=id from `linkis_application` where `name` = 'visualis'; INSERT INTO `linkis_config_key` (`id`, `key`, `description`, `name`, `application_id`, `default_value`, `validate_type`, `validate_range`, `is_hidden`, `is_advanced`, `level`) VALUES (NULL, 'spark.executor.instances', '取值范围:1-40,单位:个', '执行器实例最大并发数', @application_id, '2', 'NumInterval', '[1,40]', '0', '0', '2'); INSERT INTO `linkis_config_key` (`id`, `key`, `description`, `name`, `application_id`, `default_value`, `validate_type`, `validate_range`, `is_hidden`, `is_advanced`, `level`) VALUES (NULL, 'spark.executor.cores', '取值范围:1-8,单位:个', '执行器核心个数', @application_id, '2', 'NumInterval', '[1,2]', '1', '0', '1'); INSERT INTO `linkis_config_key` (`id`, `key`, `description`, `name`, `application_id`, `default_value`, `validate_type`, `validate_range`, `is_hidden`, `is_advanced`, `level`) VALUES (NULL, 'spark.executor.memory', '取值范围:3-15,单位:G', '执行器内存大小', @application_id, '3', 'NumInterval', '[3,15]', '0', '0', '3'); @@ -59,4 +47,62 @@ insert into `linkis_config_key_tree` VALUES(NULL,@key_id2,@tree_id1); insert into `linkis_config_key_tree` VALUES(NULL,@key_id3,@tree_id1); insert into `linkis_config_key_tree` VALUES(NULL,@key_id4,@tree_id1); insert into `linkis_config_key_tree` VALUES(NULL,@key_id5,@tree_id1); -insert into `linkis_config_key_tree` VALUES(NULL,@key_id6,@tree_id2); \ No newline at end of file +insert into `linkis_config_key_tree` VALUES(NULL,@key_id6,@tree_id2); + +#-----------------------jdbc------------------- + +select @application_id:=id from `linkis_application` where `name` = 'nodeexecution'; +INSERT INTO `linkis_application` (`id`, `name`, `chinese_name`, `description`) SELECT NULL,'nodeexecution',`chinese_name`,`description` FROM linkis_application WHERE @application_id IS NULL LIMIT 1 ; +select @jdbc_id:=id from `linkis_application` where `name` = 'jdbc'; + +INSERT INTO `linkis_config_key` (`id`, `key`, `description`, `name`, `application_id`, `default_value`, `validate_type`, `validate_range`, `is_hidden`, `is_advanced`, `level`) VALUES (NULL, 'jdbc.url', '格式:', 'jdbc连接地址', @application_id, NULL , 'None', NULL , '0', '0', '1'); +INSERT INTO `linkis_config_key` (`id`, `key`, `description`, `name`, `application_id`, `default_value`, `validate_type`, `validate_range`, `is_hidden`, `is_advanced`, `level`) VALUES (NULL, 'jdbc.username', NULL , 'jdbc连接用户名', @application_id, NULL, 'None', NULL , '0', '0', '1'); +INSERT INTO `linkis_config_key` (`id`, `key`, `description`, `name`, `application_id`, `default_value`, `validate_type`, `validate_range`, `is_hidden`, `is_advanced`, `level`) VALUES (NULL, 'jdbc.password', NULL , 'jdbc连接密码', @application_id, NULL , 'None', NULL , '0', '0', '1'); + +select @key_id1:=id from `linkis_config_key` where `application_id` = @application_id and `key` = 'jdbc.url'; +select @key_id2:=id from `linkis_config_key` where `application_id` = @application_id and `key` = 'jdbc.username'; +select @key_id3:=id from `linkis_config_key` where `application_id` = @application_id and `key` = 'jdbc.password'; + +SELECT @tree_id1:=t.id from linkis_config_tree t LEFT JOIN linkis_application a on t.application_id = a.id WHERE t.`name` = 'jdbc连接设置' and a.`name` = 'jdbc'; + +insert into `linkis_config_key_tree` VALUES(NULL,@key_id1,@tree_id1); +insert into `linkis_config_key_tree` VALUES(NULL,@key_id2,@tree_id1); +insert into `linkis_config_key_tree` VALUES(NULL,@key_id3,@tree_id1); + +INSERT INTO dss_workspace (id, name, label, description, department, product, source, create_by, create_time, last_update_time, last_update_user) VALUES (1, 'default', 'default', 'default user workspace', NULL, NULL, 'create by user', 'root', NULL, NULL, 'root'); + +INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 1, '工作流编辑执行', 'https://github.com/WeBankFinTech/DataSphereStudio', 'workflow edit execution', '工作流编辑执行', '工作流编辑执行', 1, NULL, 1, 0, NULL, NULL, NULL, NULL); +INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 1, '工作流串联可视化', 'https://github.com/WeBankFinTech/DataSphereStudio', 'workflow series visualization', '工作流串联可视化', '工作流串联可视化', 1, NULL, 2, 0, NULL, NULL, NULL, NULL); +INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 1, '工作流调度执行跑批', 'https://github.com/WeBankFinTech/DataSphereStudio', 'workflow scheduling execution run batch', '工作流调度执行跑批', '工作流调度执行跑批', 1, NULL, 3, 0, NULL, NULL, NULL, NULL); +INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 2, '某业务日常运营报表', 'https://github.com/WeBankFinTech/DataSphereStudio', 'business daily operation report', '某业务日常运营报表', '某业务日常运营报表', 1, NULL, 1, 0, NULL, NULL, NULL, NULL); +INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 2, '某业务机器学习建模预测', 'https://github.com/WeBankFinTech/DataSphereStudio', 'business machine learning modeling prediction', '某业务机器学习建模预测', '某业务机器学习建模预测', 1, NULL, 2, 0, NULL, NULL, NULL, NULL); +INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 2, '某业务导出营销用户列表', 'https://github.com/WeBankFinTech/DataSphereStudio', 'business export marketing user list', '某业务导出营销用户列表', '某业务导出营销用户列表', 1, NULL, 3, 0, NULL, NULL, NULL, NULL); +INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 3, '数据大屏体验', 'https://github.com/WeBankFinTech/DataSphereStudio', 'data big screen experience', '数据大屏体验', '数据大屏体验', 1, NULL, 1, 0, NULL, NULL, NULL, NULL); +INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 3, '数据仪表盘体验', 'https://github.com/WeBankFinTech/DataSphereStudio', 'data dashboard experience', '数据仪表盘体验', '数据仪表盘体验', 1, NULL, 2, 0, NULL, NULL, NULL, NULL); +INSERT INTO dss_homepage_demo_instance (id, menu_id, name, url, title_en, title_cn, description, is_active, icon, `order`, click_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, 3, '可视化挂件快速体验', 'https://github.com/WeBankFinTech/DataSphereStudio', 'visual widgets quick experience', '可视化挂件快速体验', '可视化挂件快速体验', 1, NULL, 3, 0, NULL, NULL, NULL, NULL); + +INSERT INTO dss_homepage_demo_menu (id, name, title_en, title_cn, description, is_active, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (1, 'workflow', 'workflow', '工作流', '工作流', 1, NULL, 1, NULL, NULL, NULL, NULL); +INSERT INTO dss_homepage_demo_menu (id, name, title_en, title_cn, description, is_active, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (2, 'application', 'application', '应用场景', '应用场景', 1, NULL, 2, NULL, NULL, NULL, NULL); +INSERT INTO dss_homepage_demo_menu (id, name, title_en, title_cn, description, is_active, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (3, 'visualization', 'visualization', '可视化', '可视化', 1, NULL, 3, NULL, NULL, NULL, NULL); + +INSERT INTO dss_homepage_video (id, name, url, title_en, title_cn, description, is_active, icon, `order`, play_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, '10秒教你搭建工作流', 'https://sandbox.webank.com/wds/dss/videos/1.mp4', '10 sec how to build workflow', '10秒教你搭建工作流', '10秒教你搭建工作流', 1, NULL, 1, 0, NULL, NULL, NULL, NULL); +INSERT INTO dss_homepage_video (id, name, url, title_en, title_cn, description, is_active, icon, `order`, play_num, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, '10秒教你发邮件', 'https://sandbox.webank.com/wds/dss/videos/1.mp4', '10 sec how to send email', '10秒教你发邮件', '10秒教你发邮件', 1, NULL, 2, 0, NULL, NULL, NULL, NULL); + +INSERT INTO dss_onestop_menu (id, name, title_en, title_cn, description, is_active, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (1, '应用开发', 'application development', '应用开发', '应用开发描述', 1, NULL, NULL, NULL, NULL, NULL, NULL); +INSERT INTO dss_onestop_menu (id, name, title_en, title_cn, description, is_active, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (2, '数据分析', 'data analysis', '数据分析', '数据分析描述', 1, NULL, NULL, NULL, NULL, NULL, NULL); +INSERT INTO dss_onestop_menu (id, name, title_en, title_cn, description, is_active, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (3, '生产运维', 'production operation', '生产运维', '生产运维描述', 1, NULL, NULL, NULL, NULL, NULL, NULL); +INSERT INTO dss_onestop_menu (id, name, title_en, title_cn, description, is_active, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (4, '数据质量', 'data quality', '数据质量', '数据质量描述', 1, NULL, NULL, NULL, NULL, NULL, NULL); +INSERT INTO dss_onestop_menu (id, name, title_en, title_cn, description, is_active, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (5, '管理员功能', 'administrator function', '管理员功能', '管理员功能描述', 0, NULL, NULL, NULL, NULL, NULL, NULL); + +INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 1, 'workflow development', '工作流开发', 'Workflow development is a data application development tool created by WeDataSphere with Linkis as the kernel.', '工作流开发是微众银行微数域(WeDataSphere)打造的数据应用开发工具,以任意桥(Linkis)做为内核,将满足从数据交换、脱敏清洗、分析挖掘、质量检测、可视化展现、定时调度到数据输出等数据应用开发全流程场景需求。', 'workflow, data warehouse development', '工作流,数仓开发', 1, 'enter workflow development', '进入工作流开发', 'user manual', '用户手册', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-workflow|rgb(102, 102, 255)', NULL, NULL, NULL, NULL, NULL); +INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 1, 'StreamSQL development', 'StreamSQL开发', 'Real-time application development is a streaming solution jointly built by WeDataSphere, Boss big data team and China Telecom ctcloud Big data team.', '实时应用开发是微众银行微数域(WeDataSphere)、Boss直聘大数据团队 和 中国电信天翼云大数据团队 社区联合共建的流式解决方案,以 Linkis 做为内核,基于 Flink Engine 构建的批流统一的 Flink SQL,助力实时化转型。', 'streaming, realtime', '流式,实时', 0, 'under union construction', '联合共建中', 'related information', '相关资讯', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-scriptis|rgb(102, 102, 255)', NULL, NULL, NULL, NULL, NULL); +INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 1, 'Data service development', '数据服务开发', 'Data service is a unified API service jointly built by WeDataSphere and Ihome Big data Team. With Linkis and DataSphere Studio as the kernel.', '数据服务是微众银行微数域(WeDataSphere)与 艾佳生活大数据团队 社区联合共建的统一API服务,以 Linkis 和 DataSphere Studio 做为内核,提供快速将 Scriptis 脚本生成数据API的能力,协助企业统一管理对内对外的API服务。', 'API, data service', 'API,数据服务', 0, 'under union construction', '联合共建中', 'related information', '相关资讯', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-scriptis|rgb(102, 102, 255)', NULL, NULL, NULL, NULL, NULL); +INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 2, 'Scriptis', 'Scriptis', 'Scriptis is a one-stop interactive data exploration analysis tool built by WeDataSphere, uses Linkis as the kernel.', 'Scriptis是微众银行微数域(WeDataSphere)打造的一站式交互式数据探索分析工具,以任意桥(Linkis)做为内核,提供多种计算存储引擎(如Spark、Hive、TiSpark等)、Hive数据库管理功能、资源(如Yarn资源、服务器资源)管理、应用管理和各种用户资源(如UDF、变量等)管理的能力。', 'scripts development,IDE', '脚本开发,IDE', 1, 'enter Scriptis', '进入Scriptis', 'user manual', '用户手册', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-scriptis|rgb(102, 102, 255)', NULL, NULL, NULL, NULL, NULL); +INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 2, 'Visualis', 'Visualis', 'Visualis is a data visualization BI tool based on Davinci, with Linkis as the kernel, it supports the analysis mode of data development exploration.', 'Visualis是基于宜信开源项目Davinci开发的数据可视化BI工具,以任意桥(Linkis)做为内核,支持拖拽式报表定义、图表联动、钻取、全局筛选、多维分析、实时查询等数据开发探索的分析模式,并做了水印、数据质量校验等金融级增强。', 'visualization, statement', '可视化,报表', 1, 'enter Visualis', '进入Visualis', 'user manual', '用户手册', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-visualis|rgb(0, 153, 255)', NULL, NULL, NULL, NULL, NULL); +INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 3, 'Schedulis', 'Schedulis', 'Description for Schedulis.', 'Schedulis描述', 'scheduling, workflow', '调度,工作流', 1, 'enter Schedulis', '进入Schedulis', 'user manual', '用户手册', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-schedule|rgb(102, 102, 204)', NULL, NULL, NULL, NULL, NULL); +INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 3, 'Application operation center', '应用运维中心', 'Description for Application operation center.', '应用运维中心描述', 'production, operation', '生产,运维', 0, 'enter application operation center', '进入应用运维中心', 'user manual', '用户手册', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-scriptis|rgb(102, 102, 255)', NULL, NULL, NULL, NULL, NULL); +INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 4, 'Qualitis', 'Qualitis', 'Qualitis is a financial and one-stop data quality management platform that provides data quality model definition, visualization and monitoring of data quality results', 'Qualitis是一套金融级、一站式的数据质量管理平台,提供了数据质量模型定义,数据质量结果可视化、可监控等功能,并用一整套统一的流程来定义和检测数据集的质量并及时报告问题。', 'product, operations', '生产,运维', 1, 'enter Qualitis', '进入Qualitis', 'user manual', '用户手册', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-qualitis|rgb(51, 153, 153)', NULL, NULL, NULL, NULL, NULL); +INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 4, 'Exchangis', 'Exchangis', 'Exchangis is a lightweight, high scalability, data exchange platform, support for structured and unstructured data transmission between heterogeneous data sources.', 'Exchangis是一个轻量级的、高扩展性的数据交换平台,支持对结构化及无结构化的异构数据源之间的数据传输,在应用层上具有数据权限管控、节点服务高可用和多租户资源隔离等业务特性,而在数据层上又具有传输架构多样化、模块插件化和组件低耦合等架构特点。', 'user manual', '生产,运维', 1, 'enter Exchangis', '进入Exchangis', 'user manual', '用户手册', 'https://github.com/WeBankFinTech/DataSphereStudio', 'fi-exchange|(102, 102, 255)', NULL, NULL, NULL, NULL, NULL); +INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 5, 'Workspace management', '工作空间管理', NULL, NULL, NULL, NULL, 1, 'workspace management', '工作空间管理', null, null, null, 'fi-scriptis|rgb(102, 102, 255)', null, null, null, null, null); +INSERT INTO dss_onestop_menu_application (id, application_id, onestop_menu_id, title_en, title_cn, desc_en, desc_cn, labels_en, labels_cn, is_active, access_button_en, access_button_cn, manual_button_en, manual_button_cn, manual_button_url, icon, `order`, create_by, create_time, last_update_time, last_update_user) VALUES (NULL, NULL, 5, 'User resources management', '用户资源管理', NULL, NULL, NULL, NULL, 1, 'user resource management', '用户资源管理', null, null, null, 'fi-scriptis|rgb(102, 102, 255)', null, null, null, null, null); + diff --git a/db/qualitis.sql b/db/qualitis.sql new file mode 100644 index 0000000000000000000000000000000000000000..04fb15cfb2b3b5c88b0b89a5af5af1bd744a441e --- /dev/null +++ b/db/qualitis.sql @@ -0,0 +1,3 @@ +INSERT INTO `dss_application` (`id`, `name`, `url`, `is_user_need_init`, `level`, `user_init_url`, `exists_project_service`, `project_url`, `enhance_json`, `if_iframe`, `homepage_url`, `redirect_url`) VALUES (NULL, 'qualitis', 'http://QUALITIS_ADRESS_IP_2:QUALITIS_ADRESS_PORT', '0', '1', NULL, '1', 'http://QUALITIS_ADRESS_IP_2:QUALITIS_ADRESS_PORT/#/projects/list?id=${projectId}&flow=true', NULL, '1', 'http://QUALITIS_ADRESS_IP_2:QUALITIS_ADRESS_PORT/#/dashboard', 'http://QUALITIS_ADRESS_IP_2:QUALITIS_ADRESS_PORT/qualitis/api/v1/redirect'); +SELECT @qualitis_appid:=id from dss_application WHERE `name` = 'qualitis'; +INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.appjoint.qualitis', @qualitis_appid, NULL, '1', '0', '1', 'http://QUALITIS_ADRESS_IP_2:QUALITIS_ADRESS_PORT/#/addGroupTechniqueRule?tableType=1&id=${projectId}&ruleGroupId=${ruleGroupId}&nodeId=${nodeId}'); \ No newline at end of file diff --git a/db/visualis.sql b/db/visualis.sql new file mode 100644 index 0000000000000000000000000000000000000000..71070c7bdacd8b016b45cdc89e6951cb6193db66 --- /dev/null +++ b/db/visualis.sql @@ -0,0 +1,34 @@ +INSERT INTO `dss_application` (`id`, `name`, `url`, `is_user_need_init`, `level`, `user_init_url`, `exists_project_service`, `project_url`, `enhance_json`, `if_iframe`, `homepage_url`, `redirect_url`) VALUES (NULL, 'visualis', null, '0', '1', NULL, '0', NULL, NULL, '1', NULL, NULL); +SELECT @visualis_appid:=id from dss_application WHERE `name` = 'visualis'; +INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.appjoint.visualis.display', @visualis_appid, '1', '1', '1', '1', NULL); +INSERT INTO `dss_workflow_node` (`id`, `icon`, `node_type`, `application_id`, `submit_to_scheduler`, `enable_copy`, `should_creation_before_node`, `support_jump`, `jump_url`) VALUES (NULL, NULL, 'linkis.appjoint.visualis.dashboard', @visualis_appid, '1', '1', '1', '1', NULL);UPDATE `dss_application` SET url = 'http://VISUALIS_NGINX_IP_2:VISUALIS_NGINX_PORT' WHERE `name` in('visualis'); +UPDATE `dss_application` SET url = 'http://VISUALIS_NGINX_IP_2:VISUALIS_NGINX_PORT' WHERE `name` in('visualis'); +UPDATE `dss_application` SET project_url = 'http://VISUALIS_NGINX_IP_2:VISUALIS_NGINX_PORT/dss/visualis/#/project/${projectId}',homepage_url = 'http://VISUALIS_NGINX_IP_2:VISUALIS_NGINX_PORT/dss/visualis/#/projects' WHERE `name` in('visualis'); +UPDATE `dss_workflow_node` SET jump_url = 'http://VISUALIS_NGINX_IP_2:VISUALIS_NGINX_PORT/dss/visualis/#/project/${projectId}/display/${nodeId}' where node_type = 'linkis.appjoint.visualis.display'; +UPDATE `dss_workflow_node` SET jump_url = 'http://VISUALIS_NGINX_IP_2:VISUALIS_NGINX_PORT/dss/visualis/#/project/${projectId}/portal/${nodeId}/portalName/${nodeName}' where node_type = 'linkis.appjoint.visualis.dashboard'; +INSERT INTO `linkis_application` (`id`, `name`, `chinese_name`, `description`) VALUES (NULL, 'visualis', NULL, NULL); +select @application_id:=id from `linkis_application` where `name` = 'visualis'; + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/docs/en_US/ch1/DataSphereStudio_Compile_Manual.md b/docs/en_US/ch1/DataSphereStudio_Compile_Manual.md index da81b87fab4ab4c388d248cbc396f02b2cf9268f..d1b5408a2f5294da294e50551f412b3cb3711370 100644 --- a/docs/en_US/ch1/DataSphereStudio_Compile_Manual.md +++ b/docs/en_US/ch1/DataSphereStudio_Compile_Manual.md @@ -6,8 +6,8 @@ ```xml - 0.5.0 - 0.9.1 + 0.9.1 + 0.9.4 2.11.8 1.8 3.3.3 diff --git a/docs/en_US/ch2/Azkaban_LinkisJobType_Deployment_Manual.md b/docs/en_US/ch2/Azkaban_LinkisJobType_Deployment_Manual.md index 2560b0674bf7a0b9216783848e27ee3e5886ba98..33cb3808cb9e3f94bf5f3cfda26ef42ba82705df 100644 --- a/docs/en_US/ch2/Azkaban_LinkisJobType_Deployment_Manual.md +++ b/docs/en_US/ch2/Azkaban_LinkisJobType_Deployment_Manual.md @@ -2,7 +2,7 @@ ## 1. Ready work -1.Click [release](https://github.com/WeBankFinTech/DataSphereStudio/releases/download/0.5.0/linkis-jobtype-0.5.0.zip) to select the corresponding installation package to download: +1.Click [release](https://github.com/WeBankFinTech/DataSphereStudio/releases/download/0.8.0/linkis-jobtype-0.8.0.zip) to select the corresponding installation package to download: - linkis-jobtype-$version.zip diff --git a/docs/en_US/ch2/DSS Quick Installation Guide.md b/docs/en_US/ch2/DSS Quick Installation Guide.md index f8837393dc6149f4735e85151d6aa20553242d27..bc36886351d509e536cc5343fac4e15215c31913 100644 --- a/docs/en_US/ch2/DSS Quick Installation Guide.md +++ b/docs/en_US/ch2/DSS Quick Installation Guide.md @@ -17,7 +17,7 @@ DSS also implements the integration of many external systems, such as [Qualitis] DSS environment configuration can be divided into three steps, including basic software installation, backend environment configuration, and frontend environment configuration. The details are as below: ### 2.1 Frontend and backend basic software installation -Linkis standard version (above 0.9.1). How to install [Linkis](https://github.com/WeBankFinTech/Linkis/blob/master/docs/en_US/ch1/deploy.md) +Linkis standard version (above 0.9.4). How to install [Linkis](https://github.com/WeBankFinTech/Linkis/blob/master/docs/en_US/ch1/deploy.md) JDK (above 1.8.0_141). How to install [JDK](https://www.runoob.com/java/java-environment-setup.html) @@ -103,7 +103,7 @@ dss_port="8088" linkis_url="http://127.0.0.1:9001" # dss ip address -dss_ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}') +dss_ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}'|awk 'NR==1') ``` The environment is ready, click me to enter ****[4. Installation and use](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/en_US/ch2/DSS%20Quick%20Installation%20Guide.md#four-installation-and-use)** @@ -111,7 +111,7 @@ The environment is ready, click me to enter ****[4. Installation and use](https: ## Three Standard DSS environment configuration preparation The standard DSS environment preparation is also divided into three parts, the frontEnd-end and back-end basic software installation, back-end environment preparation, and frontEnd-end environment preparation. The details are as follows: ### 3.1 frontEnd and BackEnd basic software installation -Linkis standard version (above 0.9.1), [How to install Linkis](https://github.com/WeBankFinTech/Linkis/blob/master/docs/en_US/ch1/deploy.md) +Linkis standard version (above 0.9.4), [How to install Linkis](https://github.com/WeBankFinTech/Linkis/blob/master/docs/en_US/ch1/deploy.md) JDK (above 1.8.0_141), How to install [JDK](https://www.runoob.com/java/java-environment-setup.html) @@ -219,7 +219,7 @@ dss_port="8088" linkis_url="http://127.0.0.1:9001" # dss ip address -dss_ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}') +dss_ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}'|awk 'NR==1') ``` The environment is ready, click me to enter **[Four Installation and use](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/en_US/ch2/DSS%20Quick%20Installation%20Guide.md#four-installation-and-use)** diff --git a/docs/en_US/ch2/DSS_0.9.0_upgrade_notes.md b/docs/en_US/ch2/DSS_0.9.0_upgrade_notes.md new file mode 100644 index 0000000000000000000000000000000000000000..cee4ec591e07d0afd3fa1186e9d5f411f768457c --- /dev/null +++ b/docs/en_US/ch2/DSS_0.9.0_upgrade_notes.md @@ -0,0 +1,16 @@ +# DSS 0.9.0 upgrade notes + +In DSS-0.9.0, the concept “workspace” is added. If you upgrade from DSS 0.7 or DSS 0.8 to DSS0.9.0. After completing platform deployment, the following adjustments are needed to be made: field `application_id` of table `dss_onestop_menu_application` is NULL by default., which is a foreign key references field `id` of table `dss_application`. So the field `application_id` of table `dss_onestop_menu_application` needed to be filled choosing from field `id` of table `dss_application`, which accords to the actual situation of business system, so as to connect workspace with each application. +E.g: +``` +-- Update application_id corresponding to workflow application +UPDATE dss_onestop_menu_application SET application_id = 2 WHERE id = 1; +-- Update application_id corresponding to Scriptis application +UPDATE dss_onestop_menu_application SET application_id = 1 WHERE id = 4; +``` +In addition, for users who have deployed DSS with edition 0.8.0 or below, the following adjustments are required: +Since field `workspace_id` is added to table `dss_project`, which is a foreign key references field `id` of table `dss_workspace`. The following command needs to be executed: +``` +ALTER TABLE dss_project ADD workspace_id bigint(20) DEFAULT 1; +``` +By default, original projects belongs to default workspace(workspace_id=1), users may add more workspace according to actual situation, and adjust the workspace of original projects as needed. diff --git a/docs/en_US/ch2/DSS_0.9.1_upgrade_notes.md b/docs/en_US/ch2/DSS_0.9.1_upgrade_notes.md new file mode 100644 index 0000000000000000000000000000000000000000..f1261483f4688060e841c56048b9350c7bfa184c --- /dev/null +++ b/docs/en_US/ch2/DSS_0.9.1_upgrade_notes.md @@ -0,0 +1,85 @@ +# DSS 0.9.1 upgrade notes + +## Environmental description + +------ + +1. The user who installs the node machine deployment of DSS must have the permission to create the directory in hdfs + 1)If the hadoop cluster adopts the kerberos authentication mechanism, in order to prevent the ticket from expiring, you need to execute the knit -kt command on the client, for example:kinit -kt /etc/security/keytabs/hdfs.keytab yarn/xxxxxxxx + 2)If the hadoop cluster uses the simple authentication mechanism, use hdfs dfs chmod authorization, such as: hdfs dfs chmod 775 /user/hive/xx +2. The user who deploys DSS on the DSS node has the permission to create hive database: + 1)If hadoop cluster using simple authentication mechanism, you can try the following manner authorized: + hive>set system:user:name=dss; + hive> grant all to user dss; + +Currently, there is no automatic authorization in the script, and the user needs to execute the command manually. +2)If the Hadoop cluster adopts kerberos authentication, the kinit command is automatically executed in our script to obtain the ticket, and there is no need to execute the command manually. The user only needs to configure the kerberos related parameters. For the specific configuration, see the kerberos configuration chapter. + +The newly created user should be configured in hive.users.in.admin.role in hive-site.xml. + +1. LDAP must be installed (user authentication only supports LDAP), and there must be ou=user and ou=group entries in ldap, such as: ou=user,dc=baidu,dc=com和ou=group,dc=baidu,dc=com. + The ldap version supports 2.4.x, and the support of other versions is to be verified +2. Install sshpass service, yum -y install sshpass + +## Upgrade installation instructions + +------ + +The jar package involved in this change: Under the dss-server/lib directory:dss-application-0.9.1.jar,dss-server-0.9.1.jar,dss-user-manager-0.9.1.jar +Front-end static files: web +After replacing the above file with the latest one, then modify the following configuration file + +### Installation and configuration file instructions + +1. kerberos related + +Function description: If the Hadoop cluster uses the kerberos authentication mechanism, the newly created user will be granted kerberos permissions +Configuration file path: dss-server/conf/linkis.properties + +``` + Parameters: + wds.linkis.kerberos.enable.switch --Whether the cluster adopts the kerberos authentication mechanism, 0-do not use kerberos 1-use kerberos. If the Hadoop cluster does not use the kerberos authentication mechanism, none of the following parameters need to be configured. + wds.linkis.kerberos.keytab.path --The storage location of keytab on the DSS installation node can be arbitrarily specified, such as /etc/security/keytabs + wds.linkis.kerberos.kdc.node --Deploy the KDC service node IP, such as 192.168.1.1 + wds.linkis.kerberos.ssh.port --Deploy the KDC service node SSH port number, generally 22 + wds.linkis.kerberos.kdc.user.name --Deploy a linux user name on the KDC node, the user must have sudo permission (very important!!!) for ssh operation + wds.linkis.kerberos.kdc.user.password --The login password of the kdc node user mentioned above, used for ssh operation + wds.linkis.kerberos.realm --Kerberos manages the domain name of the hadoop cluster, please consult the cluster operation and maintenance personnel + wds.linkis.kerberos.admin--A user granted the admin role on kerberos (such as hdfs, very important!!!, otherwise the authorization cannot be completed) +``` + +1. metastore related + +Function description: create hive databases for newly created user and grant the newly created user permission +Parameter configuration file: dss-server/conf/linkis.properties + +``` +Parameters: + wds.linkis.metastore.hive.hdfs.base.path --The path where hive warehouse data is stored on hdfs, such as /user/hive/warehouse + wds.dss.deploy.path --dss_linkis installation package path, such as /usr/local/dss_linkis +``` + +1. ldap related + Function description: Create a new Entry under ou=user and ou=group of ldap for user login authentication + Parameter configuration file path: tools/bin/ldap_user.py + +``` +LDAP_HOST -- Install the ldap service IP, such as 192.168.11.11 +LDAP_BASEDN --The upper dn of ou=user or ou=group, such as dc=example,dc=com +LDAP_ADMIN -- The dn of the user logging in to the ldap service, such as cn=admin,dc=example,dc=cn +LDAP_PASS --Password for logging in to the ldap service +The first line in the ldap_user.py file #!/usr/local/tools/venv/bin/python, replace /user/local with the installation path of dss_linkis +``` + +## User manual + +------ + +1. Access address [http://url](http://url/):port/#/userManger, you need to login as a super user (installation user) +2. Server configuration: + If the hadoop cluster uses the simple authentication mechanism, the user needs to add the ip, login user name (with sudo permission), and password of each server in the yarn cluster.The underlying principle is that the server where dss is installed will ssh to each server in the yarn cluster, and then create a linux user. + +If the kerberos authentication mechanism adopted by the hadoop cluster, just add an ip (such as 127.0.0.1), username, and password. If not added, the interface will report an exception, and subsequent versions will fix this bug. + +1. WorkspaceRootPath, hdfsRootPath, resultRootPath, schedulerPath, DSS installation directory, Azkaban installation directory.The default value is consistent with the configuration in the installation configuration file config.sh. The directory can be either the hdfs directory, starting with hdfs:///, or the linux directory, starting with file:///. +2. The bottom layer will create a hive database for the user, the database name: xx_default, and give the permission to add, delete, modify, and select. \ No newline at end of file diff --git a/docs/en_US/ch4/Web Compilation.md b/docs/en_US/ch4/Web Compilation.md new file mode 100644 index 0000000000000000000000000000000000000000..1f22f0a29e4e76b5dc61834526e802c1ab6b6940 --- /dev/null +++ b/docs/en_US/ch4/Web Compilation.md @@ -0,0 +1,102 @@ +# Compilation + +## Getting Started + +### Prerequisites + +Install Node.js on your computer. Download Link: [http://nodejs.cn/download/](http://nodejs.cn/download/). Recommend using the latest stable version. + +**Only do this step at the first time.** + +### Installation + +Run the following commands in terminal: + +``` +git clone https://github.com/WeBankFinTech/Scriptis.git +cd DataSphereStudio/web +npm install +``` + + Commands explanation: + +1. Pull remote repository to local: `git clone https://github.com/WeBankFinTech/Scriptis.git` + +2. Change to the root directory of the project: `cd DataSphereStudio/web` + +3. Install all dependencies required for the project: `npm install` + +**Only do this step at the first time.** + +### Configuration + +You need to make some configurations in your code, such as port address of backend server and socket address of backend server in .env.development file in root directory. + +``` +// Port address of backend server +VUE_APP_MN_CONFIG_PREFIX=http://yourIp:yourPort/yourPath +// Socket address +VUE_APP_MN_CONFIG_SOCKET=/yourSocketPath +``` + +You can refer to the official documentation of vue-cli for detailed explanation. [Modes and environment variables](https://cli.vuejs.org/guide/mode-and-env.html#modes) + +### Building project + +You can run the following command in terminal to build the project: + +``` +npm run build +``` + +A folder named "dist" would appear in your project's root directory if the command has run successfully and you can directly put "dist" to your static server. + +### How to run + +You would need to run the following command in terminal if you want to run project on your local browser and see corresponding effects after making changes to the code. + +``` +npm run serve +``` + +Access the application in browser (Chrome recommended) via link: [http://localhost:8080/](http://localhost:8080/) . + +Changes you make to the code would dynamically reflect on the +effects shown on browser when using the method described above to run project. + +**Notes: Since frontend and backend are developed separately, when running on local browser you need to allow cross domain access in order to access the port of backend server.** + +e.g. Chrome browser: + +Configuration in Windows: + +1. Close all browser windows. + +2. Create a shortcut of chrome, right-click to choose "properties" , then go to "Shortcut" tab find "Target" and add`--args --disable-web-security --user-data-dir=C:\MyChromeDevUserData` to it . +3. Use shortcut to open the browser. + +Configuration in MacOS: + +Run the following command. (You need to replace "yourname" in the path. If it's not working, check the path of MyChromeDevUserData on your machine and copy its path to the place right after "--user-data-dir=") + +``` +open -n /Applications/Google\ Chrome.app/ --args --disable-web-security --user-data-dir=/Users/yourname/MyChromeDevUserData/ +``` + +### FAQ + +#### Failed installation when running npm install + +Try to use Taobao npm mirror: + +``` +npm install -g cnpm --registry=https://registry.npm.taobao.org +``` + +Next, run the following command instead of npm install: + +``` +cnpm install +``` + +Note that you can still use `npm run serve` and `npm run build` to run and build project. \ No newline at end of file diff --git "a/docs/zh_CN/ch1/DSS\345\256\211\350\243\205\345\270\270\350\247\201\351\227\256\351\242\230\345\210\227\350\241\250.md" "b/docs/zh_CN/ch1/DSS\345\256\211\350\243\205\345\270\270\350\247\201\351\227\256\351\242\230\345\210\227\350\241\250.md" new file mode 100644 index 0000000000000000000000000000000000000000..151c25c6ab8f2c9ef74af26134e725d7fb0ecb4e --- /dev/null +++ "b/docs/zh_CN/ch1/DSS\345\256\211\350\243\205\345\270\270\350\247\201\351\227\256\351\242\230\345\210\227\350\241\250.md" @@ -0,0 +1,117 @@ +## DSS安装常见问题列表 + +**本文档汇总DSS安装过程中所有问题列表及解决方式,为社区用户安装DSS提供参考。** + + +#### (1) 创建工程失败:add scheduler project用户token为空 + +``` +{"method":null,"status":1,"message":"error code(错误码): 90002, error message(错误信息): add scheduler project failederrCode: 90019 ,desc: errCode: 90020 ,desc: 用户token为空 ,ip: dss.com ,port: 9004 ,serviceKind: dss-server ,ip: dss.com ,port: 9004 ,serviceKind: dss-server.","data":{"errorMsg":{"serviceKind":"dss-server","level":2,"port":9004,"errCode":90002,"ip":"dss.com","desc":"add scheduler project failederrCode: 90019 ,desc: errCode: 90020 ,desc: 用户token为空 ,ip: dss.com ,port: 9004 ,serviceKind: dss-server ,ip: dss.com ,port: 9004 ,serviceKind: dss-server"}}} + +``` + +确保dss-server的token.properties中添加了此用户,并保持与 azkaban 的 azkaban-users.xml用户一致 +以hadoop用户为例: +1、在dss-server的token.properties添加 +hadoop=hadoop +2、在azkaban azkaban-users.xml 文件 添加 + - 0.5.0 - 0.9.1 + 0.9.1 + 0.9.4 2.11.8 1.8 3.3.3 diff --git a/docs/zh_CN/ch2/Azkaban_LinkisJobType_Deployment_Manual.md b/docs/zh_CN/ch2/Azkaban_LinkisJobType_Deployment_Manual.md index 846dc7aedfa8cff8915826c98f27b946310c2734..adc075174e465cf13ede08516dfbc769a312df9a 100644 --- a/docs/zh_CN/ch2/Azkaban_LinkisJobType_Deployment_Manual.md +++ b/docs/zh_CN/ch2/Azkaban_LinkisJobType_Deployment_Manual.md @@ -22,19 +22,19 @@ cd linkis/bin/ LINKIS_GATEWAY_URL=http://127.0.0.1:9001 ## linkis的GateWay地址 ##Linkis gateway token defaultWS-AUTH -LINKIS_GATEWAY_TOKEN=WS-AUTH ## Linkis的代理Token,该参数可以用默认的 +LINKIS_GATEWAY_TOKEN=WS-AUTH ## Linkis的代理Token,该参数可以用默认值 ##Azkaban executor host -AZKABAN_EXECUTOR_HOST=127.0.0.1 ## AZKABAN执行器机器IP +AZKABAN_EXECUTOR_HOST=127.0.0.1 ## 如果Azkaban是单机安装则该IP就是机器IP,如果是分布式安装为Azkaban执行器机器IP, ### SSH Port SSH_PORT=22 ## SSH端口 ##Azkaban executor dir -AZKABAN_EXECUTOR_DIR=/tmp/Install/AzkabanInstall/executor ## 执行器的安装目录,最后不需要带上/ +AZKABAN_EXECUTOR_DIR=/tmp/Install/AzkabanInstall/executor ## 如果Azkaban是单机安装则该目录是Azkaban的安装目录,如果是分布式安装为执行器的安装目录,注意:最后不需要带上/ ##Azkaban executor plugin reload url -AZKABAN_EXECUTOR_URL=http://127.0.0.1:12321/executor?action=reloadJobTypePlugins ##这里只需要修改IP和端口即可 +AZKABAN_EXECUTOR_URL=http://$AZKABAN_EXECUTOR_HOST:12321/executor?action=reloadJobTypePlugins ##这里只需要修改IP和端口即可,该地址为Azkaban重载插件的地址。 ``` ## 3. 执行安装脚本 ``` diff --git "a/docs/zh_CN/ch2/DSS_0.9.0_\345\215\207\347\272\247\350\257\264\346\230\216.md" "b/docs/zh_CN/ch2/DSS_0.9.0_\345\215\207\347\272\247\350\257\264\346\230\216.md" new file mode 100644 index 0000000000000000000000000000000000000000..619959583c66664fefdb388cb4accb29a76b9201 --- /dev/null +++ "b/docs/zh_CN/ch2/DSS_0.9.0_\345\215\207\347\272\247\350\257\264\346\230\216.md" @@ -0,0 +1,16 @@ +# DSS-0.9.0升级说明 + +本次DSS-0.9.0版本新增用户工作空间(workspace)概念,如果您是从 DSS0.7 或 DSS0.8 升级到 DSS0.9.0,在完成平台部署后,需对数据库表做一些调整需作如下调整: +dss_onestop_menu_application表中的application_id字段默认为空,该字段与dss_application表的id字段关联,需根据用户业务系统的实际情况与dss_application表进行关联,将用户工作空间与各应用打通。例如: +``` +-- 更新workflow应用对应的application_id +UPDATE dss_onestop_menu_application SET application_id = 2 WHERE id = 1; +-- 更新Scriptis应用对应的application_id +UPDATE dss_onestop_menu_application SET application_id = 1 WHERE id = 4; +``` +此外,对于已部署DSS-0.8.0及以下版本的用户,还需做如下调整: +dss_project表新增workspace_id字段,该字段与dss_workspace表的id字段关联,需在数据库执行如下命令: +``` +ALTER TABLE dss_project ADD workspace_id bigint(20) DEFAULT 1; +``` +默认情况下,所有原有项目都将归属默认工作空间(workspace_id=1),用户可根据实际情况新增用户空间,并调整原有项目的所属工作空间。 \ No newline at end of file diff --git "a/docs/zh_CN/ch2/DSS_0.9.1_\345\215\207\347\272\247\350\257\264\346\230\216.md" "b/docs/zh_CN/ch2/DSS_0.9.1_\345\215\207\347\272\247\350\257\264\346\230\216.md" new file mode 100644 index 0000000000000000000000000000000000000000..203b315f10895741bc5f5ca1ae8be21c89e818b4 --- /dev/null +++ "b/docs/zh_CN/ch2/DSS_0.9.1_\345\215\207\347\272\247\350\257\264\346\230\216.md" @@ -0,0 +1,72 @@ +# DSS-0.9.1升级说明 + +## 环境说明 + +1. 安装DSS节点上部署DSS用户必须有hdfs创建目录的权限 + 1)如果hadoop集群采用kerberos认证机制,为防止票据过期,则需要在客户端执行 knit -kt命令,比如:kinit -kt /etc/security/keytabs/hdfs.keytab yarn/xxxxxxxx + 2)如果hadoop集群采用simple认证机制,则使用hdfs dfs chmod 授权,比如:hdfs dfs chmod 775 /user/hive/xx +2. 安装DSS节点上部署DSS的用户具有创建hive database权限问题: + 1)如果hadoop集群采用simple认证机制,可以尝试如下方式授权: + hive>set system:user:name=dss; + hive> grant all to user dss + 目前并未在脚本中自动授权,需要用户手动执行命令。 + 2)如果hadoop集群采用kerberos认证,在我们的脚本中自动执行了kinit命令以获取票据,不需要手工执行命令,用户只需要要配置kerberos相关的参数,具体配置见kerberos配置章节。 + 新建的用户要在 hive-site.xml 中hive.users.in.admin.role配置。 +3. 必须安装有LDAP(用户认证只支持LDAP),ldap中必须有ou=user和ou=group条目,比如:ou=user,dc=baidu,dc=com和ou=group,dc=baidu,dc=com。ldap版本支持2.4.x,其他版本的支持情况待验证 +4. 安装sshpass服务,yum -y install sshpass + +## 版本升级安装说明 + +本次改动涉及的的jar包:dss-server/lib目录下: dss-application-0.9.1.jar,dss-server-0.9.1.jar,dss-user-manager-0.9.1.jar +前端静态文件:web +将以上文件替换成最新的后,然后修改下面的配置文件 + +### 安装及配置文件说明 + +1. kerberos相关 + 功能说明:如果hadoop集群采用kerberos认证机制,则会给新建的用户授予kerberos权限 + 配置文件路径:dss-server/conf/linkis.properties + +``` + 参数: + wds.linkis.kerberos.enable.switch --集群是否采用kerberos认证机制,0-不采用kerberos 1-采用kerberos。如果hadoop集群不采用kerberos认证机制,则下面的参数都不需要配置。 + wds.linkis.kerberos.keytab.path --keytab在DSS安装节点上的存放位置,可以任意指定,比如 /etc/security/keytabs + wds.linkis.kerberos.kdc.node --部署KDC服务节点IP,比如192.168.1.1 + wds.linkis.kerberos.ssh.port --部署KDC服务节点SSH端口号,一般都是22 + wds.linkis.kerberos.kdc.user.name --部署KDC节点上的一个linux用户名,该用用户必须有sudo权限(重要,重要!!!),用于ssh操作 + wds.linkis.kerberos.kdc.user.password --上面提到的kdc节点用户的登录密码,用于ssh操作 + wds.linkis.kerberos.realm --kerberos管理hadoop集群的域名,请咨询集群运维人员。 + wds.linkis.kerberos.admin--kerberos上的一个被授予admin角色的用户(如hdfs,非常重要!!!!,否则无法完成授权) +``` + +1. metastore相关 + 功能说明:给新建用户hive库,并授予新建用户权限 + 参数配置文件:dss-server/conf/linkis.properties + +``` + 参数: + wds.linkis.metastore.hive.hdfs.base.path --hive仓库数据存储在在hdfs上的路径,比如 /user/hive/warehouse + wds.dss.deploy.path --dss_linkis安装包路径,比如 /usr/local/dss_linkis +``` + +3.ldap相关 +功能说明:在ldap的ou=user和ou=group下新建一个Entry,用于用户登录验证 +参数配置文件路径:安装包下的tools/bin/ldap_user.py + +``` +LDAP_HOST -- 安装ldap服务IP,比如192.168.11.11 +LDAP_BASEDN --ou=user或ou=group的上层dn,比如 dc=example,dc=com +LDAP_ADMIN -- 登录ldap服务的用户dn 比如 'cn=admin,dc=example,dc=cn' +LDAP_PASS --登录ldap服务的密码 +ldap_user.py文件中第一行 #!/usr/local/tools/venv/bin/python,将/user/local换成dss_linkis的安装路径 +``` + +## 使用说明 + +1. 访问地址 http://url:port/#/userManger,需要用超级用户(安装用户)登录 +2. 服务器配置: + 如果hadoop集群采用的是simple认证机制,则用户需要添加yarn集群各服务器的ip、登录用户名(具有sudo权限)、密码。其底层的原理是安装dss的服务器会ssh到yarn集群的各服务器,然后创建linux用户。 + 如果hadoop集群采用的kerberos认证机制,则随便添加一个ip(比如127.0.0.1),用户名,密码。如果不添加接口会报异常,后续版本会修复此bug +3. workspaceRootPath,hdfsRootPath,resultRootPath,schedulerPath,DSS安装目录,Azkaban安装目录。 + 其默认值和安装配置文件config.sh里配置的保持一致。其目录既可以是hdfs目录,以hdfs:///开头,也可以是linux目录,以file:///开头 +4. 底层会给用户创建hive库,库名:xx_default,并赋予增删改查权限 \ No newline at end of file diff --git a/docs/zh_CN/ch2/DSS_LINKIS_Quick_Install.md b/docs/zh_CN/ch2/DSS_LINKIS_Quick_Install.md new file mode 100644 index 0000000000000000000000000000000000000000..f1600779103204e837adde7f16994e00f3242384 --- /dev/null +++ b/docs/zh_CN/ch2/DSS_LINKIS_Quick_Install.md @@ -0,0 +1,449 @@ +# DataSphere Studio快速安装使用文档 + +由于DataSphere Studio依赖于[Linkis](https://github.com/WeBankFinTech/Linkis),本文档提供了以下两种部署方式供您选择: + +1. DSS & Linkis 一键部署 + +     该模式适合于DSS和Linkis都没有安装的情况。 + +     进入[DSS & Linkis安装环境准备](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/DSS_LINKIS_Quick_Install.md#%E4%B8%80dss--linkis%E5%AE%89%E8%A3%85%E7%8E%AF%E5%A2%83%E5%87%86%E5%A4%87) + +2. DSS 一键部署 + +     该模式适合于Linkis已经安装,需要安装DSS的情况。 + +     进入[DSS快速安装文档](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/DSS%E5%BF%AB%E9%80%9F%E5%AE%89%E8%A3%85%E4%BD%BF%E7%94%A8%E6%96%87%E6%A1%A3.md) + +     **请根据实际情况,选择合理安装方式**。 + +## 一、DSS & Linkis安装环境准备 + +**根据安装难度,我们提供了以下两种环境准备方式,请根据需要选择:** + +1. **精简版** + +     没有任何安装难度,适合于调研和学习,10分钟即可部署起来。 + +     支持的功能有: + +- 数据开发IDE - Scriptis,仅支持:执行Python和JDBC脚本 +- Linkis管理台 + +**进入[DSS & Linkis精简版环境准备](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/DSS_LINKIS_Quick_Install.md#%E4%BA%8Cdss--linkis%E7%B2%BE%E7%AE%80%E7%89%88%E7%8E%AF%E5%A2%83%E5%87%86%E5%A4%87)** + +2. **标准版**: + +     有一定的安装难度,体现在Hadoop、Hive和Spark版本不同时,可能需要重新编译,可能会出现包冲突问题。 + +适合于试用和生产使用,2~3小时即可部署起来。 + +     支持的功能有: + +- 数据开发IDE - Scriptis + +- 工作流实时执行 + +- 信号功能和邮件功能 + +- 数据可视化 - Visualis + +- 数据质量 - Qualitis(**单机版**) + +- 工作流定时调度 - Azkaban(**单机版**) + +- Linkis管理台 + +**进入[DSS & Linkis标准版环境准备](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/DSS_LINKIS_Quick_Install.md#%E4%B8%89dss--linkis%E6%A0%87%E5%87%86%E7%89%88%E7%8E%AF%E5%A2%83%E5%87%86%E5%A4%87)** + +---- + +## 二、DSS & Linkis精简版环境准备 + +### a. 基础软件安装 + +        下面的软件必装: + +- MySQL (5.5+),[如何安装MySQL](https://www.runoob.com/mysql/mysql-install.html) +- JDK (1.8.0_141以上),[如何安装JDK](https://www.runoob.com/java/java-environment-setup.html) +- Python(2.x和3.x都支持),[如何安装Python](https://www.runoob.com/python/python-install.html) +- Nginx,[如何安装Nginx](https://www.tecmint.com/install-nginx-on-centos-7/) + +### b. 创建用户 + +        例如: **部署用户是hadoop账号** + +1. 在部署机器上创建部署用户,用于安装 + +```bash + sudo useradd hadoop +``` + +2. 因为Linkis的服务是以 sudo -u ${linux-user} 方式来切换引擎,从而执行作业,所以部署用户需要有 sudo 权限,而且是免密的。 + +```bash + vi /etc/sudoers +``` + + hadoop ALL=(ALL) NOPASSWD: NOPASSWD: ALL + +3. **如果您的Python想拥有画图功能,则还需在安装节点,安装画图模块**。命令如下: + +```bash + python -m pip install matplotlib +``` + +### c. 安装包准备 + +**如果您想使用DSS & Linkis全家桶一键部署安装包(1.3GB)([点我进入下载页面](https://github.com/WeBankFinTech/DataSphereStudio/releases)),直接解压即可,以下步骤可忽略。** + +下列步骤为用户自行编译或者去各个组件release页面下载安装包: +1. 下载安装包 +- [wedatasphere-linkis-x.x.x-dist.tar.gz](https://github.com/WeBankFinTech/Linkis/releases) +- [wedatasphere-dss-x.x.x-dist.tar.gz](https://github.com/WeBankFinTech/DataSphereStudio/releases) +- [wedatasphere-dss-web-x.x.x-dist.zip](https://github.com/WeBankFinTech/DataSphereStudio/releases) +- [linkis-jobtype-x.x.x.zip](https://github.com/WeBankFinTech/DataSphereStudio/releases) +- azkaban-solo-server-x.x.x.tar.gz +- [wedatasphere-qualitis-x.x.x.zip](https://github.com/WeBankFinTech/Qualitis/releases) + +2. 下载DSS&LINKIS[一键部署脚本](https://share.weiyun.com/58yxh3n),并解压,再将上述所下载的安装包放置于该目录下,目录层级如下: + +```text +├── dss_linkis # 一键部署主目录 + ├── backup # 用于兼容Linkis老版本的安装启动脚本 + ├── bin # 用于一键安装启动DSS+Linkis + ├── conf # 一键部署的配置文件 + ├── azkaban-solo-server-x.x.x.tar.gz #azkaban安装包 + ├── linkis-jobtype-x.x.x.zip #linkis jobtype安装包 + ├── wedatasphere-dss-x.x.x-dist.tar.gz # DSS后台安装包 + ├── wedatasphere-dss-web-x.x.x-dist.zip # DSS前端安装包 + ├── wedatasphere-linkis-x.x.x-dist.tar.gz # Linkis安装包 + ├── wedatasphere-qualitis-x.x.x.zip # Qualitis安装包 +``` +**注意事项:** +1. Azkaban: 社区没有提供单独的release安装包,用户需要自行编译后的将安装包放置于安装目录下。 +2. DSS: 用户自行编译DSS安装包,会缺失visualis-server部分,因此,visualis-server也需要用户自行编译。从[visualis项目](https://github.com/WeBankFinTech/Visualis)编译打包后放置于wedatasphere-dss-x.x.x-dist.tar.gz的share/visualis-server目录下,否则dss安装时可能报找不到visualis安装包。 + +### d. 修改配置 + +将conf目录下的config.sh.lite.template,修改为config.sh + +```shell + cp conf/config.sh.lite.template conf/config.sh +``` + +**精简版可以不修改任何配置参数**,当然您也可以按需修改相关配置参数。 + +``` + vi conf/config.sh + + SSH_PORT=22 #ssh默认端口 + deployUser="`whoami`" #默认获取当前用户为部署用户 + WORKSPACE_USER_ROOT_PATH=file:///tmp/linkis/ ##工作空间路径,默认为本地路径,尽量提前创建并授于写权限 + RESULT_SET_ROOT_PATH=file:///tmp/linkis ##结果集路径,默认为本地路径,尽量提前创建并授于写权限 + DSS_NGINX_IP=127.0.0.1 #DSS Nginx访问IP + DSS_WEB_PORT=8088 #DSS Web页面访问端口 + +``` + +```properties + # 说明:通常情况下,精简版,上述参数默认情况均可不做修改,即可直接安装使用 + +``` + +### e. 修改数据库配置 + +```bash + vi conf/db.sh +``` + +```properties + # 设置数据库的连接信息 + MYSQL_HOST= + MYSQL_PORT= + MYSQL_DB= + MYSQL_USER= + MYSQL_PASSWORD= +``` + + +```properties + # 说明:此为必须配置参数,并确保可以从本机进行访问,验证方式: + mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD +``` + +精简版配置修改完毕,进入[安装和使用](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/DSS_LINKIS_Quick_Install.md#%E5%9B%9B%E5%AE%89%E8%A3%85%E5%92%8C%E4%BD%BF%E7%94%A8) + +## 三、DSS & Linkis标准版环境准备 + +### a. 基础软件安装 + +        下面的软件必装: + +- MySQL (5.5+),[如何安装MySQL](https://www.runoob.com/mysql/mysql-install.html) + +- JDK (1.8.0_141以上),[如何安装JDK](https://www.runoob.com/java/java-environment-setup.html) + +- Python(2.x和3.x都支持),[如何安装Python](https://www.runoob.com/python/python-install.html) + +- Nginx,[如何安装Nginx](https://www.tecmint.com/install-nginx-on-centos-7/) + +        下面的服务必须可从本机访问: + +- Hadoop(**2.7.2,Hadoop其他版本需自行编译Linkis**) + +- Hive(**1.2.1,Hive其他版本需自行编译Linkis**) + +- Spark(**支持2.0以上所有版本**) + +### b. 创建用户 + +        例如: **部署用户是hadoop账号** + +1. 在所有需要部署的机器上创建部署用户,用于安装 + +```bash + sudo useradd hadoop +``` + +2. 因为Linkis的服务是以 sudo -u ${linux-user} 方式来切换引擎,从而执行作业,所以部署用户需要有 sudo 权限,而且是免密的。 + +```bash + vi /etc/sudoers +``` + +```properties + hadoop ALL=(ALL) NOPASSWD: NOPASSWD: ALL +``` + +3. 确保部署 DSS 和 Linkis 的服务器可正常访问Hadoop、Hive和Spark。 + +        **部署DSS 和 Linkis 的服务器,不要求必须安装Hadoop,但要求hdfs命令必须可用,如:hdfs dfs -ls /**。 + +        **如果想使用Linkis的Spark,部署 Linkis 的服务器,要求spark-sql命令必须可以正常启动一个spark application**。 + +        **在每台安装节点设置如下的全局环境变量**,以便Linkis能正常读取Hadoop、Hive和Spark的配置文件,具备访问Hadoop、Hive和Spark的能力。 + +        修改安装用户hadoop的.bash_rc,命令如下: + +```bash + vim /home/hadoop/.bash_rc +``` + +        下方为环境变量示例: + +```bash + #JDK + export JAVA_HOME=/nemo/jdk1.8.0_141 + #HADOOP + export HADOOP_CONF_DIR=/appcom/config/hadoop-config + #Hive + export HIVE_CONF_DIR=/appcom/config/hive-config + #Spark + export SPARK_HOME=/appcom/Install/spark + export SPARK_CONF_DIR=/appcom/config/spark-config + export PYSPARK_ALLOW_INSECURE_GATEWAY=1 # Pyspark必须加的参数 +``` + +4. **如果您的Pyspark想拥有画图功能,则还需在所有安装节点,安装画图模块**。命令如下: + +```bash + python -m pip install matplotlib +``` + +### c. 安装包准备 + +**如果您想使用DSS & Linkis全家桶一键部署安装包(1.3GB)([点我进入下载页面](https://github.com/WeBankFinTech/DataSphereStudio/releases)),直接解压即可,以下步骤可忽略。** + +下列步骤为用户自行编译或者去各个组件release页面下载安装包: +1. 下载安装包 +- [wedatasphere-linkis-x.x.x-dist.tar.gz](https://github.com/WeBankFinTech/Linkis/releases) +- [wedatasphere-dss-x.x.x-dist.tar.gz](https://github.com/WeBankFinTech/DataSphereStudio/releases) +- [wedatasphere-dss-web-x.x.x-dist.zip](https://github.com/WeBankFinTech/DataSphereStudio/releases) +- [linkis-jobtype-x.x.x.zip](https://github.com/WeBankFinTech/DataSphereStudio/releases) +- azkaban-solo-server-x.x.x.tar.gz +- [wedatasphere-qualitis-x.x.x.zip](https://github.com/WeBankFinTech/Qualitis/releases) + +2. 下载DSS&LINKIS[一键部署脚本](https://share.weiyun.com/58yxh3n),并解压,再将上述所下载的安装包放置于该目录下,目录层级如下: + +```text +├── dss_linkis # 一键部署主目录 + ├── backup # 用于兼容Linkis老版本的安装启动脚本 + ├── bin # 用于一键安装启动DSS+Linkis + ├── conf # 一键部署的配置文件 + ├── azkaban-solo-server-x.x.x.tar.gz #azkaban安装包 + ├── linkis-jobtype-x.x.x.zip #linkis jobtype安装包 + ├── wedatasphere-dss-x.x.x-dist.tar.gz # DSS后台安装包 + ├── wedatasphere-dss-web-x.x.x-dist.zip # DSS前端安装包 + ├── wedatasphere-linkis-x.x.x-dist.tar.gz # Linkis安装包 + ├── wedatasphere-qualitis-x.x.x.zip # Qualitis安装包 +``` +**注意事项:** +1. Azkaban: 社区没有提供单独的release安装包,用户需要自行编译后的将安装包放置于安装目录下。 +2. DSS: 用户自行编译DSS安装包,会缺失visualis-server部分,因此,visualis-server也需要用户自行编译。从[visualis项目](https://github.com/WeBankFinTech/Visualis)编译打包后放置于wedatasphere-dss-x.x.x-dist.tar.gz的share/visualis-server目录下,否则安装时可能报找不到visualis安装包。 + +### d. 修改配置 + +将conf目录下的config.sh.stand.template,修改为config.sh + +```shell + cp conf/config.sh.stand.template conf/config.sh +``` + +您可以按需修改相关配置参数: + +``` + vi conf/config.sh +``` + +参数说明如下: +```properties + WORKSPACE_USER_ROOT_PATH=file:///tmp/linkis/ ##本地工作空间路径,默认为本地路径,尽量提前创建并授于写权限 + HDFS_USER_ROOT_PATH=hdfs:///tmp/linkis ##hdfs工作空间路径,默认为本地路径,尽量提前创建并授于写权限 + RESULT_SET_ROOT_PATH=hdfs:///tmp/linkis ##结果集路径,默认为本地路径,尽量提前创建并授于写权限 + WDS_SCHEDULER_PATH=file:///appcom/tmp/wds/scheduler ##DSS工程转换为azkaban工程后的存储路径 + #DSS Web,注意distribution.sh中VISUALIS_NGINX的IP和端口必须和此处保持一致 + DSS_NGINX_IP=127.0.0.1 #DSS Nginx访问IP + DSS_WEB_PORT=8088 #DSS Web页面访问端口 + ##hive metastore的地址 + HIVE_META_URL=jdbc:mysql://127.0.0.1:3306/metastore?useUnicode=true + HIVE_META_USER=xxx + HIVE_META_PASSWORD=xxx + ###hadoop配置文件目录 + HADOOP_CONF_DIR=/appcom/config/hadoop-config + ###hive配置文件目录 + HIVE_CONF_DIR=/appcom/config/hive-config + ###spark配置文件目录 + SPARK_CONF_DIR=/appcom/config/spark-config + ###azkaban服务端IP地址及端口,单机版安装时请勿修改 + AZKABAN_ADRESS_IP=127.0.0.1 + AZKABAN_ADRESS_PORT=8081 + ####Qualitis服务端IP地址及端口,单机版安装时请勿修改 + QUALITIS_ADRESS_IP=127.0.0.1 + QUALITIS_ADRESS_PORT=8090 + +``` + +### e. 使用分布式模式 + +        如果您打算将DSS和Linkis都部署在同一台服务器上, 本步骤可以跳过。 + +        如果您打算将 DSS 和 Linkis 部署在多台服务器上,首先,您需要为这些服务器配置ssh免密登陆。 + +        [如何配置SSH免密登陆](https://www.jianshu.com/p/0922095f69f3) + +        同时,您还需要修改分布式部署模式下的distribution.sh配置文件,使分布式部署生效。 + +```shell script + vi conf/distribution.sh +``` + +```说明:IP地址和端口 + LINKIS和DSS的微服务IP地址和端口,可配置成远程地址,例如您想把LINKIS和DSS安装在不同的机器上,那么只需把linkis各项微服务的IP地址修改成与DSS不同的IP即可。 + +``` + +### f. 修改数据库配置 + +```bash + vi conf/db.sh +``` + +```properties + # 设置数据库的连接信息 + MYSQL_HOST= + MYSQL_PORT= + MYSQL_DB= + MYSQL_USER= + MYSQL_PASSWORD= +``` + +```properties + # 说明:此为必须配置参数,并确保可以从本机进行访问,验证方式: + mysql -h$MYSQL_HOST -P$MYSQL_PORT -u$MYSQL_USER -p$MYSQL_PASSWORD +``` + +标准版配置修改完毕,进入[安装和使用](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/DSS_LINKIS_Quick_Install.md#%E5%9B%9B%E5%AE%89%E8%A3%85%E5%92%8C%E4%BD%BF%E7%94%A8) + +---- + + +## 四、安装和使用 + +### 1. 执行安装脚本: + +```bash + sh bin/install.sh +``` +注意:安装脚本有两处是相对路径,为了正确安装,请按照以上命令执行。 + +### 2. 安装步骤 + +- 该安装脚本会检查各项集成环境命令,如果没有请按照提示进行安装,以下命令为必须项: +_yum java mysql unzip expect telnet tar sed dos2unix nginx_ + +- 安装过程如果有很多cp 命令提示您是否覆盖安装,说明您的系统配置有别名,输入alias,如果有cp、mv、rm的别名,如果有可以去掉,就不会再有大量提示。 + +- install.sh脚本会询问您安装模式。 +安装模式分为精简版、标准版,请根据您准备的环境情况,选择合适的安装模式。 + +- install.sh脚本会询问您是否需要初始化数据库并导入元数据,linkis和dss 均会询问。 + +     **第一次安装**必须选是。 + +### 3. 是否安装成功: + +     通过查看控制台打印的日志信息查看是否安装成功。 + +     如果有错误信息,可以查看具体报错原因。 + +     您也可以通过查看我们的[安装常见问题](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch1/DSS%E5%AE%89%E8%A3%85%E5%B8%B8%E8%A7%81%E9%97%AE%E9%A2%98%E5%88%97%E8%A1%A8.md),获取问题的解答。 + +### 4. 启动服务 + +#### (1) 启动服务: + +     在安装目录执行以下命令,启动所有服务: + +```shell script + sh bin/start-all.sh > start.log 2>start_error.log +``` + +     如果启动产生了错误信息,可以查看具体报错原因。启动后,各项微服务都会进行**通信检测**,如果有异常则可以帮助用户定位异常日志和原因。 + +     您可以通过查看我们的[启动常见问题](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch1/DSS%E5%AE%89%E8%A3%85%E5%B8%B8%E8%A7%81%E9%97%AE%E9%A2%98%E5%88%97%E8%A1%A8.md),获取问题的解答。 + +#### (2) 查看是否启动成功 + +     可以在Eureka界面查看 Linkis & DSS 后台各微服务的启动情况。如下图,如您的Eureka主页**启动日志会打印此访问地址**,出现以下微服务,则表示服务都启动成功,可以正常对外提供服务了: + + ![Eureka](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/images/zh_CN/chapter2/quickInstallUse/quickInstall.png) + +#### (3) 谷歌浏览器访问: + +请使用**谷歌浏览器**访问以下前端地址: + +`http://DSS_NGINX_IP:DSS_WEB_PORT` **启动日志会打印此访问地址**。登陆时管理员的用户名和密码均为部署用户名,如部署用户为hadoop,则管理员的用户名/密码为:hadoop/hadoop。 + +如果您想支持更多用户登录,详见 [Linkis LDAP](https://github.com/WeBankFinTech/Linkis/wiki/%E9%83%A8%E7%BD%B2%E5%92%8C%E7%BC%96%E8%AF%91%E9%97%AE%E9%A2%98%E6%80%BB%E7%BB%93) + +如何快速使用DSS, 点我进入 [DSS快速使用文档](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch3/DSS_User_Manual.md) + +【DSS用户手册】提供了更加详尽的使用方法,点我进入 [DSS用户手册](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch3/DSS_User_Manual.md) +#### (4) 停止服务: +     在安装目录执行以下命令,停止所有服务:sh bin/stop-all.sh + +**注意** +1. 如果用户想启动和停止**单个应用**,可修改启动脚本注释掉其他应用的启动和停止命令即可。 + +2. 如果用户想启动和停止**单个微服务**,则可进入该微服务安装目录下执行sh bin/start-微服务名称.sh或sh bin/stop-微服务名称.sh + + +**附Qualitis及Azkaban单机版安装包资源:** + +- 腾讯云链接:https://share.weiyun.com/5fBPVIV + +- 密码:cwnhgw + +- 百度云链接:https://pan.baidu.com/s/1DYvm_KTljQpbdk6ZPx6K9g + +- 密码:3lnk diff --git "a/docs/zh_CN/ch2/DSS\345\277\253\351\200\237\345\256\211\350\243\205\344\275\277\347\224\250\346\226\207\346\241\243.md" "b/docs/zh_CN/ch2/DSS\345\277\253\351\200\237\345\256\211\350\243\205\344\275\277\347\224\250\346\226\207\346\241\243.md" index 54a3f1774f33e7baf864ab167552743b5a13af4e..a8fda0a9b5c3675c032ec433166de6d251bf0bf8 100644 --- "a/docs/zh_CN/ch2/DSS\345\277\253\351\200\237\345\256\211\350\243\205\344\275\277\347\224\250\346\226\207\346\241\243.md" +++ "b/docs/zh_CN/ch2/DSS\345\277\253\351\200\237\345\256\211\350\243\205\344\275\277\347\224\250\346\226\207\346\241\243.md" @@ -1,34 +1,38 @@ -# # 如何快速安装使用DataSphereStudio +## 如何快速安装使用DataSphereStudio ### DataSphereStudio安装分为前端部分和后台部分,安装之前首先需要确定前、后端安装环境。 ### 一、确定您的安装环境 -####         DataSphereStudio根据组件丰富程度,安装环境略有差异,分为简单版和标准版,其区别如下: +####         DataSphereStudio根据组件丰富程度,安装环境略有差异,分为精简版、简单版和标准版,其区别如下: ---- +1. **精简版**: -1. **简单版**: +      最少环境依赖,前后端基础环境部分仅需准备:[Linkis](https://github.com/WeBankFinTech/Linkis)、JAVA、MYSQL、[Nginx](https://www.nginx.com/) ,您即刻能享受到DSS已集成的数据开发Scriptis服务。 -      最少环境依赖,前后端基础环境部分仅需准备:[Linkis](https://github.com/WeBankFinTech/Linkis)、JAVA、MYSQL、[Nginx](https://www.nginx.com/) ,您即刻能享受到DSS已集成的数据开发Scriptis、工作流实时执行,可视化,邮件发送,DATACHECK和EVENTCHECK服务。 - -        点我进入[简单版DSS环境配置准备](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/DSS%E5%BF%AB%E9%80%9F%E5%AE%89%E8%A3%85%E4%BD%BF%E7%94%A8%E6%96%87%E6%A1%A3.md#%E4%BA%8C%E7%AE%80%E5%8D%95%E7%89%88dss%E7%8E%AF%E5%A2%83%E9%85%8D%E7%BD%AE%E5%87%86%E5%A4%87) +        点我进入[精简版DSS环境配置准备](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/DSS%E5%BF%AB%E9%80%9F%E5%AE%89%E8%A3%85%E4%BD%BF%E7%94%A8%E6%96%87%E6%A1%A3.md#%E4%BA%8C%E7%B2%BE%E7%AE%80%E7%89%88dss%E7%8E%AF%E5%A2%83%E9%85%8D%E7%BD%AE%E5%87%86%E5%A4%87) ---- +2. **简单版**: + +      较少环境依赖,前后端基础环境部分仅需准备:[Linkis](https://github.com/WeBankFinTech/Linkis)、JAVA、MYSQL、[Nginx](https://www.nginx.com/) ,您即刻能享受到DSS已集成的数据开发Scriptis、工作流实时执行,可视化,邮件发送,DATACHECK和EVENTCHECK服务。 +        点我进入[简单版DSS环境配置准备](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/DSS%E5%BF%AB%E9%80%9F%E5%AE%89%E8%A3%85%E4%BD%BF%E7%94%A8%E6%96%87%E6%A1%A3.md#%E4%B8%89%E7%AE%80%E5%8D%95%E7%89%88dss%E7%8E%AF%E5%A2%83%E9%85%8D%E7%BD%AE%E5%87%86%E5%A4%87) -2. **标准版** +---- +3. **标准版**           DSS还实现了很多外部系统的集成,如[Qualitis](https://github.com/WeBankFinTech/Qualitis)和[Azkaban](https://github.com/azkaban/azkaban),如果您想使用这些外部系统,则需要在简单版的基础上,提前安装和启动好Qualitis和Azkaban服务,并确保其能够正常提供服务,并在配置中指定对应服务的IP和端口。 -          点我进入[标准版DSS环境配置准备](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/DSS%E5%BF%AB%E9%80%9F%E5%AE%89%E8%A3%85%E4%BD%BF%E7%94%A8%E6%96%87%E6%A1%A3.md#%E4%B8%89%E6%A0%87%E5%87%86%E7%89%88dss%E7%8E%AF%E5%A2%83%E9%85%8D%E7%BD%AE%E5%87%86%E5%A4%87) +          点我进入[标准版DSS环境配置准备](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/DSS%E5%BF%AB%E9%80%9F%E5%AE%89%E8%A3%85%E4%BD%BF%E7%94%A8%E6%96%87%E6%A1%A3.md#%E5%9B%9B%E6%A0%87%E5%87%86%E7%89%88dss%E7%8E%AF%E5%A2%83%E9%85%8D%E7%BD%AE%E5%87%86%E5%A4%87) ---- -## 二、简单版DSS环境配置准备 +## 二、精简版DSS环境配置准备 DSS环境配置准备分为三部分,前后端基础软件安装、后端环境配置准备和前端环配置境准备,详细介绍如下: ### 2.1 前后端基础软件安装 -Linkis标准版(0.9.1及以上),[如何安装Linkis](https://github.com/WeBankFinTech/Linkis/wiki/%E5%A6%82%E4%BD%95%E5%BF%AB%E9%80%9F%E5%AE%89%E8%A3%85%E4%BD%BF%E7%94%A8Linkis) +Linkis精简版(0.9.4及以上),[如何安装Linkis](https://github.com/WeBankFinTech/Linkis/wiki/%E5%A6%82%E4%BD%95%E5%BF%AB%E9%80%9F%E5%AE%89%E8%A3%85%E4%BD%BF%E7%94%A8Linkis) JDK (1.8.0_141以上),[如何安装JDK](https://www.runoob.com/java/java-environment-setup.html) @@ -47,7 +51,7 @@ Nginx,[如何安装Nginx](https://www.tecmint.com/install-nginx-on-centos-7/) sudo useradd hadoop ``` -##### 注意:用户需要有sudo权限,且可免密登陆本机。[如何配置SSH免密登陆](https://www.jianshu.com/p/0922095f69f3) +##### 注意:用户需要有sudo权限,且可免密登陆本机。[如何配置SSH免密登陆](https://linuxconfig.org/passwordless-ssh) ``` vi /etc/sudoers @@ -61,7 +65,7 @@ Nginx,[如何安装Nginx](https://www.tecmint.com/install-nginx-on-centos-7/) ``` tar -xvf wedatasphere-dss-x.x.x-dist.tar.gz ``` -##### 注意:如果安装包是用户自行编译的,则需要把[visualis-server安装包](https://github.com/WeBankFinTech/Visualis/releases)复制到DSS安装目录的share/visualis-server文件夹下,以供自动化安装使用 + ### c. 修改基础配置 @@ -74,18 +78,125 @@ Nginx,[如何安装Nginx](https://www.tecmint.com/install-nginx-on-centos-7/) deployUser=hadoop #指定部署用户 - DSS_INSTALL_HOME=/appcom/Install/DSS #指定DSS的安装目录 + DSS_INSTALL_HOME=$workDir #默认为上一级目录 + + WORKSPACE_USER_ROOT_PATH=file:///tmp/Linkis #指定用户根目录,存储用户的脚本文件和日志文件等,是用户的工作空间。 - EUREKA_INSTALL_IP=127.0.0.1 #Linkis的 EUREKA 服务主机IP地址 + RESULT_SET_ROOT_PATH=hdfs:///tmp/linkis # 结果集文件路径,用于存储Job的结果集文件 - EUREKA_PORT=20303 #Linkis的 EUREKA 服务端口号 +``` + +### d. 修改数据库配置 + +```bash + vi conf/db.sh +``` + +```properties + # 设置DSS-Server和Eventchecker AppJoint的数据库的连接信息,需要和linkis保持同库 + MYSQL_HOST= + MYSQL_PORT= + MYSQL_DB= + MYSQL_USER= + MYSQL_PASSWORD= +``` + +### 2.3 前端端环境配置准备 + +### a、下载安装包 + + 点击[release](https://github.com/WeBankFinTech/DataSphereStudio/releases) 下载对应安装包,并在安装目录进行解压: + +```bash + unzip wedatasphere-dss-web-x.x.x-dist.zip +``` + +##### 注意:如果DSS前端安装包是用户自行编译的,则需要把[visualis前端安装包](https://github.com/WeBankFinTech/Visualis/releases)复制到DSS前端安装目录的dss/visualis文件夹下,以供自动化安装使用 + +### b、配置修改 + +    进入前端工作目录,在该目录下编辑: + + +```bash + vi conf/config.sh +``` + +更改dss的前端端口和后端linkis的gateway的IP地址及端口 + +``` +# Configuring front-end ports +dss_port="8088" + +# URL of the backend linkis gateway +linkis_url="http://127.0.0.1:9001" + +# dss ip address +dss_ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}'|awk 'NR==1') +``` + + 环境准备完毕,点我进入 [五、安装和使用](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/DSS%E5%BF%AB%E9%80%9F%E5%AE%89%E8%A3%85%E4%BD%BF%E7%94%A8%E6%96%87%E6%A1%A3.md#%E4%BA%94%E5%AE%89%E8%A3%85%E5%92%8C%E4%BD%BF%E7%94%A8) + + +---- + +## 三、简单版DSS环境配置准备 +DSS环境配置准备分为三部分,前后端基础软件安装、后端环境配置准备和前端环配置境准备,详细介绍如下: +### 3.1 前后端基础软件安装 +Linkis简单版(0.9.4及以上),[如何安装Linkis](https://github.com/WeBankFinTech/Linkis/wiki/%E5%A6%82%E4%BD%95%E5%BF%AB%E9%80%9F%E5%AE%89%E8%A3%85%E4%BD%BF%E7%94%A8Linkis) + +JDK (1.8.0_141以上),[如何安装JDK](https://www.runoob.com/java/java-environment-setup.html) + +MySQL (5.5+),[如何安装MySQL](https://www.runoob.com/mysql/mysql-install.html) + +Nginx,[如何安装Nginx](https://www.tecmint.com/install-nginx-on-centos-7/) + +### 3.2 后端环境配置准备 + +### a. 创建用户 + + 例如: **部署用户是hadoop账号** + +在部署机器上创建部署用户,使用该用户进行安装。 +``` + sudo useradd hadoop +``` + +##### 注意:用户需要有sudo权限,且可免密登陆本机。[如何配置SSH免密登陆](https://linuxconfig.org/passwordless-ssh) +``` + vi /etc/sudoers + + hadoop ALL=(ALL) NOPASSWD: NOPASSWD: ALL +``` + +### b. 安装包准备 + +从DSS已发布的release中([点击这里进入下载页面](https://github.com/WeBankFinTech/DataSphereStudio/releases)),下载对应安装包。先解压安装包到安装目录,并对解压后的文件进行配置修改 + +``` + tar -xvf wedatasphere-dss-x.x.x-dist.tar.gz +``` +##### 注意:如果安装包是用户自行编译的,则需要把[visualis-server安装包](https://github.com/WeBankFinTech/Visualis/releases)复制到DSS安装目录的share/visualis-server文件夹下,以供自动化安装使用 + +### c. 修改基础配置 + +``` + vi conf/config.sh +``` + +```properties + + + deployUser=hadoop #指定部署用户 + + DSS_INSTALL_HOME=$workDir #默认为上一级目录 WORKSPACE_USER_ROOT_PATH=file:///tmp/Linkis #指定用户根目录,存储用户的脚本文件和日志文件等,是用户的工作空间。 RESULT_SET_ROOT_PATH=hdfs:///tmp/linkis # 结果集文件路径,用于存储Job的结果集文件 #用于DATACHECK校验 - HIVE_META_URL=jdbc:mysql://127.0.0.1:3306/linkis?characterEncoding=UTF-8 + HIVE_META_URL=jdbc:mysql://127.0.0.1:3306/hivemeta?characterEncoding=UTF-8 HIVE_META_USER=xxx HIVE_META_PASSWORD=xxx @@ -98,7 +209,7 @@ Nginx,[如何安装Nginx](https://www.tecmint.com/install-nginx-on-centos-7/) ``` ```properties - # 设置DSS-Server和Eventchecker AppJoint的数据库的连接信息。 + # 设置DSS-Server和Eventchecker AppJoint的数据库的连接信息,需要和linkis保持同库 MYSQL_HOST= MYSQL_PORT= MYSQL_DB= @@ -106,7 +217,7 @@ Nginx,[如何安装Nginx](https://www.tecmint.com/install-nginx-on-centos-7/) MYSQL_PASSWORD= ``` -### 2.3 前端端环境配置准备 +### 3.3 前端端环境配置准备 ### a、下载安装包 @@ -132,15 +243,15 @@ dss_port="8088" linkis_url="http://127.0.0.1:9001" # dss ip address -dss_ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}') +dss_ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}'|awk 'NR==1') ``` - 环境准备完毕,点我进入 [四、安装和使用](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/DSS%E5%BF%AB%E9%80%9F%E5%AE%89%E8%A3%85%E4%BD%BF%E7%94%A8%E6%96%87%E6%A1%A3.md#%E5%9B%9B%E5%AE%89%E8%A3%85%E5%92%8C%E4%BD%BF%E7%94%A8) + 环境准备完毕,点我进入 [五、安装和使用](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/DSS%E5%BF%AB%E9%80%9F%E5%AE%89%E8%A3%85%E4%BD%BF%E7%94%A8%E6%96%87%E6%A1%A3.md#%E4%BA%94%E5%AE%89%E8%A3%85%E5%92%8C%E4%BD%BF%E7%94%A8) -## 三、标准版DSS环境配置准备 +## 四、标准版DSS环境配置准备 标准版DSS环境准备也分为三部分,前后端基础软件安装、后端环境准备和前端环境准备,详细介绍如下: -### 3.1 前后端基础软件安装 -Linkis标准版(0.9.1及以上),[如何安装Linkis](https://github.com/WeBankFinTech/Linkis/wiki/%E5%A6%82%E4%BD%95%E5%BF%AB%E9%80%9F%E5%AE%89%E8%A3%85%E4%BD%BF%E7%94%A8Linkis) +### 4.1 前后端基础软件安装 +Linkis标准版(0.9.4及以上),[如何安装Linkis](https://github.com/WeBankFinTech/Linkis/wiki/%E5%A6%82%E4%BD%95%E5%BF%AB%E9%80%9F%E5%AE%89%E8%A3%85%E4%BD%BF%E7%94%A8Linkis) JDK (1.8.0_141以上),[如何安装JDK](https://www.runoob.com/java/java-environment-setup.html) @@ -153,7 +264,7 @@ Qualitis [如何安装Qualitis](https://github.com/WeBankFinTech/Qualitis/blob/m Azkaban [如何安装Azkaban](https://github.com/azkaban/azkaban) ##### 注意:支持Azkaban调度需要配套安装linkis-jobtype,请点击[Linkis jobType安装文档](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/Azkaban_LinkisJobType_Deployment_Manual.md) -### 3.2 后端环境配置准备 +### 4.2 后端环境配置准备 ### a. 创建用户 @@ -164,7 +275,7 @@ Azkaban [如何安装Azkaban](https://github.com/azkaban/azkaban) sudo useradd hadoop ``` -##### 注意:用户需要有sudo权限,且可免密登陆本机。[如何配置SSH免密登陆](https://www.jianshu.com/p/0922095f69f3) +##### 注意:用户需要有sudo权限,且可免密登陆本机。[如何配置SSH免密登陆](https://linuxconfig.org/passwordless-ssh) ``` vi /etc/sudoers @@ -191,20 +302,16 @@ Azkaban [如何安装Azkaban](https://github.com/azkaban/azkaban) deployUser=hadoop #指定部署用户 - DSS_INSTALL_HOME=/appcom/Install/DSS #指定DSS的安装目录 - - EUREKA_INSTALL_IP=127.0.0.1 #Linkis的 EUREKA 服务主机IP地址 - - EUREKA_PORT=20303 #Linkis的 EUREKA 服务端口号 + DSS_INSTALL_HOME=$workDir #默认为上一级目录 WORKSPACE_USER_ROOT_PATH=file:///tmp/Linkis #指定用户根目录,存储用户的脚本文件和日志文件等,是用户的工作空间。 RESULT_SET_ROOT_PATH=hdfs:///tmp/linkis # 结果集文件路径,用于存储Job的结果集文件 - WDS_SCHEDULER_PATH=file:///appcom/tmp/wds/scheduler #Azkaban工程存储目录 + WDS_SCHEDULER_PATH=file:///appcom/tmp/wds/scheduler #DSS工程转换成Azkaban工程后zip包的存储路径 #1、用于DATACHECK - HIVE_META_URL=jdbc:mysql://127.0.0.1:3306/linkis?characterEncoding=UTF-8 + HIVE_META_URL=jdbc:mysql://127.0.0.1:3306/hivemeta?characterEncoding=UTF-8 HIVE_META_USER=xxx HIVE_META_PASSWORD=xxx #2、用于Qualitis @@ -223,7 +330,7 @@ Azkaban [如何安装Azkaban](https://github.com/azkaban/azkaban) ``` ```properties - # 设置DSS-Server和Eventchecker AppJoint的数据库的连接信息。 + # 设置DSS-Server和Eventchecker AppJoint的数据库的连接信息,需要和linkis保持同库 MYSQL_HOST= MYSQL_PORT= MYSQL_DB= @@ -231,7 +338,7 @@ Azkaban [如何安装Azkaban](https://github.com/azkaban/azkaban) MYSQL_PASSWORD= ``` -### 3.3 前端环境配置准备 +### 4.3 前端环境配置准备 ### a、下载安装包 @@ -258,14 +365,14 @@ dss_port="8088" linkis_url="http://127.0.0.1:9001" # dss ip address -dss_ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}') +dss_ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}'|awk 'NR==1') ``` - 环境准备完毕,点我进入 [四、安装和使用](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/DSS%E5%BF%AB%E9%80%9F%E5%AE%89%E8%A3%85%E4%BD%BF%E7%94%A8%E6%96%87%E6%A1%A3.md#%E5%9B%9B%E5%AE%89%E8%A3%85%E5%92%8C%E4%BD%BF%E7%94%A8) + 环境准备完毕,点我进入 [五、安装和使用](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/DSS%E5%BF%AB%E9%80%9F%E5%AE%89%E8%A3%85%E4%BD%BF%E7%94%A8%E6%96%87%E6%A1%A3.md#%E4%BA%94%E5%AE%89%E8%A3%85%E5%92%8C%E4%BD%BF%E7%94%A8) -# 四、安装和使用 +## 五、安装和使用 -## 4.1. DataSphereStudio 后台安装: +### 5.1. DataSphereStudio 后台安装: ### a. 执行安装脚本: @@ -277,24 +384,20 @@ dss_ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/ - install.sh脚本会询问您安装模式。 -     安装模式就是简单模式或标准模式,请根据您准备的环境情况,选择合适的安装模式,简单模式和标准模式都会检查mysql服务,标准模式还会检测Qualitis服务和Azkaban服务,如果检测失败会直接退出安装。 +     安装模式就是简单模式或标准模式,请根据您准备的环境情况,选择合适的安装模式,精简版、简单模式和标准模式都会检查mysql服务,标准模式还会检测Qualitis服务和Azkaban外部server服务,如果检测失败会直接退出安装。 + +- 安装过程如果有很多cp 命令提示您是否覆盖安装,说明您的系统配置有别名,输入alias,如果有cp、mv、rm的别名,如果有可以去掉,就不会再有大量提示。 - install.sh脚本会询问您是否需要初始化数据库并导入元数据。      因为担心用户重复执行install.sh脚本,把数据库中的用户数据清空,所以在install.sh执行时,会询问用户是否需要初始化数据库并导入元数据。      **第一次安装**必须选是。 -- install.sh脚本会询问您是否需要初始化使用[davinci](https://github.com/edp963/davinci)所依赖的库表,如果您没有安装过davinci,则需要进行初始化建表,如果您已经安装了davinci,则无需再次初始化。 -     因为担心用户会mysql中已安装好的davinci数据清空,所以在install.sh执行时,会询问用户是否需要初始化。 -     **第一次安装**必须选是。 - ### c. 是否安装成功:         通过查看控制台打印的日志信息查看是否安装成功。 -        如果有错误信息,可以查看具体报错原因。 - -        您也可以通过查看我们的[常见问题](https://github.com/WeBankFinTech/DataSphereStudio/wiki/FAQ),获取问题的解答。 +        您也可以通过查看我们的[常见问题]([https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch1/DSS%E5%AE%89%E8%A3%85%E5%B8%B8%E8%A7%81%E9%97%AE%E9%A2%98%E5%88%97%E8%A1%A8.md](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch1/DSS%E5%AE%89%E8%A3%85%E5%B8%B8%E8%A7%81%E9%97%AE%E9%A2%98%E5%88%97%E8%A1%A8.md)),获取问题的解答。 ### d. 启动DataSphereStudio后台服务 @@ -316,7 +419,7 @@ dss_ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/ ![Eureka](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/images/zh_CN/chapter2/quickInstallUse/quickInstall.png) -## 4.2 DataSphereStudio前端安装 +### 5.2 DataSphereStudio前端安装 ### a、部署 @@ -341,7 +444,7 @@ dss_ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/ 添加如下内容: ``` server { - listen 8080;# 访问端口 + listen 8088;# 访问端口 server_name localhost; #charset koi8-r; #access_log /var/log/nginx/host.access.log main; @@ -387,7 +490,7 @@ server { ### 2.将前端包拷贝到对应的目录: ```/appcom/Install/DSS/FRONT; # 前端包安装目录 ``` -##### 注意: 手动安装DSS前端,则需要到DSS前端安装目录的dss/visualis文件夹下,解压visualis前端安装包。 +##### 注意: 手动安装DSS前端,则需要到DSS前端安装目录的dss/visualis文件夹下,解压visualis前端安装包,用于自动化安装visualis前端。 ### 3.启动服务 ```sudo systemctl restart nginx``` @@ -395,54 +498,10 @@ server { ### 4.谷歌浏览器访问: ```http://nginx_ip:nginx_port``` -如何详细使用DSS, 点我进入 [DSS详细使用文档](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch3/DSS_User_Manual.md) - -## 4.3、常见问题 - -(1)用户token为空 - -``` -sudo vi dss-server/conf/token.properties -``` - -添加用户 - -``` -xxx=xxx -``` - -(2)visualis执行报错 - -``` -Caused by: java.lang.Exception: /data/DSSInstall/visualis-server/bin/phantomjsis not executable! -``` - -下载 [driver驱动](https://phantomjs.org/download.html),把phantomjs二进制文件放入visualis-server的bin目录下即可。 - - - -(3)上传文件大小限制 - -``` -sudo vi /etc/nginx/nginx.conf -``` - -更改上传大小 - -``` -client_max_body_size 200m -``` - - (4)接口超时 +**DSS登录用户和登录密码都是部署DSS的Linux用户名,更多用户配置,详见** [Linkis LDAP](https://github.com/WeBankFinTech/Linkis/wiki/%E9%83%A8%E7%BD%B2%E5%92%8C%E7%BC%96%E8%AF%91%E9%97%AE%E9%A2%98%E6%80%BB%E7%BB%93) -``` -sudo vi /etc/nginx/conf.d/dss.conf -``` +如何详细使用DSS, 点我进入 [DSS快速使用文档](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch3/DSS_User_Manual.md) +### 5.3、常见问题 -更改接口超时时间 - -``` -proxy_read_timeout 600s -``` - +[DSS安装常见问题](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch1/DSS%E5%AE%89%E8%A3%85%E5%B8%B8%E8%A7%81%E9%97%AE%E9%A2%98%E5%88%97%E8%A1%A8.md) diff --git a/docs/zh_CN/ch3/DSS_User_Manual.md b/docs/zh_CN/ch3/DSS_User_Manual.md index 470ae9c7c2c85a0ac86e064d695c060af9004b27..5d91c0d02ef1df272a20505267f7a68fc7fb3866 100644 --- a/docs/zh_CN/ch3/DSS_User_Manual.md +++ b/docs/zh_CN/ch3/DSS_User_Manual.md @@ -1,3 +1,9 @@ +## 快速登录 +      为了方便用户使用,系统默认通过使用Linkis的部署用户名进行登录,比如是hadoop部署的可以直接通过 用户:hadoop,密码:hadoop(密码就是用户名)来进行登录。 首先输入前端容器地址:192.168.xx.xx:8888 接着输入用户名密码:hadoop/hadoop +![quick_start00](/images/zh_CN/chapter3/quickstart/quick_start00.png) + +__注意:__ 如果要支持多用户登录,DSS的用户登录依赖Linkis,需要在linkis-GateWay的配置里面进行配置,Linkis-GateWay默认支持LDAP。 + ## 1 功能简介        DSS作为一站式数据应用开发门户,定位为闭环涵盖数据应用的全流程,满足从数据ETL、数据研发、可视化展现、数据治理、数据输出到工作流调度的数据应用全生命周期开发场景,现已经开源的组件包括如下图所示: @@ -38,7 +44,17 @@ 3. 工程复制:以工程的最新版本为源工程,复制出新工程,初始版本工作流内容为源工程最新版本的工作流。注意:**工程名是唯一,不可重复** ## 3工作流——workflow -### 3.1 工作流编排 +### 3.1 工作流spark节点 +       spark节点分别支持sql、pyspark、scala三种方式执行spark任务,使用时只需将节点拖拽至工作台后编写代码即可。 +### 3.2 工作流hive节点 +       hive节点支持sql方式执行hive任务,使用时只需将节点拖拽至工作台后编写hivesql代码即可。 +### 3.3 工作流python节点 +       python节点支持执行python任务,使用时只需将节点拖拽至工作台后编写python代码即可。 +### 3.4 工作流shell节点 +       shell节点支持执行shell命令或者脚本运行,使用时只需将节点拖拽至工作台后编写shell命令即可。 +### 3.5 工作流jdbc节点 +       jdbc节点支持以jdbc方式运行sql命令,使用时只需将节点拖拽至工作台后编写sql即可,**注意需要提前在linkis console管理台配置jdbc连接信息。** +### 3.6 工作流编排        当点击一个对应的工程后,既可以进入工程首页,在工程首页可以做工作流的编排。 1. 首先需要创建工作流 ![workflow01](/images/zh_CN/chapter3/manual/workflow01.png) @@ -47,7 +63,7 @@ 3. 节点支持右键功能包括,删除、依赖选择、复制等基本功能,同时数据开发节点还支持脚本关联 ![workflow03](/images/zh_CN/chapter3/manual/workflow03.png) -### 3.2 工作流节点打开 +### 3.7 工作流节点打开 节点支持双击打开: 1. 数据开发节点:点开后即可进入Scriptis进行脚本编辑 ![workflow04](/images/zh_CN/chapter3/manual/workflow04.png) @@ -58,14 +74,14 @@ 4. 数据可视化节点:跳转到对应的可视化编辑页面 ![workflow07](/images/zh_CN/chapter3/manual/workflow07.png) -### 3.3 层级切换 +### 3.8 层级切换 1. 支持多层级切换:支持快速工程切换、支持在工作流页面切换工作流、支持在单个工作流中切换节点 ![workflow08](/images/zh_CN/chapter3/manual/workflow08.png) 2. 右上脚支持多组件快速切换,在切换后进入的组件的内容都只与该工程相关,让用户更加清晰的去定义工程和业务的内容: ![functions](/images/zh_CN/chapter3/manual/functions.png) -### 3.4 参数和资源设置 +### 3.9 参数和资源设置 1. 工作流上下文信息设置,支持工作流参数、变量、代理用户等 @@ -80,26 +96,26 @@ open("flow://test.txt", encoding="utf-8") #工作流级资源文件使用flow:/ open("node://test.txt", encoding="utf-8") #节点级资源文件使用node://开头 ``` -### 3.5 工作流实时执行 +### 3.10 工作流实时执行 1. 除了功能节点中的subflow会跳过执行,连接节点会作为空节点运行,其他都支持实时执行 ![workflow11](/images/zh_CN/chapter3/manual/workflow11.png) 2. 用户编辑好工作流后点击执行就可以将工作流进行运行,您将看到实时的工作流运行起来可以看到现在运行节点的时间,同时可以右键节点打开节点的管理台去展示该节点的进度,运行结果,运行日志等。支持任务停止等功能 ![workflow12](/images/zh_CN/chapter3/manual/workflow12.png) -### 3.6 工作流调度执行 +### 3.11 工作流调度执行 1. DSS的工程支持发布调度,默认支持发布到Azkaban,同样DSS的调度部分做了深层次的抽象可以做到对其他的调度系统快速支持。发布前会对工作流进行解析,以确保工作流是可以调度运行的: ![workflow13](/images/zh_CN/chapter3/manual/workflow13.png) 2. 发布后即可到调度系统中进行查看,比如去Azkaban页面上进行查看: ![workflow14](/images/zh_CN/chapter3/manual/workflow14.png) 3. DSS如何对接调度系统可以参考:[]() -### 3.7 工作流版本 +### 3.12 工作流版本 1. 工作流创建完成后,具有初始版本,版本号为v000001,直接点击工作流图标时,默认打开工作流的最新版本 2. 可以查看工作流的版本,方便您进行历史版本查看: ![workflow15](/images/zh_CN/chapter3/manual/workflow15.png) -### 3.8 工作流布局修改 +### 3.13 工作流布局修改 1. 工作流格式化:当工作流节点过多,界面太乱时。可以点击节点编辑页的右上方第四个“格式化”按钮。快速美化节点界面: ![workflow16](/images/zh_CN/chapter3/manual/workflow16.png) 如果格式化后不满意,可再次点击节点编辑页的右上方第五个“恢复”按钮,恢复到之前的状态: diff --git a/docs/zh_CN/ch3/DSS_User_Tests1_Scala.md b/docs/zh_CN/ch3/DSS_User_Tests1_Scala.md new file mode 100644 index 0000000000000000000000000000000000000000..70c0b0331766ee5516f8bb625c957986d023a953 --- /dev/null +++ b/docs/zh_CN/ch3/DSS_User_Tests1_Scala.md @@ -0,0 +1,82 @@ +# DSS用户测试样例1:Scala + +DSS用户测试样例的目的是为平台新用户提供一组测试样例,用于熟悉DSS的常见操作,并验证DSS平台的正确性 + +![image-20200408211243941](../../../images/zh_CN/chapter3/tests/home.png) + +## 1.1 Spark Core(入口函数sc) + +在Scriptis中,已经默认为您注册了SparkContext,所以直接使用sc即可: + +### 1.1.1 单Value算子(Map算子为例) + +```scala +val rddMap = sc.makeRDD(Array((1,"a"),(1,"d"),(2,"b"),(3,"c")),4) +val res = rddMap.mapValues(data=>{data+"||||"}) +res.collect().foreach(data=>println(data._1+","+data._2)) +``` + +### 1.1.2 双Value算子(union算子为例) + +```scala +val rdd1 = sc.makeRDD(1 to 5) +val rdd2 = sc.makeRDD(6 to 10) +val rddCustom = rdd1.union(rdd2) +rddCustom.collect().foreach(println) +``` + +### 1.1.3 K-V算子(reduceByKey算子为例子) + +```scala +val rdd1 = sc.makeRDD(List(("female",1),("male",2),("female",3),("male",4))) +val rdd2 = rdd1.reduceByKey((x,y)=>x+y) +rdd2.collect().foreach(println) +``` + +### 1.1.4 执行算子(以上collect算子为例) + +### 1.1.5 从hdfs上读取文件并做简单执行 + +```scala +case class Person(name:String,age:String) +val file = sc.textFile("/test.txt") +val person = file.map(line=>{ + val values=line.split(",") + + Person(values(0),values(1)) +}) +val df = person.toDF() +df.select($"name").show() +``` + + + +## 1.2 UDF函数测试 + +### 1.2.1 函数定义 + + + +```scala +def ScalaUDF3(str: String): String = "hello, " + str + "this is a third attempt" +``` + +### 1.2.2 注册函数 + +函数-》个人函数-》右击新增spark函数=》注册方式同常规spark开发 + +​ ![img](../../../images/zh_CN/chapter3/tests/udf1.png) + +## 1.3 UDAF函数测试 + +### 1.3.1 Jar包上传 + +​ idea上开发一个求平均值的udaf函数,打成jar(wordcount)包,上传dss jar文件夹。 + +​ ![img](../../../images/zh_CN/chapter3/tests/udf2.png) + +### 1.3.2 注册函数 + +函数-》个人函数-》右击新增普通函数=》注册方式同常规spark开发 + + ![img](../../../images/zh_CN/chapter3/tests/udf-3.png) \ No newline at end of file diff --git a/docs/zh_CN/ch3/DSS_User_Tests2_Hive.md b/docs/zh_CN/ch3/DSS_User_Tests2_Hive.md new file mode 100644 index 0000000000000000000000000000000000000000..800277ca094926677c644d7c88b84ba8e9550c9f --- /dev/null +++ b/docs/zh_CN/ch3/DSS_User_Tests2_Hive.md @@ -0,0 +1,148 @@ +# DSS用户测试样例2:Hive + +DSS用户测试样例的目的是为平台新用户提供一组测试样例,用于熟悉DSS的常见操作,并验证DSS平台的正确性 + +![image-20200408211243941](../../../images/zh_CN/chapter3/tests/home.png) + +## 2.1 数仓建表 + +​ 进入“数据库”页面,点击“+”,依次输入表信息、表结构和分区信息即可创建数据库表: + +image-20200408212604929 + +​ ![img](../../../images/zh_CN/chapter3/tests/hive2.png) + +​ 通过以上流程,分别创建部门表dept、员工表emp和分区员工表emp_partition,建表语句如下: + +```sql +create external table if not exists default.dept( + deptno int, + dname string, + loc int +) +row format delimited fields terminated by '\t'; + +create external table if not exists default.emp( + empno int, + ename string, + job string, + mgr int, + hiredate string, + sal double, + comm double, + deptno int +) +row format delimited fields terminated by '\t'; + +create table if not exists emp_partition( + empno int, + ename string, + job string, + mgr int, + hiredate string, + sal double, + comm double, + deptno int +) +partitioned by (month string) +row format delimited fields terminated by '\t'; +``` + +**导入数据** + +目前需要通过后台手动批量导入数据,可以通过insert方法从页面插入数据 + +```sql +load data local inpath 'dept.txt' into table default.dept; +load data local inpath 'emp.txt' into table default.emp; +load data local inpath 'emp1.txt' into table default.emp_partition; +load data local inpath 'emp2.txt' into table default.emp_partition; +load data local inpath 'emp2.txt' into table default.emp_partition; +``` + +其它数据按照上述语句导入,样例数据文件路径在:`examples\ch3` + +## 2.2 基本SQL语法测试 + +### 2.2.1 简单查询 + +```sql +select * from dept; +``` + +### 2.2.2 Join连接 + +```sql +select * from emp +left join dept +on emp.deptno = dept.deptno; +``` + +### 2.2.3 聚合函数 + +```sql +select dept.dname, avg(sal) as avg_salary +from emp left join dept +on emp.deptno = dept.deptno +group by dept.dname; +``` + +### 2.2.4 内置函数 + +```sql +select ename, job,sal, +rank() over(partition by job order by sal desc) sal_rank +from emp; +``` + +### 2.2.5 分区表简单查询 + +```sql +show partitions emp_partition; +select * from emp_partition where month='202001'; +``` + +### 2.2.6 分区表联合查询 + +```sql +select * from emp_partition where month='202001' +union +select * from emp_partition where month='202002' +union +select * from emp_partition where month='202003' +``` + +## 2.3 UDF函数测试 + +### 2.3.1 Jar包上传 + +进入Scriptis页面后,右键目录路径上传jar包: + +​ ![img](../../../images/zh_CN/chapter3/tests/hive3.png) + +测试样例jar包在`examples\ch3\rename.jar` + +### 4.3.2 自定义函数 + +进入“UDF函数”选项(如1),右击“个人函数”目录,选择“新增函数”: + +image-20200408214033801 + +输入函数名称、选择jar包、并填写注册格式、输入输出格式即可创建函数: + + ![img](../../../images/zh_CN/chapter3/tests/hive5.png) + +image-20200409155418424 + +获得的函数如下: + +​ ![img](../../../images/zh_CN/chapter3/tests/hive7.png) + +### 4.3.3 利用自定义函数进行SQL查询 + +完成函数注册后,可进入工作空间页面创建.hql文件使用函数: + +```sql +select deptno,ename, rename(ename) as new_name +from emp; +``` diff --git a/docs/zh_CN/ch3/DSS_User_Tests3_SparkSQL.md b/docs/zh_CN/ch3/DSS_User_Tests3_SparkSQL.md new file mode 100644 index 0000000000000000000000000000000000000000..aaf2fb44d7096e8de1b805145bd60cf43858c05f --- /dev/null +++ b/docs/zh_CN/ch3/DSS_User_Tests3_SparkSQL.md @@ -0,0 +1,61 @@ +# DSS用户测试样例3:SparkSQL + +DSS用户测试样例的目的是为平台新用户提供一组测试样例,用于熟悉DSS的常见操作,并验证DSS平台的正确性 + +![image-20200408211243941](../../../images/zh_CN/chapter3/tests/home.png) + +## 3.1RDD与DataFrame转换 + +### 3.1.1 RDD转为DataFrame + +```scala +case class MyList(id:Int) + +val lis = List(1,2,3,4) + +val listRdd = sc.makeRDD(lis) +import spark.implicits._ +val df = listRdd.map(value => MyList(value)).toDF() + +df.show() +``` + +### 3.1.2 DataFrame转为RDD + +```scala +case class MyList(id:Int) + +val lis = List(1,2,3,4) +val listRdd = sc.makeRDD(lis) +import spark.implicits._ +val df = listRdd.map(value => MyList(value)).toDF() +println("------------------") + +val dfToRdd = df.rdd + +dfToRdd.collect().foreach(print(_)) +``` + +## 3.2 DSL语法风格实现 + +```scala +val df = df1.union(df2) +val dfSelect = df.select($"department") +dfSelect.show() +``` + +## 3.3 SQL语法风格实现(入口函数sqlContext) + +```scala +val df = df1.union(df2) + +df.createOrReplaceTempView("dfTable") +val innerSql = """ + SELECT department + FROM dfTable + """ +val sqlDF = sqlContext.sql(innerSql) +sqlDF.show() +``` + +​ \ No newline at end of file diff --git "a/docs/zh_CN/ch4/DSS\345\267\245\347\250\213\345\217\221\345\270\203\350\260\203\345\272\246\347\263\273\347\273\237\346\236\266\346\236\204\350\256\276\350\256\241.md" "b/docs/zh_CN/ch4/DSS\345\267\245\347\250\213\345\217\221\345\270\203\350\260\203\345\272\246\347\263\273\347\273\237\346\236\266\346\236\204\350\256\276\350\256\241.md" new file mode 100644 index 0000000000000000000000000000000000000000..1dc37857d79463d5b48767c454edf0603d9dd401 --- /dev/null +++ "b/docs/zh_CN/ch4/DSS\345\267\245\347\250\213\345\217\221\345\270\203\350\260\203\345\272\246\347\263\273\347\273\237\346\236\266\346\236\204\350\256\276\350\256\241.md" @@ -0,0 +1,30 @@ +# DataSphere Studio发布调度系统架构设计 + + + +## 一、背景 + + 目前在大数据领域存在许多种批量定时调度系统,如Azkaban、Airflow、EasyScheduler等,DSS支持将设计好的DAG工作流 +发布到不同的调度系统,系统默认支持了发布到Azkaban的实现。在DSS中主要完工作流的编排设计,节点的参数设置, +脚本代码编写,图表设计等需要交互式的操作,还可以在DSS中实时执行,并调试好所有节点的可执行代码。发布到调度系统后 +,由调度系统根据定时任务的配置,定时调度执行。 + +## 二、架构设计 + +![发布调度架构图](../../../images/zh_CN/charpter3/publish/publichtoscheduling.png) + +## 三、发布流程 + +(1)从数据库读取最新版本的工程、工作流信息,获取所有的保存在BML库工作流JSON文件。 + +(2)将上面的数据库内容,JSON文件内容分别转成DSS中的DSSProject,DSSFlow,如果存在子flow,则需要一并设置到flow中,保持原来的层级关系和依赖关系,构建好DSSProject,其中包含了工程下所有的DSSFlow。 + 一个工作流JSON包含了所有节点的定义,并存储了节点之间的依赖关系,以及工作流自身的属性信息。 + +(3)将DSSProject经过工程转换器转成SchedulerProject,转成SchedulerProject的过程中,同时完成了DSSJSONFlow到SchedulerFlow的转换,也完成了DSSNode到SchedulerNode的转换。 + +(4)使用ProjectTuning对整个SchedulerProject工程进行tuning操作,用于完成工程发布前的整体调整操作,在Azkaban的实现中主要完成了工程的路径设置和工作流的存储路径设置。 + +(5)ProjectPublishHook操作,hook可以根据不同的调度系统进行实现,且可分为发布前的hook和发布后的hook,这些都会被统一执行。 + 发布前的hook包含对工程的解析,工作流的解析,节点的解析,以及生成对应的资源文件,属性文件,节点描述文件等。这个需要根据不同的调度系统进行实现。 + +(6)发布工程,打包好经过转换、解析生成的工程目录文件,并上传到对应的调度系统。 diff --git "a/docs/zh_CN/ch4/\345\211\215\347\253\257\347\274\226\350\257\221\346\226\207\346\241\243.md" "b/docs/zh_CN/ch4/\345\211\215\347\253\257\347\274\226\350\257\221\346\226\207\346\241\243.md" new file mode 100644 index 0000000000000000000000000000000000000000..ee447703af6eba7e0527c1909a2744df6de1f864 --- /dev/null +++ "b/docs/zh_CN/ch4/\345\211\215\347\253\257\347\274\226\350\257\221\346\226\207\346\241\243.md" @@ -0,0 +1,86 @@ +# 编译文档中文版 + +## 启动流程 + +### 一、安装Node.js +将Node.js下载到电脑本地,安装即可。下载地址:[http://nodejs.cn/download/](http://nodejs.cn/download/) (建议使用最新的稳定版本) +**该步骤仅第一次使用时需要执行。** + +### 二、安装项目 +在终端命令行中执行以下指令: + +``` +git clone ${ipAddress} +cd DataSphereStudio/web +npm install +``` + +指令简介: +1. 将项目包从远程仓库拉取到电脑本地:git clone ${ipAddress} +2. 进入项目包根目录:cd DataSphereStudio/web +3. 安装项目所需依赖:npm install + +**该步骤仅第一次使用时需要执行。** + +### 三、配置 +您需要在代码中进行一些配置,如后端接口地址,后端socket地址等,如根目录下的.env.development文件: + +``` +// 后端接口地址 +VUE_APP_MN_CONFIG_PREFIX=http://yourIp:yourPort/yourPath +// 后端socket地址 +VUE_APP_MN_CONFIG_SOCKET=/yourSocketPath +``` + +配置的具体解释可参考vue-cli官方文档:[环境变量和模式](https://cli.vuejs.org/zh/guide/mode-and-env.html#%E7%8E%AF%E5%A2%83%E5%8F%98%E9%87%8F%E5%92%8C%E6%A8%A1%E5%BC%8F) + +### 打包项目 +您可以通过在终端命令行执行以下指令对项目进行打包,生成压缩后的代码: + +``` +npm run build +``` + +该指令成功执行后,项目根目录下会出现一个名叫 “dist” 的文件夹,该文件夹即为打包好的代码。您可以直接将该文件夹放进您的静态服务器中。 + +### 运行项目 +如果您想在本地浏览器上运行该项目并且改动代码查看效果,需要在终端命令行中执行以下指令: + +``` +npm run serve +``` + +在浏览器中(建议Chrome浏览器)通过链接访问应用:[http://localhost:8080/](http://localhost:8080/) . +当您使用该方式运行项目时,您对代码的改动产生的效果,会动态体现在浏览器上。 + +**注意:因为项目采用前后端分离开发,所以在本地浏览器上运行时,需要对浏览器进行设置跨域才能访问后端接口:** + +比如chrome浏览器: +windows系统下的配置方式: +1. 关闭所有的chrome浏览器。 +2. 新建一个chrome快捷方式,右键“属性”,“快捷方式”选项卡里选择“目标”,添加  --args --disable-web-security --user-data-dir=C:\MyChromeDevUserData +3. 通过快捷方式打开chrome浏览器 +mac系统下的配置方式: +在终端命令行执行以下命令(需要替换路径中的yourname,若还不生效请检查您机器上MyChromeDevUserData文件夹的位置并将路径复制到下面指令的“--user-data-dir=”后面) + +``` +open -n /Applications/Google\ Chrome.app/ --args --disable-web-security --user-data-dir=/Users/yourname/MyChromeDevUserData/ +``` + + +### 常见问题 + +#### npm install无法成功 +如果遇到该情况,可以使用国内的淘宝npm镜像: + +``` +npm install -g cnpm --registry=https://registry.npm.taobao.org +``` + +接着,通过执行以下指令代替npm install指令 + +``` +cnpm install +``` + +注意,项目启动和打包时,仍然可以使用npm run build和npm run serve指令 \ No newline at end of file diff --git "a/docs/zh_CN/ch4/\345\246\202\344\275\225\346\216\245\345\205\245\350\260\203\345\272\246\347\263\273\347\273\237Azkaban.md" "b/docs/zh_CN/ch4/\345\246\202\344\275\225\346\216\245\345\205\245\350\260\203\345\272\246\347\263\273\347\273\237Azkaban.md" new file mode 100644 index 0000000000000000000000000000000000000000..a6e307784e4e74eea88b3cba8fa97dc6b762c3d1 --- /dev/null +++ "b/docs/zh_CN/ch4/\345\246\202\344\275\225\346\216\245\345\205\245\350\260\203\345\272\246\347\263\273\347\273\237Azkaban.md" @@ -0,0 +1,42 @@ +## DSS如何手动安装接入调度系统Azkaban +           Azkaban目前是作为一个SchedulerAppJoint在DSS-SERVER中使用,通过AzkabanSchedulerAppJoint实现了Azkaban的工程服务和安全认证服务, + 主要提供了工程的创建、更新、发布、删除,以及安全认证服务相关的代理登录,Cookie保存等。 + + **前提条件:用户已经安装部署好社区版Azkaban-3.X以上版本。**[如何安装Azkaban](https://github.com/azkaban/azkaban) +#### **步骤:** +**1、Azkaban APPJoint安装及配置** + + 进入DSS安装包解压目录,复制share/appjoints/schedulis/dss-azkaban-appjoint.zip到DSS安装目录的dss-appjoints/schedulis文件夹下,解压即可。 + +**2、修改dss-server配置目录中linkis.properties配置,增加如下参数:** + +``` +wds.dss.appjoint.scheduler.azkaban.address=http://IP地址:端口 #Azkaban的http地址 +wds.dss.appjoint.scheduler.project.store.dir=file:///appcom/tmp/wds/scheduler #Azkaban发布包临时存储目录 +``` + +**3、数据库中dss_application表修改** + + 修改DSS数据库dss_application表中schedulis记录行,修改url的连接IP地址和端口,保持与Azkaban Server实际地址一致。 + 示例SQL: + +``` +INSERT INTO `dss_application` (`id`, `name`, `url`, `is_user_need_init`, `level`, `user_init_url`, `exists_project_service`, `project_url`, `enhance_json`, `if_iframe`, `homepage_url`, `redirect_url`) VALUES (NULL, 'schedulis', NULL, '0', '1', NULL, '0', NULL, NULL, '1', NULL, NULL); + +UPDATE `dss_application` SET url = 'http://IP地址:端口', project_url = 'http://IP地址:端口/manager?project=${projectName}',homepage_url = 'http://IP地址:端口/homepage' WHERE `name` in + ('schedulis'); +``` + +**4、Azkaban JobType插件安装** + +您还需为Azkaban安装一个JobType插件: linkis-jobtype,请点击[Linkis jobType安装文档](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch2/Azkaban_LinkisJobType_Deployment_Manual.md) + +**5、用户token配置** + +##### 请在DSS-SERVER服务conf目录的token.properties文件中,配置用户名和密码信息,关联DSS和Azkaban用户,因为用户通过DSS创建工程后,要发布到azkaban,用户必须保持一致。示例: + +``` + 用户名=密码 +``` + +说明:由于每个公司都有各自的登录认证系统,这里只提供简单实现,用户可以实现SchedulerSecurityService定义自己的登录认证方法。azkaban用户管理可参考[Azkaban-3.x 用户管理](https://cloud.tencent.com/developer/article/1492734)及[官网](https://azkaban.readthedocs.io/en/latest/userManager.html) diff --git "a/docs/zh_CN/ch4/\347\254\254\344\270\211\346\226\271\347\263\273\347\273\237\346\216\245\345\205\245DSS\346\214\207\345\215\227.md" "b/docs/zh_CN/ch4/\347\254\254\344\270\211\346\226\271\347\263\273\347\273\237\346\216\245\345\205\245DSS\346\214\207\345\215\227.md" index 73173626eb6de9a37ba75a3fd1aa4689521649c7..690682aabe2a6e3fe74111757bbbaf47a69fbd9e 100644 --- "a/docs/zh_CN/ch4/\347\254\254\344\270\211\346\226\271\347\263\273\347\273\237\346\216\245\345\205\245DSS\346\214\207\345\215\227.md" +++ "b/docs/zh_CN/ch4/\347\254\254\344\270\211\346\226\271\347\263\273\347\273\237\346\216\245\345\205\245DSS\346\214\207\345\215\227.md" @@ -31,28 +31,142 @@         NodeService是用来解决用户在DSS提交的任务在第三方系统生成相应的任务的问题。用户如果在DSS系统的工作流中新建了一个工作流节点并进行任务的编辑,第三方系统需要同步感知到 - 4.getNodeExecution -        NodeExecution接口是用来将任务提交到第三方系统进行执行的接口,NodeExecution接口有支持短时间任务的NodeExecution和支持长时间任务的LongTermNodeExecution。一般短时间任务,如邮件发送等,可以直接实现NodeExecution接口,并重写execute方法,DSS系统同步等待任务结束。另外的长时间任务,如数据质量检测等,可以实现LongTermNodeExecution接口,并重写submit方法,返回一个NodeExecutionAction,DSS系统通过这个NodeExecutionAction可以向第三方系统获取任务的日志、状态等。 +        NodeExecution接口是用来将任务提交到第三方系统进行执行的接口,NodeExecution +接口有支持短时间任务的NodeExecution和支持长时间任务的LongTermNodeExecution。一般短时间任务,如邮件发送等,可以直接实现NodeExecution接口,并重写execute方法,DSS系统同步等待任务结束。另外的长时间任务,如数据质量检测等,可以实现LongTermNodeExecution接口,并重写submit方法,返回一个NodeExecutionAction,DSS系统通过这个NodeExecutionAction可以向第三方系统获取任务的日志、状态等。 #### 3.第三方系统接入DSS的实现(以Visualis为例) -        Visualis是微众银行WeDataSphere开源的一款商业BI工具,DSS集成Visualis系统之后可以获得数据可视化的能力。Visualis接入DSS系统的代码在DSS项目中已经同步开源,下面将以开源代码为例,对步骤进行罗列分析。 +        Visualis是微众银行WeDataSphere开源的一款商业BI工具,DSS集成Visualis系统之后可以获得数据可视化的能力。 +Visualis接入DSS系统的代码在DSS项目中已经同步开源,下面将以开源代码为例,对步骤进行罗列分析。 Visualis接入的DSS系统的步骤如下: **3.1.Visualis实现AppJoint接口** -        Visualis实现的 AppJoint接口的实现类是VisualisAppjoint。查看VisualisAppjoint的代码可知,它在init方法时候,初始化了自己实现的SecurityService、 NodeService以及NodeExecution。 +        Visualis实现的 AppJoint接口的实现类是VisualisAppjoint。查看VisualisAppjoint的代码可知,它在init方法时候, +初始化了自己实现的SecurityService、 NodeService以及NodeExecution。 +```java + public void init(String baseUrl, Map params) { + securityService = new VisualisSecurityService(); + securityService.setBaseUrl(baseUrl); + nodeExecution = new VisualisNodeExecution(); + nodeExecution.setBaseUrl(baseUrl); + nodeService = new VisualisNodeService(); + nodeService.setBaseUrl(baseUrl); + } +``` **3.2.Visualis实现SecurtyService接口** -        Visualis实现的SecurityService接口的类名是VisualisSecurityService,并重写了login方法,为了能够进行授权登陆,Visualis采用了提供token的方式,DSS的网关对该token进行授权,这样就能够做到用户鉴权。 +        Visualis实现的SecurityService接口的类名是VisualisSecurityService, +并重写了login方法,为了能够进行授权登陆,Visualis采用了提供token的方式,DSS的网关对该token进行授权,这样就能够做到用户鉴权。 + +```java +public class VisualisSecurityService extends AppJointUrlImpl implements SecurityService { + @Override + public Session login(String user) throws AppJointErrorException { + VisualisSession visualisSession = new VisualisSession(); + visualisSession.setUser(user); + visualisSession.getParameters().put("Token-User",user); + visualisSession.getParameters().put("Token-Code","WS-AUTH"); + return visualisSession; + } + + @Override + public void logout(String user) { + + } +} +``` **3.3.Visualis实现的NodeService接口** -        Visualis实现的NodeService接口的类是VisualisNodeService,并重写了createNode,deleteNode和updateNode三个方法,这三个方法是进行在第三方系统同步生成任务元数据。例如createNode方法是通过调用visualis的HTTP接口在Visualis系统生成同一工程下面的Visualis任务。 +        Visualis实现的NodeService接口的类是VisualisNodeService,并重写了createNode, +deleteNode和updateNode三个方法,这三个方法是进行在第三方系统同步生成任务元数据。例如createNode方法是通过调用visualis的HTTP接口在Visualis系统生成同一工程下面的Visualis任务。 + +```java + @Override + public Map createNode(Session session, AppJointNode node, + Map requestBody) throws AppJointErrorException { + if (DisplayNodeService.getNodeType().equals(node.getNodeType())) { + return DisplayNodeService.createNode(session, getBaseUrl(), String.valueOf(node.getProjectId()), node.getNodeType(), requestBody); + } else if (DashboardNodeService.getNodeType().equals(node.getNodeType())) { + return DashboardNodeService.createNode(session, getBaseUrl(), String.valueOf(node.getProjectId()), node.getNodeType(), requestBody); + } else { + throw new AppJointErrorException(42002, "cannot recognize the nodeType " + node.getNodeType()); + } + } + + @Override + public void deleteNode(Session session, AppJointNode node) throws AppJointErrorException { + if (DisplayNodeService.getNodeType().equals(node.getNodeType())) { + DisplayNodeService.deleteNode(session, getBaseUrl(), String.valueOf(node.getProjectId()), node.getNodeType(), node.getJobContent()); + } else if (DashboardNodeService.getNodeType().equals(node.getNodeType())) { + DashboardNodeService.deleteNode(session, getBaseUrl(), String.valueOf(node.getProjectId()), node.getNodeType(), node.getJobContent()); + } else { + throw new AppJointErrorException(42002, "cannot recognize the nodeType " + node.getNodeType()); + } + } + + @Override + public Map updateNode(Session session, AppJointNode node, + Map requestBody) throws AppJointErrorException { + if (DisplayNodeService.getNodeType().equals(node.getNodeType())) { + return DisplayNodeService.updateNode(session, getBaseUrl(), node.getProjectId(), node.getNodeType(), requestBody); + } else if (DashboardNodeService.getNodeType().equals(node.getNodeType())) { + return DashboardNodeService.updateNode(session, getBaseUrl(), node.getProjectId(), node.getNodeType(), requestBody); + } else { + throw new AppJointErrorException(42002, "cannot recognize the nodeType " + node.getNodeType()); + } + } +``` **3.4.Visualis实现NodeExecution接口** -        Visualis实现的NodeExecution接口的类是VisualisNodeExecution,并重写了execute方法,该方法传入的两个参数为Node和NodeContext,从NodeContext中我们可以拿到用户、DSS的网关地址,还有网关验证的Token。通过这些,我们可以封装成一个HTTP的请求发送到第三方系统Visualis并从Visualis获取响应结果,NodeContext提供写入结果集的方法,如Visualis的结果集一般是以图片的形式展示,在execute方法的最后,Visualis通过nodeContext获取到一个支持图片写入的PictureResultSetWriter方法,并将结果集进行写入。 +        Visualis实现的NodeExecution接口的类是VisualisNodeExecution,并重写了execute方法, +该方法传入的两个参数为Node和NodeContext,从NodeContext中我们可以拿到用户、DSS的网关地址,还有网关验证的Token。 +通过这些,我们可以封装成一个HTTP的请求发送到第三方系统Visualis并从Visualis获取响应结果,NodeContext提供写入结果集的方法, +如Visualis的结果集一般是以图片的形式展示,在execute方法的最后,Visualis通过nodeContext获取到一个支持图片写入的PictureResultSetWriter方法,并将结果集进行写入。 +```scala + override def execute(node: AppJointNode, nodeContext: NodeContext, session: Session): NodeExecutionResponse = node match { + case commonAppJointNode: CommonAppJointNode => + val appJointResponse = new CompletedNodeExecutionResponse() + val idMap = commonAppJointNode.getJobContent + val id = idMap.values().iterator().next().toString + val url = if(commonAppJointNode.getNodeType.toLowerCase.contains(DISPLAY)) getDisplayPreviewUrl(nodeContext.getGatewayUrl, id) + else if(commonAppJointNode.getNodeType.toLowerCase.contains(DASHBOARD)) getDashboardPreviewUrl(nodeContext.getGatewayUrl, id) + else { + appJointResponse.setIsSucceed(false) + appJointResponse.setErrorMsg("不支持的appJoint类型:" + node.getNodeType) + return appJointResponse + } + var response = "" + val headers = nodeContext.getTokenHeader(nodeContext.getUser) + nodeContext.appendLog(LogUtils.generateInfo(s"Ready to download preview picture from $url.")) + Utils.tryCatch(download(url, null, headers.toMap, + input => Utils.tryFinally{ + val os = new ByteArrayOutputStream() + IOUtils.copy(input, os) + response = new String(Base64.getEncoder.encode(os.toByteArray)) + //response = IOUtils.toString(input, ServerConfiguration.BDP_SERVER_ENCODING.getValue) + }(IOUtils.closeQuietly(input)))){ t => + val errException = new ErrorException(70063, "failed to do visualis request") + errException.initCause(t) + appJointResponse.setException(errException) + appJointResponse.setIsSucceed(false) + appJointResponse.setErrorMsg(s"用户${nodeContext.getUser}请求Visualis失败!URL为: " + url) + return appJointResponse + } + nodeContext.appendLog(LogUtils.generateInfo("Preview picture downloaded, now ready to write results.")) + val imagesBytes = response + val resultSetWriter = nodeContext.createPictureResultSetWriter() + Utils.tryFinally{ + resultSetWriter.addMetaData(new LineMetaData()) + resultSetWriter.addRecord(new LineRecord(imagesBytes)) + }(IOUtils.closeQuietly(resultSetWriter)) + appJointResponse.setIsSucceed(true) + appJointResponse + } +``` **3.5.数据库内容的更新(dss-application模块)** @@ -66,7 +180,7 @@ Visualis接入的DSS系统的步骤如下: | url | 10 | 如 http://127.0.0.1:8080 | | is_user_need_init | 是否需要用户初始化 | 默认否 | | user_init_url | 用户初始化url | 默认空 | -| exists_project_service | 是否存在自己的projectService服务, 存在的话要自己写appjoint实现projectService0 | | +| exists_project_service | 是否存在自己的projectService服务, 存在的话要自己写appjoint实现projectService | | | enhance_json | 加强json,在appjoint初始化的时候会作为map进行传入 | | | homepage_url | 接入的系统主页url | | | direct_url | 接入的系统重定向url | | @@ -97,11 +211,11 @@ Visualis接入的DSS系统的步骤如下: **3.6.前端的修改** - 3.6.1 增加节点类型 -修改src/js/service/nodeType.js文件,增加Qualitis节点类型 +修改src/js/service/nodeType.js文件,增加Visualis节点类型 - 3.6.2 增加节点图标 将节点图标复制到src/js/module/process/images/路径下,目前只支持SVG格式。 - 3.6.3 新增节点配置 -修改src/js/module/process/shape.js文件,增加Qualitis的节点配置信息。 +修改src/js/module/process/shape.js文件,增加Visualis的节点配置信息。 - 3.6.4 修改首页单击节点事件 修改src/js/module/process/index.vue文件,增加节点单击事件以及单击事件的处理逻辑。 - 3.6.5 修改工作流节点双击事件 @@ -109,7 +223,13 @@ Visualis接入的DSS系统的步骤如下: **3.7.编译打包成jar包放置到指定位置** -        实现了上述的接口之后,一个AppJoint就已经实现了。打包之后,需要放置到指定的位置。jar包需要放置到dss-server和linkis-appjoint-entrance两个微服务中,以linkis-appjoint-entrance 为例(dss-server与linkis-appjoint-entrance一致),在linkis-appjont-entrance下面的lib的同级目录有一个appjoints目录,目录下面层次如图3-3所示。 +        实现了上述的接口之后,一个AppJoint就已经实现了。打包之后,需要放置到指定的位置。 +jar包需要放置到dss-server和linkis-appjoint-entrance两个微服务中,以linkis-appjoint-entrance 为例(dss-server与linkis-appjoint-entrance一致), +在linkis-appjont-entrance下面的lib的同级目录有一个appjoints目录,目录下面层次如图3-3所示。 ![appjoints目录示例](/images/zh_CN/chapter4/appjoints.png)
图3-3 appjoints目录示例 -        在appjoints目录下面新建一个visualis目录。visualis目录下面要求有lib目录,lib目录存放的是visualis在实现VisualisAppJoint的编译的jar包,当然如果有引入dss系统没有带入的jar包,也需要放置到lib目录中,如sendemail Appjoint需要发送邮件功能的依赖包,所以需要将这些依赖包和已经实现的jar包统一放置到lib目录中。另外可以将本AppJoint所需要的一些配置参数放置到appjoints.properties,DSS系统提供的AppJointLoader会将这些配置的参数读取,放置到一个Map中,在AppJoint调用init方法的时候传入。 +        在appjoints目录下面新建一个visualis目录。 +visualis目录下面要求有lib目录,lib目录存放的是visualis在实现VisualisAppJoint的编译的jar包, +当然如果有引入dss系统没有带入的jar包,也需要放置到lib目录中,如sendemail Appjoint需要发送邮件功能的依赖包, +所以需要将这些依赖包和已经实现的jar包统一放置到lib目录中。 +另外可以将本AppJoint所需要的一些配置参数放置到appjoints.properties,DSS系统提供的AppJointLoader会将这些配置的参数读取,放置到一个Map中,在AppJoint调用init方法的时候传入。 diff --git a/dss-appjoint-auth/pom.xml b/dss-appjoint-auth/pom.xml index cbf74d63163c5b5d453e59215ade45baa47bffe8..3a368b8666b98b078f44a0c4501d8dc7b646a0be 100644 --- a/dss-appjoint-auth/pom.xml +++ b/dss-appjoint-auth/pom.xml @@ -22,7 +22,7 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 4.0.0 @@ -33,6 +33,16 @@ linkis-gateway-httpclient-support ${linkis.version} + + com.webank.wedatasphere.linkis + linkis-common + ${linkis.version} + + + javax.servlet + javax.servlet-api + 3.1.0 + diff --git a/dss-appjoint-auth/src/main/scala/com/webank/wedatasphere/dss/appjoint/auth/impl/AppJointAuthImpl.scala b/dss-appjoint-auth/src/main/scala/com/webank/wedatasphere/dss/appjoint/auth/impl/AppJointAuthImpl.scala index 8414615149472474159e3557f4bb2d512c5caaf7..15dd7e30727d5c0aaf7ecc56a24292689523ad26 100644 --- a/dss-appjoint-auth/src/main/scala/com/webank/wedatasphere/dss/appjoint/auth/impl/AppJointAuthImpl.scala +++ b/dss-appjoint-auth/src/main/scala/com/webank/wedatasphere/dss/appjoint/auth/impl/AppJointAuthImpl.scala @@ -24,8 +24,9 @@ import com.webank.wedatasphere.dss.appjoint.auth.{AppJointAuth, RedirectMsg} import com.webank.wedatasphere.linkis.common.utils.Logging import com.webank.wedatasphere.linkis.httpclient.dws.DWSHttpClient import com.webank.wedatasphere.linkis.httpclient.dws.config.DWSClientConfigBuilder -import javax.servlet.http.{Cookie, HttpServletRequest} +import javax.servlet.http.HttpServletRequest import org.apache.commons.io.IOUtils +import org.apache.http.impl.cookie.BasicClientCookie import scala.collection.JavaConversions._ @@ -38,7 +39,8 @@ class AppJointAuthImpl private() extends AppJointAuth with Logging { private def getBaseUrl(dssUrl: String): String = { val uri = new URI(dssUrl) - uri.getScheme + "://" + uri.getHost + ":" + uri.getPort + val dssPort = if(uri.getPort != -1) uri.getPort else 80 + uri.getScheme + "://" + uri.getHost + ":" + dssPort } protected def getDWSClient(dssUrl: String): DWSHttpClient = { @@ -67,7 +69,7 @@ class AppJointAuthImpl private() extends AppJointAuth with Logging { val index = cookie.indexOf("=") val key = cookie.substring(0, index).trim val value = cookie.substring(index + 1).trim - userInfoAction.addCookie(new Cookie(key, value)) + userInfoAction.addCookie(new BasicClientCookie(key, value)) } val redirectMsg = new RedirectMsgImpl redirectMsg.setRedirectUrl(request.getParameter(AppJointAuthImpl.REDIRECT_KEY)) diff --git a/dss-appjoint-core/pom.xml b/dss-appjoint-core/pom.xml index 70a534b66ca9a057bf3de7df2a2b047a413c18a9..7140b33656c25b07c08e4ceb4e3263e6d50e55c4 100644 --- a/dss-appjoint-core/pom.xml +++ b/dss-appjoint-core/pom.xml @@ -22,7 +22,7 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 4.0.0 @@ -50,6 +50,12 @@ dss-common ${dss.version} + + + com.webank.wedatasphere.linkis + linkis-httpclient + ${linkis.version} + diff --git a/dss-appjoint-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/execution/scheduler/ListenerEventBusNodeExecutionScheduler.scala b/dss-appjoint-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/execution/scheduler/ListenerEventBusNodeExecutionScheduler.scala index a355270ccb1b44ea94ce98f3dd5b2329ed029f12..51d167eaff1110b8657f62ac32574e529e0954e4 100644 --- a/dss-appjoint-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/execution/scheduler/ListenerEventBusNodeExecutionScheduler.scala +++ b/dss-appjoint-core/src/main/scala/com/webank/wedatasphere/dss/appjoint/execution/scheduler/ListenerEventBusNodeExecutionScheduler.scala @@ -17,6 +17,8 @@ package com.webank.wedatasphere.dss.appjoint.execution.scheduler +import java.util.concurrent.ArrayBlockingQueue + import com.webank.wedatasphere.dss.appjoint.exception.AppJointErrorException import com.webank.wedatasphere.dss.appjoint.execution.common.{AsyncNodeExecutionResponse, CompletedNodeExecutionResponse, LongTermNodeExecutionAction} import com.webank.wedatasphere.dss.appjoint.execution.conf.NodeExecutionConfiguration._ @@ -55,7 +57,7 @@ class ListenerEventBusNodeExecutionScheduler(eventQueueCapacity: Int, name: Stri val field1 = ru.typeOf[ListenerEventBus[_, _]].decl(ru.TermName("eventQueue")).asMethod val result = listenerEventBusClass.reflectMethod(field1) result() match { - case queue: BlockingLoopArray[AsyncNodeExecutionResponseEvent] => queue + case queue: ArrayBlockingQueue[AsyncNodeExecutionResponseEvent] => queue } } @@ -104,18 +106,18 @@ class ListenerEventBusNodeExecutionScheduler(eventQueueCapacity: Int, name: Stri protected def addEvent(event: AsyncNodeExecutionResponseEvent): Unit = synchronized { listenerEventBus.post(event) - event.getResponse.getAction match { - case longTermAction: LongTermNodeExecutionAction => - longTermAction.setSchedulerId(eventQueue.max) - case _ => - } +// event.getResponse.getAction match { +// case longTermAction: LongTermNodeExecutionAction => +// longTermAction.setSchedulerId(eventQueue.max) +// case _ => +// } } - override def removeAsyncResponse(action: LongTermNodeExecutionAction): Unit = - getAsyncResponse(action).setCompleted(true) + override def removeAsyncResponse(action: LongTermNodeExecutionAction): Unit = { + + } - override def getAsyncResponse(action: LongTermNodeExecutionAction): AsyncNodeExecutionResponse = - eventQueue.get(action.getSchedulerId).getResponse + override def getAsyncResponse(action: LongTermNodeExecutionAction): AsyncNodeExecutionResponse = null override def start(): Unit = listenerEventBus.start() diff --git a/dss-appjoint-loader/pom.xml b/dss-appjoint-loader/pom.xml index c3045655cc6fbd6d9fcb3749e6e3f133234bfc46..99ab6ac8be11a5a922c8f1627e2e32ad380fba02 100644 --- a/dss-appjoint-loader/pom.xml +++ b/dss-appjoint-loader/pom.xml @@ -22,12 +22,12 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 4.0.0 dss-appjoint-loader - 0.5.0 + 0.9.1 diff --git a/dss-application/pom.xml b/dss-application/pom.xml index 2b6fd94fc477be0e380b9837743676e4f5acd088..8be0e043489410b559bcd0f94fa73b43dc99493c 100644 --- a/dss-application/pom.xml +++ b/dss-application/pom.xml @@ -23,7 +23,7 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 dss-application @@ -47,6 +47,12 @@ dss-appjoint-loader ${dss.version} + + org.apache.htrace + htrace-core + 3.2.0-incubating + compile + diff --git a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/conf/ApplicationConf.java b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/conf/ApplicationConf.java index 5eca714b6eccc307c910d0eaaafb368c307611c5..1194eb427176e5d5012b78fd85c3396bbb9e15a9 100644 --- a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/conf/ApplicationConf.java +++ b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/conf/ApplicationConf.java @@ -26,4 +26,27 @@ import com.webank.wedatasphere.linkis.common.conf.CommonVars; public class ApplicationConf { public static final CommonVars FAQ = CommonVars.apply("wds.linkis.application.dws.params",""); + + public static final String SUPER_USER_NAME = CommonVars.apply("wds.linkis.super.user.name","").getValue(); + public static final String WORKSPACE_USER_ROOT_PATH = CommonVars.apply("wds.linkis.workspace.user.root.path","").getValue(); + public static final String HDFS_USER_ROOT_PATH = CommonVars.apply("wds.linkis.hdfs.user.root.path","").getValue(); + public static final String RESULT_SET_ROOT_PATH = CommonVars.apply("wds.linkis.result.set.root.path","").getValue(); + public static final String WDS_SCHEDULER_PATH = CommonVars.apply("wds.linkis.scheduler.path","").getValue(); + public static final String WDS_USER_PATH = CommonVars.apply("wds.linkis.user.path","hdfs:///user").getValue(); + public static final String DSS_INSTALL_DIR = CommonVars.apply("wds.linkis.dss.install.dir","").getValue(); + public static final String AZKABAN_INSTALL_DIR = CommonVars.apply("wds.linkis.azkaban.install.dir","").getValue(); + + + + + + + + + + + + + + } diff --git a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/dao/DSSApplicationUserMapper.java b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/dao/DSSApplicationUserMapper.java new file mode 100644 index 0000000000000000000000000000000000000000..f2f59e06b152fc4766d336081d75d54030621c36 --- /dev/null +++ b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/dao/DSSApplicationUserMapper.java @@ -0,0 +1,31 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.application.dao; + +import com.webank.wedatasphere.dss.application.entity.DSSUser; + +/** + * Created by chaogefeng on 2019/10/11. + */ +public interface DSSApplicationUserMapper { + DSSUser getUserByName(String username); + + void registerDssUser(DSSUser userDb); + + void updateUserFirstLogin(Long userId); +} diff --git a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/dao/impl/DSSUserMapper.xml b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/dao/impl/DSSApplicationUserMapper.xml similarity index 94% rename from dss-application/src/main/java/com/webank/wedatasphere/dss/application/dao/impl/DSSUserMapper.xml rename to dss-application/src/main/java/com/webank/wedatasphere/dss/application/dao/impl/DSSApplicationUserMapper.xml index ff731d7c6456f15c6e437f4bb361f0549f089b94..040cb3eb02934804bc2dfd1c932f090d82e77f8b 100644 --- a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/dao/impl/DSSUserMapper.xml +++ b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/dao/impl/DSSApplicationUserMapper.xml @@ -19,7 +19,7 @@ - + id,`username`,`name`,`is_first_login` @@ -29,7 +29,7 @@ select * from dss_user where `username` = #{username} - + INSERT INTO dss_user() VALUES (#{id},#{username},#{name},#{isFirstLogin}) diff --git a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/entity/DSSUser.java b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/entity/DSSUser.java index 43323288a449430d67fe3494302175a355eda43e..73dd36aca0488a1ababd43fb3ba4f692f1ec91a1 100644 --- a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/entity/DSSUser.java +++ b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/entity/DSSUser.java @@ -25,6 +25,14 @@ public class DSSUser { private String username; private String name; private Boolean isFirstLogin; + private boolean isSuperUser = false; + + public boolean getIsSuperUser() { + return isSuperUser; + } + public void setIsSuperUser(boolean isSuperUser) { + this.isSuperUser = isSuperUser; + } public Long getId() { return id; diff --git a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/entity/WorkSpacePath.java b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/entity/WorkSpacePath.java new file mode 100644 index 0000000000000000000000000000000000000000..4f2b258915a9245175488ae75be3ed351b741a26 --- /dev/null +++ b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/entity/WorkSpacePath.java @@ -0,0 +1,59 @@ +package com.webank.wedatasphere.dss.application.entity; + +public class WorkSpacePath { + private String workspaceRootPath; + private String hdfsRootPath; + private String resultRootPath; + private String schedulerPath; + private String userPath; + + public String getUserPath() { + return userPath; + } + + public void setUserPath(String userPath) { + this.userPath = userPath; + } + + + + public String getWorkspaceRootPath() { + return workspaceRootPath; + } + + public void setWorkspaceRootPath(String workspaceRootPath) { + this.workspaceRootPath = workspaceRootPath; + } + + public String getHdfsRootPath() { + return hdfsRootPath; + } + + public void setHdfsRootPath(String hdfsRootPath) { + this.hdfsRootPath = hdfsRootPath; + } + + public String getResultRootPath() { + return resultRootPath; + } + + public void setResultRootPath(String resultRootPath) { + this.resultRootPath = resultRootPath; + } + + public String getSchedulerPath() { + return schedulerPath; + } + + public void setSchedulerPath(String schedulerPath) { + this.schedulerPath = schedulerPath; + } + + + + + + + + +} diff --git a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/handler/UserFirstLoginHandler.java b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/handler/UserFirstLoginHandler.java index 5c5a93c7755ddfd49f84a55076a40f8d11cc2932..f902c5ba3ddea435b3c855a7f67a772363c05e90 100644 --- a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/handler/UserFirstLoginHandler.java +++ b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/handler/UserFirstLoginHandler.java @@ -18,7 +18,7 @@ package com.webank.wedatasphere.dss.application.handler; import com.webank.wedatasphere.dss.application.entity.DSSUser; -import com.webank.wedatasphere.dss.application.service.DSSUserService; +import com.webank.wedatasphere.dss.application.service.DSSApplicationUserService; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; @@ -33,7 +33,7 @@ public class UserFirstLoginHandler implements Handler { private Logger logger = LoggerFactory.getLogger(this.getClass()); @Autowired - private DSSUserService dssUserService; + private DSSApplicationUserService dssApplicationUserService; @Override public int getOrder() { @@ -44,7 +44,7 @@ public class UserFirstLoginHandler implements Handler { public void handle(DSSUser user) { logger.info("UserFirstLoginHandler:"); synchronized (user.getUsername().intern()){ - DSSUser userDb = dssUserService.getUserByName(user.getUsername()); + DSSUser userDb = dssApplicationUserService.getUserByName(user.getUsername()); if(userDb == null){ logger.info("User first enter dss, insert table dss_user"); userDb = new DSSUser(); @@ -52,7 +52,7 @@ public class UserFirstLoginHandler implements Handler { userDb.setName(user.getName()); userDb.setFirstLogin(true); userDb.setId(user.getId()); - dssUserService.registerDSSUser(userDb); + dssApplicationUserService.registerDssUser(userDb); } // TODO: 2019/11/29 update firstLogin user = userDb; diff --git a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/restful/ApplicationRestfulApi.java b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/restful/ApplicationRestfulApi.java index c0fa339629c52798a6fd02c914cfaa83f205bb7a..10552b25f2c639edc6445721d6a49003ec6a2149 100644 --- a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/restful/ApplicationRestfulApi.java +++ b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/restful/ApplicationRestfulApi.java @@ -17,12 +17,14 @@ package com.webank.wedatasphere.dss.application.restful; +import com.webank.wedatasphere.dss.application.conf.ApplicationConf; import com.webank.wedatasphere.dss.application.entity.Application; import com.webank.wedatasphere.dss.application.entity.DSSUser; import com.webank.wedatasphere.dss.application.entity.DSSUserVO; +import com.webank.wedatasphere.dss.application.entity.WorkSpacePath; import com.webank.wedatasphere.dss.application.handler.ApplicationHandlerChain; import com.webank.wedatasphere.dss.application.service.ApplicationService; -import com.webank.wedatasphere.dss.application.service.DSSUserService; +import com.webank.wedatasphere.dss.application.service.DSSApplicationUserService; import com.webank.wedatasphere.dss.application.util.ApplicationUtils; import com.webank.wedatasphere.linkis.server.Message; import com.webank.wedatasphere.linkis.server.security.SecurityFilter; @@ -37,6 +39,8 @@ import javax.ws.rs.Produces; import javax.ws.rs.core.Context; import javax.ws.rs.core.MediaType; import javax.ws.rs.core.Response; +import java.util.ArrayList; +import java.util.HashMap; import java.util.List; /** @@ -51,7 +55,7 @@ public class ApplicationRestfulApi { @Autowired private ApplicationService applicationService; @Autowired - private DSSUserService dataworkisUserService; + private DSSApplicationUserService dataworkisUserService; @Autowired private ApplicationHandlerChain applicationHandlerChain; @@ -70,7 +74,31 @@ public class ApplicationRestfulApi { } DSSUser dssUser = dataworkisUserService.getUserByName(username); DSSUserVO dataworkisUserVO = new DSSUserVO(); + String superUserName = ApplicationConf.SUPER_USER_NAME; + if(username.equals(superUserName)){ + dssUser.setIsSuperUser(true); + }else{ + dssUser.setIsSuperUser(false); + } + dataworkisUserVO.setBasic(dssUser); return Message.messageToResponse(Message.ok().data("applications",applicationList).data("userInfo",dataworkisUserVO)); } + + + @GET + @Path("paths") + public Response getWorkSpace(@Context HttpServletRequest req) throws Exception { + WorkSpacePath workSpacePath = new WorkSpacePath(); + workSpacePath.setHdfsRootPath(ApplicationConf.HDFS_USER_ROOT_PATH); + workSpacePath.setResultRootPath(ApplicationConf.RESULT_SET_ROOT_PATH); + workSpacePath.setSchedulerPath(ApplicationConf.WDS_SCHEDULER_PATH); + workSpacePath.setWorkspaceRootPath(ApplicationConf.WORKSPACE_USER_ROOT_PATH); + workSpacePath.setUserPath(ApplicationConf.WDS_USER_PATH); + ArrayList> responses = ApplicationUtils.convertToMap(workSpacePath); + return Message.messageToResponse(Message.ok().data("paths",responses) + .data("dssInstallDir", ApplicationConf.DSS_INSTALL_DIR) + .data("azkakanDir", ApplicationConf.AZKABAN_INSTALL_DIR)); + + } } diff --git a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/DSSApplicationUserService.java b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/DSSApplicationUserService.java new file mode 100644 index 0000000000000000000000000000000000000000..a798dc4357275a6d6b8ccf359f01bea10babcf8c --- /dev/null +++ b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/DSSApplicationUserService.java @@ -0,0 +1,32 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.application.service; + +import com.webank.wedatasphere.dss.application.entity.DSSUser; + +/** + * Created by chaogefeng on 2019/10/11. + */ +public interface DSSApplicationUserService { + + DSSUser getUserByName(String username); + + void registerDssUser(DSSUser userDb); + + void updateUserFirstLogin(Long id); +} diff --git a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/impl/ApplicationServiceImpl.java b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/impl/ApplicationServiceImpl.java index 36fcde3eb848506fd3fc6a40a4ede38e939b7d86..271acac867e6452519c8e5bf96e2a5cc89499e8b 100644 --- a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/impl/ApplicationServiceImpl.java +++ b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/impl/ApplicationServiceImpl.java @@ -65,7 +65,12 @@ public class ApplicationServiceImpl implements ApplicationService { @Override public AppJoint getAppjoint(String nodeType) throws AppJointErrorException { - Application application = getApplicationbyNodeType(nodeType); + Application application; + if(nodeType.equals("schedulis")){ + application = getApplication(nodeType); + }else { + application = getApplicationbyNodeType(nodeType); + } AppJoint appJoint = null; try { appJoint = loadAppjoint(application); diff --git a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/impl/DSSUserServiceImpl.java b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/impl/DSSApplicationUserServiceImpl.java similarity index 66% rename from dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/impl/DSSUserServiceImpl.java rename to dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/impl/DSSApplicationUserServiceImpl.java index 71db0e5df1f8518631b4ae01db0a76f64fb79c7e..c8fae7a6319455d06fba518c8231f175e2fc030b 100644 --- a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/impl/DSSUserServiceImpl.java +++ b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/impl/DSSApplicationUserServiceImpl.java @@ -17,9 +17,9 @@ package com.webank.wedatasphere.dss.application.service.impl; -import com.webank.wedatasphere.dss.application.dao.DSSUserMapper; +import com.webank.wedatasphere.dss.application.dao.DSSApplicationUserMapper; import com.webank.wedatasphere.dss.application.entity.DSSUser; -import com.webank.wedatasphere.dss.application.service.DSSUserService; +import com.webank.wedatasphere.dss.application.service.DSSApplicationUserService; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Service; @@ -27,23 +27,23 @@ import org.springframework.stereotype.Service; * Created by chaogefeng on 2019/10/11. */ @Service -public class DSSUserServiceImpl implements DSSUserService { +public class DSSApplicationUserServiceImpl implements DSSApplicationUserService { @Autowired - private DSSUserMapper dssUserMapper; + private DSSApplicationUserMapper dssApplicationUserMapper; @Override public DSSUser getUserByName(String username) { - return dssUserMapper.getUserByName(username); + return dssApplicationUserMapper.getUserByName(username); } @Override - public void registerDSSUser(DSSUser userDb) { - dssUserMapper.registerDSSUser( userDb); + public void registerDssUser(DSSUser userDb) { + dssApplicationUserMapper.registerDssUser( userDb); } @Override public void updateUserFirstLogin(Long id) { - dssUserMapper.updateUserFirstLogin(id); + dssApplicationUserMapper.updateUserFirstLogin(id); } } diff --git a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/impl/LinkisUserServiceImpl.java b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/impl/LinkisUserServiceImpl.java index 60f11bdf3d2bfb3af9f8777233676c0cc1c885fe..f40f286eccb56a2c4883c07fd2cd8799e2463f47 100644 --- a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/impl/LinkisUserServiceImpl.java +++ b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/impl/LinkisUserServiceImpl.java @@ -16,7 +16,7 @@ */ package com.webank.wedatasphere.dss.application.service.impl; -import com.webank.wedatasphere.dss.application.dao.DSSUserMapper; +import com.webank.wedatasphere.dss.application.dao.DSSApplicationUserMapper; import com.webank.wedatasphere.dss.application.dao.LinkisUserMapper; import com.webank.wedatasphere.dss.application.entity.DSSUser; import com.webank.wedatasphere.dss.application.entity.LinkisUser; @@ -33,7 +33,7 @@ public class LinkisUserServiceImpl implements LinkisUserService { @Autowired private LinkisUserMapper linkisUserMapper; @Autowired - private DSSUserMapper dssUserMapper; + private DSSApplicationUserMapper dssApplicationUserMapper; @Override public LinkisUser getUserByName(String username) { @@ -55,6 +55,6 @@ public class LinkisUserServiceImpl implements LinkisUserService { dssUser.setName(userDb.getName()); dssUser.setUsername(userDb.getUserName()); dssUser.setFirstLogin(userDb.getFirstLogin()); - dssUserMapper.registerDSSUser(dssUser); + dssApplicationUserMapper.registerDssUser(dssUser); } } diff --git a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/util/ApplicationUtils.java b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/util/ApplicationUtils.java index 4f0fabd1b9d319e1e5fa34219cece63243bb293e..af65bd32271b51091dbd25c872bc27272927a9de 100644 --- a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/util/ApplicationUtils.java +++ b/dss-application/src/main/java/com/webank/wedatasphere/dss/application/util/ApplicationUtils.java @@ -17,12 +17,16 @@ package com.webank.wedatasphere.dss.application.util; +import com.webank.wedatasphere.dss.application.entity.WorkSpacePath; import com.webank.wedatasphere.dss.common.exception.DSSErrorException; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import java.io.UnsupportedEncodingException; +import java.lang.reflect.Field; import java.net.URLEncoder; +import java.util.ArrayList; +import java.util.HashMap; /** * Created by chaogefeng on 2019/11/20. @@ -46,8 +50,36 @@ public class ApplicationUtils { return String.format(REDIRECT_FORMAT,redirectUrl,URLEndoder(url)); } - public static void main(String[] args) throws DSSErrorException { - System.out.println(redirectUrlFormat("http://127.0..0.1:8090/qualitis/api/v1/redirect","http://127.0..0.1:8090/#/projects/list?id={projectId}&flow=true")); + + public static ArrayList> convertToMap(Object obj) + throws Exception { + + ArrayList> mapList = new ArrayList(); + Field[] fields = obj.getClass().getDeclaredFields(); + for (int i = 0, len = fields.length; i < len; i++) { + String varName = fields[i].getName(); + boolean accessFlag = fields[i].isAccessible(); + fields[i].setAccessible(true); + HashMap map = new HashMap<>(); + Object o = fields[i].get(obj); + if (o != null){ + map.put("key",varName); + map.put("value",o.toString()); + mapList.add(map); + } + + fields[i].setAccessible(accessFlag); + } + + return mapList; + } + + public static void main(String[] args) throws Exception { +// System.out.println(redirectUrlFormat("http://127.0..0.1:8090/qualitis/api/v1/redirect","http://127.0..0.1:8090/#/projects/list?id={projectId}&flow=true")); + + WorkSpacePath workSpacePath = new WorkSpacePath(); + workSpacePath.setWorkspaceRootPath("/"); + System.out.println(convertToMap(workSpacePath)); } } diff --git a/dss-azkaban-scheduler-appjoint/pom.xml b/dss-azkaban-scheduler-appjoint/pom.xml index 56215f52ffcbaa471892dc6eec42aaa970137832..4f67a636b8f732d68543f58bf6f04ce5e45a7708 100644 --- a/dss-azkaban-scheduler-appjoint/pom.xml +++ b/dss-azkaban-scheduler-appjoint/pom.xml @@ -22,22 +22,24 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 4.0.0 dss-azkaban-scheduler-appjoint - - org.springframework.boot - spring-boot-autoconfigure - 2.0.3.RELEASE - + + + + + com.webank.wedatasphere.dss dss-scheduler-appjoint-core ${dss.version} + provided + true org.apache.httpcomponents @@ -48,6 +50,8 @@ com.webank.wedatasphere.dss dss-application ${dss.version} + provided + true @@ -66,6 +70,35 @@ org.apache.maven.plugins maven-jar-plugin + + org.apache.maven.plugins + maven-assembly-plugin + 2.3 + false + + + make-assembly + package + + single + + + + src/main/assembly/distribution.xml + + + + + + false + dss-azkaban-appjoint + false + false + + src/main/assembly/distribution.xml + + + diff --git a/dss-azkaban-scheduler-appjoint/src/main/assembly/distribution.xml b/dss-azkaban-scheduler-appjoint/src/main/assembly/distribution.xml new file mode 100644 index 0000000000000000000000000000000000000000..2898561240c9e9f235ce199252a4fcfa6834b80e --- /dev/null +++ b/dss-azkaban-scheduler-appjoint/src/main/assembly/distribution.xml @@ -0,0 +1,136 @@ + + + + dss-azkaban-appjoint + + zip + + true + schedulis + + + + + + lib + true + true + false + true + true + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + ${basedir}/src/main/resources + + appjoint.properties + + 0777 + / + unix + + + + ${basedir}/src/main/resources + + log4j.properties + log4j2.xml + + 0777 + conf + unix + + + + + + diff --git a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/AzkabanSchedulerAppJoint.java b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/AzkabanSchedulerAppJoint.java index 9a062d5c95cd8cd56f2b95fb6d56f9dcac3594cf..8f6e4bb8a9ad8176ae76731e697a12fdce5f63f1 100644 --- a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/AzkabanSchedulerAppJoint.java +++ b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/AzkabanSchedulerAppJoint.java @@ -2,33 +2,34 @@ package com.webank.wedatasphere.dss.appjoint.scheduler.azkaban; import com.webank.wedatasphere.dss.appjoint.scheduler.SchedulerAppJoint; import com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.conf.AzkabanConf; +import com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.hooks.LinkisAzkabanProjectPublishHook; +import com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.parser.AzkabanProjectParser; +import com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.tuning.AzkabanProjectTuning; +import com.webank.wedatasphere.dss.appjoint.scheduler.hooks.ProjectPublishHook; +import com.webank.wedatasphere.dss.appjoint.scheduler.parser.ProjectParser; import com.webank.wedatasphere.dss.appjoint.scheduler.service.SchedulerProjectService; import com.webank.wedatasphere.dss.appjoint.scheduler.service.SchedulerSecurityService; import com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.service.AzkabanProjectService; import com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.service.AzkabanSecurityService; -import com.webank.wedatasphere.dss.application.entity.Application; +import com.webank.wedatasphere.dss.appjoint.scheduler.tuning.ProjectTuning; +import com.webank.wedatasphere.dss.appjoint.service.AppJointUrlImpl; import com.webank.wedatasphere.dss.application.service.ApplicationService; import org.slf4j.Logger; import org.slf4j.LoggerFactory; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Component; import org.springframework.util.StringUtils; - -import javax.annotation.PostConstruct; import java.io.IOException; import java.util.Map; /** * Created by cooperyang on 2019/9/16. */ -@Component -public final class AzkabanSchedulerAppJoint implements SchedulerAppJoint { + +public final class AzkabanSchedulerAppJoint extends AppJointUrlImpl implements SchedulerAppJoint { private static final Logger LOGGER = LoggerFactory.getLogger(AzkabanSchedulerAppJoint.class); private SchedulerSecurityService securityService; private SchedulerProjectService projectService; - @Autowired private ApplicationService applicationService; @Override @@ -36,25 +37,17 @@ public final class AzkabanSchedulerAppJoint implements SchedulerAppJoint { return "schedulis"; } - @PostConstruct - public void beforeInit(){ - Application schedulis = applicationService.getApplication("schedulis"); - String basicUrl = null; - if(schedulis != null){ - basicUrl = schedulis.getUrl(); - } + + @Override + public void init(String basicUrl, Map params) { LOGGER.info("read schedulerAppJoint url from db{}",basicUrl); if(StringUtils.isEmpty(basicUrl)){ basicUrl = AzkabanConf.AZKABAN_BASE_URL.getValue(); LOGGER.warn("basic url in db is empty,read it from conf{}",basicUrl); } - init(basicUrl,null); - } - - @Override - public void init(String basicUrl, Map params) { securityService = new AzkabanSecurityService(); securityService.setBaseUrl(basicUrl); + projectService = new AzkabanProjectService(); projectService.setBaseUrl(basicUrl); } @@ -64,6 +57,22 @@ public final class AzkabanSchedulerAppJoint implements SchedulerAppJoint { return this.securityService; } + @Override + public ProjectParser getProjectParser() { + return new AzkabanProjectParser(); + } + + @Override + public ProjectTuning getProjectTuning() { + return new AzkabanProjectTuning(); + } + + @Override + public ProjectPublishHook[] getProjectPublishHooks() { + ProjectPublishHook[] projectPublishHooks = {new LinkisAzkabanProjectPublishHook()}; + return projectPublishHooks; + } + @Override public SchedulerProjectService getProjectService() { return this.projectService; diff --git a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/conf/AzkabanConf.java b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/conf/AzkabanConf.java index 40fe989b5ded0f0b0fe3449753684618f5f717fe..49ea9948ad282f57abdb62ac2305c39fd39e0f79 100644 --- a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/conf/AzkabanConf.java +++ b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/conf/AzkabanConf.java @@ -10,4 +10,5 @@ public class AzkabanConf { public static final CommonVars AZKABAN_BASE_URL = CommonVars.apply("wds.dss.appjoint.scheduler.azkaban.address", ""); public static final CommonVars DEFAULT_STORE_PATH = CommonVars.apply("wds.dss.appjoint.scheduler.project.store.dir", "/appcom/tmp/wds/dss"); + public static final CommonVars AZKABAN_LOGIN_PWD = CommonVars.apply("wds.dss.appjoint.scheduler.azkaban.login.passwd", "password"); } diff --git a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/hooks/LinkisAzkabanFlowPublishHook.java b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/hooks/LinkisAzkabanFlowPublishHook.java index 813c30e2688c9cd778c2e7072ebb52c3f87faa5e..73cec2164c9c3b34c73d59ff8503ecd12b78c4d2 100644 --- a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/hooks/LinkisAzkabanFlowPublishHook.java +++ b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/hooks/LinkisAzkabanFlowPublishHook.java @@ -13,9 +13,6 @@ import org.apache.commons.io.FileUtils; import org.apache.commons.io.IOUtils; import org.slf4j.Logger; import org.slf4j.LoggerFactory; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Component; - import java.io.File; import java.io.FileOutputStream; import java.util.List; @@ -24,11 +21,15 @@ import java.util.Map; /** * Created by allenlliu on 2019/9/20. */ -@Component + public class LinkisAzkabanFlowPublishHook extends AbstractFlowPublishHook { private static final Logger LOGGER = LoggerFactory.getLogger(LinkisAzkabanFlowPublishHook.class); + public LinkisAzkabanFlowPublishHook(){ + NodePublishHook[] nodePublishHooks = {new LinkisAzkabanNodePublishHook()}; + setNodeHooks(nodePublishHooks); + } @Override public void prePublish(SchedulerFlow flow) throws DSSErrorException { @@ -102,7 +103,7 @@ public class LinkisAzkabanFlowPublishHook extends AbstractFlowPublishHook { super.postPublish(flow); } - @Autowired + @Override public void setNodeHooks(NodePublishHook[] nodePublishHooks) { super.setNodeHooks(nodePublishHooks); diff --git a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/hooks/LinkisAzkabanNodePublishHook.java b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/hooks/LinkisAzkabanNodePublishHook.java index 1e34802485a80ebef57262e91764d9a402621003..702275f40c966583a94476c1958d3c166adadc21 100644 --- a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/hooks/LinkisAzkabanNodePublishHook.java +++ b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/hooks/LinkisAzkabanNodePublishHook.java @@ -13,21 +13,20 @@ import org.apache.commons.io.FileUtils; import org.apache.commons.io.IOUtils; import org.slf4j.Logger; import org.slf4j.LoggerFactory; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Component; - import java.io.File; import java.io.FileOutputStream; import java.util.List; -@Component + public class LinkisAzkabanNodePublishHook extends AbstractNodePublishHook { private static final Logger LOGGER = LoggerFactory.getLogger(LinkisAzkabanNodePublishHook.class); - - @Autowired private LinkisJobConverter linkisJobConverter; + public LinkisAzkabanNodePublishHook(){ + this.linkisJobConverter = new LinkisJobConverter(); + } + @Override public void prePublish(SchedulerNode schedulerNode) throws DSSErrorException { writeNodeTojobLocal(schedulerNode); @@ -60,7 +59,7 @@ public class LinkisAzkabanNodePublishHook extends AbstractNodePublishHook { } private void writeNodeResourcesToLocal(SchedulerNode schedulerNode) throws DSSErrorException { - List nodeResources = schedulerNode.getDWSNode().getResources(); + List nodeResources = schedulerNode.getDssNode().getResources(); if(nodeResources == null || nodeResources.isEmpty()) {return;} FileOutputStream os = null; try { diff --git a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/hooks/LinkisAzkabanProjectPublishHook.java b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/hooks/LinkisAzkabanProjectPublishHook.java index 6e9b164c1fa2e976573786b201600f6b6399d710..40effab61b6954979e43f15cebb38ac784ffbc07 100644 --- a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/hooks/LinkisAzkabanProjectPublishHook.java +++ b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/hooks/LinkisAzkabanProjectPublishHook.java @@ -14,9 +14,6 @@ import org.apache.commons.io.FileUtils; import org.apache.commons.io.IOUtils; import org.slf4j.Logger; import org.slf4j.LoggerFactory; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Component; - import java.io.File; import java.io.FileOutputStream; import java.util.ArrayList; @@ -25,13 +22,16 @@ import java.util.List; /** * Created by cooperyang on 2019/9/18 */ -@Component + public class LinkisAzkabanProjectPublishHook extends AbstractProjectPublishHook { private static final Logger LOGGER = LoggerFactory.getLogger( LinkisAzkabanProjectPublishHook.class); + public LinkisAzkabanProjectPublishHook(){ + FlowPublishHook[] flowPublishHooks = {new LinkisAzkabanFlowPublishHook()}; + setFlowPublishHooks(flowPublishHooks); + } @Override - @Autowired public void setFlowPublishHooks(FlowPublishHook[] flowPublishHooks) { super.setFlowPublishHooks(flowPublishHooks); } @@ -75,7 +75,7 @@ public class LinkisAzkabanProjectPublishHook extends AbstractProjectPublishHook } private void writeProjectResourcesToLocal(AzkabanSchedulerProject publishProject)throws DSSErrorException { - List resources = publishProject.getDWSProject().getProjectResources(); + List resources = publishProject.getDssProject().getProjectResources(); FileOutputStream os = null; try { String storePath = publishProject.getStorePath(); diff --git a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/linkisjob/AzkabanSubFlowJobTuning.java b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/linkisjob/AzkabanSubFlowJobTuning.java index d7028cceaf6d446d05b29ff942d911a7d4091200..4b31aa9dd4cbc819251faa6c888859e679b79378 100644 --- a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/linkisjob/AzkabanSubFlowJobTuning.java +++ b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/linkisjob/AzkabanSubFlowJobTuning.java @@ -1,11 +1,11 @@ package com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.linkisjob; -import org.springframework.stereotype.Component; + /** * Created by cooperyang on 2019/11/1. */ -@Component + public class AzkabanSubFlowJobTuning implements LinkisJobTuning { @Override diff --git a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/linkisjob/LinkisJobConverter.java b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/linkisjob/LinkisJobConverter.java index 6f497c8cd343a3c5694fb7afb4a15454b2ee40e9..ef6f3f3b1c7e084ea9c62b0a7ee4a7499abd69c2 100644 --- a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/linkisjob/LinkisJobConverter.java +++ b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/linkisjob/LinkisJobConverter.java @@ -7,9 +7,6 @@ import com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.entity.LinkisAzkab import com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.entity.LinkisAzkabanShareNode; import com.webank.wedatasphere.dss.appjoint.scheduler.constant.SchedulerAppJointConstant; import org.apache.commons.lang.StringUtils; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Component; - import java.util.Arrays; import java.util.HashMap; import java.util.List; @@ -18,10 +15,13 @@ import java.util.Map; /** * Created by cooperyang on 2019/11/1. */ -@Component + public class LinkisJobConverter { - @Autowired + public LinkisJobConverter(){ + LinkisJobTuning[] linkisJobTunings = {new AzkabanSubFlowJobTuning()}; + this.linkisJobTunings = linkisJobTunings; + } private LinkisJobTuning[] linkisJobTunings; public String conversion(LinkisAzkabanSchedulerNode schedulerNode){ @@ -79,7 +79,7 @@ public class LinkisJobConverter { } private void convertDependencies(LinkisAzkabanSchedulerNode schedulerNode,LinkisJob job){ - List dependencys = schedulerNode.getDWSNode().getDependencys(); + List dependencys = schedulerNode.getDssNode().getDependencys(); if(dependencys != null && !dependencys.isEmpty()) { StringBuilder dependencies = new StringBuilder(); dependencys.forEach(d ->dependencies.append(d + ",")); @@ -88,12 +88,12 @@ public class LinkisJobConverter { } private void convertProxyUser(LinkisAzkabanSchedulerNode schedulerNode,LinkisJob job){ - String userProxy = schedulerNode.getDWSNode().getUserProxy(); + String userProxy = schedulerNode.getDssNode().getUserProxy(); if(!StringUtils.isEmpty(userProxy)) job.setProxyUser(userProxy); } private void convertConfiguration(LinkisAzkabanSchedulerNode schedulerNode,LinkisJob job){ - Map params = schedulerNode.getDWSNode().getParams(); + Map params = schedulerNode.getDssNode().getParams(); if (params != null && !params.isEmpty()) { Object configuration = params.get("configuration"); String confprefix = "node.conf."; @@ -103,7 +103,7 @@ public class LinkisJobConverter { } private void convertJobCommand(LinkisAzkabanSchedulerNode schedulerNode,LinkisJob job){ - Map jobContent = schedulerNode.getDWSNode().getJobContent(); + Map jobContent = schedulerNode.getDssNode().getJobContent(); if(jobContent != null) { jobContent.remove("jobParams"); job.setCommand(new Gson().toJson(jobContent)); diff --git a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/parser/AzkabanFlowParser.java b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/parser/AzkabanFlowParser.java index 57934278dceb25e98893faa5738911058797d1c0..0fe73508373b997b9197f973837d4d4ce6fe68e7 100644 --- a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/parser/AzkabanFlowParser.java +++ b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/parser/AzkabanFlowParser.java @@ -3,27 +3,34 @@ package com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.parser; import com.webank.wedatasphere.dss.appjoint.scheduler.parser.AbstractFlowParser; import com.webank.wedatasphere.dss.appjoint.scheduler.parser.NodeParser; import com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.entity.AzkabanSchedulerFlow; -import com.webank.wedatasphere.dss.common.entity.flow.DWSJSONFlow; +import com.webank.wedatasphere.dss.common.entity.flow.DSSJSONFlow; import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerFlow; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Component; +import java.util.ArrayList; + -@Component public class AzkabanFlowParser extends AbstractFlowParser { + public AzkabanFlowParser(){ + ArrayList list = new ArrayList<>(); + list.add(new LinkisAzkabanNodeParser()); + list.add(new LinkisAzkabanSendEmailNodeParser()); + NodeParser[] nodeParsers =new NodeParser[list.size()]; + list.toArray(nodeParsers); + setNodeParsers(nodeParsers); + } + @Override protected SchedulerFlow createSchedulerFlow() { return new AzkabanSchedulerFlow(); } @Override - @Autowired public void setNodeParsers(NodeParser[] nodeParsers) { super.setNodeParsers(nodeParsers); } @Override - public Boolean ifFlowCanParse(DWSJSONFlow flow) { + public Boolean ifFlowCanParse(DSSJSONFlow flow) { return true; } diff --git a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/parser/AzkabanProjectParser.java b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/parser/AzkabanProjectParser.java index 14952e465deb0495b71596d3e248c25e662b572b..3895d7c68753f95c5765f3dad9362fb3715f4e8a 100644 --- a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/parser/AzkabanProjectParser.java +++ b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/parser/AzkabanProjectParser.java @@ -6,15 +6,17 @@ import com.webank.wedatasphere.dss.appjoint.scheduler.parser.AbstractProjectPars import com.webank.wedatasphere.dss.appjoint.scheduler.parser.FlowParser; import org.slf4j.Logger; import org.slf4j.LoggerFactory; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Component; - /** * Created by allenlliu on 2019/9/16. */ -@Component + public class AzkabanProjectParser extends AbstractProjectParser { + public AzkabanProjectParser(){ + FlowParser[] flowParsers = {new AzkabanFlowParser()}; + setFlowParsers(flowParsers); + } + private static final Logger logger = LoggerFactory.getLogger(AzkabanProjectParser.class); @Override @@ -23,7 +25,6 @@ public class AzkabanProjectParser extends AbstractProjectParser { } @Override - @Autowired public void setFlowParsers(FlowParser[] flowParsers) { super.setFlowParsers(flowParsers); } diff --git a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/parser/LinkisAzkabanNodeParser.java b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/parser/LinkisAzkabanNodeParser.java index c6084b66de84bff880f0d8fdebfc2fdaa7338a07..e9c8b86a90974b4280f91a04859995c4b79dc5cb 100644 --- a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/parser/LinkisAzkabanNodeParser.java +++ b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/parser/LinkisAzkabanNodeParser.java @@ -1,27 +1,25 @@ package com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.parser; import com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.entity.LinkisAzkabanSchedulerNode; -import com.webank.wedatasphere.dss.common.entity.node.DWSNode; +import com.webank.wedatasphere.dss.common.entity.node.DSSNode; import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerNode; import org.slf4j.Logger; import org.slf4j.LoggerFactory; -import org.springframework.stereotype.Component; -@Component public class LinkisAzkabanNodeParser extends AzkabanNodeParser { private static final Logger LOGGER = LoggerFactory.getLogger(LinkisAzkabanNodeParser.class); @Override - public SchedulerNode parseNode(DWSNode dwsNode) { + public SchedulerNode parseNode(DSSNode dssNode) { LinkisAzkabanSchedulerNode schedulerNode = new LinkisAzkabanSchedulerNode(); - schedulerNode.setDWSNode(dwsNode); + schedulerNode.setDssNode(dssNode); return schedulerNode; } @Override - public Boolean ifNodeCanParse(DWSNode dwsNode) { + public Boolean ifNodeCanParse(DSSNode dssNode) { //预留 return true; } diff --git a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/parser/LinkisAzkabanSendEmailNodeParser.java b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/parser/LinkisAzkabanSendEmailNodeParser.java index a46dd0be879050d912b437231e022c8af882f9f1..723b35f7e848d196da8f725d8848c3bce372c8da 100644 --- a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/parser/LinkisAzkabanSendEmailNodeParser.java +++ b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/parser/LinkisAzkabanSendEmailNodeParser.java @@ -5,9 +5,9 @@ import com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.entity.LinkisAzkab import com.webank.wedatasphere.dss.appjoint.scheduler.entity.ReadNode; import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerNode; import com.webank.wedatasphere.dss.appjoint.scheduler.parser.SendEmailNodeParser; -import org.springframework.stereotype.Component; -@Component + + public class LinkisAzkabanSendEmailNodeParser extends SendEmailNodeParser { @Override diff --git a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/service/AzkabanProjectService.java b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/service/AzkabanProjectService.java index 2f2f98e4d04bf7828d0b5dda8875f4f5a3d7232b..eb536b14b7f27e780c01c456ae950aacc4be4ce5 100644 --- a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/service/AzkabanProjectService.java +++ b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/service/AzkabanProjectService.java @@ -80,11 +80,11 @@ public final class AzkabanProjectService extends AppJointUrlImpl implements Sche params.add(new BasicNameValuePair("name", project.getName())); params.add(new BasicNameValuePair("description", project.getDescription())); HttpPost httpPost = new HttpPost(projectUrl); - httpPost.addHeader(HTTP.CONTENT_ENCODING, "UTF-8"); + httpPost.addHeader(HTTP.CONTENT_ENCODING, HTTP.IDENTITY_CODING); CookieStore cookieStore = new BasicCookieStore(); cookieStore.addCookie(session.getCookies()[0]); - HttpEntity entity = EntityBuilder.create().setContentEncoding("UTF-8"). - setContentType(ContentType.create("application/x-www-form-urlencoded", Consts.UTF_8)) + HttpEntity entity = EntityBuilder.create() + .setContentType(ContentType.create("application/x-www-form-urlencoded", Consts.UTF_8)) .setParameters(params).build(); httpPost.setEntity(entity); CloseableHttpClient httpClient = null; diff --git a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/service/AzkabanSecurityService.java b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/service/AzkabanSecurityService.java index 5e8d0bcdc6a0725b78f075f4fd03f81d9bd5df49..b24fe94e7654dbbbdffc19dec7d148b55b8c841e 100644 --- a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/service/AzkabanSecurityService.java +++ b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/service/AzkabanSecurityService.java @@ -1,12 +1,14 @@ package com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.service; import com.webank.wedatasphere.dss.appjoint.exception.AppJointErrorException; +import com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.conf.AzkabanConf; import com.webank.wedatasphere.dss.appjoint.scheduler.service.SchedulerSecurityService; import com.webank.wedatasphere.dss.appjoint.service.AppJointUrlImpl; import com.webank.wedatasphere.dss.appjoint.service.session.Session; import com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.constant.AzkabanConstant; import com.webank.wedatasphere.linkis.common.utils.Utils; import org.apache.commons.io.IOUtils; +import org.apache.http.HttpEntity; import org.apache.http.NameValuePair; import org.apache.http.client.CookieStore; import org.apache.http.client.entity.UrlEncodedFormEntity; @@ -18,6 +20,7 @@ import org.apache.http.impl.client.BasicCookieStore; import org.apache.http.impl.client.CloseableHttpClient; import org.apache.http.impl.client.HttpClients; import org.apache.http.message.BasicNameValuePair; +import org.apache.http.util.EntityUtils; import org.slf4j.Logger; import org.slf4j.LoggerFactory; @@ -40,23 +43,34 @@ public final class AzkabanSecurityService extends AppJointUrlImpl implements Sch private ConcurrentHashMap sessionCache = new ConcurrentHashMap<>(); private String securityUrl; private static final String USER_NAME_KEY = "username"; - private static final String USER_TOKEN_KEY = "userpwd"; + private static final String USER_TOKEN_KEY = AzkabanConf.AZKABAN_LOGIN_PWD.getValue(); private static final String SESSION_ID_KEY = "azkaban.browser.session.id"; private static Properties userToken ; static { Utils.defaultScheduler().scheduleAtFixedRate(()->{ - LOGGER.info("开始读取用户token文件"); + LOGGER.info("load azkaban-user.xml"); Properties properties = new Properties(); try { properties.load(AzkabanSecurityService.class.getClassLoader().getResourceAsStream(AzkabanConstant.TOKEN_FILE_NAME)); userToken = properties; } catch (IOException e) { - LOGGER.error("读取文件失败:",e); + LOGGER.error("load error:",e); } },0,10, TimeUnit.MINUTES); } + public void reloadToken(){ + LOGGER.info("reload azkaban-user.xml"); + Properties properties = new Properties(); + try { + properties.load(AzkabanSecurityService.class.getClassLoader().getResourceAsStream(AzkabanConstant.TOKEN_FILE_NAME)); + userToken = properties; + } catch (IOException e) { + LOGGER.error("reload error:",e); + } + } + @Override public void setBaseUrl(String baseUrl) { this.securityUrl = baseUrl + "/checkin"; @@ -83,7 +97,7 @@ public final class AzkabanSecurityService extends AppJointUrlImpl implements Sch } } - private Session getSession(String user, String token) throws IOException { + private Session getSession(String user, String token) throws IOException, AppJointErrorException { HttpPost httpPost = new HttpPost(securityUrl); List params = new ArrayList<>(); params.add(new BasicNameValuePair(USER_NAME_KEY, user)); @@ -94,17 +108,24 @@ public final class AzkabanSecurityService extends AppJointUrlImpl implements Sch CloseableHttpClient httpClient = null; CloseableHttpResponse response = null; HttpClientContext context; + String responseContent; try { httpClient = HttpClients.custom().setDefaultCookieStore(cookieStore).build(); context = HttpClientContext.create(); response = httpClient.execute(httpPost, context); + HttpEntity entity = response.getEntity(); + responseContent = EntityUtils.toString(entity,"utf-8"); + LOGGER.info("Get azkaban response code is "+ response.getStatusLine().getStatusCode()+",response: "+responseContent); + if(response.getStatusLine().getStatusCode() != 200){ + throw new AppJointErrorException(90041, responseContent); + } } finally { IOUtils.closeQuietly(response); IOUtils.closeQuietly(httpClient); } List cookies = context.getCookieStore().getCookies(); Optional session = cookies.stream().filter(this::findSessionId).map(this::cookieToSession).findFirst(); - return session.orElseThrow(() -> new IllegalAccessError("azkaban登录失败:无此用户")); + return session.orElseThrow(() -> new AppJointErrorException(90041,"Get azkaban session is null : "+ responseContent)); } private boolean findSessionId(Cookie cookie) { diff --git a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/tuning/AzkabanProjectTuning.java b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/tuning/AzkabanProjectTuning.java index d930a513fd22a762741d543c66e2523c51e63ba8..b4cec7a03edb2f8d4ada61c4b671319b92d96f3f 100644 --- a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/tuning/AzkabanProjectTuning.java +++ b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/tuning/AzkabanProjectTuning.java @@ -7,20 +7,26 @@ import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerFlow; import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerProject; import com.webank.wedatasphere.dss.appjoint.scheduler.tuning.AbstractProjectTuning; import com.webank.wedatasphere.dss.appjoint.scheduler.tuning.FlowTuning; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Component; - import java.io.File; import java.text.SimpleDateFormat; +import java.util.ArrayList; import java.util.Date; import java.util.List; /** * Created by cooperyang on 2019/9/26. */ -@Component + public class AzkabanProjectTuning extends AbstractProjectTuning { + public AzkabanProjectTuning(){ + ArrayList list = new ArrayList<>(); + list.add(new LinkisAzkabanFlowTuning()); + list.add(new LinkisShareNodeFlowTuning()); + FlowTuning[] flowTunings =new FlowTuning[list.size()]; + setFlowTunings(list.toArray(flowTunings)); + } + @Override public SchedulerProject tuningSchedulerProject(SchedulerProject schedulerProject) { if(ifProjectCanTuning(schedulerProject)){ @@ -39,7 +45,7 @@ public class AzkabanProjectTuning extends AbstractProjectTuning { azkabanSchedulerFlow.setStorePath(projectStorePath + File.separator + azkabanSchedulerFlow.getName()); } - @Autowired + @Override public void setFlowTunings(FlowTuning[] flowTunings) { super.setFlowTunings(flowTunings); @@ -54,7 +60,7 @@ public class AzkabanProjectTuning extends AbstractProjectTuning { SimpleDateFormat dateFormat = new SimpleDateFormat(AzkabanSchedulerProject.DATE_FORMAT); Date date = new Date(); String dataStr = dateFormat.format(date); - String userName = azkabanSchedulerProject.getDWSProject().getUserName(); + String userName = azkabanSchedulerProject.getDssProject().getUserName(); String name = azkabanSchedulerProject.getName(); String storePath = AzkabanConf.DEFAULT_STORE_PATH.getValue() + File.separator + userName + File.separator + dataStr + File.separator +name; diff --git a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/tuning/LinkisAzkabanFlowTuning.java b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/tuning/LinkisAzkabanFlowTuning.java index 4cf5248e41d1288eabf21266d061b07fc6a38aaa..42c9962f136bce09f615fc0decfb85126ea4efe3 100644 --- a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/tuning/LinkisAzkabanFlowTuning.java +++ b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/tuning/LinkisAzkabanFlowTuning.java @@ -9,17 +9,13 @@ import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerFlow; import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerNode; import com.webank.wedatasphere.dss.appjoint.scheduler.tuning.AbstractFlowTuning; import com.webank.wedatasphere.dss.appjoint.scheduler.tuning.NodeTuning; -import com.webank.wedatasphere.dss.common.entity.node.DWSNodeDefault; -import org.apache.commons.lang.StringUtils; -import org.springframework.stereotype.Component; - +import com.webank.wedatasphere.dss.common.entity.node.DSSNodeDefault; import java.io.File; import java.util.ArrayList; import java.util.List; import java.util.Map; //DefaultFlowTuning修改为AzkabanFlowTuning -@Component public class LinkisAzkabanFlowTuning extends AbstractFlowTuning { @@ -62,7 +58,7 @@ public class LinkisAzkabanFlowTuning extends AbstractFlowTuning { } private SchedulerFlow addEndNodeForFlowName(SchedulerFlow flow) { - DWSNodeDefault endNode = new DWSNodeDefault(); + DSSNodeDefault endNode = new DSSNodeDefault(); List endNodeList = getFlowEndJobList(flow); endNode.setId(flow.getName() + "_"); endNode.setName(flow.getName() + "_"); @@ -74,7 +70,7 @@ public class LinkisAzkabanFlowTuning extends AbstractFlowTuning { endNodeList.forEach(tmpNode -> endNode.addDependency(tmpNode.getName())); } LinkisAzkabanSchedulerNode azkabanSchedulerNode = new LinkisAzkabanSchedulerNode(); - azkabanSchedulerNode.setDWSNode(endNode); + azkabanSchedulerNode.setDssNode(endNode); flow.getSchedulerNodes().add((azkabanSchedulerNode)); return flow; } @@ -84,7 +80,7 @@ public class LinkisAzkabanFlowTuning extends AbstractFlowTuning { for (SchedulerNode job : flow.getSchedulerNodes()) { int flag = 0; for (SchedulerEdge link : flow.getSchedulerEdges()) { - if (job.getId().equals(link.getDWSEdge().getSource())) { + if (job.getId().equals(link.getDssEdge().getSource())) { flag = 1; } } diff --git a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/tuning/LinkisShareNodeFlowTuning.java b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/tuning/LinkisShareNodeFlowTuning.java index 8d84bb1e1450ecc129d6d3650bedca0448b93b3b..2e01c25821b81bc86d10d030572b89aff4dcf96c 100644 --- a/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/tuning/LinkisShareNodeFlowTuning.java +++ b/dss-azkaban-scheduler-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/azkaban/tuning/LinkisShareNodeFlowTuning.java @@ -7,9 +7,9 @@ import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerFlow; import com.webank.wedatasphere.dss.appjoint.scheduler.entity.ShareNode; import com.webank.wedatasphere.dss.appjoint.scheduler.tuning.AbstractShareNodeFlowTuning; import com.webank.wedatasphere.dss.appjoint.scheduler.tuning.NodeTuning; -import org.springframework.stereotype.Component; -@Component + + public class LinkisShareNodeFlowTuning extends AbstractShareNodeFlowTuning { @Override diff --git a/dss-azkaban-scheduler-appjoint/src/main/resources/appjoint.properties b/dss-azkaban-scheduler-appjoint/src/main/resources/appjoint.properties new file mode 100644 index 0000000000000000000000000000000000000000..e5983a4595fa0c0a79c6b20cd289c51ca8dd850e --- /dev/null +++ b/dss-azkaban-scheduler-appjoint/src/main/resources/appjoint.properties @@ -0,0 +1,21 @@ +# +# Copyright 2019 WeBank +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + + + + + diff --git a/dss-azkaban-scheduler-appjoint/src/main/resources/log4j.properties b/dss-azkaban-scheduler-appjoint/src/main/resources/log4j.properties new file mode 100644 index 0000000000000000000000000000000000000000..0807e6087704a1a31f2c6d41042fec441d301a85 --- /dev/null +++ b/dss-azkaban-scheduler-appjoint/src/main/resources/log4j.properties @@ -0,0 +1,37 @@ +# +# Copyright 2019 WeBank +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# + +### set log levels ### + +log4j.rootCategory=INFO,console + +log4j.appender.console=org.apache.log4j.ConsoleAppender +log4j.appender.console.Threshold=INFO +log4j.appender.console.layout=org.apache.log4j.PatternLayout +#log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n +log4j.appender.console.layout.ConversionPattern= %d{ISO8601} %-5p (%t) %p %c{1} - %m%n + + +log4j.appender.com.webank.bdp.ide.core=org.apache.log4j.DailyRollingFileAppender +log4j.appender.com.webank.bdp.ide.core.Threshold=INFO +log4j.additivity.com.webank.bdp.ide.core=false +log4j.appender.com.webank.bdp.ide.core.layout=org.apache.log4j.PatternLayout +log4j.appender.com.webank.bdp.ide.core.Append=true +log4j.appender.com.webank.bdp.ide.core.File=logs/linkis.log +log4j.appender.com.webank.bdp.ide.core.layout.ConversionPattern= %d{ISO8601} %-5p (%t) [%F:%M(%L)] - %m%n + +log4j.logger.org.springframework=INFO diff --git a/dss-azkaban-scheduler-appjoint/src/main/resources/log4j2.xml b/dss-azkaban-scheduler-appjoint/src/main/resources/log4j2.xml new file mode 100644 index 0000000000000000000000000000000000000000..3923cd9f39ff28b9b7c08f01e783fb271d36ee8f --- /dev/null +++ b/dss-azkaban-scheduler-appjoint/src/main/resources/log4j2.xml @@ -0,0 +1,39 @@ + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dss-common/pom.xml b/dss-common/pom.xml index 1502a7674af513ab49df1369c8971f5f7bb66f0a..4fe637f11ba1a9eac9618a105cba58eb941a5856 100644 --- a/dss-common/pom.xml +++ b/dss-common/pom.xml @@ -23,7 +23,7 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 dss-common diff --git a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DWSFlow.java b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DSSFlow.java similarity index 86% rename from dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DWSFlow.java rename to dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DSSFlow.java index 15ca42b8ce63371bae9736d0a3c1773f96606762..ff768f791e25f38fef1ccbeb01f12bbb7ebbe8d1 100644 --- a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DWSFlow.java +++ b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DSSFlow.java @@ -24,7 +24,7 @@ import java.util.stream.Collectors; /** * Created by enjoyyin on 2019/5/14. */ -public class DWSFlow implements Flow { +public class DSSFlow implements Flow { private Long id; private String name; private Boolean state; //0,1代表发布过和未发布过 @@ -38,11 +38,11 @@ public class DWSFlow implements Flow { private Boolean hasSaved;//0disable 1 enable 0表示工作流从来没存过,发布的时候忽略 private String uses; - private List versions; //为了前台不做修改,还是使用versions 而不使用flowVersions的变量名 - private List children; + private List versions; //为了前台不做修改,还是使用versions 而不使用flowVersions的变量名 + private List children; private String flowType; - private DWSFlowVersion latestVersion; + private DSSFlowVersion latestVersion; public Integer getRank() { @@ -86,27 +86,27 @@ public class DWSFlow implements Flow { @Override public void addFlowVersion(FlowVersion flowVersion) { - this.versions.add((DWSFlowVersion) flowVersion); + this.versions.add((DSSFlowVersion) flowVersion); } @Override - public List getChildren() { + public List getChildren() { return children; } @Override public void setChildren(List children) { - this.children = children.stream().map(f ->(DWSFlow)f).collect(Collectors.toList()); + this.children = children.stream().map(f ->(DSSFlow)f).collect(Collectors.toList()); } @Override - public List getFlowVersions() { + public List getFlowVersions() { return this.versions; } @Override public void setFlowVersions(List flowVersions) { - this.versions = flowVersions.stream().map(f ->(DWSFlowVersion)f).collect(Collectors.toList()); + this.versions = flowVersions.stream().map(f ->(DSSFlowVersion)f).collect(Collectors.toList()); } @Override @@ -176,11 +176,11 @@ public class DWSFlow implements Flow { this.hasSaved = hasSaved; } - public DWSFlowVersion getLatestVersion() { + public DSSFlowVersion getLatestVersion() { return latestVersion; } - public void setLatestVersion(DWSFlowVersion latestVersion) { + public void setLatestVersion(DSSFlowVersion latestVersion) { this.latestVersion = latestVersion; } diff --git a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DWSFlowPublishHistory.java b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DSSFlowPublishHistory.java similarity index 94% rename from dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DWSFlowPublishHistory.java rename to dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DSSFlowPublishHistory.java index e9c4537b6caed05e080b771c48579f8e6c3536da..7fe3e9ad427ceca5219a66fd0b35c7e727191789 100644 --- a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DWSFlowPublishHistory.java +++ b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DSSFlowPublishHistory.java @@ -20,5 +20,5 @@ package com.webank.wedatasphere.dss.common.entity.flow; /** * Created by enjoyyin on 2019/9/19. */ -public class DWSFlowPublishHistory { +public class DSSFlowPublishHistory { } diff --git a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DWSFlowVersion.java b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DSSFlowVersion.java similarity index 92% rename from dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DWSFlowVersion.java rename to dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DSSFlowVersion.java index 43c319541e8e683660ebc6cf1eb3fc8fe9ecc4c9..ce8559097a15f4f1304a9d1dbe5c1892db5925ce 100644 --- a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DWSFlowVersion.java +++ b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DSSFlowVersion.java @@ -22,7 +22,7 @@ import java.util.Date; /** * Created by enjoyyin on 2019/9/19. */ -public class DWSFlowVersion implements FlowVersion, Comparable { +public class DSSFlowVersion implements FlowVersion, Comparable { private Long id; private Long flowID; private String source; @@ -31,7 +31,7 @@ public class DWSFlowVersion implements FlowVersion, Comparable { private Date updateTime; private Long updatorID; private String version; - private DWSFlowPublishHistory publishHistory; + private DSSFlowPublishHistory publishHistory; private String json; private String updator; private Boolean isNotPublished; //true 未发过版,false已经过版 @@ -146,16 +146,16 @@ public class DWSFlowVersion implements FlowVersion, Comparable { this.updatorID = updatorID; } - public DWSFlowPublishHistory getPublishHistory() { + public DSSFlowPublishHistory getPublishHistory() { return publishHistory; } - public void setPublishHistory(DWSFlowPublishHistory publishHistory) { + public void setPublishHistory(DSSFlowPublishHistory publishHistory) { this.publishHistory = publishHistory; } @Override - public int compareTo(DWSFlowVersion o) { + public int compareTo(DSSFlowVersion o) { Integer v1 = Integer.valueOf(this.version.substring(1, version.length())); Integer v2 = Integer.valueOf(o.version.substring(1,o.version.length())); return v2 - v1; diff --git a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DWSJSONFlow.java b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DSSJSONFlow.java similarity index 84% rename from dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DWSJSONFlow.java rename to dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DSSJSONFlow.java index 1c6e42cc67000b5d799104dd92f4afbdacf0460e..ae9b986318fadfd72aa0dbc2d471fe75a822a4ab 100644 --- a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DWSJSONFlow.java +++ b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/flow/DSSJSONFlow.java @@ -23,10 +23,10 @@ import java.util.stream.Collectors; /** * Created by enjoyyin on 2019/5/14. */ -public class DWSJSONFlow extends DWSFlow { +public class DSSJSONFlow extends DSSFlow { private String json; - private List children; + private List children; public String getJson() { return json; @@ -38,11 +38,11 @@ public class DWSJSONFlow extends DWSFlow { @Override public void setChildren(List children) { - this.children = children.stream().map(f ->(DWSJSONFlow)f).collect(Collectors.toList()); + this.children = children.stream().map(f ->(DSSJSONFlow)f).collect(Collectors.toList()); } @Override - public List getChildren() { + public List getChildren() { return children; } } diff --git a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DWSEdge.java b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DSSEdge.java similarity index 97% rename from dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DWSEdge.java rename to dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DSSEdge.java index e255d2a917587d9f810ffaa6005074b2ec746181..c4b30593af81d529e525189320431b1c233a93bb 100644 --- a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DWSEdge.java +++ b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DSSEdge.java @@ -20,7 +20,7 @@ package com.webank.wedatasphere.dss.common.entity.node; /** * Created by enjoyyin on 2019/5/14. */ -public interface DWSEdge { +public interface DSSEdge { String getSource(); diff --git a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DWSEdgeDefault.java b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DSSEdgeDefault.java similarity index 96% rename from dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DWSEdgeDefault.java rename to dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DSSEdgeDefault.java index 4cf094bb4299293d7e6b0e3f1df3bbc54411d7b9..3e6bc8b1397517922bc7b6529096dd56756c1aba 100644 --- a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DWSEdgeDefault.java +++ b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DSSEdgeDefault.java @@ -20,7 +20,7 @@ package com.webank.wedatasphere.dss.common.entity.node; /** * Created by enjoyyin on 2019/5/14. */ -public class DWSEdgeDefault implements DWSEdge { +public class DSSEdgeDefault implements DSSEdge { private String source; private String target; private String sourceLocation; @@ -68,7 +68,7 @@ public class DWSEdgeDefault implements DWSEdge { @Override public String toString() { - return "DWSEdge{" + + return "DSSEdge{" + "source='" + source + '\'' + ", target='" + target + '\'' + ", sourceLocation='" + sourceLocation + '\'' + diff --git a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DWSNode.java b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DSSNode.java similarity index 96% rename from dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DWSNode.java rename to dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DSSNode.java index d7fa7e870e369dd3fe5b50e0aa56ee97ffef1f73..8f6575c83626867b3777f6a311aacf87995f7eb7 100644 --- a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DWSNode.java +++ b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DSSNode.java @@ -25,7 +25,7 @@ import java.util.Map; /** * Created by enjoyyin on 2019/5/14. */ -public interface DWSNode extends Node { +public interface DSSNode extends Node { Layout getLayout(); diff --git a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DWSNodeDefault.java b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DSSNodeDefault.java similarity index 98% rename from dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DWSNodeDefault.java rename to dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DSSNodeDefault.java index b7748016f587355c5fae2cf1fa09895ca6a79f6c..2d416dfb0231f3287acbbf8077bc1ab117e9325f 100644 --- a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DWSNodeDefault.java +++ b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/node/DSSNodeDefault.java @@ -26,7 +26,7 @@ import java.util.Map; /** * Created by enjoyyin on 2019/5/14. */ -public class DWSNodeDefault implements DWSNode { +public class DSSNodeDefault implements DSSNode { private Layout layout; private String id; private String jobType; diff --git a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DWSJSONProject.java b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DSSJSONProject.java similarity index 69% rename from dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DWSJSONProject.java rename to dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DSSJSONProject.java index 05202dac488bd5b6246a1b7d52ca9d4cfcc85a32..ffd9026fc336e64ccffe92d9496fefb9502e7609 100644 --- a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DWSJSONProject.java +++ b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DSSJSONProject.java @@ -17,8 +17,8 @@ package com.webank.wedatasphere.dss.common.entity.project; -import com.webank.wedatasphere.dss.common.entity.flow.DWSFlow; -import com.webank.wedatasphere.dss.common.entity.flow.DWSJSONFlow; +import com.webank.wedatasphere.dss.common.entity.flow.DSSFlow; +import com.webank.wedatasphere.dss.common.entity.flow.DSSJSONFlow; import java.util.List; import java.util.stream.Collectors; @@ -26,16 +26,16 @@ import java.util.stream.Collectors; /** * Created by allenlliu on 2019/9/16. */ -public class DWSJSONProject extends DWSProject { - private List flows; +public class DSSJSONProject extends DSSProject { + private List flows; @Override - public List getFlows() { + public List getFlows() { return this.flows; } @Override - public void setFlows(List flows) { - this.flows = flows.stream().map(f ->(DWSJSONFlow)f).collect(Collectors.toList()); + public void setFlows(List flows) { + this.flows = flows.stream().map(f ->(DSSJSONFlow)f).collect(Collectors.toList()); } } diff --git a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DWSProject.java b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DSSProject.java similarity index 88% rename from dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DWSProject.java rename to dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DSSProject.java index 3f7ad9c2ebf03611a2794a7ea00b7bf0945afadd..4454362d8e4767869763b074de98339da1f77308 100644 --- a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DWSProject.java +++ b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DSSProject.java @@ -18,7 +18,7 @@ package com.webank.wedatasphere.dss.common.entity.project; import com.webank.wedatasphere.dss.common.entity.Resource; -import com.webank.wedatasphere.dss.common.entity.flow.DWSFlow; +import com.webank.wedatasphere.dss.common.entity.flow.DSSFlow; import java.util.Date; import java.util.List; @@ -27,7 +27,7 @@ import java.util.stream.Collectors; /** * Created by enjoyyin on 2019/9/16. */ -public class DWSProject implements Project { +public class DSSProject implements Project { private Long id; private String name; @@ -48,14 +48,15 @@ public class DWSProject implements Project { private String product; private Integer applicationArea; private String business; + private Long workspaceId; - private DWSProjectVersion latestVersion; + private DSSProjectVersion latestVersion; private Boolean isNotPublish; private String userName; private String projectGroup; private List projectVersions; - private List flows; + private List flows; private List projectResources; public List getProjectResources() { @@ -66,12 +67,12 @@ public class DWSProject implements Project { this.projectResources = projectResources; } - public List getFlows() { + public List getFlows() { return flows; } - public void setFlows(List flows) { - this.flows = flows.stream().map(f -> (DWSFlow) f).collect(Collectors.toList()); + public void setFlows(List flows) { + this.flows = flows.stream().map(f -> (DSSFlow) f).collect(Collectors.toList()); } public String getUserName() { @@ -122,11 +123,11 @@ public class DWSProject implements Project { this.initialOrgID = initialOrgID; } - public DWSProjectVersion getLatestVersion() { + public DSSProjectVersion getLatestVersion() { return latestVersion; } - public void setLatestVersion(DWSProjectVersion latestVersion) { + public void setLatestVersion(DSSProjectVersion latestVersion) { this.latestVersion = latestVersion; } @@ -174,13 +175,13 @@ public class DWSProject implements Project { @Override public void setProjectVersions(List projectVersions) { - this.projectVersions = projectVersions.stream().map(f -> (DWSProjectVersion) f).collect(Collectors.toList()); + this.projectVersions = projectVersions.stream().map(f -> (DSSProjectVersion) f).collect(Collectors.toList()); } @Override public void addProjectVersion(ProjectVersion projectVersion) { - this.projectVersions.add((DWSProjectVersion) projectVersion); + this.projectVersions.add((DSSProjectVersion) projectVersion); } @Override @@ -282,4 +283,12 @@ public class DWSProject implements Project { public void setBusiness(String business) { this.business = business; } + + public Long getWorkspaceId() { + return workspaceId; + } + + public void setWorkspaceId(Long workspaceId) { + this.workspaceId = workspaceId; + } } diff --git a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DWSProjectPublishHistory.java b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DSSProjectPublishHistory.java similarity index 98% rename from dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DWSProjectPublishHistory.java rename to dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DSSProjectPublishHistory.java index 7ea9b1ee0e955cf4c94e3f465dcc5c77e37a2cda..dca93dd48757493cedc89decc36be6057686e14a 100644 --- a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DWSProjectPublishHistory.java +++ b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DSSProjectPublishHistory.java @@ -22,7 +22,7 @@ import java.util.Date; /** * Created by enjoyyin on 2019/5/14. */ -public class DWSProjectPublishHistory { +public class DSSProjectPublishHistory { private Long id; private Long projectVersionID; private Date createTime; diff --git a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DWSProjectVersion.java b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DSSProjectVersion.java similarity index 91% rename from dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DWSProjectVersion.java rename to dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DSSProjectVersion.java index 534a9631f27b8a9970f5277f2d903128cbe0f1a9..5731ecf8e32b5ea5b30b7ab0919ec159604d367f 100644 --- a/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DWSProjectVersion.java +++ b/dss-common/src/main/java/com/webank/wedatasphere/dss/common/entity/project/DSSProjectVersion.java @@ -22,7 +22,7 @@ import java.util.Date; /** * Created by enjoyyin on 2019/9/18. */ -public class DWSProjectVersion implements ProjectVersion { +public class DSSProjectVersion implements ProjectVersion { private Long id; private Long projectID; @@ -33,9 +33,7 @@ public class DWSProjectVersion implements ProjectVersion { private Integer lock; private String updator; private Boolean isNotPublish; - private DWSProjectPublishHistory publishHistory; - - + private DSSProjectPublishHistory publishHistory; @Override public String getVersion() { @@ -118,11 +116,11 @@ public class DWSProjectVersion implements ProjectVersion { isNotPublish = notPublish; } - public DWSProjectPublishHistory getPublishHistory() { + public DSSProjectPublishHistory getPublishHistory() { return publishHistory; } - public void setPublishHistory(DWSProjectPublishHistory publishHistory) { + public void setPublishHistory(DSSProjectPublishHistory publishHistory) { this.publishHistory = publishHistory; } } diff --git a/dss-common/src/main/scala/com/webank/wedatasphere/dss/common/protocol/RequestDWSProject.scala b/dss-common/src/main/scala/com/webank/wedatasphere/dss/common/protocol/RequestDSSProject.scala similarity index 90% rename from dss-common/src/main/scala/com/webank/wedatasphere/dss/common/protocol/RequestDWSProject.scala rename to dss-common/src/main/scala/com/webank/wedatasphere/dss/common/protocol/RequestDSSProject.scala index 92d01f765eda269f922fdd2fd744a646cd30796f..b9b3da46af09a07c489eadde358c91bae65499ce 100644 --- a/dss-common/src/main/scala/com/webank/wedatasphere/dss/common/protocol/RequestDWSProject.scala +++ b/dss-common/src/main/scala/com/webank/wedatasphere/dss/common/protocol/RequestDSSProject.scala @@ -20,6 +20,6 @@ package com.webank.wedatasphere.dss.common.protocol /** * Created by enjoyyin on 2019/11/8. */ -case class RequestDWSProject(flowId:Long,version:String,projectVersionId:Long) +case class RequestDSSProject(flowId:Long, version:String, projectVersionId:Long) case class RequestDSSApplication(name:String) \ No newline at end of file diff --git a/dss-flow-execution-entrance/bin/start-dss-flow-execution-entrance.sh b/dss-flow-execution-entrance/bin/start-dss-flow-execution-entrance.sh index 5ce15dc90fc2664f0d19bab5803bf9cc5cae5916..9bd4a006cb085748d11a68fa776a81b7cac1913c 100644 --- a/dss-flow-execution-entrance/bin/start-dss-flow-execution-entrance.sh +++ b/dss-flow-execution-entrance/bin/start-dss-flow-execution-entrance.sh @@ -1,33 +1,49 @@ #!/bin/bash - cd `dirname $0` cd .. HOME=`pwd` -export DWS_ENGINE_MANAGER_HOME=$HOME -export DWS_ENGINE_MANAGER_PID=$HOME/bin/linkis.pid +export SERVER_PID=$HOME/bin/linkis.pid +export SERVER_LOG_PATH=$HOME/logs +export SERVER_CLASS=com.webank.wedatasphere.linkis.DataWorkCloudApplication -if [[ -f "${DWS_ENGINE_MANAGER_PID}" ]]; then - pid=$(cat ${DWS_ENGINE_MANAGER_PID}) - if kill -0 ${pid} >/dev/null 2>&1; then - echo "FlowExecution Entrance is already running." - return 0; - fi +if test -z "$SERVER_HEAP_SIZE" +then + export SERVER_HEAP_SIZE="512M" fi -export DWS_ENGINE_MANAGER_LOG_PATH=$HOME/logs -export DWS_ENGINE_MANAGER_HEAP_SIZE="1G" -export DWS_ENGINE_MANAGER_JAVA_OPTS="-Xms$DWS_ENGINE_MANAGER_HEAP_SIZE -Xmx$DWS_ENGINE_MANAGER_HEAP_SIZE -XX:+UseG1GC -XX:MaxPermSize=500m -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=11730" +if test -z "$SERVER_JAVA_OPTS" +then + export SERVER_JAVA_OPTS=" -Xmx$SERVER_HEAP_SIZE -XX:+UseG1GC -Xloggc:$HOME/logs/linkis-gc.log" +fi -echo $HOME/lib/ +if [[ -f "${SERVER_PID}" ]]; then + pid=$(cat ${SERVER_PID}) + if kill -0 ${pid} >/dev/null 2>&1; then + echo "Server is already running." + exit 1 + fi +fi -nohup java $DWS_ENGINE_MANAGER_JAVA_OPTS -cp $HOME/conf:$HOME/lib/* com.webank.wedatasphere.linkis.DataWorkCloudApplication 2>&1 > $DWS_ENGINE_MANAGER_LOG_PATH/linkis.out & +nohup java $SERVER_JAVA_OPTS -cp $HOME/conf:$HOME/lib/* $SERVER_CLASS 2>&1 > $SERVER_LOG_PATH/linkis.out & pid=$! if [[ -z "${pid}" ]]; then - echo "FlowExecution Entrance start failed!" + echo "server $SERVER_NAME start failed!" exit 1 else - echo "FlowExecution Entrance start succeeded!" - echo $pid > $DWS_ENGINE_MANAGER_PID + echo "server $SERVER_NAME start succeeded!" + echo $pid > $SERVER_PID sleep 1 fi + + + + + + + + + + + + diff --git a/dss-flow-execution-entrance/pom.xml b/dss-flow-execution-entrance/pom.xml index b50e61c3b30e76d54df389d87897daab3af2f059..669ca30cd5e8f3f6becfd34a27e2e09ac5c5631b 100644 --- a/dss-flow-execution-entrance/pom.xml +++ b/dss-flow-execution-entrance/pom.xml @@ -22,7 +22,7 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 4.0.0 @@ -33,12 +33,28 @@ com.webank.wedatasphere.linkis linkis-ujes-entrance ${linkis.version} + + + org.apache.poi + ooxml-schemas + + + + + com.webank.wedatasphere.linkis + linkis-cloudRPC + ${linkis.version} - com.webank.wedatasphere.dss dss-linkis-node-execution ${dss.version} + + + com.ibm.icu + icu4j + + diff --git a/dss-flow-execution-entrance/src/main/assembly/distribution.xml b/dss-flow-execution-entrance/src/main/assembly/distribution.xml index c080c0c09f3d072b737dc14a3f0e9aaa148057c8..bb09aad22b3893133d130e2c507835657cab13ae 100644 --- a/dss-flow-execution-entrance/src/main/assembly/distribution.xml +++ b/dss-flow-execution-entrance/src/main/assembly/distribution.xml @@ -59,16 +59,16 @@ aopalliance:aopalliance:jar asm:asm:jar cglib:cglib:jar - com.amazonaws:aws-java-sdk-autoscaling:jar - com.amazonaws:aws-java-sdk-core:jar - com.amazonaws:aws-java-sdk-ec2:jar - com.amazonaws:aws-java-sdk-route53:jar - com.amazonaws:aws-java-sdk-sts:jar - com.amazonaws:jmespath-java:jar + + + + + + com.fasterxml.jackson.core:jackson-annotations:jar com.fasterxml.jackson.core:jackson-core:jar com.fasterxml.jackson.core:jackson-databind:jar - com.fasterxml.jackson.dataformat:jackson-dataformat-cbor:jar + com.fasterxml.jackson.datatype:jackson-datatype-jdk8:jar com.fasterxml.jackson.datatype:jackson-datatype-jsr310:jar com.fasterxml.jackson.jaxrs:jackson-jaxrs-base:jar @@ -84,7 +84,6 @@ com.google.code.gson:gson:jar com.google.guava:guava:jar com.google.inject:guice:jar - com.google.protobuf:protobuf-java:jar com.netflix.archaius:archaius-core:jar com.netflix.eureka:eureka-client:jar com.netflix.eureka:eureka-core:jar @@ -100,7 +99,6 @@ com.netflix.ribbon:ribbon-loadbalancer:jar com.netflix.ribbon:ribbon-transport:jar com.netflix.servo:servo-core:jar - com.ning:async-http-client:jar com.sun.jersey.contribs:jersey-apache-client4:jar com.sun.jersey:jersey-client:jar com.sun.jersey:jersey-core:jar @@ -113,15 +111,10 @@ com.webank.wedatasphere.linkis:linkis-common:jar com.webank.wedatasphere.linkis:linkis-module:jar commons-beanutils:commons-beanutils:jar - commons-beanutils:commons-beanutils-core:jar - commons-cli:commons-cli:jar commons-codec:commons-codec:jar commons-collections:commons-collections:jar commons-configuration:commons-configuration:jar - commons-daemon:commons-daemon:jar commons-dbcp:commons-dbcp:jar - commons-digester:commons-digester:jar - commons-httpclient:commons-httpclient:jar commons-io:commons-io:jar commons-jxpath:commons-jxpath:jar commons-lang:commons-lang:jar @@ -129,7 +122,6 @@ commons-net:commons-net:jar commons-pool:commons-pool:jar io.micrometer:micrometer-core:jar - io.netty:netty:jar io.netty:netty-all:jar io.netty:netty-buffer:jar io.netty:netty-codec:jar @@ -146,41 +138,21 @@ javax.annotation:javax.annotation-api:jar javax.inject:javax.inject:jar javax.servlet:javax.servlet-api:jar - javax.servlet.jsp:jsp-api:jar javax.validation:validation-api:jar javax.websocket:javax.websocket-api:jar javax.ws.rs:javax.ws.rs-api:jar javax.xml.bind:jaxb-api:jar javax.xml.stream:stax-api:jar joda-time:joda-time:jar - log4j:log4j:jar mysql:mysql-connector-java:jar - net.databinder.dispatch:dispatch-core_2.11:jar - net.databinder.dispatch:dispatch-json4s-jackson_2.11:jar org.antlr:antlr-runtime:jar org.antlr:stringtemplate:jar - org.apache.commons:commons-compress:jar org.apache.commons:commons-math:jar - org.apache.commons:commons-math3:jar - org.apache.curator:curator-client:jar - org.apache.curator:curator-framework:jar - org.apache.curator:curator-recipes:jar - org.apache.directory.api:api-asn1-api:jar - org.apache.directory.api:api-util:jar - org.apache.directory.server:apacheds-i18n:jar - org.apache.directory.server:apacheds-kerberos-codec:jar - org.apache.hadoop:hadoop-annotations:jar - org.apache.hadoop:hadoop-auth:jar - org.apache.hadoop:hadoop-common:jar - org.apache.hadoop:hadoop-hdfs:jar - org.apache.htrace:htrace-core:jar org.apache.httpcomponents:httpclient:jar - org.apache.httpcomponents:httpcore:jar org.apache.logging.log4j:log4j-api:jar org.apache.logging.log4j:log4j-core:jar org.apache.logging.log4j:log4j-jul:jar org.apache.logging.log4j:log4j-slf4j-impl:jar - org.apache.zookeeper:zookeeper:jar org.aspectj:aspectjweaver:jar org.bouncycastle:bcpkix-jdk15on:jar org.bouncycastle:bcprov-jdk15on:jar @@ -194,7 +166,6 @@ org.eclipse.jetty:jetty-continuation:jar org.eclipse.jetty:jetty-http:jar org.eclipse.jetty:jetty-io:jar - org.eclipse.jetty:jetty-jndi:jar org.eclipse.jetty:jetty-plus:jar org.eclipse.jetty:jetty-security:jar org.eclipse.jetty:jetty-server:jar @@ -210,7 +181,6 @@ org.eclipse.jetty.websocket:websocket-common:jar org.eclipse.jetty.websocket:websocket-server:jar org.eclipse.jetty.websocket:websocket-servlet:jar - org.fusesource.leveldbjni:leveldbjni-all:jar org.glassfish.hk2:class-model:jar org.glassfish.hk2:config-types:jar org.glassfish.hk2.external:aopalliance-repackaged:jar @@ -243,13 +213,10 @@ org.json4s:json4s-ast_2.11:jar org.json4s:json4s-core_2.11:jar org.json4s:json4s-jackson_2.11:jar - org.jsoup:jsoup:jar org.jvnet.mimepull:mimepull:jar org.jvnet:tiger-types:jar org.latencyutils:LatencyUtils:jar org.mortbay.jasper:apache-el:jar - org.mortbay.jetty:jetty:jar - org.mortbay.jetty:jetty-util:jar org.ow2.asm:asm-analysis:jar org.ow2.asm:asm-commons:jar org.ow2.asm:asm-tree:jar @@ -296,11 +263,8 @@ org.springframework:spring-jcl:jar org.springframework:spring-web:jar org.springframework:spring-webmvc:jar - org.tukaani:xz:jar org.yaml:snakeyaml:jar - software.amazon.ion:ion-java:jar - xerces:xercesImpl:jar - xmlenc:xmlenc:jar + xmlpull:xmlpull:jar xpp3:xpp3_min:jar diff --git a/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/job/FlowExecutionAppJointSignalSharedJob.java b/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/job/FlowExecutionAppJointSignalSharedJob.java index 8d0735e5811fd6cccf8f542c413ffefa0478b659..d0d556aadfa8db32afa4bde3bb2e90a73140659f 100644 --- a/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/job/FlowExecutionAppJointSignalSharedJob.java +++ b/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/job/FlowExecutionAppJointSignalSharedJob.java @@ -17,17 +17,28 @@ package com.webank.wedatasphere.dss.flow.execution.entrance.job; -import com.webank.wedatasphere.dss.common.entity.node.DWSNode; import com.webank.wedatasphere.dss.flow.execution.entrance.conf.FlowExecutionEntranceConfiguration; +import com.webank.wedatasphere.dss.linkis.node.execution.job.JobSignalKeyCreator; import com.webank.wedatasphere.dss.linkis.node.execution.job.SignalSharedJob; import java.util.Map; /** - * Created by peacewong on 2019/11/14. + * Created by johnnwang on 2019/11/14. */ -public class FlowExecutionAppJointSignalSharedJob extends FlowExecutionAppJointLinkisSharedJob implements SignalSharedJob { +public class FlowExecutionAppJointSignalSharedJob extends FlowExecutionAppJointLinkisJob implements SignalSharedJob { + private JobSignalKeyCreator signalKeyCreator; + + @Override + public JobSignalKeyCreator getSignalKeyCreator() { + return this.signalKeyCreator; + } + + @Override + public void setSignalKeyCreator(JobSignalKeyCreator signalKeyCreator) { + this.signalKeyCreator = signalKeyCreator; + } @Override public String getMsgSaveKey() { diff --git a/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/job/FlowExecutionJobSignalKeyCreator.java b/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/job/FlowExecutionJobSignalKeyCreator.java new file mode 100644 index 0000000000000000000000000000000000000000..e284b44d8e0fbcb4332e6b75863d934a835f3702 --- /dev/null +++ b/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/job/FlowExecutionJobSignalKeyCreator.java @@ -0,0 +1,39 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.flow.execution.entrance.job; + +import com.webank.wedatasphere.dss.flow.execution.entrance.conf.FlowExecutionEntranceConfiguration; +import com.webank.wedatasphere.dss.linkis.node.execution.job.Job; +import com.webank.wedatasphere.dss.linkis.node.execution.job.JobSignalKeyCreator; +import com.webank.wedatasphere.dss.linkis.node.execution.job.SignalSharedJob; + +public class FlowExecutionJobSignalKeyCreator implements JobSignalKeyCreator { + + @Override + public String getSignalKeyByJob(Job job) { + String projectId = job.getJobProps().get(FlowExecutionEntranceConfiguration.PROJECT_NAME()); + String flowId = job.getJobProps().get(FlowExecutionEntranceConfiguration.FLOW_NAME()); + String flowExecId = job.getJobProps().get(FlowExecutionEntranceConfiguration.FLOW_EXEC_ID()); + return projectId + "." + flowId + "." + flowExecId; + } + + @Override + public String getSignalKeyBySignalSharedJob(SignalSharedJob job) { + return getSignalKeyByJob((Job)job); + } +} diff --git a/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/parser/FlowExecutionFlowParser.java b/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/parser/FlowExecutionFlowParser.java index 66388106e1db6ae2d842e7133bf4bd51fe4dfe73..fbd8bdd7624310ae2e9eff06e5a04252461ff434 100644 --- a/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/parser/FlowExecutionFlowParser.java +++ b/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/parser/FlowExecutionFlowParser.java @@ -19,7 +19,7 @@ package com.webank.wedatasphere.dss.flow.execution.entrance.parser; import com.webank.wedatasphere.dss.appjoint.scheduler.parser.AbstractFlowParser; import com.webank.wedatasphere.dss.appjoint.scheduler.parser.NodeParser; -import com.webank.wedatasphere.dss.common.entity.flow.DWSJSONFlow; +import com.webank.wedatasphere.dss.common.entity.flow.DSSJSONFlow; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Component; @@ -36,7 +36,7 @@ public class FlowExecutionFlowParser extends AbstractFlowParser { } @Override - public Boolean ifFlowCanParse(DWSJSONFlow flow) { + public Boolean ifFlowCanParse(DSSJSONFlow flow) { return true; } diff --git a/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/parser/FlowExecutionNodeParser.java b/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/parser/FlowExecutionNodeParser.java index 379aa14e5a2908f75dc9b7b0647aaba03a5d66ab..97c5824b98d1b5e7c7ff8046f3ffe7bc40dfcec2 100644 --- a/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/parser/FlowExecutionNodeParser.java +++ b/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/parser/FlowExecutionNodeParser.java @@ -19,7 +19,7 @@ package com.webank.wedatasphere.dss.flow.execution.entrance.parser; import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerNode; import com.webank.wedatasphere.dss.appjoint.scheduler.parser.AbstractNodeParser; -import com.webank.wedatasphere.dss.common.entity.node.DWSNode; +import com.webank.wedatasphere.dss.common.entity.node.DSSNode; import com.webank.wedatasphere.dss.flow.execution.entrance.entity.FlowExecutionNode; import org.springframework.stereotype.Component; @@ -31,14 +31,14 @@ import org.springframework.stereotype.Component; public class FlowExecutionNodeParser extends AbstractNodeParser { @Override - public SchedulerNode parseNode(DWSNode dwsNode) { + public SchedulerNode parseNode(DSSNode dssNode) { FlowExecutionNode node = new FlowExecutionNode(); - node.setDWSNode(dwsNode); + node.setDssNode(dssNode); return node; } @Override - public Boolean ifNodeCanParse(DWSNode dwsNode) { + public Boolean ifNodeCanParse(DSSNode dssNode) { return true; } diff --git a/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/parser/FlowExecutionReadNodeParser.java b/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/parser/FlowExecutionReadNodeParser.java index 29f964c5d7c2d2dde919fe8db9ee999c2abe183f..1c04fdfd2a5efaabd2555959bdd4277c8ed16606 100644 --- a/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/parser/FlowExecutionReadNodeParser.java +++ b/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/parser/FlowExecutionReadNodeParser.java @@ -20,7 +20,7 @@ package com.webank.wedatasphere.dss.flow.execution.entrance.parser; import com.webank.wedatasphere.dss.appjoint.scheduler.entity.ReadNode; import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerNode; import com.webank.wedatasphere.dss.appjoint.scheduler.parser.AbstractReadNodeParser; -import com.webank.wedatasphere.dss.common.entity.node.DWSNode; +import com.webank.wedatasphere.dss.common.entity.node.DSSNode; import com.webank.wedatasphere.dss.flow.execution.entrance.entity.FlowExecutionNode; import com.webank.wedatasphere.dss.flow.execution.entrance.entity.FlowExecutonReadNode; import com.webank.wedatasphere.dss.flow.execution.entrance.utils.FlowExecutionUtils; @@ -39,8 +39,8 @@ public class FlowExecutionReadNodeParser extends AbstractReadNodeParser { } @Override - public Boolean ifNodeCanParse(DWSNode dwsNode) { - return FlowExecutionUtils.isReadNode(dwsNode.getNodeType()); + public Boolean ifNodeCanParse(DSSNode dssNode) { + return FlowExecutionUtils.isReadNode(dssNode.getNodeType()); } @Override diff --git a/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/restful/FlowExecutionRestfulApi.java b/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/restful/FlowExecutionRestfulApi.java index 807de6029932a06ac529b6976eb5a1d5cbeeb362..8ad6dcb51f55643db660841314e55a67deb177af 100644 --- a/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/restful/FlowExecutionRestfulApi.java +++ b/dss-flow-execution-entrance/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/restful/FlowExecutionRestfulApi.java @@ -70,6 +70,12 @@ public class FlowExecutionRestfulApi { message = Message.ok("Successfully get job execution info"); message.setMethod("/api/entrance/" + id + "/execution"); message.setStatus(0); + long nowTime = System.currentTimeMillis(); + flowEntranceJob.getFlowContext().getRunningNodes().forEach((k, v) -> { + if (v != null) { + v.setNowTime(nowTime); + } + }); message.data("runningJobs", FlowContext$.MODULE$.convertView(flowEntranceJob.getFlowContext().getRunningNodes())); List> pendingList = FlowContext$.MODULE$.convertView(flowEntranceJob.getFlowContext().getPendingNodes()); pendingList.addAll(FlowContext$.MODULE$.convertView(flowEntranceJob.getFlowContext().getSkippedNodes())); diff --git a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/FlowContext.scala b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/FlowContext.scala index b0192fb5447636084c9d820e3c3b644c4b963210..f03081a5e141346231754828d5edc2cbac538d2e 100644 --- a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/FlowContext.scala +++ b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/FlowContext.scala @@ -53,7 +53,7 @@ object FlowContext { def changedNodeState(fromMap: util.Map[String, NodeRunner], toMap: util.Map[String, NodeRunner], node: SchedulerNode,info:String): Unit = { - val nodeName = node.getDWSNode.getName + val nodeName = node.getDssNode.getName if (fromMap.containsKey(nodeName)) { val runner = fromMap.get(nodeName) runner.setNodeExecutedInfo(info) diff --git a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/conf/FlowExecutionEntranceConfiguration.scala b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/conf/FlowExecutionEntranceConfiguration.scala index 91cdeedd6d9b9ed327faebf859a0438de6eba1ba..245a23687fc093cbc479018f32ede0216ca4aae8 100644 --- a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/conf/FlowExecutionEntranceConfiguration.scala +++ b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/conf/FlowExecutionEntranceConfiguration.scala @@ -44,9 +44,10 @@ object FlowExecutionEntranceConfiguration { val NODE_STATUS_POLLER_THREAD_SIZE = CommonVars("wds.dds.flow.node.status.poller.thread.size", 20) - val NODE_STATUS_POLLER_SCHEDULER_TIME = CommonVars("wds.dds.flow.node.status.poller.scheduler.time", 2) + val NODE_STATUS_POLLER_SCHEDULER_TIME = CommonVars("wds.dds.flow.node.status.poller.scheduler.time", 5) val FLOW_EXECUTION_SCHEDULER_POOL_SIZE = CommonVars("wds.linkis.flow.execution.pool.size", 30) + val NODE_STATUS_INTERVAL = CommonVars("wds.dds.flow.node.status.poller.interval.time", 3000) val COMMAND = "command" diff --git a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/execution/DefaultFlowExecution.scala b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/execution/DefaultFlowExecution.scala index 04b56936e86d4aa19658a0ea961ff04785b705ab..c1164c9d3d6f12db510cd936bc0db62ff34da38e 100644 --- a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/execution/DefaultFlowExecution.scala +++ b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/execution/DefaultFlowExecution.scala @@ -16,7 +16,7 @@ */ package com.webank.wedatasphere.dss.flow.execution.entrance.execution -import java.util + import java.util.concurrent.{Executors, LinkedBlockingQueue, TimeUnit} import com.webank.wedatasphere.dss.flow.execution.entrance.conf.FlowExecutionEntranceConfiguration @@ -31,13 +31,11 @@ import scala.collection.mutable.ArrayBuffer /** - * Created by peacewong on 2019/11/5. - */ + * Created by johnnwang on 2019/11/5. + */ @Service class DefaultFlowExecution extends FlowExecution with Logging { - private val executeService = Utils.newCachedThreadPool(FlowExecutionEntranceConfiguration.FLOW_EXECUTION_POOL_SIZE.getValue, - "DefaultFlowExecution",true) private val nodeRunnerQueue: LinkedBlockingQueue[NodeRunner] = new LinkedBlockingQueue[NodeRunner]() @@ -63,7 +61,7 @@ class DefaultFlowExecution extends FlowExecution with Logging { // submit node runner runningNodes.add(runner) } else { - info(s"This node ${runner.getNode.getDWSNode.getName} Skipped in execution") + info(s"This node ${runner.getNode.getDssNode.getName} Skipped in execution") runner.fromScheduledTunToState(NodeExecutionState.Skipped) } } @@ -74,6 +72,7 @@ class DefaultFlowExecution extends FlowExecution with Logging { if (pollerCount < FlowExecutionEntranceConfiguration.NODE_STATUS_POLLER_THREAD_SIZE.getValue){ scheduledThreadPool.scheduleAtFixedRate(new NodeExecutionStatusPoller(nodeRunnerQueue), 1, FlowExecutionEntranceConfiguration.NODE_STATUS_POLLER_SCHEDULER_TIME.getValue ,TimeUnit.SECONDS) + pollerCount = pollerCount + 1 } } } diff --git a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/job/FlowEntranceJob.scala b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/job/FlowEntranceJob.scala index 225492781c67553ec131f96138190d3abc614b2c..42bf395d4670e989416f0d2598e5c5c4854cae56 100644 --- a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/job/FlowEntranceJob.scala +++ b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/job/FlowEntranceJob.scala @@ -18,7 +18,7 @@ package com.webank.wedatasphere.dss.flow.execution.entrance.job import com.webank.wedatasphere.dss.appjoint.scheduler.entity.{SchedulerFlow, SchedulerNode} -import com.webank.wedatasphere.dss.common.entity.project.DWSProject +import com.webank.wedatasphere.dss.common.entity.project.DSSProject import com.webank.wedatasphere.dss.flow.execution.entrance.exception.FlowExecutionErrorException import com.webank.wedatasphere.dss.flow.execution.entrance.{FlowContext, FlowContextImpl} import com.webank.wedatasphere.dss.flow.execution.entrance.listener.NodeRunnerListener @@ -44,7 +44,7 @@ class FlowEntranceJob extends EntranceExecutionJob with NodeRunnerListener { private val flowContext: FlowContext = new FlowContextImpl - @BeanProperty var dwsProject: DWSProject = _ + @BeanProperty var dwsProject: DSSProject = _ def setFlow(flow: SchedulerFlow): Unit = this.flow = flow @@ -80,7 +80,7 @@ class FlowEntranceJob extends EntranceExecutionJob with NodeRunnerListener { override def onStatusChanged(fromState: NodeExecutionState, toState: NodeExecutionState, node: SchedulerNode): Unit = { - val nodeName = node.getDWSNode.getName + val nodeName = node.getDssNode.getName toState match { case NodeExecutionState.Failed => printLog(s"Failed to execute node($nodeName),prepare to kill flow job", "ERROR") diff --git a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/job/parser/FlowJobFlowParser.scala b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/job/parser/FlowJobFlowParser.scala index 5c5bf8ede596108414c0b656ab12f2f881b66afa..a088e146f7fc3f49a68148bb41d4c8cd14522e62 100644 --- a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/job/parser/FlowJobFlowParser.scala +++ b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/job/parser/FlowJobFlowParser.scala @@ -18,8 +18,8 @@ package com.webank.wedatasphere.dss.flow.execution.entrance.job.parser import com.webank.wedatasphere.dss.appjoint.scheduler.entity.{AbstractSchedulerProject, SchedulerFlow} -import com.webank.wedatasphere.dss.common.entity.project.DWSProject -import com.webank.wedatasphere.dss.common.protocol.RequestDWSProject +import com.webank.wedatasphere.dss.common.entity.project.DSSProject +import com.webank.wedatasphere.dss.common.protocol.RequestDSSProject import com.webank.wedatasphere.dss.flow.execution.entrance.conf.FlowExecutionEntranceConfiguration import com.webank.wedatasphere.dss.flow.execution.entrance.entity.FlowExecutionCode import com.webank.wedatasphere.dss.flow.execution.entrance.job.FlowEntranceJob @@ -52,8 +52,8 @@ class FlowJobFlowParser extends FlowEntranceJobParser with Logging { val code = flowEntranceJob.jobToExecuteRequest().code val flowExecutionCode = LinkisJobExecutionUtils.gson.fromJson(code, classOf[FlowExecutionCode]) - getDWSProjectByCode(flowExecutionCode) match { - case dwsProject: DWSProject => + getDSSProjectByCode(flowExecutionCode) match { + case dwsProject: DSSProject => val project = this.flowExecutionProjectParser.parseProject(dwsProject) @@ -76,8 +76,8 @@ class FlowJobFlowParser extends FlowEntranceJobParser with Logging { info(s"${flowEntranceJob.getId} finished to parse flow") } - private def getDWSProjectByCode(flowExecutionCode: FlowExecutionCode) = { - val req = new RequestDWSProject(flowExecutionCode.getFlowId, flowExecutionCode.getVersion, flowExecutionCode.getProjectVersionId) + private def getDSSProjectByCode(flowExecutionCode: FlowExecutionCode) = { + val req = new RequestDSSProject(flowExecutionCode.getFlowId, flowExecutionCode.getVersion, flowExecutionCode.getProjectVersionId) Sender.getSender(FlowExecutionEntranceConfiguration.SCHEDULER_APPLICATION.getValue).ask(req) } diff --git a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/job/parser/FlowJobNodeParser.scala b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/job/parser/FlowJobNodeParser.scala index 158a8aebde57f51830b6bb78074544744276dc41..2f18486fa34b1ed9af82865c9d8b23aac337c562 100644 --- a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/job/parser/FlowJobNodeParser.scala +++ b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/job/parser/FlowJobNodeParser.scala @@ -19,6 +19,7 @@ package com.webank.wedatasphere.dss.flow.execution.entrance.job.parser import java.util +import com.webank.wedatasphere.dss.flow.execution.entrance.conf.FlowExecutionEntranceConfiguration import com.webank.wedatasphere.dss.flow.execution.entrance.conf.FlowExecutionEntranceConfiguration._ import com.webank.wedatasphere.dss.flow.execution.entrance.exception.FlowExecutionErrorException import com.webank.wedatasphere.dss.flow.execution.entrance.job.FlowEntranceJob @@ -34,8 +35,8 @@ import org.springframework.core.annotation.Order import org.springframework.stereotype.Component /** - * Created by peacewong on 2019/11/6. - */ + * Created by johnnwang on 2019/11/6. + */ @Order(2) @Component @@ -52,7 +53,7 @@ class FlowJobNodeParser extends FlowEntranceJobParser with Logging{ val nodeName = node.getName val propsMap = new util.HashMap[String, String]() - val proxyUser = if (node.getDWSNode.getUserProxy == null) flowEntranceJob.getUser else node.getDWSNode.getUserProxy + val proxyUser = if (node.getDssNode.getUserProxy == null) flowEntranceJob.getUser else node.getDssNode.getUserProxy propsMap.put(PROJECT_NAME, project.getName) propsMap.put(FLOW_NAME, flow.getName) propsMap.put(JOB_ID, nodeName) @@ -61,12 +62,12 @@ class FlowJobNodeParser extends FlowEntranceJobParser with Logging{ propsMap.put(LinkisJobExecutionConfiguration.LINKIS_TYPE, node.getNodeType) propsMap.put(PROXY_USER, proxyUser) - propsMap.put(COMMAND, LinkisJobExecutionUtils.gson.toJson(node.getDWSNode.getJobContent)) + propsMap.put(COMMAND, LinkisJobExecutionUtils.gson.toJson(node.getDssNode.getJobContent)) - var params = node.getDWSNode.getParams + var params = node.getDssNode.getParams if (params == null) { params = new util.HashMap[String,AnyRef]() - node.getDWSNode.setParams(params) + node.getDssNode.setParams(params) } val flowVar = new util.HashMap[String, AnyRef]() val properties = flow.getFlowProperties @@ -76,6 +77,8 @@ class FlowJobNodeParser extends FlowEntranceJobParser with Logging{ } } + propsMap.put(FlowExecutionEntranceConfiguration.FLOW_EXEC_ID, flowEntranceJob.getId) + params.put(PROPS_MAP, propsMap) params.put(FLOW_VAR_MAP, flowVar) params.put(PROJECT_RESOURCES, project.getProjectResources) diff --git a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/node/AppJointJobBuilder.scala b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/node/AppJointJobBuilder.scala index 02505695a8c46d76b9b5f38419a60353c52a9192..f8173adfb671bab4eb6f9d280ce3fb58353f2f3e 100644 --- a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/node/AppJointJobBuilder.scala +++ b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/node/AppJointJobBuilder.scala @@ -26,13 +26,28 @@ import com.webank.wedatasphere.dss.flow.execution.entrance.job._ import com.webank.wedatasphere.dss.flow.execution.entrance.utils.FlowExecutionUtils import com.webank.wedatasphere.dss.linkis.node.execution.conf.LinkisJobExecutionConfiguration import com.webank.wedatasphere.dss.linkis.node.execution.entity.BMLResource +import com.webank.wedatasphere.dss.linkis.node.execution.execution.impl.LinkisNodeExecutionImpl import com.webank.wedatasphere.dss.linkis.node.execution.job._ +import com.webank.wedatasphere.dss.linkis.node.execution.parser.JobParamsParser import org.apache.commons.lang.StringUtils /** - * Created by peacewong on 2019/11/5. - */ + * Created by johnnwang on 2019/11/5. + */ object AppJointJobBuilder { + + val signalKeyCreator = new FlowExecutionJobSignalKeyCreator + + init() + + def init(): Unit ={ + val jobParamsParser = new JobParamsParser + + jobParamsParser.setSignalKeyCreator(signalKeyCreator) + + LinkisNodeExecutionImpl.getLinkisNodeExecution.asInstanceOf[LinkisNodeExecutionImpl].registerJobParser(jobParamsParser) + } + def builder():FlowBuilder = new FlowBuilder class FlowBuilder extends Builder { @@ -95,9 +110,10 @@ object AppJointJobBuilder { override protected def createSignalSharedJob(isLinkisType: Boolean): SignalSharedJob = { if(isLinkisType){ - null + null } else { val signalJob = new FlowExecutionAppJointSignalSharedJob + signalJob.setSignalKeyCreator(signalKeyCreator) signalJob.setJobProps(this.jobProps) signalJob } @@ -123,12 +139,12 @@ object AppJointJobBuilder { } override protected def fillLinkisJobInfo(linkisJob: LinkisJob): Unit = { - this.node.getDWSNode.getParams.get(FlowExecutionEntranceConfiguration.NODE_CONFIGURATION_KEY) match { + this.node.getDssNode.getParams.get(FlowExecutionEntranceConfiguration.NODE_CONFIGURATION_KEY) match { case configuration:util.Map[String, AnyRef] => linkisJob.setConfiguration(configuration) case _ => } - this.node.getDWSNode.getParams.remove(FlowExecutionEntranceConfiguration.FLOW_VAR_MAP) match { + this.node.getDssNode.getParams.remove(FlowExecutionEntranceConfiguration.FLOW_VAR_MAP) match { case flowVar:util.Map[String, AnyRef] => linkisJob.setVariables(flowVar) case _ => @@ -137,13 +153,13 @@ object AppJointJobBuilder { } override protected def fillCommonLinkisJobInfo(linkisAppjointJob: CommonLinkisJob): Unit = { - linkisAppjointJob.setJobResourceList(FlowExecutionUtils.resourcesAdaptation(this.node.getDWSNode.getResources)) - this.node.getDWSNode.getParams.remove(FlowExecutionEntranceConfiguration.PROJECT_RESOURCES) match { + linkisAppjointJob.setJobResourceList(FlowExecutionUtils.resourcesAdaptation(this.node.getDssNode.getResources)) + this.node.getDssNode.getParams.remove(FlowExecutionEntranceConfiguration.PROJECT_RESOURCES) match { case projectResources:util.List[Resource] => linkisAppjointJob.setProjectResourceList(FlowExecutionUtils.resourcesAdaptation(projectResources)) case _ => } - this.node.getDWSNode.getParams.remove(FlowExecutionEntranceConfiguration.FLOW_RESOURCES) match { + this.node.getDssNode.getParams.remove(FlowExecutionEntranceConfiguration.FLOW_RESOURCES) match { case flowResources:util.HashMap[String, util.List[BMLResource]] => linkisAppjointJob.setFlowNameAndResources(flowResources) case _ => diff --git a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/node/DefaultNodeRunner.scala b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/node/DefaultNodeRunner.scala index a9462dda8a312fd0915d2d65626941ae3b6596a1..fbd13c84932f9fd9e884cc890c303d581c5c64f8 100644 --- a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/node/DefaultNodeRunner.scala +++ b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/node/DefaultNodeRunner.scala @@ -27,11 +27,12 @@ import com.webank.wedatasphere.dss.flow.execution.entrance.log.FlowExecutionLog import com.webank.wedatasphere.dss.flow.execution.entrance.node.NodeExecutionState.NodeExecutionState import com.webank.wedatasphere.dss.linkis.node.execution.execution.impl.LinkisNodeExecutionImpl import com.webank.wedatasphere.dss.linkis.node.execution.listener.LinkisExecutionListener +import com.webank.wedatasphere.linkis.common.exception.ErrorException import com.webank.wedatasphere.linkis.common.utils.{Logging, Utils} /** - * Created by peacewong on 2019/11/5. - */ + * Created by johnnwang on 2019/11/5. + */ class DefaultNodeRunner extends NodeRunner with Logging { private var node: SchedulerNode = _ @@ -48,6 +49,10 @@ class DefaultNodeRunner extends NodeRunner with Logging { private var startTime: Long = _ + private var nowTime:Long = _ + + private var lastGetStatusTime: Long = 0 + override def getNode: SchedulerNode = this.node def setNode(schedulerNode: SchedulerNode): Unit = { @@ -64,6 +69,15 @@ class DefaultNodeRunner extends NodeRunner with Logging { } override def isLinkisJobCompleted: Boolean = Utils.tryCatch{ + + val interval = System.currentTimeMillis() - lastGetStatusTime + + if ( interval < FlowExecutionEntranceConfiguration.NODE_STATUS_INTERVAL.getValue){ + return false + } + + lastGetStatusTime = System.currentTimeMillis() + if(NodeExecutionState.isCompleted(getStatus)) return true val toState = NodeExecutionState.withName(LinkisNodeExecutionImpl.getLinkisNodeExecution.getState(this.linkisJob)) if (NodeExecutionState.isCompleted(toState)) { @@ -75,9 +89,11 @@ class DefaultNodeRunner extends NodeRunner with Logging { } else { false } - }{ t => - warn(s"Failed to get ${this.node.getName} linkis job states", t) - false + }{ + case e:ErrorException => logger.warn(s"failed to get ${this.node.getName} state", e) + false + case t :Throwable => logger.error(s"failed to get ${this.node.getName} state", t) + true } override def setNodeRunnerListener(nodeRunnerListener: NodeRunnerListener): Unit = this.nodeRunnerListener = nodeRunnerListener @@ -87,7 +103,7 @@ class DefaultNodeRunner extends NodeRunner with Logging { override def run(): Unit = { info(s"start to run node of ${node.getName}") try { - val jobProps = node.getDWSNode.getParams.remove(FlowExecutionEntranceConfiguration.PROPS_MAP) match { + val jobProps = node.getDssNode.getParams.remove(FlowExecutionEntranceConfiguration.PROPS_MAP) match { case propsMap: util.Map[String, String] => propsMap case _ => new util.HashMap[String, String]() } @@ -102,7 +118,7 @@ class DefaultNodeRunner extends NodeRunner with Logging { } LinkisNodeExecutionImpl.getLinkisNodeExecution.runJob(this.linkisJob) - info(s"Finished to run node of ${node.getName}") + info(s"start to run node of ${node.getName}") /*LinkisNodeExecutionImpl.getLinkisNodeExecution.waitForComplete(this.linkisJob) val listener = LinkisNodeExecutionImpl.getLinkisNodeExecution.asInstanceOf[LinkisExecutionListener] val toState = LinkisNodeExecutionImpl.getLinkisNodeExecution.getState(this.linkisJob) @@ -142,4 +158,7 @@ class DefaultNodeRunner extends NodeRunner with Logging { override def setStartTime(startTime: Long): Unit = this.startTime = startTime + override def getNowTime(): Long = this.nowTime + + override def setNowTime(nowTime: Long): Unit = this.nowTime = nowTime } diff --git a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/node/NodeRunner.scala b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/node/NodeRunner.scala index 5e090fe29e8e5e96b00acf2f00a19af7e761e9cf..f0a40fd57a3b997378234688ef4a6f9216529ba4 100644 --- a/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/node/NodeRunner.scala +++ b/dss-flow-execution-entrance/src/main/scala/com/webank/wedatasphere/dss/flow/execution/entrance/node/NodeRunner.scala @@ -75,6 +75,10 @@ abstract class NodeRunner extends Runnable with Logging{ def setStartTime(startTime: Long): Unit + def getNowTime():Long + + def setNowTime(nowTime: Long):Unit + protected def transitionState(toState: NodeExecutionState): Unit = Utils.tryAndWarn{ if (getStatus == toState) return info(s"from state $getStatus to $toState") diff --git a/dss-linkis-node-execution/pom.xml b/dss-linkis-node-execution/pom.xml index f09266cdf7f129fd2e9df4b188c0ef16fa895732..f773b12f2459bed429c2aa3d4404d2e93f41a341 100644 --- a/dss-linkis-node-execution/pom.xml +++ b/dss-linkis-node-execution/pom.xml @@ -24,7 +24,7 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 dss-linkis-node-execution @@ -33,13 +33,13 @@ com.webank.wedatasphere.linkis linkis-ujes-client - 0.9.1 + ${linkis.version} com.webank.wedatasphere.linkis linkis-workspace-httpclient - 0.9.1 + ${linkis.version} diff --git a/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/WorkflowContextImpl.java b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/WorkflowContextImpl.java index e92881356996baacd0ce000a52b98748c966b393..8f2ff84fac9d9495c24747ee0268a5c25f128a77 100644 --- a/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/WorkflowContextImpl.java +++ b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/WorkflowContextImpl.java @@ -21,12 +21,13 @@ import com.google.common.cache.Cache; import com.google.common.cache.CacheBuilder; import com.webank.wedatasphere.dss.linkis.node.execution.conf.LinkisJobExecutionConfiguration; import com.webank.wedatasphere.dss.linkis.node.execution.entity.ContextInfo; +import org.apache.commons.lang.StringUtils; import java.util.*; import java.util.concurrent.TimeUnit; /** - * Created by peacewong on 2019/9/26. + * Created by johnnwang on 2019/9/26. */ public class WorkflowContextImpl implements WorkflowContext { @@ -79,7 +80,7 @@ public class WorkflowContextImpl implements WorkflowContext { while (keys.hasNext()) { String key = keys.next(); if (key.startsWith(keyPrefix)) { - map.put(key, getValue(key)); + map.put(StringUtils.substringAfter(key, keyPrefix), getValue(key)); } } return map; diff --git a/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/conf/LinkisJobExecutionConfiguration.java b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/conf/LinkisJobExecutionConfiguration.java index 852262bb57b8c231df20fc63dba1fe991d77ab41..9b75d73a01b6b4267b8d8fccbf12e0349e8ae496 100644 --- a/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/conf/LinkisJobExecutionConfiguration.java +++ b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/conf/LinkisJobExecutionConfiguration.java @@ -68,7 +68,7 @@ public class LinkisJobExecutionConfiguration { public final static CommonVars LINKIS_CONNECTION_TIMEOUT = CommonVars.apply("wds.linkis.flow.connection.timeout",30000); - public final static CommonVars LINKIS_JOB_REQUEST_STATUS_TIME = CommonVars.apply("wds.linkis.flow.connection.timeout",1000); + public final static CommonVars LINKIS_JOB_REQUEST_STATUS_TIME = CommonVars.apply("wds.linkis.flow.connection.timeout",3000); public final static CommonVars LINKIS_ADMIN_USER = CommonVars.apply("wds.linkis.client.flow.adminuser","ws"); diff --git a/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/execution/impl/LinkisNodeExecutionImpl.java b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/execution/impl/LinkisNodeExecutionImpl.java index 635ce6ac03f6ba93ed5704e33bb99808404c49f3..264e8288c461de3a4b1a49718bc53931229c053a 100644 --- a/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/execution/impl/LinkisNodeExecutionImpl.java +++ b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/execution/impl/LinkisNodeExecutionImpl.java @@ -23,7 +23,6 @@ import com.webank.wedatasphere.dss.linkis.node.execution.execution.LinkisNodeExe import com.webank.wedatasphere.dss.linkis.node.execution.job.SharedJob; import com.webank.wedatasphere.dss.linkis.node.execution.job.SignalSharedJob; import com.webank.wedatasphere.dss.linkis.node.execution.listener.LinkisExecutionListener; -import com.webank.wedatasphere.dss.linkis.node.execution.parser.JobParamsParser; import com.webank.wedatasphere.dss.linkis.node.execution.parser.JobRuntimeParamsParser; import com.webank.wedatasphere.dss.linkis.node.execution.service.impl.BuildJobActionImpl; import com.webank.wedatasphere.dss.linkis.node.execution.conf.LinkisJobExecutionConfiguration; @@ -51,7 +50,7 @@ public class LinkisNodeExecutionImpl implements LinkisNodeExecution , LinkisExec private LinkisNodeExecutionImpl() { registerJobParser(new CodeParser()); - registerJobParser(new JobParamsParser()); + /*registerJobParser(new JobParamsParser());*/ registerJobParser(new JobRuntimeParamsParser()); } @@ -107,7 +106,7 @@ public class LinkisNodeExecutionImpl implements LinkisNodeExecution , LinkisExec job.getLogFromLine(), LinkisJobExecutionConfiguration.LOG_SIZE.getValue()); - job.setLogFromLint(jobLogResult.fromLine()); + job.setLogFromLine(jobLogResult.fromLine()); ArrayList logArray = jobLogResult.getLog(); @@ -191,12 +190,7 @@ public class LinkisNodeExecutionImpl implements LinkisNodeExecution , LinkisExec if (job instanceof SignalSharedJob){ SignalSharedJob signalSharedJob = (SignalSharedJob) job; String result = getResult(job, 0, -1); - String msgSaveKey = signalSharedJob.getMsgSaveKey(); - String key = SignalSharedJob.PREFIX ; - if (StringUtils.isNotEmpty(msgSaveKey)){ - key = key + msgSaveKey; - } - WorkflowContext.getAppJointContext().setValue(key, result , -1); + WorkflowContext.getAppJointContext().setValue(signalSharedJob.getSharedKey(), result , -1); } else if(job instanceof SharedJob){ String taskId = job.getJobExecuteResult().getTaskID(); job.getLogObj().info("Set shared info:" + taskId); diff --git a/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/AbstractAppJointLinkisJob.java b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/AbstractAppJointLinkisJob.java index 1bce73c8af2a2d1b06513f33bf96e09feb85fecf..7d9e47f069fa704ba45561674cd6b655ce68d211 100644 --- a/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/AbstractAppJointLinkisJob.java +++ b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/AbstractAppJointLinkisJob.java @@ -146,7 +146,7 @@ public abstract class AbstractAppJointLinkisJob extends AppJointLinkisJob { } @Override - public void setLogFromLint(int index) { + public void setLogFromLine(int index) { this.logFromLine = index; } diff --git a/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/AbstractCommonLinkisJob.java b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/AbstractCommonLinkisJob.java index 2d836eaacc497fab8cb691f23e69cebf388db1d1..c7f74f2348faa674b8414191e56f775fa9b345dc 100644 --- a/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/AbstractCommonLinkisJob.java +++ b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/AbstractCommonLinkisJob.java @@ -154,7 +154,7 @@ public abstract class AbstractCommonLinkisJob extends CommonLinkisJob { } @Override - public void setLogFromLint(int index) { + public void setLogFromLine(int index) { this.logFromLine = index; } diff --git a/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/Job.java b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/Job.java index ac25c724b474baf519e5d8918b695d641a82d191..1811ad105e43b4ddd06bb57b3d1cc2e0415d7dd4 100644 --- a/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/Job.java +++ b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/Job.java @@ -63,6 +63,6 @@ public interface Job { int getLogFromLine(); - void setLogFromLint(int index); + void setLogFromLine(int index); } diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/DWSUserMapper.java b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/JobSignalKeyCreator.java similarity index 74% rename from dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/DWSUserMapper.java rename to dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/JobSignalKeyCreator.java index 25daa48acb84a7c39e88fd4a48db5c7177b0ef71..ce64bf90175edc4c8665a148264f0d01495e5f11 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/DWSUserMapper.java +++ b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/JobSignalKeyCreator.java @@ -15,13 +15,11 @@ * */ -package com.webank.wedatasphere.dss.server.dao; +package com.webank.wedatasphere.dss.linkis.node.execution.job; -import org.apache.ibatis.annotations.Mapper; +public interface JobSignalKeyCreator { + String getSignalKeyByJob(Job job); -public interface DWSUserMapper { - Long getUserID(String userName); - - String getuserName(Long userID); + String getSignalKeyBySignalSharedJob(SignalSharedJob job); } diff --git a/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/SignalSharedJob.java b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/SignalSharedJob.java index a71a12555af1ffced4f89c461420ad882d92d24c..13a9b8b444cce7ef0598692bdf96dc6fd936b9e4 100644 --- a/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/SignalSharedJob.java +++ b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/job/SignalSharedJob.java @@ -24,6 +24,10 @@ public interface SignalSharedJob extends SharedJob { String PREFIX = "signal."; + JobSignalKeyCreator getSignalKeyCreator(); + + void setSignalKeyCreator(JobSignalKeyCreator signalKeyCreator); + @Override default int getSharedNum() { return -1; @@ -31,7 +35,7 @@ public interface SignalSharedJob extends SharedJob { @Override default String getSharedKey() { - return PREFIX + getMsgSaveKey(); + return PREFIX + getSignalKeyCreator().getSignalKeyBySignalSharedJob(this) + "." + getMsgSaveKey(); } String getMsgSaveKey(); diff --git a/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/parser/JobParamsParser.java b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/parser/JobParamsParser.java index fca769d8a63310c68f5c3d6dd5fecad892db6c14..27174fbcc8f3f9e047f359b33c95b2c5353e1b11 100644 --- a/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/parser/JobParamsParser.java +++ b/dss-linkis-node-execution/src/main/java/com/webank/wedatasphere/dss/linkis/node/execution/parser/JobParamsParser.java @@ -19,18 +19,32 @@ package com.webank.wedatasphere.dss.linkis.node.execution.parser; import com.google.gson.reflect.TypeToken; import com.webank.wedatasphere.dss.linkis.node.execution.WorkflowContext; +import com.webank.wedatasphere.dss.linkis.node.execution.job.JobSignalKeyCreator; import com.webank.wedatasphere.dss.linkis.node.execution.job.LinkisJob; import com.webank.wedatasphere.dss.linkis.node.execution.job.Job; import com.webank.wedatasphere.dss.linkis.node.execution.job.SignalSharedJob; import com.webank.wedatasphere.dss.linkis.node.execution.utils.LinkisJobExecutionUtils; - +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import java.util.Collection; +import java.util.HashMap; import java.util.List; import java.util.Map; /** - * Created by peacewong on 2019/11/3. + * Created by johnnwang on 2019/11/3. */ public class JobParamsParser implements JobParser { + private static final Logger LOGGER = LoggerFactory.getLogger(JobParamsParser.class); + private JobSignalKeyCreator signalKeyCreator; + + public JobSignalKeyCreator getSignalKeyCreator() { + return signalKeyCreator; + } + + public void setSignalKeyCreator(JobSignalKeyCreator signalKeyCreator) { + this.signalKeyCreator = signalKeyCreator; + } @Override public void parseJob(Job job) throws Exception { @@ -44,11 +58,19 @@ public class JobParamsParser implements JobParser { Map flowVariables = linkisJob.getVariables(); putParamsMap(job.getParams(), "variable", flowVariables); //put signal info - Map sharedValue = WorkflowContext.getAppJointContext().getSubMapByPrefix(SignalSharedJob.PREFIX); + Map sharedValue = WorkflowContext.getAppJointContext() + .getSubMapByPrefix(SignalSharedJob.PREFIX + this.getSignalKeyCreator().getSignalKeyByJob(job)); if (sharedValue != null) { - putParamsMap(job.getParams(), "variable", sharedValue); + Collection values = sharedValue.values(); + for(Object value : values){ + List> list = LinkisJobExecutionUtils.gson.fromJson(value.toString(), List.class); + Map totalMap = new HashMap<>(); + for (Map kv : list) { + totalMap.putAll(kv); + } + putParamsMap(job.getParams(), "variable", totalMap); + } } - // put configuration Map configuration = linkisJob.getConfiguration(); putParamsMap(job.getParams(), "configuration", configuration); diff --git a/dss-scheduler-appjoint-core/pom.xml b/dss-scheduler-appjoint-core/pom.xml index 1af112fdb18acfbf6b8e4a83c12f370824d300fc..72f112372a9812e7381c9275623754e1a1fb6c12 100644 --- a/dss-scheduler-appjoint-core/pom.xml +++ b/dss-scheduler-appjoint-core/pom.xml @@ -22,7 +22,7 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 4.0.0 @@ -48,6 +48,11 @@ ${dss.version} + + com.google.code.gson + gson + 2.8.5 + diff --git a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/SchedulerAppJoint.java b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/SchedulerAppJoint.java index 28450d82ed1e6238d24d542f467256a7833ffcfd..9923d90c73138d220051ada6886f441391c89f49 100644 --- a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/SchedulerAppJoint.java +++ b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/SchedulerAppJoint.java @@ -18,8 +18,11 @@ package com.webank.wedatasphere.dss.appjoint.scheduler; import com.webank.wedatasphere.dss.appjoint.AppJoint; +import com.webank.wedatasphere.dss.appjoint.scheduler.hooks.ProjectPublishHook; +import com.webank.wedatasphere.dss.appjoint.scheduler.parser.ProjectParser; import com.webank.wedatasphere.dss.appjoint.scheduler.service.SchedulerProjectService; import com.webank.wedatasphere.dss.appjoint.scheduler.service.SchedulerSecurityService; +import com.webank.wedatasphere.dss.appjoint.scheduler.tuning.ProjectTuning; import java.io.Closeable; @@ -32,4 +35,10 @@ public interface SchedulerAppJoint extends AppJoint, Closeable { SchedulerSecurityService getSecurityService(); + ProjectParser getProjectParser(); + ProjectTuning getProjectTuning(); + + ProjectPublishHook[] getProjectPublishHooks(); + + } diff --git a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/AbstractSchedulerNode.java b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/AbstractSchedulerNode.java index 167d61fda6a5c45ebbdef52a572a6c76fd99949f..a5139d7d9c449a2b7e3ab60cb41fe386e0d2936b 100644 --- a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/AbstractSchedulerNode.java +++ b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/AbstractSchedulerNode.java @@ -17,7 +17,7 @@ package com.webank.wedatasphere.dss.appjoint.scheduler.entity; -import com.webank.wedatasphere.dss.common.entity.node.DWSNode; +import com.webank.wedatasphere.dss.common.entity.node.DSSNode; import java.util.List; @@ -26,65 +26,65 @@ import java.util.List; */ public abstract class AbstractSchedulerNode implements SchedulerNode { - private DWSNode dwsNode; + private DSSNode dssNode; @Override - public DWSNode getDWSNode() { - return this.dwsNode; + public DSSNode getDssNode() { + return this.dssNode; } @Override - public void setDWSNode(DWSNode dwsNode) { - this.dwsNode = dwsNode; + public void setDssNode(DSSNode dssNode) { + this.dssNode = dssNode; } @Override public String getId() { - return dwsNode.getId(); + return dssNode.getId(); } @Override public void setId(String id) { - dwsNode.setId(id); + dssNode.setId(id); } @Override public String getNodeType() { - return dwsNode.getNodeType(); + return dssNode.getNodeType(); } @Override public void setNodeType(String nodeType) { - dwsNode.setNodeType(nodeType); + dssNode.setNodeType(nodeType); } @Override public String getName() { - return dwsNode.getName(); + return dssNode.getName(); } @Override public void setName(String name) { - dwsNode.setName(name); + dssNode.setName(name); } @Override public void addDependency(String nodeName) { - dwsNode.addDependency(nodeName); + dssNode.addDependency(nodeName); } @Override public void setDependency(List dependency) { - dwsNode.setDependency(dependency); + dssNode.setDependency(dependency); } @Override public void removeDependency(String nodeName) { - dwsNode.removeDependency(nodeName); + dssNode.removeDependency(nodeName); } @Override public List getDependencys() { - return dwsNode.getDependencys(); + return dssNode.getDependencys(); } } diff --git a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/AbstractSchedulerProject.java b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/AbstractSchedulerProject.java index 6cfd51647f3e44b53d99aafb985ef36f5b023c7b..3a8ad55eb8defa03badcdcd127fb410ac0cbafe7 100644 --- a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/AbstractSchedulerProject.java +++ b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/AbstractSchedulerProject.java @@ -17,7 +17,7 @@ package com.webank.wedatasphere.dss.appjoint.scheduler.entity; -import com.webank.wedatasphere.dss.common.entity.project.DWSProject; +import com.webank.wedatasphere.dss.common.entity.project.DSSProject; import com.webank.wedatasphere.dss.common.entity.project.ProjectVersion; import java.util.List; @@ -32,7 +32,7 @@ public abstract class AbstractSchedulerProject implements SchedulerProject { private String name; private String description; - private DWSProject dwsProject; + private DSSProject dssProject; private List schedulerFlows; private List projectVersions; @@ -101,12 +101,12 @@ public abstract class AbstractSchedulerProject implements SchedulerProject { } @Override - public DWSProject getDWSProject() { - return this.dwsProject; + public DSSProject getDssProject() { + return this.dssProject; } @Override - public void setDWSProject(DWSProject dwsProject) { - this.dwsProject = dwsProject; + public void setDssProject(DSSProject dssProject) { + this.dssProject = dssProject; } } diff --git a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/SchedulerEdge.java b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/SchedulerEdge.java index 582f0a4fb7de03087f86c0fd995a621aad9e74cf..aee1e72292f126cb50bb40bbbe223a830b69a426 100644 --- a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/SchedulerEdge.java +++ b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/SchedulerEdge.java @@ -17,13 +17,13 @@ package com.webank.wedatasphere.dss.appjoint.scheduler.entity; -import com.webank.wedatasphere.dss.common.entity.node.DWSEdge; +import com.webank.wedatasphere.dss.common.entity.node.DSSEdge; /** * Created by enjoyyin on 2019/9/7. */ public interface SchedulerEdge { - DWSEdge getDWSEdge(); + DSSEdge getDssEdge(); - void setDWSEdge(DWSEdge dwsEdge); + void setDssEdge(DSSEdge dssEdge); } diff --git a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/SchedulerEdgeDefault.java b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/SchedulerEdgeDefault.java index b9f9fb96010108fee14e57ac553aa4a3e281381c..db663fe540387fa57a2f40296093e55c11ab2057 100644 --- a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/SchedulerEdgeDefault.java +++ b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/SchedulerEdgeDefault.java @@ -17,22 +17,22 @@ package com.webank.wedatasphere.dss.appjoint.scheduler.entity; -import com.webank.wedatasphere.dss.common.entity.node.DWSEdge; +import com.webank.wedatasphere.dss.common.entity.node.DSSEdge; /** * Created by allenlliu on 2019/9/19. */ public class SchedulerEdgeDefault implements SchedulerEdge { - private DWSEdge dwsEdge; + private DSSEdge dssEdge; @Override - public DWSEdge getDWSEdge() { - return dwsEdge; + public DSSEdge getDssEdge() { + return dssEdge; } @Override - public void setDWSEdge(DWSEdge dwsEdge) { - this.dwsEdge = dwsEdge; + public void setDssEdge(DSSEdge dssEdge) { + this.dssEdge = dssEdge; } } diff --git a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/SchedulerNode.java b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/SchedulerNode.java index 1d756efdd225b092a34600b604e595b1cd598c42..612c2e25d6a99e8e5581ec1d768d7c7f450c803f 100644 --- a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/SchedulerNode.java +++ b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/SchedulerNode.java @@ -18,14 +18,14 @@ package com.webank.wedatasphere.dss.appjoint.scheduler.entity; -import com.webank.wedatasphere.dss.common.entity.node.DWSNode; +import com.webank.wedatasphere.dss.common.entity.node.DSSNode; import com.webank.wedatasphere.dss.common.entity.node.Node; /** * Created by enjoyyin on 2019/9/7. */ public interface SchedulerNode extends Node { - DWSNode getDWSNode(); + DSSNode getDssNode(); - void setDWSNode(DWSNode dwsNode); + void setDssNode(DSSNode dssNode); } diff --git a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/SchedulerProject.java b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/SchedulerProject.java index ca3d8362e547b1a612f47ed988e5d734e918477d..d4158b0f4c1eb0cd3c65e49bef1e5fbf708c91f7 100644 --- a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/SchedulerProject.java +++ b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/entity/SchedulerProject.java @@ -17,13 +17,13 @@ package com.webank.wedatasphere.dss.appjoint.scheduler.entity; -import com.webank.wedatasphere.dss.common.entity.project.DWSProject; +import com.webank.wedatasphere.dss.common.entity.project.DSSProject; import com.webank.wedatasphere.dss.common.entity.project.Project; /** * Created by enjoyyin on 2019/9/16. */ public interface SchedulerProject extends Project { - DWSProject getDWSProject(); - void setDWSProject(DWSProject dwsProject); + DSSProject getDssProject(); + void setDssProject(DSSProject dssProject); } diff --git a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/AbstractFlowParser.java b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/AbstractFlowParser.java index 546cb32ed604b5deedec646abd39863168503aca..96a9f2aed6f1055f14316e24ca80263efce2bd95 100644 --- a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/AbstractFlowParser.java +++ b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/AbstractFlowParser.java @@ -27,12 +27,12 @@ import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerEdgeDefaul import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerNode; import com.webank.wedatasphere.dss.common.entity.Resource; import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerFlow; -import com.webank.wedatasphere.dss.common.entity.flow.DWSJSONFlow; +import com.webank.wedatasphere.dss.common.entity.flow.DSSJSONFlow; import com.webank.wedatasphere.dss.common.entity.flow.Flow; -import com.webank.wedatasphere.dss.common.entity.node.DWSEdge; -import com.webank.wedatasphere.dss.common.entity.node.DWSEdgeDefault; -import com.webank.wedatasphere.dss.common.entity.node.DWSNode; -import com.webank.wedatasphere.dss.common.entity.node.DWSNodeDefault; +import com.webank.wedatasphere.dss.common.entity.node.DSSEdge; +import com.webank.wedatasphere.dss.common.entity.node.DSSEdgeDefault; +import com.webank.wedatasphere.dss.common.entity.node.DSSNode; +import com.webank.wedatasphere.dss.common.entity.node.DSSNodeDefault; import org.springframework.beans.BeanUtils; import java.util.*; @@ -64,39 +64,39 @@ public abstract class AbstractFlowParser implements FlowParser { protected void dealFlowProperties(Flow flow){} @Override - public SchedulerFlow parseFlow(DWSJSONFlow flow) { + public SchedulerFlow parseFlow(DSSJSONFlow flow) { downloadFlowResources(); dealFlowResources(); dealFlowProperties(flow); - return resolveDWSJSONFlow(flow); + return resolveDSSJSONFlow(flow); } - // 解析DWSJSONFlow,生成DWSNode - public SchedulerFlow resolveDWSJSONFlow(DWSJSONFlow jsonFlow){ + // 解析DSSJSONFlow,生成DSSNode + public SchedulerFlow resolveDSSJSONFlow(DSSJSONFlow jsonFlow){ SchedulerFlow schedulerFlow = createSchedulerFlow(); BeanUtils.copyProperties(jsonFlow,schedulerFlow,"children"); JsonParser parser = new JsonParser(); JsonObject jsonObject = parser.parse(jsonFlow.getJson()).getAsJsonObject(); JsonArray nodeJsonArray = jsonObject.getAsJsonArray("nodes"); Gson gson = new Gson(); - List dwsNodes = gson.fromJson(nodeJsonArray, new TypeToken>() { + List dssNodes = gson.fromJson(nodeJsonArray, new TypeToken>() { }.getType()); List schedulerNodeList = new ArrayList<>(); List schedulerEdgeList = new ArrayList<>(); - for (DWSNode dwsNode : dwsNodes) { + for (DSSNode dssNode : dssNodes) { Optional firstNodeParser = Arrays.stream(getNodeParsers()) - .filter(p -> p.ifNodeCanParse(dwsNode)) + .filter(p -> p.ifNodeCanParse(dssNode)) .sorted((p1, p2) -> p2.getOrder() - p1.getOrder()) .findFirst(); - SchedulerNode schedulerNode = firstNodeParser.orElseThrow(()->new IllegalArgumentException("NodeParser个数应该大于0")).parseNode(dwsNode); + SchedulerNode schedulerNode = firstNodeParser.orElseThrow(()->new IllegalArgumentException("NodeParser个数应该大于0")).parseNode(dssNode); schedulerNodeList.add(schedulerNode); } JsonArray edgeJsonArray = jsonObject.getAsJsonArray("edges"); - List dwsEdges = gson.fromJson(edgeJsonArray, new TypeToken>() { + List dssEdges = gson.fromJson(edgeJsonArray, new TypeToken>() { }.getType()); - for (DWSEdge dwsEdge : dwsEdges) { + for (DSSEdge dssEdge : dssEdges) { SchedulerEdge schedulerEdge = new SchedulerEdgeDefault(); - schedulerEdge.setDWSEdge(dwsEdge); + schedulerEdge.setDssEdge(dssEdge); schedulerEdgeList.add(schedulerEdge); } diff --git a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/AbstractProjectParser.java b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/AbstractProjectParser.java index 27ef1c1c25a2ee6575c24a93e2eae5d6c224e0f6..a7d481d24b092dd3945d73a967c66cc1bb25b96c 100644 --- a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/AbstractProjectParser.java +++ b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/AbstractProjectParser.java @@ -21,10 +21,10 @@ import com.webank.wedatasphere.dss.appjoint.scheduler.entity.AbstractSchedulerPr import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerProjectVersionForFlows; import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerFlow; import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerProject; -import com.webank.wedatasphere.dss.common.entity.flow.DWSFlow; -import com.webank.wedatasphere.dss.common.entity.flow.DWSJSONFlow; -import com.webank.wedatasphere.dss.common.entity.project.DWSJSONProject; -import com.webank.wedatasphere.dss.common.entity.project.DWSProject; +import com.webank.wedatasphere.dss.common.entity.flow.DSSFlow; +import com.webank.wedatasphere.dss.common.entity.flow.DSSJSONFlow; +import com.webank.wedatasphere.dss.common.entity.project.DSSProject; +import com.webank.wedatasphere.dss.common.entity.project.DSSJSONProject; import com.webank.wedatasphere.dss.common.entity.project.ProjectVersionForFlows; import org.springframework.beans.BeanUtils; @@ -50,27 +50,27 @@ public abstract class AbstractProjectParser implements ProjectParser { return flowParsers; } - public DWSJSONProject parseToDWSJSONProject(DWSProject dwsProject){ - DWSJSONProject dwsjsonProject = new DWSJSONProject(); - BeanUtils.copyProperties(dwsProject,dwsjsonProject,"flows","projectVersions"); - List dwsFlows = dwsProject.getFlows(); - List dwsjsonFlows = dwsFlows.stream().map(this::toDWSJsonFlow).collect(Collectors.toList()); - dwsjsonProject.setFlows(dwsjsonFlows); - return dwsjsonProject; + public DSSJSONProject parseToDssJsonProject(DSSProject dssProject){ + DSSJSONProject dssJsonProject = new DSSJSONProject(); + BeanUtils.copyProperties(dssProject, dssJsonProject,"flows","projectVersions"); + List dwsFlows = dssProject.getFlows(); + List dssJsonFlows = dwsFlows.stream().map(this::toDssJsonFlow).collect(Collectors.toList()); + dssJsonProject.setFlows(dssJsonFlows); + return dssJsonProject; } - private DWSJSONFlow toDWSJsonFlow(DWSFlow dwsFlow){ - DWSJSONFlow dwsjsonFlow = new DWSJSONFlow(); - BeanUtils.copyProperties(dwsFlow,dwsjsonFlow,"children","flowVersions"); - dwsjsonFlow.setJson(dwsFlow.getLatestVersion().getJson()); - if(dwsFlow.getChildren() != null){ - dwsjsonFlow.setChildren(dwsFlow.getChildren().stream().map(this::toDWSJsonFlow).collect(Collectors.toList())); + private DSSJSONFlow toDssJsonFlow(DSSFlow dssFlow){ + DSSJSONFlow dssJsonFlow = new DSSJSONFlow(); + BeanUtils.copyProperties(dssFlow, dssJsonFlow,"children","flowVersions"); + dssJsonFlow.setJson(dssFlow.getLatestVersion().getJson()); + if(dssFlow.getChildren() != null){ + dssJsonFlow.setChildren(dssFlow.getChildren().stream().map(this::toDssJsonFlow).collect(Collectors.toList())); } - return dwsjsonFlow; + return dssJsonFlow; } - public SchedulerProject parseProject(DWSJSONProject project){ + public SchedulerProject parseProject(DSSJSONProject project){ AbstractSchedulerProject schedulerProject = createSchedulerProject(); SchedulerProjectVersionForFlows projectVersionForFlows = new SchedulerProjectVersionForFlows(); schedulerProject.setProjectVersions(new ArrayList()); @@ -81,23 +81,23 @@ public abstract class AbstractProjectParser implements ProjectParser { return schedulerProject; } - private SchedulerFlow invokeFlowParser(ProjectVersionForFlows projectVersionForFlows, DWSJSONFlow dwsjsonFlow, FlowParser[] flowParsers){ - List flowParsersF = Arrays.stream(flowParsers).filter(f -> f.ifFlowCanParse(dwsjsonFlow)).collect(Collectors.toList()); + private SchedulerFlow invokeFlowParser(ProjectVersionForFlows projectVersionForFlows, DSSJSONFlow dssJsonFlow, FlowParser[] flowParsers){ + List flowParsersF = Arrays.stream(flowParsers).filter(f -> f.ifFlowCanParse(dssJsonFlow)).collect(Collectors.toList()); // TODO: 2019/9/25 如果flowParsers数量>1 ||<=0抛出异常 - SchedulerFlow schedulerFlow = flowParsersF.get(0).parseFlow(dwsjsonFlow); + SchedulerFlow schedulerFlow = flowParsersF.get(0).parseFlow(dssJsonFlow); //收集所有的不分层级的flow? projectVersionForFlows.addFlow(schedulerFlow); - if(dwsjsonFlow.getChildren() != null){ - List schedulerFlows = dwsjsonFlow.getChildren().stream().map(f -> invokeFlowParser(projectVersionForFlows,f, flowParsers)).collect(Collectors.toList()); + if(dssJsonFlow.getChildren() != null){ + List schedulerFlows = dssJsonFlow.getChildren().stream().map(f -> invokeFlowParser(projectVersionForFlows,f, flowParsers)).collect(Collectors.toList()); schedulerFlow.setChildren(schedulerFlows); } return schedulerFlow; } @Override - public SchedulerProject parseProject(DWSProject dwsProject) { - SchedulerProject schedulerProject = parseProject(parseToDWSJSONProject(dwsProject)); - schedulerProject.setDWSProject(dwsProject); + public SchedulerProject parseProject(DSSProject dssProject) { + SchedulerProject schedulerProject = parseProject(parseToDssJsonProject(dssProject)); + schedulerProject.setDssProject(dssProject); return schedulerProject; } diff --git a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/AbstractReadNodeParser.java b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/AbstractReadNodeParser.java index 6cd767f7208fb7da3bfda4bf9eb8d907e0174206..68e8d970e74fe87642a5bf1768845d733b0b4284 100644 --- a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/AbstractReadNodeParser.java +++ b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/AbstractReadNodeParser.java @@ -18,7 +18,7 @@ package com.webank.wedatasphere.dss.appjoint.scheduler.parser; import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerNode; -import com.webank.wedatasphere.dss.common.entity.node.DWSNode; +import com.webank.wedatasphere.dss.common.entity.node.DSSNode; import com.webank.wedatasphere.dss.appjoint.scheduler.entity.ReadNode; import java.util.Arrays; @@ -32,7 +32,7 @@ public abstract class AbstractReadNodeParser implements ContextNodeParser { @Override public String[] getShareNodeIds(SchedulerNode node) { //需根据节点的参数进行解析生成 - Map jobParams = node.getDWSNode().getParams(); + Map jobParams = node.getDssNode().getParams(); if(jobParams == null) return null; Map configuration =(Map) jobParams.get("configuration"); Map runtime = (Map) configuration.get("runtime"); @@ -57,16 +57,16 @@ public abstract class AbstractReadNodeParser implements ContextNodeParser { @Override public ReadNode parseNode(SchedulerNode node) { ReadNode readNode = createReadNode(); - readNode.setDWSNode(node.getDWSNode()); + readNode.setDssNode(node.getDssNode()); readNode.setSchedulerNode(node); readNode.setShareNodeIds(getShareNodeIds(node)); return readNode; } @Override - public SchedulerNode parseNode(DWSNode dwsNode) { + public SchedulerNode parseNode(DSSNode dssNode) { SchedulerNode schedulerNode = createSchedulerNode(); - schedulerNode.setDWSNode(dwsNode); + schedulerNode.setDssNode(dssNode); return parseNode(schedulerNode); } diff --git a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/FlowParser.java b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/FlowParser.java index b6b49897c659b11f5f18dc80fff910c9ed8a9c11..85369debcee72e70ac6a9c04401dae23cca12f94 100644 --- a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/FlowParser.java +++ b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/FlowParser.java @@ -19,18 +19,18 @@ package com.webank.wedatasphere.dss.appjoint.scheduler.parser; import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerFlow; -import com.webank.wedatasphere.dss.common.entity.flow.DWSJSONFlow; +import com.webank.wedatasphere.dss.common.entity.flow.DSSJSONFlow; /** * Created by enjoyyin on 2019/9/7. */ public interface FlowParser { - SchedulerFlow parseFlow(DWSJSONFlow flow); + SchedulerFlow parseFlow(DSSJSONFlow flow); void setNodeParsers(NodeParser[] nodeParsers); NodeParser[] getNodeParsers(); - Boolean ifFlowCanParse(DWSJSONFlow flow); + Boolean ifFlowCanParse(DSSJSONFlow flow); } diff --git a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/NodeParser.java b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/NodeParser.java index 602d386235fa62c7cb88e89327a021c5a12fe4a2..ebba917a160250fe95e6fffc776168cb056329b0 100644 --- a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/NodeParser.java +++ b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/NodeParser.java @@ -18,14 +18,14 @@ package com.webank.wedatasphere.dss.appjoint.scheduler.parser; import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerNode; -import com.webank.wedatasphere.dss.common.entity.node.DWSNode; +import com.webank.wedatasphere.dss.common.entity.node.DSSNode; import com.webank.wedatasphere.dss.appjoint.scheduler.order.Order; /** * Created by enjoyyin on 2019/9/7. */ public interface NodeParser extends Order { - SchedulerNode parseNode(DWSNode dwsNode); + SchedulerNode parseNode(DSSNode dssNode); - Boolean ifNodeCanParse(DWSNode dwsNode); + Boolean ifNodeCanParse(DSSNode dssNode); } diff --git a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/ProjectParser.java b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/ProjectParser.java index 13ffd4362c146974712bc0be30db60551f4a0850..db00db311a81fab39791d7d730d67d456dd8bf60 100644 --- a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/ProjectParser.java +++ b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/ProjectParser.java @@ -18,13 +18,13 @@ package com.webank.wedatasphere.dss.appjoint.scheduler.parser; import com.webank.wedatasphere.dss.appjoint.scheduler.entity.SchedulerProject; -import com.webank.wedatasphere.dss.common.entity.project.DWSProject; +import com.webank.wedatasphere.dss.common.entity.project.DSSProject; /** * Created by enjoyyin on 2019/9/16. */ public interface ProjectParser { - SchedulerProject parseProject(DWSProject dwsProject); + SchedulerProject parseProject(DSSProject dssProject); void setFlowParsers(FlowParser[] flowParser); FlowParser[] getFlowParsers(); diff --git a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/SendEmailNodeParser.java b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/SendEmailNodeParser.java index f4511a8d39f52ad798f200e1c84b65084838b77a..1bd3606f0d96daf0fe1a1e0c97821a0a3c8db766 100644 --- a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/SendEmailNodeParser.java +++ b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/parser/SendEmailNodeParser.java @@ -18,7 +18,7 @@ package com.webank.wedatasphere.dss.appjoint.scheduler.parser; import com.webank.wedatasphere.dss.appjoint.scheduler.constant.SchedulerAppJointConstant; -import com.webank.wedatasphere.dss.common.entity.node.DWSNode; +import com.webank.wedatasphere.dss.common.entity.node.DSSNode; import java.util.Map; @@ -28,9 +28,9 @@ import java.util.Map; public abstract class SendEmailNodeParser extends AbstractReadNodeParser { @Override - public Boolean ifNodeCanParse(DWSNode dwsNode) { + public Boolean ifNodeCanParse(DSSNode dssNode) { //判断是sendemail 并且category是node - Map params = dwsNode.getParams(); + Map params = dssNode.getParams(); if(params != null && !params.isEmpty()){ Object configuration = params.get(SchedulerAppJointConstant.CONFIGURATION); if(configuration instanceof Map){ @@ -38,7 +38,7 @@ public abstract class SendEmailNodeParser extends AbstractReadNodeParser { if(runtime instanceof Map){ Object category = ((Map) runtime).get(SchedulerAppJointConstant.CATEGORY); if(category != null && SchedulerAppJointConstant.NODE.equals(category.toString())){ - return SchedulerAppJointConstant.SENDEMAIL_NODE_TYPE.equals(dwsNode.getNodeType()); + return SchedulerAppJointConstant.SENDEMAIL_NODE_TYPE.equals(dssNode.getNodeType()); } } } diff --git a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/service/SchedulerSecurityService.java b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/service/SchedulerSecurityService.java index 6113d3a5ccd2a2a98853454854b75fa5c9dfd2fc..a9652e52c4e4399406ef47bdbf8df6b216d71646 100644 --- a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/service/SchedulerSecurityService.java +++ b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/service/SchedulerSecurityService.java @@ -23,4 +23,5 @@ import com.webank.wedatasphere.dss.appjoint.service.SecurityService; * Created by enjoyyin on 2019/10/12. */ public interface SchedulerSecurityService extends SecurityService { + void reloadToken(); } diff --git a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/tuning/AbstractFlowTuning.java b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/tuning/AbstractFlowTuning.java index f55ca135c883f20bf93009cd969c20f79d7b6730..42423a69d66d92a087b3db30a6d30fc28e2ea20f 100644 --- a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/tuning/AbstractFlowTuning.java +++ b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/tuning/AbstractFlowTuning.java @@ -75,8 +75,8 @@ public abstract class AbstractFlowTuning implements FlowTuning { private List resolveDependencys(SchedulerNode node,List schedulerNodes, List flowEdges) { List dependencys = new ArrayList<>(); flowEdges.forEach(edge -> { - if (edge.getDWSEdge().getTarget().equals(node.getId())) { - dependencys.add(schedulerNodes.stream().filter(n ->edge.getDWSEdge().getSource().equals(n.getId())).findFirst().get().getName()); + if (edge.getDssEdge().getTarget().equals(node.getId())) { + dependencys.add(schedulerNodes.stream().filter(n ->edge.getDssEdge().getSource().equals(n.getId())).findFirst().get().getName()); } }); @@ -112,7 +112,7 @@ public abstract class AbstractFlowTuning implements FlowTuning { private void setProxyUser(SchedulerFlow schedulerFlow) { String proxyUser = getProxyUser(schedulerFlow); if(StringUtils.isNotBlank(proxyUser)) { - schedulerFlow.getSchedulerNodes().forEach(node -> node.getDWSNode().setUserProxy(proxyUser)); + schedulerFlow.getSchedulerNodes().forEach(node -> node.getDssNode().setUserProxy(proxyUser)); schedulerFlow.setUserProxy(proxyUser); } } diff --git a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/tuning/AbstractShareNodeFlowTuning.java b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/tuning/AbstractShareNodeFlowTuning.java index 84493b510ff93f20ebbbf539f630f55b30137834..81dc0d5d31e831e630388eacdb9fe3cb9a3b74b5 100644 --- a/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/tuning/AbstractShareNodeFlowTuning.java +++ b/dss-scheduler-appjoint-core/src/main/java/com/webank/wedatasphere/dss/appjoint/scheduler/tuning/AbstractShareNodeFlowTuning.java @@ -66,9 +66,8 @@ public abstract class AbstractShareNodeFlowTuning extends AbstractFlowTuning imp Map res = new HashMap<>(); //遍历readNodes,将NodeIds转为name的集合,过滤掉删除了节点但是还滞留在content里面的id Arrays.stream(readNodes).filter(rn ->rn.getShareNodeIds() != null).forEach(rn ->{ - List names = Arrays.stream(rn.getShareNodeIds()).filter(id->flow.getSchedulerNodes().stream().filter(sn -> !id.equals(sn.getId())).findFirst().isPresent()). - map(id -> flow.getSchedulerNodes().stream().filter(sn -> id.equals(sn.getId())).findFirst().get().getName()).collect(Collectors.toList()); - rn.setShareNodeIds(names.toArray(new String[0])); + rn.setShareNodeIds(Arrays.stream(rn.getShareNodeIds()).filter(id -> flow.getSchedulerNodes().stream().anyMatch(sn -> id.equals(sn.getId()))). + map(id -> flow.getSchedulerNodes().stream().filter(sn -> id.equals(sn.getId())).findFirst().get().getName()).toArray(String[]::new)); }); Stream.of(readNodes).forEach(x -> { @@ -84,7 +83,7 @@ public abstract class AbstractShareNodeFlowTuning extends AbstractFlowTuning imp if(schedulerNode != null) { int shareTimes = (nameAndNumMap.get(key)).intValue(); ShareNode shareNode = createShareNode(); - shareNode.setDWSNode(schedulerNode.getDWSNode()); + shareNode.setDssNode(schedulerNode.getDssNode()); shareNode.setSchedulerNode(schedulerNode); shareNode.setShareTimes(shareTimes); res.put(shareNode, shareTimes); diff --git a/dss-server/bin/start-dss-server.sh b/dss-server/bin/start-dss-server.sh index e53e44a1678971c460843ff5c1ec5a0534350e5a..518cd8da17321678d614a766d5d50bb30db3efe0 100644 --- a/dss-server/bin/start-dss-server.sh +++ b/dss-server/bin/start-dss-server.sh @@ -1,33 +1,39 @@ #!/bin/bash - cd `dirname $0` cd .. HOME=`pwd` -export DWS_ENGINE_MANAGER_HOME=$HOME -export DWS_ENGINE_MANAGER_PID=$HOME/bin/linkis.pid +export SERVER_PID=$HOME/bin/linkis.pid +export SERVER_LOG_PATH=$HOME/logs +export SERVER_CLASS=com.webank.wedatasphere.dss.DSSSpringApplication + +if test -z "$SERVER_HEAP_SIZE" +then + export SERVER_HEAP_SIZE="512M" +fi + +if test -z "$SERVER_JAVA_OPTS" +then + export SERVER_JAVA_OPTS=" -Xmx$SERVER_HEAP_SIZE -XX:+UseG1GC -Xloggc:$HOME/logs/linkis-gc.log" +fi -if [[ -f "${DWS_ENGINE_MANAGER_PID}" ]]; then - pid=$(cat ${DWS_ENGINE_MANAGER_PID}) +if [[ -f "${SERVER_PID}" ]]; then + pid=$(cat ${SERVER_PID}) if kill -0 ${pid} >/dev/null 2>&1; then - echo "DSS SERVER is already running." - return 0; + echo "Server is already running." + exit 1 fi fi -export DWS_ENGINE_MANAGER_LOG_PATH=$HOME/logs -export DWS_ENGINE_MANAGER_HEAP_SIZE="1G" -export DWS_ENGINE_MANAGER_JAVA_OPTS="-Xms$DWS_ENGINE_MANAGER_HEAP_SIZE -Xmx$DWS_ENGINE_MANAGER_HEAP_SIZE -XX:+UseG1GC -XX:MaxPermSize=500m -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=11729" - -nohup java $DWS_ENGINE_MANAGER_JAVA_OPTS -cp $HOME/conf:$HOME/lib/* com.webank.wedatasphere.dss.DSSSpringApplication 2>&1 > $DWS_ENGINE_MANAGER_LOG_PATH/linkis.out & +nohup java $SERVER_JAVA_OPTS -cp $HOME/conf:$HOME/lib/* $SERVER_CLASS 2>&1 > $SERVER_LOG_PATH/linkis.out & pid=$! if [[ -z "${pid}" ]]; then - echo "DSS SERVER start failed!" - sleep 1 + echo "server $SERVER_NAME start failed!" exit 1 else - echo "DSS SERVER start succeeded!" - echo $pid > $DWS_ENGINE_MANAGER_PID + echo "server $SERVER_NAME start succeeded!" + echo $pid > $SERVER_PID sleep 1 fi -exit 1 + + diff --git a/dss-server/pom.xml b/dss-server/pom.xml index 1da0c251efcafdfcb30e139ee04dbb455293c335..76d1446ddbd7c7a1c946406e429fb223098feb8d 100644 --- a/dss-server/pom.xml +++ b/dss-server/pom.xml @@ -22,7 +22,7 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 4.0.0 @@ -60,21 +60,21 @@ ${dss.version} - - com.webank.wedatasphere.dss - dss-azkaban-scheduler-appjoint - - - spring-cloud-starter-openfeign - org.springframework.cloud - - - spring-cloud-starter-netflix-eureka-client - org.springframework.cloud - - - ${dss.version} - + + + + + + + + + + + + + + + org.springframework.cloud @@ -93,6 +93,10 @@ gson com.google.code.gson + + jsr311-api + javax.ws.rs + @@ -117,7 +121,7 @@ com.webank.wedatasphere.linkis - 0.9.1 + ${linkis.version} @@ -142,6 +146,37 @@ 4.12 test + + com.webank.wedatasphere.dss + dss-scheduler-appjoint-core + 0.9.1 + + + com.webank.wedatasphere.dss + dss-user-manager + 0.9.1 + compile + + + com.github.rholder + guava-retrying + 2.0.0 + + + dom4j + dom4j + 1.6.1 + + + com.typesafe + config + 1.4.1 + + + xml-apis + xml-apis + 1.4.01 + diff --git a/dss-server/src/main/assembly/distribution.xml b/dss-server/src/main/assembly/distribution.xml index ffa6656d8caedf6528fc7ae686d4e1a4d0c316af..0da5c182ca24d94f89be5c8562d89f11ebd0344b 100644 --- a/dss-server/src/main/assembly/distribution.xml +++ b/dss-server/src/main/assembly/distribution.xml @@ -41,16 +41,16 @@ aopalliance:aopalliance:jar asm:asm:jar cglib:cglib:jar - com.amazonaws:aws-java-sdk-autoscaling:jar - com.amazonaws:aws-java-sdk-core:jar - com.amazonaws:aws-java-sdk-ec2:jar - com.amazonaws:aws-java-sdk-route53:jar - com.amazonaws:aws-java-sdk-sts:jar - com.amazonaws:jmespath-java:jar + + + + + + com.fasterxml.jackson.core:jackson-annotations:jar com.fasterxml.jackson.core:jackson-core:jar com.fasterxml.jackson.core:jackson-databind:jar - com.fasterxml.jackson.dataformat:jackson-dataformat-cbor:jar + com.fasterxml.jackson.datatype:jackson-datatype-jdk8:jar com.fasterxml.jackson.datatype:jackson-datatype-jsr310:jar com.fasterxml.jackson.jaxrs:jackson-jaxrs-base:jar @@ -82,7 +82,6 @@ com.netflix.ribbon:ribbon-loadbalancer:jar com.netflix.ribbon:ribbon-transport:jar com.netflix.servo:servo-core:jar - com.ning:async-http-client:jar com.sun.jersey.contribs:jersey-apache-client4:jar com.sun.jersey:jersey-client:jar com.sun.jersey:jersey-core:jar @@ -137,8 +136,6 @@ joda-time:joda-time:jar log4j:log4j:jar mysql:mysql-connector-java:jar - net.databinder.dispatch:dispatch-core_2.11:jar - net.databinder.dispatch:dispatch-json4s-jackson_2.11:jar org.antlr:antlr-runtime:jar org.antlr:stringtemplate:jar org.apache.commons:commons-compress:jar @@ -176,7 +173,9 @@ org.eclipse.jetty:jetty-continuation:jar org.eclipse.jetty:jetty-http:jar org.eclipse.jetty:jetty-io:jar + org.eclipse.jetty:jetty-plus:jar org.eclipse.jetty:jetty-security:jar org.eclipse.jetty:jetty-server:jar @@ -225,7 +224,9 @@ org.json4s:json4s-ast_2.11:jar org.json4s:json4s-core_2.11:jar org.json4s:json4s-jackson_2.11:jar + org.jvnet.mimepull:mimepull:jar org.jvnet:tiger-types:jar org.latencyutils:LatencyUtils:jar @@ -280,7 +281,7 @@ org.springframework:spring-webmvc:jar org.tukaani:xz:jar org.yaml:snakeyaml:jar - software.amazon.ion:ion-java:jar + xerces:xercesImpl:jar xmlenc:xmlenc:jar xmlpull:xmlpull:jar @@ -299,6 +300,24 @@ conf unix + + ${project.parent.basedir}/dss-user-manager/src/main/resources/config + + * + + 0777 + conf/config + unix + + + ${project.parent.basedir}/dss-user-manager/src/main/resources/default + + * + + 0777 + conf/default + unix + ${basedir}/bin diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/DSSSpringApplication.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/DSSSpringApplication.java index e255b125f1ddc647eab58379fcbfd375f001ea39..8a7344f9b2bc5dd796baffb612bff413d6c433f4 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/DSSSpringApplication.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/DSSSpringApplication.java @@ -161,7 +161,7 @@ public class DSSSpringApplication extends SpringBootServletInitializer { } DWCException.setApplicationName(serviceInstance.getApplicationName()); DWCException.setHostname(Utils.getComputerName()); - DWCException.setHostPort(Integer.parseInt(applicationContext.getEnvironment().getProperty("server.port"))); + DWCException.setHostPort(Integer.parseInt(applicationContext.getEnvironment().getProperty("server.port","9004"))); } private static void setServiceInstance(ServiceInstance serviceInstance) throws Exception { diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/constant/DSSServerConstant.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/constant/DSSServerConstant.java index 3aebc85926349212223ce51a392fb5d753225985..2c5711e154d562a1efbc4b57f4753d5f99aa2b57 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/constant/DSSServerConstant.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/constant/DSSServerConstant.java @@ -19,12 +19,15 @@ package com.webank.wedatasphere.dss.server.constant; public class DSSServerConstant { - public static final String DWS_PROJECT_FIRST_VERSION = "v000001"; - public static final String DWS_PROJECT_FIRST_VERSION_COMMENT = "first version"; - public static final String DWS_PROJECT_SOURCE = "create by user"; + public static final String DSS_PROJECT_FIRST_VERSION = "v000001"; + public static final String DSS_PROJECT_FIRST_VERSION_COMMENT = "first version"; + public static final String DSS_PROJECT_SOURCE = "create by user"; + public static final String DSS_WORKSPACE_SOURCE = "create by user"; public static final String PROJECT_VERSION_ID = "projectVersionID"; - public static final String PUBLISH_FLOW_REPORT_FORMATE = "工作流名:%s,版本号:%s,工作流内容为空,请自行修改或者删除"; - public static final String EMVEDDEDFLOWID ="\"embeddedFlowId\":" ; + public static final String PUBLISH_FLOW_REPORT_FORMATE = "the workflow name is% s, the version number is% s, and the workflow content is empty. Please modify or delete it by yourself"; + public static final String EMVEDDEDFLOWID = "\"embeddedFlowId\":"; public static final String VERSION_FORMAT = "%06d"; public static final String VERSION_PREFIX = "v"; + public static final String SUPER_USER_LOGIN_ERROR = "please login with super user"; + } diff --git a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/dao/DSSUserMapper.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/DSSUserMapper.java similarity index 86% rename from dss-application/src/main/java/com/webank/wedatasphere/dss/application/dao/DSSUserMapper.java rename to dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/DSSUserMapper.java index fe81ae46a4f932181fe1d73480e3a92dbf9461f2..31d4974e126c6dca01576fa76412506a0bc6b134 100644 --- a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/dao/DSSUserMapper.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/DSSUserMapper.java @@ -15,14 +15,15 @@ * */ -package com.webank.wedatasphere.dss.application.dao; +package com.webank.wedatasphere.dss.server.dao; import com.webank.wedatasphere.dss.application.entity.DSSUser; -/** - * Created by chaogefeng on 2019/10/11. - */ public interface DSSUserMapper { + Long getUserID(String userName); + + String getuserName(Long userID); + DSSUser getUserByName(String username); void registerDSSUser(DSSUser userDb); diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/FlowMapper.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/FlowMapper.java index 52fa6fb2be38db2860b02d6a9256ba49919893b2..ea830d7cf83a7b3119935cd648f1a472e605cca2 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/FlowMapper.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/FlowMapper.java @@ -18,8 +18,8 @@ package com.webank.wedatasphere.dss.server.dao; -import com.webank.wedatasphere.dss.common.entity.flow.DWSFlow; -import com.webank.wedatasphere.dss.common.entity.flow.DWSFlowVersion; +import com.webank.wedatasphere.dss.common.entity.flow.DSSFlow; +import com.webank.wedatasphere.dss.common.entity.flow.DSSFlowVersion; import org.apache.ibatis.annotations.Param; import org.springframework.dao.DuplicateKeyException; @@ -27,23 +27,23 @@ import java.util.List; public interface FlowMapper { - DWSFlow selectFlowByID(Long id); + DSSFlow selectFlowByID(Long id); - List listFlowByTaxonomyID(@Param("projectID") Long projectID, @Param("taxonomyID") Long taxonomyID, @Param("isRootFlow") Boolean isRootFlow); + List listFlowByTaxonomyID(@Param("projectID") Long projectID, @Param("taxonomyID") Long taxonomyID, @Param("isRootFlow") Boolean isRootFlow); - List listFlowVersionsByFlowID(@Param("flowID") Long flowID, @Param("projectVersionID") Long projectVersionID); + List listFlowVersionsByFlowID(@Param("flowID") Long flowID, @Param("projectVersionID") Long projectVersionID); - void insertFlow(DWSFlow dwsFlow) throws DuplicateKeyException; + void insertFlow(DSSFlow dssFlow) throws DuplicateKeyException; - void insertFlowVersion(DWSFlowVersion version); + void insertFlowVersion(DSSFlowVersion version); - void batchInsertFlowVersion(@Param("flowVersions") List flowVersions); + void batchInsertFlowVersion(@Param("flowVersions") List flowVersions); void insertFlowRelation(@Param("flowID") Long flowID, @Param("parentFlowID") Long parentFlowID); - DWSFlowVersion selectVersionByFlowID(@Param("flowID") Long flowID, @Param("version") String version, @Param("projectVersionID") Long projectVersionID); + DSSFlowVersion selectVersionByFlowID(@Param("flowID") Long flowID, @Param("version") String version, @Param("projectVersionID") Long projectVersionID); - void updateFlowBaseInfo(DWSFlow dwsFlow) throws DuplicateKeyException; + void updateFlowBaseInfo(DSSFlow dssFlow) throws DuplicateKeyException; List selectSubFlowIDByParentFlowID(Long parentFlowID); @@ -55,17 +55,17 @@ public interface FlowMapper { Long selectParentFlowIDByFlowID(Long flowID); - List listFlowByProjectID(Long projectID); + List listFlowByProjectID(Long projectID); - List listVersionByFlowIDAndProjectVersionID(@Param("flowID") Long flowID, @Param("projectVersionID") Long projectVersionID); + List listVersionByFlowIDAndProjectVersionID(@Param("flowID") Long flowID, @Param("projectVersionID") Long projectVersionID); Boolean noVersions(Long flowID); - List listLastFlowVersionsByProjectVersionID(@Param("projectVersionID") Long projectVersionId); + List listLastFlowVersionsByProjectVersionID(@Param("projectVersionID") Long projectVersionId); - List listLatestRootFlowVersionByProjectVersionID(Long projectVersionID); + List listLatestRootFlowVersionByProjectVersionID(Long projectVersionID); - void batchUpdateFlowVersion(List flowVersions); + void batchUpdateFlowVersion(List flowVersions); Long getParentFlowID(Long flowID); } diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/FlowTaxonomyMapper.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/FlowTaxonomyMapper.java index f5c3dc1ec1fd4b74a8189857791948a2a0f41190..640e8b7d0386fe6cac8098ccd00b34db30f02258 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/FlowTaxonomyMapper.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/FlowTaxonomyMapper.java @@ -18,7 +18,7 @@ package com.webank.wedatasphere.dss.server.dao; -import com.webank.wedatasphere.dss.server.entity.DWSFlowTaxonomy; +import com.webank.wedatasphere.dss.server.entity.DSSFlowTaxonomy; import org.apache.ibatis.annotations.Param; import org.springframework.dao.DuplicateKeyException; @@ -26,11 +26,11 @@ import java.util.List; public interface FlowTaxonomyMapper { - DWSFlowTaxonomy selectFlowTaxonomyByID(Long id); + DSSFlowTaxonomy selectFlowTaxonomyByID(Long id); - void insertFlowTaxonomy(DWSFlowTaxonomy dwsFlowTaxonomy) throws DuplicateKeyException; + void insertFlowTaxonomy(DSSFlowTaxonomy dssFlowTaxonomy) throws DuplicateKeyException; - void updateFlowTaxonomy(DWSFlowTaxonomy dwsFlowTaxonomy) throws DuplicateKeyException; + void updateFlowTaxonomy(DSSFlowTaxonomy dssFlowTaxonomy) throws DuplicateKeyException; Long hasFlows(Long flowTaxonomyID); @@ -47,5 +47,5 @@ public interface FlowTaxonomyMapper { void deleteFlowTaxonomyByProjectID(Long projectID); - List listFlowTaxonomyByProjectID(Long projectID); + List listFlowTaxonomyByProjectID(Long projectID); } diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/ProjectMapper.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/ProjectMapper.java index b5757d1f8b8d1e54d33c6b98e6b2ccfb02d6deed..c2ae56786da573d7a3cae71fea4010fc95e30e1a 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/ProjectMapper.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/ProjectMapper.java @@ -18,9 +18,9 @@ package com.webank.wedatasphere.dss.server.dao; -import com.webank.wedatasphere.dss.common.entity.project.DWSProject; -import com.webank.wedatasphere.dss.common.entity.project.DWSProjectPublishHistory; -import com.webank.wedatasphere.dss.common.entity.project.DWSProjectVersion; +import com.webank.wedatasphere.dss.common.entity.project.DSSProject; +import com.webank.wedatasphere.dss.common.entity.project.DSSProjectPublishHistory; +import com.webank.wedatasphere.dss.common.entity.project.DSSProjectVersion; import org.apache.ibatis.annotations.Param; import java.util.List; @@ -28,19 +28,19 @@ import java.util.Map; public interface ProjectMapper { - DWSProject selectProjectByID(Long id); + DSSProject selectProjectByID(Long id); - DWSProjectVersion selectLatestVersionByProjectID(Long projectID); + DSSProjectVersion selectLatestVersionByProjectID(Long projectID); - DWSProject selectProjectByVersionID(Long projectVersionID); + DSSProject selectProjectByVersionID(Long projectVersionID); - void addProject(DWSProject dwsProject); + void addProject(DSSProject dssProject); - void addProjectVersion(DWSProjectVersion dwsProjectVersion); + void addProjectVersion(DSSProjectVersion dssProjectVersion); void updateDescription(@Param("projectID") Long projectID, @Param("description") String description, @Param("product")String product ,@Param("applicationArea")Integer applicationArea ,@Param("business")String business); - List listProjectVersionsByProjectID(Long projectID); + List listProjectVersionsByProjectID(Long projectID); Boolean noPublished(Long projectID); @@ -48,15 +48,15 @@ public interface ProjectMapper { void deleteProjectBaseInfo(long projectID); - DWSProjectVersion selectProjectVersionByID(Long id); + DSSProjectVersion selectProjectVersionByID(Long id); - DWSProjectVersion selectProjectVersionByProjectIDAndVersionID(@Param("projectID") Long projectId, @Param("version") String version); + DSSProjectVersion selectProjectVersionByProjectIDAndVersionID(@Param("projectID") Long projectId, @Param("version") String version); Integer updateLock(@Param("lock") Integer lock, @Param("projectVersionID") Long projectVersionID); - DWSProjectPublishHistory selectProjectPublishHistoryByProjectVersionID(Long projectVersionID); + DSSProjectPublishHistory selectProjectPublishHistoryByProjectVersionID(Long projectVersionID); - void insertPublishHistory(DWSProjectPublishHistory dwsProjectPublishHistory); + void insertPublishHistory(DSSProjectPublishHistory dssProjectPublishHistory); void updatePublishHistoryState(@Param("projectVersionID") Long projectVersionID, @Param("status") Integer status); diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/ProjectTaxonomyMapper.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/ProjectTaxonomyMapper.java index 3d87bde6b7c9051cd8b6e5bb264647af789c74a4..21fc44e41a1e8c61899c0d210938b128a4ee103c 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/ProjectTaxonomyMapper.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/ProjectTaxonomyMapper.java @@ -17,8 +17,8 @@ package com.webank.wedatasphere.dss.server.dao; -import com.webank.wedatasphere.dss.server.entity.DWSProjectTaxonomy; -import com.webank.wedatasphere.dss.server.entity.DWSProjectTaxonomyRelation; +import com.webank.wedatasphere.dss.server.entity.DSSProjectTaxonomy; +import com.webank.wedatasphere.dss.server.entity.DSSProjectTaxonomyRelation; import org.apache.ibatis.annotations.Param; import org.springframework.dao.DuplicateKeyException; @@ -26,16 +26,16 @@ import java.util.List; public interface ProjectTaxonomyMapper { - DWSProjectTaxonomy selectProjectTaxonomyByID(Long id); - DWSProjectTaxonomyRelation selectProjectTaxonomyRelationByTaxonomyIdOrProjectId(Long taxonomyIdOrProjectId); - List listProjectTaxonomyByUser(String userName); + DSSProjectTaxonomy selectProjectTaxonomyByID(Long id); + DSSProjectTaxonomyRelation selectProjectTaxonomyRelationByTaxonomyIdOrProjectId(Long taxonomyIdOrProjectId); + List listProjectTaxonomyByUser(String userName); //-------------------- List listProjectIDByTaxonomyID(@Param("taxonomyID") Long taxonomyID, @Param("userName") String userName); - void insertProjectTaxonomy(DWSProjectTaxonomy dwsProjectTaxonomy) throws DuplicateKeyException; + void insertProjectTaxonomy(DSSProjectTaxonomy dssProjectTaxonomy) throws DuplicateKeyException; - void updateProjectTaxonomy(DWSProjectTaxonomy dwsProjectTaxonomy) throws DuplicateKeyException; + void updateProjectTaxonomy(DSSProjectTaxonomy dssProjectTaxonomy) throws DuplicateKeyException; Long hasProjects(Long projectTaxonomyID); diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/WorkspaceMapper.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/WorkspaceMapper.java new file mode 100644 index 0000000000000000000000000000000000000000..c3fbaca2224984ebf5fb388c3bdd26fe76654592 --- /dev/null +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/WorkspaceMapper.java @@ -0,0 +1,68 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.server.dao; + +import com.webank.wedatasphere.dss.server.dto.response.*; +import com.webank.wedatasphere.dss.server.entity.*; +import com.webank.wedatasphere.dss.server.dto.response.HomepageDemoInstanceVo; +import com.webank.wedatasphere.dss.server.dto.response.HomepageDemoMenuVo; +import com.webank.wedatasphere.dss.server.dto.response.HomepageVideoVo; +import com.webank.wedatasphere.dss.server.dto.response.WorkspaceFavoriteVo; +import org.apache.ibatis.annotations.Param; + +import java.util.List; + +/** + * Created by schumiyi on 2020/6/22 + */ +public interface WorkspaceMapper { + + List getWorkspaces(); + + List findByWorkspaceName(String name); + + void addWorkSpace(DSSWorkspace dssWorkspace); + + List getHomepageDemoMenusEn(); + List getHomepageDemoMenusCn(); + + List getHomepageInstancesByMenuIdCn(Long id); + List getHomepageInstancesByMenuIdEn(Long id); + + List getHomepageVideosEn(); + List getHomepageVideosCn(); + + DSSWorkspace getWorkspaceById(Long workspaceId); + + List getManagementMenuCn(); + List getManagementMenuEn(); + + List getApplicationMenuCn(); + List getApplicationMenuEn(); + + List getMenuAppInstancesCn(Long id); + List getMenuAppInstancesEn(Long id); + + List getWorkspaceFavoritesCn(@Param("username") String username, @Param("workspaceId") Long workspaceId); + + List getWorkspaceFavoritesEn(@Param("username") String username, @Param("workspaceId") Long workspaceId); + + void addFavorite(DSSFavorite dssFavorite); + + void deleteFavorite(Long favouritesId); +} diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/dwsUserMapper.xml b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/dwsUserMapper.xml index 65fbd5eb2ce4e0d6b9938fc6e039359655ab632c..ac65dce4fcbe03f8d9161cabfdfe3a2b58ed6993 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/dwsUserMapper.xml +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/dwsUserMapper.xml @@ -19,7 +19,7 @@ - + diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/flowMapper.xml b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/flowMapper.xml index 38d66f96cf5cd8d5aab3488671974ffb53be0c58..37dae091a48b524a5947f65a38a3b885f5685eb2 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/flowMapper.xml +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/flowMapper.xml @@ -29,11 +29,11 @@ id,`flow_id`,`source`,`version`,`json_path`,`comment`,`update_time`,`updator_id`,`project_version_id` - SELECT * FROM `dss_flow` WHERE id = #{id} - SELECT f.* FROM @@ -44,7 +44,7 @@ AND f.project_id=#{projectID} - - - SELECT fv.*, bdu.`name` AS 'updator', ISNULL(fph.id) AS 'is_not_published' @@ -86,19 +86,19 @@ - + INSERT INTO dss_flow () VALUES (#{id},#{name},#{state},#{source},#{description},#{createTime},#{creatorID},#{isRootFlow},#{rank},#{projectID},#{hasSaved},#{uses}) - + INSERT INTO dss_flow_version () VALUES (#{id},#{flowID},#{source},#{version},#{jsonPath},#{comment},#{updateTime},#{updatorID},#{projectVersionID}) - + INSERT INTO dss_flow_version ( `flow_id`,`source`,`version`,`json_path`,`comment`,`update_time`,`updator_id`,`project_version_id`) VALUES @@ -107,7 +107,7 @@ - + UPDATE dss_flow_version source= #{fv.source} @@ -127,7 +127,7 @@ (#{flowID},#{parentFlowID}) - SELECT fv.*, bdu.`name` AS 'updator', ISNULL(fph.id) AS 'is_not_published' @@ -141,7 +141,7 @@ AND fv.project_version_id = #{projectVersionID} - + UPDATE dss_flow name=#{name}, @@ -186,7 +186,7 @@ flow_id = #{flowID} - SELECT * FROM @@ -195,7 +195,7 @@ project_id = #{projectID} - SELECT fv.*, bdu.`name` AS 'updator', ISNULL(fph.id) AS 'is_not_published' @@ -218,7 +218,7 @@ LIMIT 1 - select from dss_flow_version as fv where version = @@ -228,7 +228,7 @@ and fv.flow_id=flow_id ) ; - SELECT fv.* FROM diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/flowTaxonomyMapper.xml b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/flowTaxonomyMapper.xml index 60b04c3e77ed9bdd0b6cb9966d361514cfde6a27..c681f768ca87092091f175dfd8cf8290dfdac28d 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/flowTaxonomyMapper.xml +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/flowTaxonomyMapper.xml @@ -25,17 +25,17 @@ id, `name`,`description`,`creator_id`,`create_time`,`update_time`,`project_id` - SELECT * FROM `dss_flow_taxonomy` WHERE id = #{id} - + INSERT INTO dss_flow_taxonomy () VALUES (#{id},#{name},#{description},#{creatorID},#{createTime},#{updateTime},#{projectID}) - + UPDATE dss_flow_taxonomy name=#{name}, @@ -84,7 +84,7 @@ - SELECT * FROM diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/projectMapper.xml b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/projectMapper.xml index 1ff0893b93e3b82b73ec2182b95526b6ce097b56..a1bc4ab8cce5a734797f90a4eb7e5b270789e31a 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/projectMapper.xml +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/projectMapper.xml @@ -22,7 +22,7 @@ - id,`name`,`source`,`description`,`org_id`,`visibility`,`is_transfer`,`initial_org_id`,`user_id`,`create_time`,`create_by`,`product`,`application_area`,`business` + id,`name`,`source`,`description`,`org_id`,`visibility`,`is_transfer`,`initial_org_id`,`user_id`,`create_time`,`create_by`,`product`,`application_area`,`business`,`workspace_id` @@ -33,7 +33,7 @@ id,`project_version_id`,`create_time`,`creator_id`,`update_time`,`comment`,`state`,`version_path`,`expire_time` - SELECT p.*, MIN(ISNULL(pph.id)) AS 'is_not_publish' FROM @@ -44,11 +44,11 @@ p.id = #{id} - SELECT * FROM dss_project_version WHERE project_id = #{projectID} ORDER BY id DESC LIMIT 1 - SELECT wp.* FROM @@ -58,13 +58,13 @@ wpv.id = #{projrctVersionID} - + INSERT INTO dss_project () VALUES - (#{id},#{name},#{source},#{description},#{orgID},#{visibility},#{isTransfer},#{initialOrgID},#{userID},#{createTime},#{createBy},#{product},#{applicationArea},#{business}) + (#{id},#{name},#{source},#{description},#{orgID},#{visibility},#{isTransfer},#{initialOrgID},#{userID},#{createTime},#{createBy},#{product},#{applicationArea},#{business},#{workspaceId}) - + INSERT INTO dss_project_version () VALUES (#{id},#{projectID},#{version},#{comment},#{updateTime},#{updatorID},#{lock}) @@ -76,11 +76,11 @@ WHERE id = #{projectID} - - SELECT pv.*, u.`name` AS 'updator', ( @@ -126,11 +126,11 @@ id = #{projectID} - SELECT * from dss_project_version WHERE id = #{id} - SELECT * from dss_project_version WHERE project_id = #{projectID} and version = #{version} @@ -138,7 +138,7 @@ UPDATE `dss_project_version` set `lock` = `lock` +1 WHERE id= #{projectVersionID} AND `lock` = #{lock} - SELECT pph.*, u.`name` as 'creator' FROM @@ -148,7 +148,7 @@ pph.project_version_id = #{projectVersionID} - + INSERT INTO dss_project_publish_history () VALUES (#{id},#{projectVersionID},#{createTime},#{createID},#{updateTime},#{comment},#{state},#{versionPath},#{expireTime}) diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/projectTaxonomyMapper.xml b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/projectTaxonomyMapper.xml index 418736c883ebdb9907fc9801e3f948a8efb9fdb1..9c0333953c311b2f3186acef3933cd08ca8f8d37 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/projectTaxonomyMapper.xml +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/projectTaxonomyMapper.xml @@ -23,16 +23,16 @@ id, `name`,`description`,`creator_id`,`create_time`,`update_time` - SELECT * FROM `dss_project_taxonomy` WHERE id = #{id} - SELECT taxonomy_id as taxonomyId , project_id as projectId , creator_id as creatorId FROM `dss_project_taxonomy_relation` WHERE project_id = #{taxonomyIdOrProjectId} - SELECT pt.* FROM @@ -54,13 +54,13 @@ AND bdu.`name` = #{userName} - + INSERT INTO dss_project_taxonomy () VALUES (#{id},#{name},#{description},#{creatorID},#{createTime},#{updateTime}) - + UPDATE dss_project_taxonomy name=#{name}, diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/workspaceMapper.xml b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/workspaceMapper.xml new file mode 100644 index 0000000000000000000000000000000000000000..e1fc05ed9df6d8d7d8ab8e396f30d141785c19a6 --- /dev/null +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dao/impl/workspaceMapper.xml @@ -0,0 +1,200 @@ + + + + + + + + + id,`name`,`label`,`description`,`department`,`product`,`source`,`create_by`,`create_time`,`last_update_user`,`last_update_time` + + + + m.`id`,m.`title_cn` AS `title`, m.`desc_cn` AS `description`,m.`labels_cn` AS `labels`, + m.`access_button_cn` AS `access_button`,m.`manual_button_cn` AS `manualButton`,m.`is_active`, + m.`manual_button_url`,m.`icon`,m.`order`,app.`homepage_url` AS `access_button_url`,app.project_url, app.`name` + + + m.`id`,m.`title_en` AS `title`, m.`desc_en` AS `description`,m.`labels_en` AS `labels`, + m.`access_button_en` AS `access_button`,m.`manual_button_en` AS `manualButton`,m.`is_active`, + m.`manual_button_url`,m.`icon`,m.`order`,app.`homepage_url` AS `access_button_url`,app.project_url, app.`name` + + + + id,`username`,`workspace_id`,`menu_application_id`,`order`,`create_by`,`create_time`,`last_update_user`,`last_update_time` + + + + + + + + + + INSERT INTO dss_workspace () + VALUES + (#{id},#{name},#{label},#{description},#{department},#{product},#{source},#{createBy},#{createTime},#{lastUpdateUser},#{lastUpdateTime}) + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + INSERT INTO dss_onestop_user_favorites () + VALUES + (#{id},#{username},#{workspaceId},#{menuApplicationId},#{order},#{createBy},#{createTime},#{lastUpdateUser},#{lastUpdateTime}) + + + + DELETE + FROM + dss_onestop_user_favorites + WHERE + id = #{favouritesId} + + \ No newline at end of file diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/HomepageDemoInstanceVo.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/HomepageDemoInstanceVo.java new file mode 100644 index 0000000000000000000000000000000000000000..b880f0f2adb82a1b43d91c13f4c55e43b3e575a0 --- /dev/null +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/HomepageDemoInstanceVo.java @@ -0,0 +1,96 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +package com.webank.wedatasphere.dss.server.dto.response; + +/** + * Created by schumiyi on 2020/6/23 + */ +public class HomepageDemoInstanceVo { + + private Long id; + private Long menuId; + private String name; + private String url; + private String title; + private String description; + private String icon; + private Integer order; + + public Long getId() { + return id; + } + + public void setId(Long id) { + this.id = id; + } + + public Long getMenuId() { + return menuId; + } + + public void setMenuId(Long menuId) { + this.menuId = menuId; + } + + public String getName() { + return name; + } + + public void setName(String name) { + this.name = name; + } + + public String getUrl() { + return url; + } + + public void setUrl(String url) { + this.url = url; + } + + public String getTitle() { + return title; + } + + public void setTitle(String title) { + this.title = title; + } + + public String getDescription() { + return description; + } + + public void setDescription(String description) { + this.description = description; + } + + public String getIcon() { + return icon; + } + + public void setIcon(String icon) { + this.icon = icon; + } + + public Integer getOrder() { + return order; + } + + public void setOrder(Integer order) { + this.order = order; + } +} diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/HomepageDemoMenuVo.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/HomepageDemoMenuVo.java new file mode 100644 index 0000000000000000000000000000000000000000..c7d4fdcbd3b7bbc0067096a9148720ea995d1eb0 --- /dev/null +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/HomepageDemoMenuVo.java @@ -0,0 +1,89 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +package com.webank.wedatasphere.dss.server.dto.response; + +import java.util.List; + +/** + * Created by schumiyi on 2020/6/23 + */ +public class HomepageDemoMenuVo { + + private Long id; + private String name; + private String title; + private String description; + private String icon; + private Integer order; + private List demoInstances; + + public Long getId() { + return id; + } + + public void setId(Long id) { + this.id = id; + } + + public String getName() { + return name; + } + + public void setName(String name) { + this.name = name; + } + + public String getTitle() { + return title; + } + + public void setTitle(String title) { + this.title = title; + } + + public String getDescription() { + return description; + } + + public void setDescription(String description) { + this.description = description; + } + + public String getIcon() { + return icon; + } + + public void setIcon(String icon) { + this.icon = icon; + } + + public Integer getOrder() { + return order; + } + + public void setOrder(Integer order) { + this.order = order; + } + + public List getDemoInstances() { + return demoInstances; + } + + public void setDemoInstances(List demoInstances) { + this.demoInstances = demoInstances; + } +} diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/HomepageVideoVo.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/HomepageVideoVo.java new file mode 100644 index 0000000000000000000000000000000000000000..0f9ce857df810f98d57187836da696c6b1ed3cfc --- /dev/null +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/HomepageVideoVo.java @@ -0,0 +1,78 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +package com.webank.wedatasphere.dss.server.dto.response; + +/** + * Created by schumiyi on 2020/6/23 + */ +public class HomepageVideoVo { + + private Long id; + private String name; + private String url; + private String title; + private String description; + private Integer order; + + public Long getId() { + return id; + } + + public void setId(Long id) { + this.id = id; + } + + public String getName() { + return name; + } + + public void setName(String name) { + this.name = name; + } + + public String getUrl() { + return url; + } + + public void setUrl(String url) { + this.url = url; + } + + public String getTitle() { + return title; + } + + public void setTitle(String title) { + this.title = title; + } + + public String getDescription() { + return description; + } + + public void setDescription(String description) { + this.description = description; + } + + public Integer getOrder() { + return order; + } + + public void setOrder(Integer order) { + this.order = order; + } +} diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/OnestopMenuAppInstanceVo.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/OnestopMenuAppInstanceVo.java new file mode 100644 index 0000000000000000000000000000000000000000..f22ca6b33740f1c77cc8c984ac8bcb1657852dc2 --- /dev/null +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/OnestopMenuAppInstanceVo.java @@ -0,0 +1,140 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +package com.webank.wedatasphere.dss.server.dto.response; + +/** + * Created by schumiyi on 2020/6/24 + */ +public class OnestopMenuAppInstanceVo { + private Long id; + private String title; + private String description; + private String labels; + private String accessButton; + private String accessButtonUrl; + private String manualButton; + private String manualButtonUrl; + private String projectUrl; + private String name; + private Boolean isActive; + private String icon; + private Integer order; + + public Long getId() { + return id; + } + + public void setId(Long id) { + this.id = id; + } + + public String getTitle() { + return title; + } + + public void setTitle(String title) { + this.title = title; + } + + public String getDescription() { + return description; + } + + public void setDescription(String description) { + this.description = description; + } + + public String getLabels() { + return labels; + } + + public void setLabels(String labels) { + this.labels = labels; + } + + public String getAccessButton() { + return accessButton; + } + + public void setAccessButton(String accessButton) { + this.accessButton = accessButton; + } + + public String getAccessButtonUrl() { + return accessButtonUrl; + } + + public void setAccessButtonUrl(String accessButtonUrl) { + this.accessButtonUrl = accessButtonUrl; + } + + public String getManualButton() { + return manualButton; + } + + public void setManualButton(String manualButton) { + this.manualButton = manualButton; + } + + public String getManualButtonUrl() { + return manualButtonUrl; + } + + public void setManualButtonUrl(String manualButtonUrl) { + this.manualButtonUrl = manualButtonUrl; + } + + public String getIcon() { + return icon; + } + + public void setIcon(String icon) { + this.icon = icon; + } + + public Integer getOrder() { + return order; + } + + public void setOrder(Integer order) { + this.order = order; + } + + public String getProjectUrl() { + return projectUrl; + } + + public void setProjectUrl(String projectUrl) { + this.projectUrl = projectUrl; + } + + public Boolean getActive() { + return isActive; + } + + public void setActive(Boolean active) { + isActive = active; + } + + public String getName() { + return name; + } + + public void setName(String name) { + this.name = name; + } +} diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/OnestopMenuVo.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/OnestopMenuVo.java new file mode 100644 index 0000000000000000000000000000000000000000..214b5d853936e650d1dc8e7cd9cea1a82b33e9cd --- /dev/null +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/OnestopMenuVo.java @@ -0,0 +1,62 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.server.dto.response; + +import java.util.List; + +/** + * Created by schumiyi on 2020/6/24 + */ +public class OnestopMenuVo { + private Long id; + private String title; + private Integer order; + private List appInstances; + + public Long getId() { + return id; + } + + public void setId(Long id) { + this.id = id; + } + + public String getTitle() { + return title; + } + + public void setTitle(String title) { + this.title = title; + } + + public Integer getOrder() { + return order; + } + + public void setOrder(Integer order) { + this.order = order; + } + + public List getAppInstances() { + return appInstances; + } + + public void setAppInstances(List appInstances) { + this.appInstances = appInstances; + } +} diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/WorkspaceDepartmentVo.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/WorkspaceDepartmentVo.java new file mode 100644 index 0000000000000000000000000000000000000000..4fc436a33bf72aaa8ae34f1322e5eb6c466351d6 --- /dev/null +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/WorkspaceDepartmentVo.java @@ -0,0 +1,44 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.server.dto.response; + +/** + * Created by schumiyi on 2020/6/23 + */ +public class WorkspaceDepartmentVo { + + private Long id; + + private String name; + + public Long getId() { + return id; + } + + public void setId(Long id) { + this.id = id; + } + + public String getName() { + return name; + } + + public void setName(String name) { + this.name = name; + } +} diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/WorkspaceFavoriteVo.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/WorkspaceFavoriteVo.java new file mode 100644 index 0000000000000000000000000000000000000000..f9953d6c58619810bfcc6a53b427c28c96b9b9a0 --- /dev/null +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/dto/response/WorkspaceFavoriteVo.java @@ -0,0 +1,82 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.server.dto.response; + +/** + * Created by Adamyuanyuan on 2020/6/25 + */ +public class WorkspaceFavoriteVo { + private Long id; + + private Long menuApplicationId; + + private String name; + + private String url; + + private String icon; + + private String title; + + public Long getId() { + return id; + } + + public void setId(Long id) { + this.id = id; + } + public String getName() { + return name; + } + + public void setName(String name) { + this.name = name; + } + + public Long getMenuApplicationId() { + return menuApplicationId; + } + + public void setMenuApplicationId(Long menuApplicationId) { + this.menuApplicationId = menuApplicationId; + } + + public String getUrl() { + return url; + } + + public void setUrl(String url) { + this.url = url; + } + + public String getIcon() { + return icon; + } + + public void setIcon(String icon) { + this.icon = icon; + } + + public String getTitle() { + return title; + } + + public void setTitle(String title) { + this.title = title; + } +} diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/BaseEntity.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/BaseEntity.java new file mode 100644 index 0000000000000000000000000000000000000000..d01380d9e295ac50f20f201d32442749292eb8cf --- /dev/null +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/BaseEntity.java @@ -0,0 +1,49 @@ +package com.webank.wedatasphere.dss.server.entity; + +import java.util.Date; + +/** + * Created by schumiyi on 2020/6/22 + */ +public class BaseEntity { + + private String createBy; + + private Date createTime = new Date(); + + private Date lastUpdateTime = new Date(); + + private String lastUpdateUser; + + public String getCreateBy() { + return createBy; + } + + public void setCreateBy(String createBy) { + this.createBy = createBy; + } + + public Date getCreateTime() { + return createTime; + } + + public void setCreateTime(Date createTime) { + this.createTime = createTime; + } + + public Date getLastUpdateTime() { + return lastUpdateTime; + } + + public void setLastUpdateTime(Date lastUpdateTime) { + this.lastUpdateTime = lastUpdateTime; + } + + public String getLastUpdateUser() { + return lastUpdateUser; + } + + public void setLastUpdateUser(String lastUpdateUser) { + this.lastUpdateUser = lastUpdateUser; + } +} diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DSSFavorite.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DSSFavorite.java new file mode 100644 index 0000000000000000000000000000000000000000..9d9956442a07609ea8d8bd6cc1055b41d3deff19 --- /dev/null +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DSSFavorite.java @@ -0,0 +1,73 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +package com.webank.wedatasphere.dss.server.entity; + +/** + * Created by Adamyuanyuan on 2020/6/25 + */ +public class DSSFavorite extends BaseEntity{ + + private Long id; + + private String username; + + private Long workspaceId; + + private Long menuApplicationId; + + private Integer order; + + public Long getId() { + return id; + } + + public void setId(Long id) { + this.id = id; + } + + public String getUsername() { + return username; + } + + public void setUsername(String username) { + this.username = username; + } + + public Long getWorkspaceId() { + return workspaceId; + } + + public void setWorkspaceId(Long workspaceId) { + this.workspaceId = workspaceId; + } + + public Long getMenuApplicationId() { + return menuApplicationId; + } + + public void setMenuApplicationId(Long menuApplicationId) { + this.menuApplicationId = menuApplicationId; + } + + public Integer getOrder() { + return order; + } + + public void setOrder(Integer order) { + this.order = order; + } +} diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DWSFlowTaxonomy.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DSSFlowTaxonomy.java similarity index 86% rename from dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DWSFlowTaxonomy.java rename to dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DSSFlowTaxonomy.java index f5d20f659b28f4c59af63b31c7af74ecf0878254..314e21ea5dbc02a0d5bfa63416ad6d3be96aa7b6 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DWSFlowTaxonomy.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DSSFlowTaxonomy.java @@ -17,13 +17,13 @@ package com.webank.wedatasphere.dss.server.entity; -import com.webank.wedatasphere.dss.common.entity.flow.DWSFlow; +import com.webank.wedatasphere.dss.common.entity.flow.DSSFlow; import java.util.Date; import java.util.List; -public class DWSFlowTaxonomy { +public class DSSFlowTaxonomy { private Long id; private String name; private String description; @@ -31,14 +31,14 @@ public class DWSFlowTaxonomy { private Date createTime; private Date updateTime; private Long projectID; - private List dwsFlowList; + private List dssFlowList; - public List getDwsFlowList() { - return dwsFlowList; + public List getDssFlowList() { + return dssFlowList; } - public void setDwsFlowList(List dwsFlowList) { - this.dwsFlowList = dwsFlowList; + public void setDssFlowList(List dssFlowList) { + this.dssFlowList = dssFlowList; } public Long getId() { diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DWSProjectTaxonomy.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DSSProjectTaxonomy.java similarity index 84% rename from dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DWSProjectTaxonomy.java rename to dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DSSProjectTaxonomy.java index eccf702a0423bce493a6ac6bb26a0e1117f7f75d..302b8067520dfb7983aff813352ede127b0e14c8 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DWSProjectTaxonomy.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DSSProjectTaxonomy.java @@ -17,27 +17,27 @@ package com.webank.wedatasphere.dss.server.entity; -import com.webank.wedatasphere.dss.common.entity.project.DWSProject; +import com.webank.wedatasphere.dss.common.entity.project.DSSProject; import java.util.Date; import java.util.List; -public class DWSProjectTaxonomy { +public class DSSProjectTaxonomy { private Long id; private String name; private String description; private Long creatorID; private Date createTime; private Date updateTime; - private List dwsProjectList; + private List dssProjectList; - public List getDwsProjectList() { - return dwsProjectList; + public List getDssProjectList() { + return dssProjectList; } - public void setDwsProjectList(List dwsProjectList) { - this.dwsProjectList = dwsProjectList; + public void setDssProjectList(List dssProjectList) { + this.dssProjectList = dssProjectList; } public Long getId() { diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DWSProjectTaxonomyRelation.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DSSProjectTaxonomyRelation.java similarity index 96% rename from dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DWSProjectTaxonomyRelation.java rename to dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DSSProjectTaxonomyRelation.java index 80f38328aef5831bebef4b4832373c6bafb1117c..accdf3e4b184ba0cb6f2c41867e435afda37eb39 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DWSProjectTaxonomyRelation.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DSSProjectTaxonomyRelation.java @@ -17,7 +17,7 @@ package com.webank.wedatasphere.dss.server.entity; -public class DWSProjectTaxonomyRelation { +public class DSSProjectTaxonomyRelation { private Long taxonomyId; private Long projectId; private Long creatorId; diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DSSWorkspace.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DSSWorkspace.java new file mode 100644 index 0000000000000000000000000000000000000000..305bffeb4c993c3acfae54c4b3bc42127a6f7420 --- /dev/null +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/entity/DSSWorkspace.java @@ -0,0 +1,93 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +package com.webank.wedatasphere.dss.server.entity; + +/** + * Created by schumiyi on 2020/6/22 + */ +public class DSSWorkspace extends BaseEntity { + + private Long id; + + private String name; + + private String label; + + private String description; + + private String department; + + private String product; + + private String source; + + public Long getId() { + return id; + } + + public void setId(Long id) { + this.id = id; + } + + public String getName() { + return name; + } + + public void setName(String name) { + this.name = name; + } + + public String getLabel() { + return label; + } + + public void setLabel(String label) { + this.label = label; + } + + public String getDescription() { + return description; + } + + public void setDescription(String description) { + this.description = description; + } + + public String getDepartment() { + return department; + } + + public void setDepartment(String department) { + this.department = department; + } + + public String getProduct() { + return product; + } + + public void setProduct(String product) { + this.product = product; + } + + public String getSource() { + return source; + } + + public void setSource(String source) { + this.source = source; + } +} diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/function/FunctionInvoker.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/function/FunctionInvoker.java index be5232f51ddbd9db9a0806fdb0dffbda46753ac9..d4ef8bc4bddfa9be2c5c3ddd9d11d59c22834f6a 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/function/FunctionInvoker.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/function/FunctionInvoker.java @@ -25,7 +25,7 @@ import com.webank.wedatasphere.dss.appjoint.service.SecurityService; import com.webank.wedatasphere.dss.appjoint.service.session.Session; import com.webank.wedatasphere.dss.application.entity.Application; import com.webank.wedatasphere.dss.application.service.ApplicationService; -import com.webank.wedatasphere.dss.common.entity.project.DWSProject; +import com.webank.wedatasphere.dss.common.entity.project.DSSProject; import com.webank.wedatasphere.dss.common.entity.project.Project; import com.webank.wedatasphere.dss.server.dao.ProjectMapper; import org.apache.commons.math3.util.Pair; @@ -66,7 +66,7 @@ public class FunctionInvoker { return jobContent; } - public List> projectServiceAddFunction(DWSProject project, ProjectServiceAddFunction projectServiceAddFunction, List appJoints) throws AppJointErrorException { + public List> projectServiceAddFunction(DSSProject project, ProjectServiceAddFunction projectServiceAddFunction, List appJoints) throws AppJointErrorException { ArrayList> projects = new ArrayList<>(); for (AppJoint appJoint : appJoints) { Project appJointProject = null; @@ -84,7 +84,7 @@ public class FunctionInvoker { return projects; } - public void projectServiceFunction(DWSProject project, ProjectServiceFunction projectServiceFunction, List appJoints) throws AppJointErrorException { + public void projectServiceFunction(DSSProject project, ProjectServiceFunction projectServiceFunction, List appJoints) throws AppJointErrorException { for (AppJoint appJoint : appJoints) { Session session = null; if(appJoint.getSecurityService() != null){ diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/function/FunctionPool.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/function/FunctionPool.java index 3ace828c7b1ca2404e186418e1f27daa1c33b88f..7efe477b2ccb62224e48f0ff88615396a1643726 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/function/FunctionPool.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/function/FunctionPool.java @@ -26,6 +26,7 @@ import java.util.Map; public class FunctionPool { public static NodeServiceFunction deleteNode = (NodeService nodeService, Session session, AppJointNode node, Map requestBody)->{ + node.setJobContent(requestBody); nodeService.deleteNode(session,node); return null; }; diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/lock/LockAspect.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/lock/LockAspect.java index b41287dddc05ff40b0d3c28897c5480e0a36c55b..473f224c517e1c830019696c90715a84f2a8d449 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/lock/LockAspect.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/lock/LockAspect.java @@ -19,7 +19,7 @@ package com.webank.wedatasphere.dss.server.lock; import com.webank.wedatasphere.dss.server.constant.DSSServerConstant; import com.webank.wedatasphere.dss.server.dao.ProjectMapper; -import com.webank.wedatasphere.dss.common.entity.project.DWSProjectVersion; +import com.webank.wedatasphere.dss.common.entity.project.DSSProjectVersion; import com.webank.wedatasphere.dss.common.exception.DSSErrorException; import org.apache.commons.lang.ArrayUtils; import org.aspectj.lang.ProceedingJoinPoint; @@ -70,11 +70,11 @@ public class LockAspect implements Ordered { return point.proceed(); } logger.info("projectVersionID为:" + projectVersionID); - DWSProjectVersion dwsProjectVersion = projectMapper.selectProjectVersionByID(projectVersionID); - Integer lock = dwsProjectVersion.getLock(); + DSSProjectVersion dssProjectVersion = projectMapper.selectProjectVersionByID(projectVersionID); + Integer lock = dssProjectVersion.getLock(); try { Object proceed = point.proceed(); - judge(lockAnnotation, dwsProjectVersion, lock, projectVersionID); + judge(lockAnnotation, dssProjectVersion, lock, projectVersionID); return proceed; } catch (Exception e) { logger.info("执行过程出现异常", e); @@ -82,16 +82,16 @@ public class LockAspect implements Ordered { } } - private void judge(Lock lockAnnotation, DWSProjectVersion dwsProjectVersion, Integer lock, Long projectVersionID) throws DSSErrorException { + private void judge(Lock lockAnnotation, DSSProjectVersion dssProjectVersion, Integer lock, Long projectVersionID) throws DSSErrorException { if (lockAnnotation.type().equals(LockEnum.ADD)) { logger.info("projectVersion会增加"); - List dwsProjectVersions = projectMapper.listProjectVersionsByProjectID(dwsProjectVersion.getProjectID()); - if (dwsProjectVersions.size() < 2 || !dwsProjectVersions.get(1).getId().equals(projectVersionID)) { + List dssProjectVersions = projectMapper.listProjectVersionsByProjectID(dssProjectVersion.getProjectID()); + if (dssProjectVersions.size() < 2 || !dssProjectVersions.get(1).getId().equals(projectVersionID)) { throw new DSSErrorException(67457, "已经有别的用户对此project进行了版本更新操作,不能进行此操作!"); } } else { logger.info("projectVersion不会增加"); - DWSProjectVersion latest = projectMapper.selectLatestVersionByProjectID(dwsProjectVersion.getProjectID()); + DSSProjectVersion latest = projectMapper.selectLatestVersionByProjectID(dssProjectVersion.getProjectID()); if (!latest.getId().equals(projectVersionID)) { throw new DSSErrorException(67455, "目前project版本已经不是最新,不能进行此操作!"); } diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/operate/AppJointNodeOperate.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/operate/AppJointNodeOperate.java index 5c7dd7ab4b05cb50f2e98c3f0247e0e6786bad0d..276e99d3e29c2452057fe964f22e201aaaaab19f 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/operate/AppJointNodeOperate.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/operate/AppJointNodeOperate.java @@ -20,11 +20,10 @@ package com.webank.wedatasphere.dss.server.operate; import com.webank.wedatasphere.dss.appjoint.exception.AppJointErrorException; import com.webank.wedatasphere.dss.appjoint.execution.core.CommonAppJointNode; -import com.webank.wedatasphere.dss.appjoint.service.NodeService; import com.webank.wedatasphere.dss.server.function.FunctionInvoker; import com.webank.wedatasphere.dss.server.function.FunctionPool; import com.webank.wedatasphere.dss.server.function.NodeServiceFunction; -import com.webank.wedatasphere.dss.server.service.DWSFlowService; +import com.webank.wedatasphere.dss.server.service.DSSFlowService; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Component; @@ -41,17 +40,17 @@ public class AppJointNodeOperate implements Operate { } @Override - public void add(DWSFlowService dwsFlowService, Op op) throws AppJointErrorException { + public void add(DSSFlowService dssFlowService, Op op) throws AppJointErrorException { invokeNodeServiceFunction(op,FunctionPool.createNode); } @Override - public void update(DWSFlowService dwsFlowService,Op op) throws AppJointErrorException { + public void update(DSSFlowService dssFlowService, Op op) throws AppJointErrorException { invokeNodeServiceFunction(op,FunctionPool.updateNode); } @Override - public void delete(DWSFlowService dwsFlowService,Op op) throws AppJointErrorException { + public void delete(DSSFlowService dssFlowService, Op op) throws AppJointErrorException { invokeNodeServiceFunction(op, FunctionPool.deleteNode); } @@ -64,7 +63,4 @@ public class AppJointNodeOperate implements Operate { node.setNodeType(op.getNodeType()); functionInvoker.nodeServiceFunction(userName,op.getParams(),node,function); } - - - } diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/operate/Operate.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/operate/Operate.java index 75beee6d017824de675dc794468017da54f1d592..ec44935c8df7a55885d0d70e27e784f6ff07c47e 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/operate/Operate.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/operate/Operate.java @@ -20,7 +20,7 @@ package com.webank.wedatasphere.dss.server.operate; import com.webank.wedatasphere.dss.appjoint.exception.AppJointErrorException; -import com.webank.wedatasphere.dss.server.service.DWSFlowService; +import com.webank.wedatasphere.dss.server.service.DSSFlowService; import com.webank.wedatasphere.dss.common.exception.DSSErrorException; @@ -28,9 +28,9 @@ public interface Operate { boolean canInvokeOperate(Op op); - void add(DWSFlowService dwsFlowService, Op op) throws DSSErrorException, AppJointErrorException; + void add(DSSFlowService dssFlowService, Op op) throws DSSErrorException, AppJointErrorException; - void update(DWSFlowService dwsFlowService, Op op) throws DSSErrorException, AppJointErrorException; + void update(DSSFlowService dssFlowService, Op op) throws DSSErrorException, AppJointErrorException; - void delete(DWSFlowService dwsFlowService, Op op) throws DSSErrorException, AppJointErrorException; + void delete(DSSFlowService dssFlowService, Op op) throws DSSErrorException, AppJointErrorException; } diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/operate/SubFlowOperate.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/operate/SubFlowOperate.java index 96db7ba6233fd380f83fc3f7f7ccee667b7eb9a8..8f3d28661a89c737badbfdca7c8ecb2014fda3f9 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/operate/SubFlowOperate.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/operate/SubFlowOperate.java @@ -18,8 +18,8 @@ package com.webank.wedatasphere.dss.server.operate; -import com.webank.wedatasphere.dss.server.service.DWSFlowService; -import com.webank.wedatasphere.dss.common.entity.flow.DWSFlow; +import com.webank.wedatasphere.dss.common.entity.flow.DSSFlow; +import com.webank.wedatasphere.dss.server.service.DSSFlowService; import com.webank.wedatasphere.dss.common.exception.DSSErrorException; import org.slf4j.Logger; import org.slf4j.LoggerFactory; @@ -41,33 +41,33 @@ public class SubFlowOperate implements Operate { } @Override - public void add(DWSFlowService dwsFlowService, Op op)throws DSSErrorException { - afterOperateSubFlow(dwsFlowService,op); + public void add(DSSFlowService dssFlowService, Op op)throws DSSErrorException { + afterOperateSubFlow(dssFlowService,op); } @Override - public void update(DWSFlowService dwsFlowService,Op op) throws DSSErrorException { + public void update(DSSFlowService dssFlowService, Op op) throws DSSErrorException { logger.info("name:{},description:{}",op.getParams().get("name"),op.getParams().get("description")); - DWSFlow dwsFlow = new DWSFlow(); - dwsFlow.setId(op.getId()); - dwsFlow.setName(op.getParams().get("name").toString()); - dwsFlow.setDescription(op.getParams().get("description").toString()); - dwsFlowService.updateFlowBaseInfo(dwsFlow, Long.valueOf(op.getParams().get("projectVersionID").toString()), null); - afterOperateSubFlow(dwsFlowService,op); + DSSFlow dssFlow = new DSSFlow(); + dssFlow.setId(op.getId()); + dssFlow.setName(op.getParams().get("name").toString()); + dssFlow.setDescription(op.getParams().get("description").toString()); + dssFlowService.updateFlowBaseInfo(dssFlow, Long.valueOf(op.getParams().get("projectVersionID").toString()), null); + afterOperateSubFlow(dssFlowService,op); } @Override - public void delete(DWSFlowService dwsFlowService,Op op) throws DSSErrorException { + public void delete(DSSFlowService dssFlowService, Op op) throws DSSErrorException { logger.info("delete subFlow{}",op.getId()); - dwsFlowService.batchDeleteFlow(Arrays.asList(op.getId()), Long.valueOf(op.getParams().get("projectVersionID").toString())); - afterOperateSubFlow(dwsFlowService,op); + dssFlowService.batchDeleteFlow(Arrays.asList(op.getId()), Long.valueOf(op.getParams().get("projectVersionID").toString())); + afterOperateSubFlow(dssFlowService,op); } - private void afterOperateSubFlow(DWSFlowService dwsFlowService,Op op) throws DSSErrorException { + private void afterOperateSubFlow(DSSFlowService dssFlowService, Op op) throws DSSErrorException { //更新工作流基本信息 - DWSFlow dwsFlow = new DWSFlow(); - dwsFlow.setId(op.getId()); - dwsFlow.setHasSaved(true); - dwsFlowService.updateFlowBaseInfo(dwsFlow, Long.valueOf(op.getParams().get("projectVersionID").toString()), null); + DSSFlow dssFlow = new DSSFlow(); + dssFlow.setId(op.getId()); + dssFlow.setHasSaved(true); + dssFlowService.updateFlowBaseInfo(dssFlow, Long.valueOf(op.getParams().get("projectVersionID").toString()), null); } } diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/publish/PublishJobFactory.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/publish/PublishJobFactory.java index 304fc444e8cb63d22de021d523de8dc6e962854b..fff485910450b3865917276299cc88c7e21803dd 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/publish/PublishJobFactory.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/publish/PublishJobFactory.java @@ -19,7 +19,7 @@ package com.webank.wedatasphere.dss.server.publish; import com.webank.wedatasphere.dss.server.conf.DSSServerConf; -import com.webank.wedatasphere.dss.server.service.DWSProjectService; +import com.webank.wedatasphere.dss.server.service.DSSProjectService; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Component; @@ -31,7 +31,7 @@ public class PublishJobFactory { @Autowired private PublishManager publishManager; @Autowired - private DWSProjectService projectService; + private DSSProjectService projectService; public PublishSubmitJob createSubmitPublishJob(Long projectVersionID, String user, String comment){ PublishSubmitJob job = new PublishSubmitJob(projectService, user, comment, projectVersionID); diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/publish/PublishManager.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/publish/PublishManager.java index 60e9205de222d847798966d79c45df8f67fa4ed8..c1f29c4e290eecfe24d511559311cb997d6be478 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/publish/PublishManager.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/publish/PublishManager.java @@ -18,8 +18,8 @@ package com.webank.wedatasphere.dss.server.publish; -import com.webank.wedatasphere.dss.server.service.DWSProjectService; -import com.webank.wedatasphere.dss.common.entity.project.DWSProjectPublishHistory; +import com.webank.wedatasphere.dss.server.service.DSSProjectService; +import com.webank.wedatasphere.dss.common.entity.project.DSSProjectPublishHistory; import com.webank.wedatasphere.dss.common.exception.DSSErrorException; import com.webank.wedatasphere.linkis.common.utils.Utils; import org.slf4j.Logger; @@ -40,7 +40,7 @@ public class PublishManager implements PublishListner { private Logger logger = LoggerFactory.getLogger(this.getClass()); @Autowired - private DWSProjectService projectService; + private DSSProjectService projectService; private Map cacheMap = new ConcurrentHashMap<>(); @@ -105,7 +105,7 @@ public class PublishManager implements PublishListner { if (cacheMap.get(projectVersionID) == null) { synchronized (cacheMap) { if (cacheMap.get(projectVersionID) == null) { - DWSProjectPublishHistory history = projectService.getPublishHistoryByID(projectVersionID); + DSSProjectPublishHistory history = projectService.getPublishHistoryByID(projectVersionID); if (history == null) { logger.info("创建一个发布job" + projectVersionID); projectService.createPublishHistory(comment, creatorID, projectVersionID); diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/publish/PublishSubmitJob.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/publish/PublishSubmitJob.java index 769bd9dbfd87a26500212d1a745907e4207c4ba7..af04335f5efd7a0f57a29eb9b98fae886fdb42b3 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/publish/PublishSubmitJob.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/publish/PublishSubmitJob.java @@ -18,16 +18,16 @@ package com.webank.wedatasphere.dss.server.publish; -import com.webank.wedatasphere.dss.server.service.DWSProjectService; +import com.webank.wedatasphere.dss.server.service.DSSProjectService; public class PublishSubmitJob extends PublishJob { - private DWSProjectService projectService; + private DSSProjectService projectService; private String comment; - public PublishSubmitJob(DWSProjectService projectService, String user, String comment, Long projectVersionID) { + public PublishSubmitJob(DSSProjectService projectService, String user, String comment, Long projectVersionID) { this.projectService = projectService; this.user = user; this.comment = comment; diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/FlowRestfulApi.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/FlowRestfulApi.java index 74b8520342f0d2d06f08659a4f9ec23353cc75c4..102518263b0a32f0cc88f45867b255413868ce3e 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/FlowRestfulApi.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/FlowRestfulApi.java @@ -19,11 +19,11 @@ package com.webank.wedatasphere.dss.server.restful; import com.webank.wedatasphere.dss.appjoint.exception.AppJointErrorException; -import com.webank.wedatasphere.dss.server.service.DWSFlowService; -import com.webank.wedatasphere.dss.server.service.DWSProjectService; -import com.webank.wedatasphere.dss.server.service.DWSUserService; -import com.webank.wedatasphere.dss.common.entity.flow.DWSFlow; -import com.webank.wedatasphere.dss.common.entity.flow.DWSFlowVersion; +import com.webank.wedatasphere.dss.common.entity.flow.DSSFlow; +import com.webank.wedatasphere.dss.server.service.DSSFlowService; +import com.webank.wedatasphere.dss.server.service.DSSProjectService; +import com.webank.wedatasphere.dss.server.service.DSSUserService; +import com.webank.wedatasphere.dss.common.entity.flow.DSSFlowVersion; import com.webank.wedatasphere.dss.common.exception.DSSErrorException; import com.webank.wedatasphere.dss.server.operate.Op; import com.webank.wedatasphere.dss.server.publish.PublishManager; @@ -54,9 +54,9 @@ import java.util.List; public class FlowRestfulApi { @Autowired - private DWSFlowService flowService; + private DSSFlowService flowService; @Autowired - private DWSUserService dwsUserService; + private DSSUserService dssUserService; @Autowired private PublishManager publishManager; @@ -65,7 +65,7 @@ public class FlowRestfulApi { @GET @Path("/listAllFlowVersions") public Response listAllVersions(@Context HttpServletRequest req, @QueryParam("id")Long flowID,@QueryParam("projectVersionID")Long projectVersionID) { - List versions = flowService.listAllFlowVersions(flowID,projectVersionID); + List versions = flowService.listAllFlowVersions(flowID,projectVersionID); return Message.messageToResponse(Message.ok().data("versions",versions)); } @@ -83,30 +83,30 @@ public class FlowRestfulApi { String uses = json.get("uses") == null?null:json.get("uses").getTextValue(); if(taxonomyID == null && parentFlowID == null) throw new DSSErrorException(90009,"请求选择工作流分类"); publishManager.checkeIsPublishing(projectVersionID); - DWSFlow dwsFlow = new DWSFlow(); - dwsFlow.setProjectID(projectService.getProjectByProjectVersionID(projectVersionID).getId()); - dwsFlow.setName(name); - dwsFlow.setDescription(description); - dwsFlow.setCreatorID(dwsUserService.getUserID(userName)); - dwsFlow.setCreateTime(new Date()); - dwsFlow.setState(false); - dwsFlow.setSource("create by user"); - dwsFlow.setUses(uses); + DSSFlow dssFlow = new DSSFlow(); + dssFlow.setProjectID(projectService.getProjectByProjectVersionID(projectVersionID).getId()); + dssFlow.setName(name); + dssFlow.setDescription(description); + dssFlow.setCreatorID(dssUserService.getUserID(userName)); + dssFlow.setCreateTime(new Date()); + dssFlow.setState(false); + dssFlow.setSource("create by user"); + dssFlow.setUses(uses); if(parentFlowID == null){ - dwsFlow.setRootFlow(true); - dwsFlow.setRank(0); - dwsFlow.setHasSaved(true); - dwsFlow = flowService.addRootFlow(dwsFlow,taxonomyID,projectVersionID); + dssFlow.setRootFlow(true); + dssFlow.setRank(0); + dssFlow.setHasSaved(true); + dssFlow = flowService.addRootFlow(dssFlow,taxonomyID,projectVersionID); }else { - dwsFlow.setRootFlow(false); + dssFlow.setRootFlow(false); Integer rank = flowService.getParentRank(parentFlowID); // TODO: 2019/6/3 并发问题考虑for update - dwsFlow.setRank(rank+1); - dwsFlow.setHasSaved(false); - dwsFlow = flowService.addSubFlow(dwsFlow,parentFlowID,projectVersionID); + dssFlow.setRank(rank+1); + dssFlow.setHasSaved(false); + dssFlow = flowService.addSubFlow(dssFlow,parentFlowID,projectVersionID); } // TODO: 2019/5/16 空值校验,重复名校验 - return Message.messageToResponse(Message.ok().data("flow",dwsFlow)); + return Message.messageToResponse(Message.ok().data("flow", dssFlow)); } @POST @@ -121,12 +121,12 @@ public class FlowRestfulApi { publishManager.checkeIsPublishing(projectVersionID); // TODO: 2019/6/13 projectVersionID的更新校验 //这里可以不做事务 - DWSFlow dwsFlow = new DWSFlow(); - dwsFlow.setId(flowID); - dwsFlow.setName(name); - dwsFlow.setDescription(description); - dwsFlow.setUses(uses); - flowService.updateFlowBaseInfo(dwsFlow,projectVersionID,taxonomyID); + DSSFlow dssFlow = new DSSFlow(); + dssFlow.setId(flowID); + dssFlow.setName(name); + dssFlow.setDescription(description); + dssFlow.setUses(uses); + flowService.updateFlowBaseInfo(dssFlow,projectVersionID,taxonomyID); return Message.messageToResponse(Message.ok()); } @@ -134,14 +134,14 @@ public class FlowRestfulApi { @Path("/get") public Response get(@Context HttpServletRequest req, @QueryParam("id")Long flowID,@QueryParam("version")String version,@QueryParam("projectVersionID")Long projectVersionID) throws DSSErrorException { // TODO: 2019/5/23 id空值判断 - DWSFlow dwsFlow; + DSSFlow dssFlow; if (StringUtils.isEmpty(version)){ - dwsFlow = flowService.getLatestVersionFlow(flowID,projectVersionID); - dwsFlow.setFlowVersions(Arrays.asList(dwsFlow.getLatestVersion())); + dssFlow = flowService.getLatestVersionFlow(flowID,projectVersionID); + dssFlow.setFlowVersions(Arrays.asList(dssFlow.getLatestVersion())); }else { - dwsFlow = flowService.getOneVersionFlow(flowID, version,projectVersionID); + dssFlow = flowService.getOneVersionFlow(flowID, version,projectVersionID); } - return Message.messageToResponse(Message.ok().data("flow",dwsFlow)); + return Message.messageToResponse(Message.ok().data("flow", dssFlow)); } @POST @@ -175,6 +175,6 @@ public class FlowRestfulApi { } @Autowired - private DWSProjectService projectService; + private DSSProjectService projectService; } diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/FlowTaxonomyRestfulApi.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/FlowTaxonomyRestfulApi.java index 5e6f3be56481e77fff71d03095dc307119eb116c..bc43c5ab680dd3c060e0bfcc8ea8098281db3e7e 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/FlowTaxonomyRestfulApi.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/FlowTaxonomyRestfulApi.java @@ -18,10 +18,10 @@ package com.webank.wedatasphere.dss.server.restful; -import com.webank.wedatasphere.dss.server.service.DWSFlowTaxonomyService; -import com.webank.wedatasphere.dss.server.service.DWSProjectService; -import com.webank.wedatasphere.dss.server.service.DWSUserService; -import com.webank.wedatasphere.dss.server.entity.DWSFlowTaxonomy; +import com.webank.wedatasphere.dss.server.service.DSSFlowTaxonomyService; +import com.webank.wedatasphere.dss.server.service.DSSProjectService; +import com.webank.wedatasphere.dss.server.service.DSSUserService; +import com.webank.wedatasphere.dss.server.entity.DSSFlowTaxonomy; import com.webank.wedatasphere.dss.common.exception.DSSErrorException; import com.webank.wedatasphere.dss.server.publish.PublishManager; import com.webank.wedatasphere.linkis.server.Message; @@ -48,11 +48,11 @@ import java.util.Date; public class FlowTaxonomyRestfulApi { @Autowired - private DWSFlowTaxonomyService flowTaxonomyService; + private DSSFlowTaxonomyService flowTaxonomyService; @Autowired - private DWSUserService dwsUserService; + private DSSUserService dssUserService; @Autowired - private DWSProjectService projectService; + private DSSProjectService projectService; @Autowired private PublishManager publishManager; @@ -65,15 +65,15 @@ public class FlowTaxonomyRestfulApi { Long projectVersionID = json.get("projectVersionID").getLongValue(); publishManager.checkeIsPublishing(projectVersionID); // TODO: 2019/5/16 空值校验,重复名校验 - DWSFlowTaxonomy dwsFlowTaxonomy = new DWSFlowTaxonomy(); + DSSFlowTaxonomy dssFlowTaxonomy = new DSSFlowTaxonomy(); Date date = new Date(); - dwsFlowTaxonomy.setName(name); - dwsFlowTaxonomy.setDescription(description); - dwsFlowTaxonomy.setCreatorID(dwsUserService.getUserID(userName)); - dwsFlowTaxonomy.setCreateTime(date); - dwsFlowTaxonomy.setUpdateTime(date); - dwsFlowTaxonomy.setProjectID(projectService.getProjectByProjectVersionID(projectVersionID).getId()); - flowTaxonomyService.addFlowTaxonomy(dwsFlowTaxonomy,projectVersionID); + dssFlowTaxonomy.setName(name); + dssFlowTaxonomy.setDescription(description); + dssFlowTaxonomy.setCreatorID(dssUserService.getUserID(userName)); + dssFlowTaxonomy.setCreateTime(date); + dssFlowTaxonomy.setUpdateTime(date); + dssFlowTaxonomy.setProjectID(projectService.getProjectByProjectVersionID(projectVersionID).getId()); + flowTaxonomyService.addFlowTaxonomy(dssFlowTaxonomy,projectVersionID); return Message.messageToResponse(Message.ok()); } @POST @@ -86,12 +86,12 @@ public class FlowTaxonomyRestfulApi { publishManager.checkeIsPublishing(projectVersionID); // TODO: 2019/6/13 projectVersionID的更新校验 // TODO: 2019/5/16 空值校验,重复名校验 - DWSFlowTaxonomy dwsFlowTaxonomy = new DWSFlowTaxonomy(); - dwsFlowTaxonomy.setId(id); - dwsFlowTaxonomy.setName(name); - dwsFlowTaxonomy.setDescription(description); - dwsFlowTaxonomy.setUpdateTime(new Date()); - flowTaxonomyService.updateFlowTaxonomy(dwsFlowTaxonomy,projectVersionID); + DSSFlowTaxonomy dssFlowTaxonomy = new DSSFlowTaxonomy(); + dssFlowTaxonomy.setId(id); + dssFlowTaxonomy.setName(name); + dssFlowTaxonomy.setDescription(description); + dssFlowTaxonomy.setUpdateTime(new Date()); + flowTaxonomyService.updateFlowTaxonomy(dssFlowTaxonomy,projectVersionID); return Message.messageToResponse(Message.ok()); } diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/NodeRestfulApi.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/NodeRestfulApi.java index 69df3212667d31ce6959d53778f38e9f7a9637e7..ec1c43a228ba8544299dd34900fe5f2706624797 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/NodeRestfulApi.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/NodeRestfulApi.java @@ -20,7 +20,6 @@ package com.webank.wedatasphere.dss.server.restful; import com.webank.wedatasphere.dss.appjoint.exception.AppJointErrorException; import com.webank.wedatasphere.dss.appjoint.execution.core.CommonAppJointNode; -import com.webank.wedatasphere.dss.appjoint.service.NodeService; import com.webank.wedatasphere.dss.application.entity.Application; import com.webank.wedatasphere.dss.application.service.ApplicationService; import com.webank.wedatasphere.dss.application.util.ApplicationUtils; @@ -28,7 +27,7 @@ import com.webank.wedatasphere.dss.server.entity.NodeInfo; import com.webank.wedatasphere.dss.server.function.FunctionInvoker; import com.webank.wedatasphere.dss.server.function.FunctionPool; import com.webank.wedatasphere.dss.server.function.NodeServiceFunction; -import com.webank.wedatasphere.dss.server.service.DWSNodeInfoService; +import com.webank.wedatasphere.dss.server.service.DSSNodeInfoService; import com.webank.wedatasphere.linkis.server.Message; import com.webank.wedatasphere.linkis.server.security.SecurityFilter; import org.slf4j.Logger; @@ -54,7 +53,7 @@ public class NodeRestfulApi { private Logger logger = LoggerFactory.getLogger(this.getClass()); @Autowired - private DWSNodeInfoService nodeInfoService; + private DSSNodeInfoService nodeInfoService; @Autowired private ApplicationService applicationService; diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/ProjectRestfulApi.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/ProjectRestfulApi.java index f8ef9237fa8dcbb09bd632d48cb9c1b695a2b299..be22ec75824d233f393d23fd1ee423563a0fd632 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/ProjectRestfulApi.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/ProjectRestfulApi.java @@ -19,16 +19,16 @@ package com.webank.wedatasphere.dss.server.restful; import com.webank.wedatasphere.dss.appjoint.exception.AppJointErrorException; -import com.webank.wedatasphere.dss.common.entity.project.DWSProject; -import com.webank.wedatasphere.dss.common.entity.project.DWSProjectVersion; +import com.webank.wedatasphere.dss.common.entity.project.DSSProject; +import com.webank.wedatasphere.dss.common.entity.project.DSSProjectVersion; import com.webank.wedatasphere.dss.common.exception.DSSErrorException; -import com.webank.wedatasphere.dss.server.dao.DWSUserMapper; +import com.webank.wedatasphere.dss.server.dao.DSSUserMapper; import com.webank.wedatasphere.dss.server.dao.ProjectMapper; import com.webank.wedatasphere.dss.server.entity.ApplicationArea; import com.webank.wedatasphere.dss.server.publish.*; -import com.webank.wedatasphere.dss.server.service.DWSProjectService; -import com.webank.wedatasphere.dss.server.service.DWSProjectTaxonomyService; -import com.webank.wedatasphere.dss.server.service.DWSUserService; +import com.webank.wedatasphere.dss.server.service.DSSProjectService; +import com.webank.wedatasphere.dss.server.service.DSSProjectTaxonomyService; +import com.webank.wedatasphere.dss.server.service.DSSUserService; import com.webank.wedatasphere.linkis.server.Message; import com.webank.wedatasphere.linkis.server.security.SecurityFilter; import org.codehaus.jackson.JsonNode; @@ -53,11 +53,11 @@ import java.util.concurrent.Future; public class ProjectRestfulApi { @Autowired - private DWSProjectTaxonomyService projectTaxonomyService; + private DSSProjectTaxonomyService projectTaxonomyService; @Autowired - private DWSProjectService projectService; + private DSSProjectService projectService; @Autowired - private DWSUserService dwsUserService; + private DSSUserService dssUserService; @Autowired private ProjectMapper projectMapper; @Autowired @@ -66,12 +66,12 @@ public class ProjectRestfulApi { private PublishManager publishManager; @Autowired - private DWSUserMapper dwsUserMapper; + private DSSUserMapper dssUserMapper; @GET @Path("/listAllProjectVersions") public Response listAllVersions(@Context HttpServletRequest req, @QueryParam("id") Long projectID) { - List versions = projectService.listAllProjectVersions(projectID); + List versions = projectService.listAllProjectVersions(projectID); return Message.messageToResponse(Message.ok().data("versions", versions)); } @@ -101,8 +101,9 @@ public class ProjectRestfulApi { String product = json.get("product").getTextValue(); Integer applicationArea = json.get("applicationArea").getIntValue(); String business = json.get("business").getTextValue(); + Long workspaceId = json.get("workspaceId") == null ? 1 : json.get("workspaceId").getLongValue(); // TODO: 2019/5/16 空值校验,重复名校验 - projectService.addProject(userName, name, description, taxonomyID,product,applicationArea,business); + projectService.addProject(userName, name, description, taxonomyID,product,applicationArea,business, workspaceId); return Message.messageToResponse(Message.ok()); } @@ -144,7 +145,7 @@ public class ProjectRestfulApi { String userName = SecurityFilter.getLoginUsername(req); Long projectID = json.get("projectID").getLongValue(); String projectName = json.get("projectName") == null ? null : json.get("projectName").getTextValue(); - DWSProjectVersion maxVersion = projectMapper.selectLatestVersionByProjectID(projectID); + DSSProjectVersion maxVersion = projectMapper.selectLatestVersionByProjectID(projectID); projectService.copyProject( maxVersion.getId(),projectID, projectName, userName); return Message.messageToResponse(Message.ok()); } @@ -154,8 +155,8 @@ public class ProjectRestfulApi { public Response copyProjectVersion(@Context HttpServletRequest req, JsonNode json) throws InterruptedException, DSSErrorException { String userName = SecurityFilter.getLoginUsername(req); Long copyprojectVersionID = json.get("projectVersionID").getLongValue(); - DWSProjectVersion currentVersion = projectMapper.selectProjectVersionByID(copyprojectVersionID); - DWSProjectVersion maxVersion = projectMapper.selectLatestVersionByProjectID(currentVersion.getProjectID()); + DSSProjectVersion currentVersion = projectMapper.selectProjectVersionByID(copyprojectVersionID); + DSSProjectVersion maxVersion = projectMapper.selectLatestVersionByProjectID(currentVersion.getProjectID()); projectService.copyProjectVersionMax( maxVersion.getId(), maxVersion,currentVersion,userName,null); return Message.messageToResponse(Message.ok()); @@ -167,8 +168,8 @@ public class ProjectRestfulApi { String userName = SecurityFilter.getLoginUsername(req); Long projectID = json.get("id").getLongValue(); String comment = json.get("comment").getTextValue(); - DWSProject latestVersionProject = projectService.getLatestVersionProject(projectID); - publishManager.addPublishCache(latestVersionProject.getLatestVersion().getId(),dwsUserService.getUserID(userName),comment); + DSSProject latestVersionProject = projectService.getLatestVersionProject(projectID); + publishManager.addPublishCache(latestVersionProject.getLatestVersion().getId(), dssUserService.getUserID(userName),comment); PublishSubmitJob job = publishJobFactory.createSubmitPublishJob(latestVersionProject.getLatestVersion().getId(), userName, comment); Future submit = PublishThreadPool.get().submit(job); PublishSubmitJobDeamon deamon = publishJobFactory.createSubmitPublishJobDeamon(submit, job); diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/ProjectTaxonomyRestfulApi.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/ProjectTaxonomyRestfulApi.java index f29e1a147b36b7cb03764fe4ea98029aba391fe5..4852c0fe9bb46bf58b29be5104a9771920797e5c 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/ProjectTaxonomyRestfulApi.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/ProjectTaxonomyRestfulApi.java @@ -18,9 +18,9 @@ package com.webank.wedatasphere.dss.server.restful; -import com.webank.wedatasphere.dss.server.service.DWSProjectTaxonomyService; -import com.webank.wedatasphere.dss.server.service.DWSUserService; -import com.webank.wedatasphere.dss.server.entity.DWSProjectTaxonomy; +import com.webank.wedatasphere.dss.server.service.DSSProjectTaxonomyService; +import com.webank.wedatasphere.dss.server.service.DSSUserService; +import com.webank.wedatasphere.dss.server.entity.DSSProjectTaxonomy; import com.webank.wedatasphere.dss.common.exception.DSSErrorException; import com.webank.wedatasphere.linkis.server.Message; import com.webank.wedatasphere.linkis.server.security.SecurityFilter; @@ -46,9 +46,9 @@ import java.util.Date; public class ProjectTaxonomyRestfulApi { @Autowired - private DWSProjectTaxonomyService projectTaxonomyService; + private DSSProjectTaxonomyService projectTaxonomyService; @Autowired - private DWSUserService dwsUserService; + private DSSUserService dssUserService; @POST @Path("/addProjectTaxonomy") @@ -57,14 +57,14 @@ public class ProjectTaxonomyRestfulApi { String name = json.get("name").getTextValue(); String description = json.get("description").getTextValue(); // TODO: 2019/5/16 空值校验,重复名校验 - DWSProjectTaxonomy dwsProjectTaxonomy = new DWSProjectTaxonomy(); + DSSProjectTaxonomy dssProjectTaxonomy = new DSSProjectTaxonomy(); Date date = new Date(); - dwsProjectTaxonomy.setName(name); - dwsProjectTaxonomy.setDescription(description); - dwsProjectTaxonomy.setCreatorID(dwsUserService.getUserID(userName)); - dwsProjectTaxonomy.setCreateTime(date); - dwsProjectTaxonomy.setUpdateTime(date); - projectTaxonomyService.addProjectTaxonomy(dwsProjectTaxonomy); + dssProjectTaxonomy.setName(name); + dssProjectTaxonomy.setDescription(description); + dssProjectTaxonomy.setCreatorID(dssUserService.getUserID(userName)); + dssProjectTaxonomy.setCreateTime(date); + dssProjectTaxonomy.setUpdateTime(date); + projectTaxonomyService.addProjectTaxonomy(dssProjectTaxonomy); return Message.messageToResponse(Message.ok()); } @POST @@ -74,12 +74,12 @@ public class ProjectTaxonomyRestfulApi { String description = json.get("description") == null?null:json.get("description").getTextValue(); Long id = json.get("id").getLongValue(); // TODO: 2019/5/16 空值校验,重复名校验 - DWSProjectTaxonomy dwsProjectTaxonomy = new DWSProjectTaxonomy(); - dwsProjectTaxonomy.setId(id); - dwsProjectTaxonomy.setName(name); - dwsProjectTaxonomy.setDescription(description); - dwsProjectTaxonomy.setUpdateTime(new Date()); - projectTaxonomyService.updateProjectTaxonomy(dwsProjectTaxonomy); + DSSProjectTaxonomy dssProjectTaxonomy = new DSSProjectTaxonomy(); + dssProjectTaxonomy.setId(id); + dssProjectTaxonomy.setName(name); + dssProjectTaxonomy.setDescription(description); + dssProjectTaxonomy.setUpdateTime(new Date()); + projectTaxonomyService.updateProjectTaxonomy(dssProjectTaxonomy); return Message.messageToResponse(Message.ok()); } diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/TreeRestfulApi.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/TreeRestfulApi.java index 1b9d2806b5067184b4fb81268376b8cd6617d3d3..fde36296e8c084603e3676622da8a935fd7c75db 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/TreeRestfulApi.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/TreeRestfulApi.java @@ -26,13 +26,11 @@ import com.webank.wedatasphere.linkis.server.Message; import com.webank.wedatasphere.linkis.server.security.SecurityFilter; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Component; +import org.springframework.web.bind.annotation.RequestParam; import scala.Enumeration; import javax.servlet.http.HttpServletRequest; -import javax.ws.rs.Consumes; -import javax.ws.rs.GET; -import javax.ws.rs.Path; -import javax.ws.rs.Produces; +import javax.ws.rs.*; import javax.ws.rs.core.Context; import javax.ws.rs.core.MediaType; import javax.ws.rs.core.Response; diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/UserManagerApi.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/UserManagerApi.java new file mode 100644 index 0000000000000000000000000000000000000000..a249e7ae3b4a2b9d182e8a7c28702615a8c8d1d8 --- /dev/null +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/UserManagerApi.java @@ -0,0 +1,92 @@ +package com.webank.wedatasphere.dss.server.restful; + +import com.webank.wedatasphere.dss.appjoint.exception.AppJointErrorException; +import com.webank.wedatasphere.dss.appjoint.scheduler.SchedulerAppJoint; +import com.webank.wedatasphere.dss.application.conf.ApplicationConf; +import com.webank.wedatasphere.dss.application.service.ApplicationService; +import com.webank.wedatasphere.dss.server.constant.DSSServerConstant; +import com.webank.wedatasphere.linkis.server.Message; +import com.webank.wedatasphere.linkis.server.security.SecurityFilter; +import com.webank.wedatasphpere.dss.user.dto.request.AuthorizationBody; +import com.webank.wedatasphpere.dss.user.service.AbsCommand; +import com.webank.wedatasphpere.dss.user.service.impl.UserAuthorizationClient; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Component; + +import javax.servlet.http.HttpServletRequest; +import javax.ws.rs.*; +import javax.ws.rs.core.Context; +import javax.ws.rs.core.MediaType; +import javax.ws.rs.core.Response; + +/** + * @program: user-authorization + * @description: 鲁班对外交互的接口 包括施工 注册用户 + * @create: 2020-08-12 14:24 + **/ + +@Component +@Path("/dss") +@Produces(MediaType.APPLICATION_JSON) +@Consumes(MediaType.APPLICATION_JSON) +public class UserManagerApi { + + private UserAuthorizationClient client = new UserAuthorizationClient(); + private Logger logger = LoggerFactory.getLogger(this.getClass()); + + @Autowired + private ApplicationService applicationService; + + private SchedulerAppJoint schedulerAppJoint; + + @POST + @Path("/user") + public Response createUser(@Context HttpServletRequest req, AuthorizationBody body) { + String username = SecurityFilter.getLoginUsername(req); + String superUserName = ApplicationConf.SUPER_USER_NAME; + if(!username.equals(superUserName)){ + return Message.messageToResponse(Message.error(DSSServerConstant.SUPER_USER_LOGIN_ERROR)); + } + + try { + String result = client.authorization(body); + + if(result.equals(AbsCommand.SUCCESS)){ + schedulerAppJoint = getSchedulerAppJoint(); + if(schedulerAppJoint != null){ + try{ + schedulerAppJoint.getSecurityService().reloadToken(); + }catch (Throwable throwable){ + logger.warn("choose schedulies,don not care"); + } + + } + return Message.messageToResponse(Message.ok()); + }else { + return Message.messageToResponse(Message.error(AbsCommand.ERROR)); + } + } catch (Exception e) { + return Message.messageToResponse(Message.error(e.getMessage())); + } + + + + + } + + private SchedulerAppJoint getSchedulerAppJoint(){ + + if(schedulerAppJoint == null){ + try { + schedulerAppJoint = (SchedulerAppJoint)applicationService.getAppjoint("schedulis"); + } catch (AppJointErrorException e) { + logger.error("Schedule system init failed!", e); + } + } + return schedulerAppJoint; + } + + +} diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/WorkspaceRestfulApi.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/WorkspaceRestfulApi.java new file mode 100644 index 0000000000000000000000000000000000000000..4f85a55a4dc435ef5680b8d3c079d59c16b3bdff --- /dev/null +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/restful/WorkspaceRestfulApi.java @@ -0,0 +1,172 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.server.restful; + +import com.webank.wedatasphere.dss.server.dto.response.HomepageDemoMenuVo; +import com.webank.wedatasphere.dss.server.dto.response.HomepageVideoVo; +import com.webank.wedatasphere.dss.server.dto.response.OnestopMenuVo; +import com.webank.wedatasphere.dss.server.entity.DSSWorkspace; +import com.webank.wedatasphere.dss.server.dto.response.WorkspaceDepartmentVo; +import com.webank.wedatasphere.dss.server.dto.response.*; +import com.webank.wedatasphere.dss.server.service.DSSUserService; +import com.webank.wedatasphere.dss.server.service.DSSWorkspaceService; +import com.webank.wedatasphere.linkis.server.Message; +import com.webank.wedatasphere.linkis.server.security.SecurityFilter; +import org.codehaus.jackson.JsonNode; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Component; + +import javax.servlet.http.HttpServletRequest; +import javax.ws.rs.*; +import javax.ws.rs.core.Context; +import javax.ws.rs.core.MediaType; +import javax.ws.rs.core.Response; +import java.util.List; + +/** + * Created by schumiyi on 2020/6/19 + */ +@Component +@Path("/dss") +@Produces(MediaType.APPLICATION_JSON) +@Consumes(MediaType.APPLICATION_JSON) +public class WorkspaceRestfulApi { + + @Autowired + private DSSWorkspaceService dssWorkspaceService; + + @Autowired + private DSSUserService dssUserService; + + @GET + @Path("/workspaces") + public Response getAllWorkspaces(@Context HttpServletRequest req) { + // TODO: Order By time + List workspaces = dssWorkspaceService.getWorkspaces(); + return Message.messageToResponse(Message.ok().data("workspaces", workspaces)); + } + + @GET + @Path("/workspaces/{id}") + public Response getWorkspacesById(@Context HttpServletRequest req, @PathParam("id") Long id) { + DSSWorkspace workspace = dssWorkspaceService.getWorkspacesById(id); + return Message.messageToResponse(Message.ok().data("workspace", workspace)); + } + + @GET + @Path("/workspaces/departments") + public Response getAllWorkspaceDepartments(@Context HttpServletRequest req) { + List departments = dssWorkspaceService.getWorkSpaceDepartments(); + return Message.messageToResponse(Message.ok().data("departments", departments)); + } + + @GET + @Path("/workspaces/exists") + public Response getUsernameExistence(@Context HttpServletRequest req, @QueryParam("name") String name) { + boolean exists = dssWorkspaceService.existWorkspaceName(name); + return Message.messageToResponse(Message.ok().data("workspaceNameExists", exists)); + } + + @POST + @Path("/workspaces") + public Response addWorkspace(@Context HttpServletRequest req, JsonNode json) { + String userName = SecurityFilter.getLoginUsername(req); + String name = json.get("name").getTextValue(); + if (dssWorkspaceService.existWorkspaceName(name)) { + return Message.messageToResponse(Message.error("工作空间名重复")); + } + String department = json.get("department").getTextValue(); + String label = json.get("label").getTextValue(); + String description = json.get("description").getTextValue(); + Long workspaceId = dssWorkspaceService.addWorkspace(userName, name, department, label, description); + return Message.messageToResponse(Message.ok().data("workspaceId", workspaceId)); + } + + @GET + @Path("/workspaces/demos") + public Response getAllHomepageDemos(@Context HttpServletRequest req) { + String header = req.getHeader("Content-language").trim(); + boolean isChinese = "zh-CN".equals(header); + List homepageDemos = dssWorkspaceService.getHomepageDemos(isChinese); + return Message.messageToResponse(Message.ok().data("demos", homepageDemos)); + } + + @GET + @Path("/workspaces/videos") + public Response getAllVideos(@Context HttpServletRequest req) { + String header = req.getHeader("Content-language").trim(); + boolean isChinese = "zh-CN".equals(header); + List homepageVideos = dssWorkspaceService.getHomepageVideos(isChinese); + return Message.messageToResponse(Message.ok().data("videos", homepageVideos)); + } + + @GET + @Path("workspaces/{workspaceId}/managements") + public Response getWorkspaceManagements(@Context HttpServletRequest req, @PathParam("workspaceId") Long workspaceId) { + String header = req.getHeader("Content-language").trim(); + boolean isChinese = "zh-CN".equals(header); + String username = SecurityFilter.getLoginUsername(req); + + List managements = dssWorkspaceService.getWorkspaceManagements(workspaceId, username, isChinese); + return Message.messageToResponse(Message.ok().data("managements", managements)); + } + + @GET + @Path("workspaces/{workspaceId}/applications") + public Response getWorkspaceApplications(@Context HttpServletRequest req, @PathParam("workspaceId") Long workspaceId) { + String header = req.getHeader("Content-language").trim(); + boolean isChinese = "zh-CN".equals(header); + String username = SecurityFilter.getLoginUsername(req); + List applications = dssWorkspaceService.getWorkspaceApplications(workspaceId, username, isChinese); + return Message.messageToResponse(Message.ok().data("applications", applications)); + } + + @GET + @Path("/workspaces/{workspaceId}/favorites") + public Response getWorkspaceFavorites(@Context HttpServletRequest req, @PathParam("workspaceId") Long workspaceId) { + String header = req.getHeader("Content-language").trim(); + boolean isChinese = "zh-CN".equals(header); + String username = SecurityFilter.getLoginUsername(req); + List favorites = dssWorkspaceService.getWorkspaceFavorites(workspaceId, username, isChinese); + return Message.messageToResponse(Message.ok().data("favorites", favorites)); + } + + /** + * 应用加入收藏,返回收藏后id + * + * @param req + * @param json + * @return + */ + @POST + @Path("/workspaces/{workspaceId}/favorites") + public Response addFavorite(@Context HttpServletRequest req, @PathParam("workspaceId") Long workspaceId, JsonNode json) { + String username = SecurityFilter.getLoginUsername(req); + Long menuApplicationId = json.get("menuApplicationId").getLongValue(); + Long favoriteId = dssWorkspaceService.addFavorite(username, workspaceId, menuApplicationId); + return Message.messageToResponse(Message.ok().data("favoriteId", favoriteId)); + } + + @DELETE + @Path("/workspaces/{workspaceId}/favorites/{favouritesId}") + public Response deleteFavorite(@Context HttpServletRequest req, @PathParam("workspaceId") Long workspaceId, @PathParam("favouritesId") Long favouritesId) { + String username = SecurityFilter.getLoginUsername(req); + Long favoriteId = dssWorkspaceService.deleteFavorite(username, favouritesId); + return Message.messageToResponse(Message.ok().data("favoriteId", favoriteId)); + } +} diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSFlowService.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSFlowService.java similarity index 64% rename from dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSFlowService.java rename to dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSFlowService.java index 384748ee4b44b872af39e6b461ba014f9a8e7576..12d0f577098db18ee17616841660a8e3ca331ca2 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSFlowService.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSFlowService.java @@ -17,52 +17,48 @@ package com.webank.wedatasphere.dss.server.service; - - - - import com.webank.wedatasphere.dss.appjoint.exception.AppJointErrorException; -import com.webank.wedatasphere.dss.common.entity.flow.DWSFlow; -import com.webank.wedatasphere.dss.common.entity.flow.DWSFlowVersion; +import com.webank.wedatasphere.dss.common.entity.flow.DSSFlow; +import com.webank.wedatasphere.dss.common.entity.flow.DSSFlowVersion; import com.webank.wedatasphere.dss.common.exception.DSSErrorException; import com.webank.wedatasphere.dss.server.operate.Op; import java.util.List; -public interface DWSFlowService { - DWSFlow getFlowByID(Long id); +public interface DSSFlowService { + DSSFlow getFlowByID(Long id); - List listAllFlowVersions(Long flowID, Long projectVersionID); + List listAllFlowVersions(Long flowID, Long projectVersionID); - DWSFlow addRootFlow(DWSFlow dwsFlow, Long taxonomyID, Long projectVersionID) throws DSSErrorException; + DSSFlow addRootFlow(DSSFlow dssFlow, Long taxonomyID, Long projectVersionID) throws DSSErrorException; - DWSFlow addSubFlow(DWSFlow dwsFlow, Long parentFlowID, Long projectVersionID) throws DSSErrorException; + DSSFlow addSubFlow(DSSFlow dssFlow, Long parentFlowID, Long projectVersionID) throws DSSErrorException; /** - * 通过flowID获取最新版本的dwsFlow,版本信息在latestVersion + * 通过flowID获取最新版本的DSSFlow,版本信息在latestVersion * @param flowID * @return */ - DWSFlow getLatestVersionFlow(Long flowID, Long projectVersionID) throws DSSErrorException; + DSSFlow getLatestVersionFlow(Long flowID, Long projectVersionID) throws DSSErrorException; /** - * 通过flowID和某个版本号,获取一个dwsFlow,版本信息在versions数组中的第一个元素 + * 通过flowID和某个版本号,获取一个DSSFlow,版本信息在versions数组中的第一个元素 * @param flowID * @return */ - DWSFlow getOneVersionFlow(Long flowID, String version, Long projectVersionID); + DSSFlow getOneVersionFlow(Long flowID, String version, Long projectVersionID); /** - * 通过dwsFlow对象拿到最新的json,其实这里只要个flowID应该就可以了 - * @param dwsFlow + * 通过DSSFlow对象拿到最新的json,其实这里只要个flowID应该就可以了 + * @param dssFlow * @return */ -/* String getLatestJsonByFlow(DWSFlow dwsFlow); +/* String getLatestJsonByFlow(dssFlow dssFlow); - DWSFlow getLatestVersionFlow(Long ProjectID,String flowName);*/ + dssFlow getLatestVersionFlow(Long ProjectID,String flowName);*/ - void updateFlowBaseInfo(DWSFlow dwsFlow, Long projectVersionID, Long taxonomyID) throws DSSErrorException; + void updateFlowBaseInfo(DSSFlow dssFlow, Long projectVersionID, Long taxonomyID) throws DSSErrorException; void updateFlowTaxonomyRelation(Long flowID, Long taxonomyID) throws DSSErrorException; @@ -72,7 +68,7 @@ public interface DWSFlowService { Integer getParentRank(Long flowID); - DWSFlowVersion getLatestVersionByFlowIDAndProjectVersionID(Long flowID, Long projectVersionID); + DSSFlowVersion getLatestVersionByFlowIDAndProjectVersionID(Long flowID, Long projectVersionID); Long getParentFlowID(Long id); } diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSFlowTaxonomyService.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSFlowTaxonomyService.java similarity index 71% rename from dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSFlowTaxonomyService.java rename to dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSFlowTaxonomyService.java index db0fcf3a8e78ae76cd279c85574e9026cd6f949f..f105779a359c78714a327be2d9739771fbab4c48 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSFlowTaxonomyService.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSFlowTaxonomyService.java @@ -18,22 +18,22 @@ package com.webank.wedatasphere.dss.server.service; -import com.webank.wedatasphere.dss.server.entity.DWSFlowTaxonomy; +import com.webank.wedatasphere.dss.server.entity.DSSFlowTaxonomy; import com.webank.wedatasphere.dss.common.exception.DSSErrorException; import java.util.List; -public interface DWSFlowTaxonomyService { - DWSFlowTaxonomy getFlowTaxonomyByID(Long id); +public interface DSSFlowTaxonomyService { + DSSFlowTaxonomy getFlowTaxonomyByID(Long id); //--------- - List listAllFlowTaxonomy(Long projectVersionID, Boolean isRootFlow); - List listFlowTaxonomy(Long projectVersionID, Long flowTaxonomyID, Boolean isRootFlow); + List listAllFlowTaxonomy(Long projectVersionID, Boolean isRootFlow); + List listFlowTaxonomy(Long projectVersionID, Long flowTaxonomyID, Boolean isRootFlow); - void addFlowTaxonomy(DWSFlowTaxonomy dwsFlowTaxonomy, Long projectVersionID) throws DSSErrorException; + void addFlowTaxonomy(DSSFlowTaxonomy dssFlowTaxonomy, Long projectVersionID) throws DSSErrorException; - void updateFlowTaxonomy(DWSFlowTaxonomy dwsFlowTaxonomy, Long projectVersionID) throws DSSErrorException; + void updateFlowTaxonomy(DSSFlowTaxonomy dssFlowTaxonomy, Long projectVersionID) throws DSSErrorException; boolean hasFlows(Long flowTaxonomyID); diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSNodeInfoService.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSNodeInfoService.java similarity index 95% rename from dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSNodeInfoService.java rename to dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSNodeInfoService.java index 846c69f4b0d0b1d6714e918a94dd4ecf85ad9795..dc7303f52f8f74dcde5f23697965c684821e752c 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSNodeInfoService.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSNodeInfoService.java @@ -26,7 +26,7 @@ import com.webank.wedatasphere.dss.server.entity.NodeInfo; import java.util.List; -public interface DWSNodeInfoService { +public interface DSSNodeInfoService { List getNodeType(String f); diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSProjectService.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSProjectService.java similarity index 70% rename from dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSProjectService.java rename to dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSProjectService.java index 1f40d208877eaa43bf5bdb37246bf31077930b03..1cad4fd681e30ba791394bbb57b5634da0726209 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSProjectService.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSProjectService.java @@ -17,40 +17,38 @@ package com.webank.wedatasphere.dss.server.service; - - import com.webank.wedatasphere.dss.appjoint.exception.AppJointErrorException; -import com.webank.wedatasphere.dss.common.entity.project.DWSProject; -import com.webank.wedatasphere.dss.common.entity.project.DWSProjectPublishHistory; -import com.webank.wedatasphere.dss.common.entity.project.DWSProjectVersion; +import com.webank.wedatasphere.dss.common.entity.project.DSSProject; +import com.webank.wedatasphere.dss.common.entity.project.DSSProjectPublishHistory; +import com.webank.wedatasphere.dss.common.entity.project.DSSProjectVersion; import com.webank.wedatasphere.dss.common.exception.DSSErrorException; -import com.webank.wedatasphere.dss.common.protocol.RequestDWSProject; +import com.webank.wedatasphere.dss.common.protocol.RequestDSSProject; import java.util.Date; import java.util.List; -public interface DWSProjectService { +public interface DSSProjectService { - DWSProject getProjectByID(Long id); + DSSProject getProjectByID(Long id); - Long addProject(String userName, String name, String description, Long taxonomyID,String product,Integer applicationArea,String business) throws DSSErrorException, AppJointErrorException; + Long addProject(String userName, String name, String description, Long taxonomyID,String product,Integer applicationArea,String business, Long workspaceId) throws DSSErrorException, AppJointErrorException; void updateProject(long projectID, String name, String description, String userName , String product ,Integer applicationArea ,String business) throws AppJointErrorException; void deleteProject(long projectID, String userName, Boolean ifDelScheduler) throws DSSErrorException; - DWSProject getLatestVersionProject(Long projectID); + DSSProject getLatestVersionProject(Long projectID); - DWSProject getProjectByProjectVersionID(Long projectVersionID); + DSSProject getProjectByProjectVersionID(Long projectVersionID); Boolean isPublished(Long projectID); - List listAllProjectVersions(Long projectID); + List listAllProjectVersions(Long projectID); Long copyProject(Long projectVersionID, Long projectID, String projectName, String userName) throws DSSErrorException, InterruptedException, AppJointErrorException; - void copyProjectVersionMax(Long projectVersionID, DWSProjectVersion maxVersion, DWSProjectVersion copyVersion, String userName, Long WTSSprojectID) throws DSSErrorException, InterruptedException; + void copyProjectVersionMax(Long projectVersionID, DSSProjectVersion maxVersion, DSSProjectVersion copyVersion, String userName, Long WTSSprojectID) throws DSSErrorException, InterruptedException; void publish(Long projectVersionID, String userName, String comment) throws DSSErrorException, InterruptedException, AppJointErrorException; @@ -58,9 +56,9 @@ public interface DWSProjectService { void updatePublishHistory(Long projectVersionID, Integer status, Date updateTime); - DWSProjectPublishHistory getPublishHistoryByID(Long projectVersionID); + DSSProjectPublishHistory getPublishHistoryByID(Long projectVersionID); - DWSProject getExecutionDWSProject(RequestDWSProject requestDWSProject) throws DSSErrorException; + DSSProject getExecutionDSSProject(RequestDSSProject requestDSSProject) throws DSSErrorException; Long getAppjointProjectID(Long projectID, String nodeType); diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSProjectTaxonomyService.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSProjectTaxonomyService.java similarity index 67% rename from dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSProjectTaxonomyService.java rename to dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSProjectTaxonomyService.java index 77161fac66ca08cd285df3688f8d7fe837e4e48f..cebabbf63b46864b03cce2ada8acdafab5c1db93 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSProjectTaxonomyService.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSProjectTaxonomyService.java @@ -17,29 +17,26 @@ package com.webank.wedatasphere.dss.server.service; - - - -import com.webank.wedatasphere.dss.server.entity.DWSProjectTaxonomy; +import com.webank.wedatasphere.dss.server.entity.DSSProjectTaxonomy; import com.webank.wedatasphere.dss.common.exception.DSSErrorException; import java.util.List; -public interface DWSProjectTaxonomyService { +public interface DSSProjectTaxonomyService { - DWSProjectTaxonomy getProjectTaxonomyByID(Long id); + DSSProjectTaxonomy getProjectTaxonomyByID(Long id); - List listProjectTaxonomyByUser(String userName); + List listProjectTaxonomyByUser(String userName); //---------------------- - List listAllProjectTaxonomy(String userName); + List listAllProjectTaxonomy(String userName, Long workspaceId); - List listProjectTaxonomy(Long taxonomyID, String userName); + List listProjectTaxonomy(Long taxonomyID, String userName); - void addProjectTaxonomy(DWSProjectTaxonomy dwsProjectTaxonomy) throws DSSErrorException; + void addProjectTaxonomy(DSSProjectTaxonomy dssProjectTaxonomy) throws DSSErrorException; - void updateProjectTaxonomy(DWSProjectTaxonomy dwsProjectTaxonomy) throws DSSErrorException; + void updateProjectTaxonomy(DSSProjectTaxonomy dssProjectTaxonomy) throws DSSErrorException; boolean hasProjects(Long projectTaxonomyID); diff --git a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/DSSUserService.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSUserService.java similarity index 67% rename from dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/DSSUserService.java rename to dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSUserService.java index 3333fd4ffe79007b92588be7d759a4d24c5cc1b1..2c2760297cd1f264fb19132cc64fcdc06a01939e 100644 --- a/dss-application/src/main/java/com/webank/wedatasphere/dss/application/service/DSSUserService.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSUserService.java @@ -15,18 +15,9 @@ * */ -package com.webank.wedatasphere.dss.application.service; +package com.webank.wedatasphere.dss.server.service; -import com.webank.wedatasphere.dss.application.entity.DSSUser; -/** - * Created by chaogefeng on 2019/10/11. - */ public interface DSSUserService { - - DSSUser getUserByName(String username); - - void registerDSSUser(DSSUser userDb); - - void updateUserFirstLogin(Long id); + Long getUserID(String userName); } diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSWorkspaceService.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSWorkspaceService.java new file mode 100644 index 0000000000000000000000000000000000000000..f88c221ea2e021b89ebcbfbe026d1ea4a6bcbdb5 --- /dev/null +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DSSWorkspaceService.java @@ -0,0 +1,63 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.server.service; + +import com.webank.wedatasphere.dss.server.dto.response.HomepageDemoMenuVo; +import com.webank.wedatasphere.dss.server.dto.response.HomepageVideoVo; +import com.webank.wedatasphere.dss.server.dto.response.OnestopMenuVo; +import com.webank.wedatasphere.dss.server.entity.DSSWorkspace; +import com.webank.wedatasphere.dss.server.dto.response.WorkspaceDepartmentVo; +import com.webank.wedatasphere.dss.server.dto.response.*; + +import java.util.List; + +/** + * Created by schumiyi on 2020/6/22 + */ +public interface DSSWorkspaceService { + List getWorkspaces(); + + Long addWorkspace(String userName, String name, String department, String label, String description); + + boolean existWorkspaceName(String name); + + List getWorkSpaceDepartments(); + + List getHomepageDemos(boolean isChinese); + + List getHomepageVideos(boolean isChinese); + + List getWorkspaceManagements(Long workspaceId, String username, boolean isChinese); + + List getWorkspaceApplications(Long workspaceId, String username, boolean isChinese); + + DSSWorkspace getWorkspacesById(Long id); + + /** + * 查询用户收藏的应用,如果是新用户,就在数据库给它插入默认两个收藏:脚本开发与工作流 workflow scriptis + * @param workspaceId + * @param username + * @param isChinese + * @return + */ + List getWorkspaceFavorites(Long workspaceId, String username, boolean isChinese); + + Long addFavorite(String username, Long workspaceId, Long menuApplicationId); + + Long deleteFavorite(String username, Long favouritesId); +} diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSFlowServiceImpl.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSFlowServiceImpl.java similarity index 70% rename from dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSFlowServiceImpl.java rename to dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSFlowServiceImpl.java index c4ce9228a3f2dd60219fd9b6e9895b4eb48ef060..218aacc449b72b5edb6ccd28d13edd8e760e54a1 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSFlowServiceImpl.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSFlowServiceImpl.java @@ -19,18 +19,18 @@ package com.webank.wedatasphere.dss.server.service.impl; import com.webank.wedatasphere.dss.appjoint.exception.AppJointErrorException; -import com.webank.wedatasphere.dss.common.entity.flow.DWSFlow; -import com.webank.wedatasphere.dss.common.entity.flow.DWSFlowVersion; +import com.webank.wedatasphere.dss.common.entity.flow.DSSFlow; +import com.webank.wedatasphere.dss.common.entity.flow.DSSFlowVersion; import com.webank.wedatasphere.dss.common.exception.DSSErrorException; -import com.webank.wedatasphere.dss.server.dao.DWSUserMapper; +import com.webank.wedatasphere.dss.server.dao.DSSUserMapper; import com.webank.wedatasphere.dss.server.dao.FlowMapper; import com.webank.wedatasphere.dss.server.dao.FlowTaxonomyMapper; import com.webank.wedatasphere.dss.server.lock.Lock; import com.webank.wedatasphere.dss.server.operate.Op; import com.webank.wedatasphere.dss.server.operate.Operate; import com.webank.wedatasphere.dss.server.service.BMLService; -import com.webank.wedatasphere.dss.server.service.DWSFlowService; -import com.webank.wedatasphere.dss.server.service.DWSProjectService; +import com.webank.wedatasphere.dss.server.service.DSSFlowService; +import com.webank.wedatasphere.dss.server.service.DSSProjectService; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; @@ -43,7 +43,7 @@ import java.util.stream.Collectors; @Service -public class DWSFlowServiceImpl implements DWSFlowService { +public class DSSFlowServiceImpl implements DSSFlowService { private Logger logger = LoggerFactory.getLogger(this.getClass()); @@ -52,23 +52,23 @@ public class DWSFlowServiceImpl implements DWSFlowService { @Autowired private FlowTaxonomyMapper flowTaxonomyMapper; @Autowired - private DWSUserMapper dwsUserMapper; + private DSSUserMapper dssUserMapper; @Autowired - private DWSProjectService projectService; + private DSSProjectService projectService; @Autowired private Operate[] operates; @Autowired private BMLService bmlService; @Override - public DWSFlow getFlowByID(Long id) { + public DSSFlow getFlowByID(Long id) { return flowMapper.selectFlowByID(id); } @Override - public List listAllFlowVersions(Long flowID, Long projectVersionID) { - List versions = flowMapper.listFlowVersionsByFlowID(flowID, projectVersionID).stream().sorted((v1, v2) -> { + public List listAllFlowVersions(Long flowID, Long projectVersionID) { + List versions = flowMapper.listFlowVersionsByFlowID(flowID, projectVersionID).stream().sorted((v1, v2) -> { return v1.compareTo(v2); }).collect(Collectors.toList()); return versions; @@ -77,76 +77,76 @@ public class DWSFlowServiceImpl implements DWSFlowService { @Lock @Transactional(rollbackFor = DSSErrorException.class) @Override - public DWSFlow addRootFlow(DWSFlow dwsFlow, Long taxonomyID, Long projectVersionID) throws DSSErrorException { + public DSSFlow addRootFlow(DSSFlow dssFlow, Long taxonomyID, Long projectVersionID) throws DSSErrorException { try { - flowMapper.insertFlow(dwsFlow); + flowMapper.insertFlow(dssFlow); } catch (DuplicateKeyException e) { logger.info(e.getMessage()); throw new DSSErrorException(90003, "工作流名不能重复"); } //通过resource上传空文件,获取resourceId(jsonPath)和version - Map bmlReturnMap = bmlService.upload(dwsUserMapper.getuserName(dwsFlow.getCreatorID()), "", UUID.randomUUID().toString() + ".json"); + Map bmlReturnMap = bmlService.upload(dssUserMapper.getuserName(dssFlow.getCreatorID()), "", UUID.randomUUID().toString() + ".json"); //数据库中插入版本信息 - DWSFlowVersion version = new DWSFlowVersion(); - version.setFlowID(dwsFlow.getId()); + DSSFlowVersion version = new DSSFlowVersion(); + version.setFlowID(dssFlow.getId()); version.setSource("create by user"); version.setJsonPath(bmlReturnMap.get("resourceId").toString()); version.setVersion(bmlReturnMap.get("version").toString()); version.setUpdateTime(new Date()); - version.setUpdatorID(dwsFlow.getCreatorID()); + version.setUpdatorID(dssFlow.getCreatorID()); // TODO: 2019/6/12 这里应该由前台传入 version.setProjectVersionID(projectVersionID); flowMapper.insertFlowVersion(version); //数据库中插入分类关联信息 - flowTaxonomyMapper.insertFlowTaxonomyRelation(taxonomyID, dwsFlow.getId()); - return dwsFlow; + flowTaxonomyMapper.insertFlowTaxonomyRelation(taxonomyID, dssFlow.getId()); + return dssFlow; } @Lock @Transactional(rollbackFor = DSSErrorException.class) @Override - public DWSFlow addSubFlow(DWSFlow dwsFlow, Long parentFlowID, Long projectVersionID) throws DSSErrorException { + public DSSFlow addSubFlow(DSSFlow dssFlow, Long parentFlowID, Long projectVersionID) throws DSSErrorException { Long taxonomyID = flowTaxonomyMapper.selectTaxonomyIDByFlowID(parentFlowID); - DWSFlow parentFlow = flowMapper.selectFlowByID(parentFlowID); - dwsFlow.setProjectID(parentFlow.getProjectID()); - DWSFlow subFlow = addRootFlow(dwsFlow, taxonomyID, projectVersionID); + DSSFlow parentFlow = flowMapper.selectFlowByID(parentFlowID); + dssFlow.setProjectID(parentFlow.getProjectID()); + DSSFlow subFlow = addRootFlow(dssFlow, taxonomyID, projectVersionID); //数据库中插入关联信息 flowMapper.insertFlowRelation(subFlow.getId(), parentFlowID); return subFlow; } @Override - public DWSFlow getLatestVersionFlow(Long flowID, Long projectVersionID) throws DSSErrorException { - DWSFlow dwsFlow = getFlowByID(flowID); - DWSFlowVersion dwsFlowVersion = getLatestVersionByFlowIDAndProjectVersionID(flowID, projectVersionID); - if (dwsFlowVersion == null) throw new DSSErrorException(90011, "该工作流已经被删除,请重新创建"); - String userName = dwsUserMapper.getuserName(dwsFlowVersion.getUpdatorID()); - Map query = bmlService.query(userName, dwsFlowVersion.getJsonPath(), dwsFlowVersion.getVersion()); - dwsFlowVersion.setJson(query.get("string").toString()); - dwsFlow.setLatestVersion(dwsFlowVersion); - return dwsFlow; + public DSSFlow getLatestVersionFlow(Long flowID, Long projectVersionID) throws DSSErrorException { + DSSFlow dssFlow = getFlowByID(flowID); + DSSFlowVersion dssFlowVersion = getLatestVersionByFlowIDAndProjectVersionID(flowID, projectVersionID); + if (dssFlowVersion == null) throw new DSSErrorException(90011, "该工作流已经被删除,请重新创建"); + String userName = dssUserMapper.getuserName(dssFlowVersion.getUpdatorID()); + Map query = bmlService.query(userName, dssFlowVersion.getJsonPath(), dssFlowVersion.getVersion()); + dssFlowVersion.setJson(query.get("string").toString()); + dssFlow.setLatestVersion(dssFlowVersion); + return dssFlow; } @Override - public DWSFlow getOneVersionFlow(Long flowID, String version, Long projectVersionID) { - DWSFlow dwsFlow = getFlowByID(flowID); - DWSFlowVersion dwsFlowVersion = flowMapper.selectVersionByFlowID(flowID, version, projectVersionID); - String userName = dwsUserMapper.getuserName(dwsFlowVersion.getUpdatorID()); - Map query = bmlService.query(userName, dwsFlowVersion.getJsonPath(), dwsFlowVersion.getVersion()); - dwsFlowVersion.setJson(query.get("string").toString()); - dwsFlow.setFlowVersions(Arrays.asList(dwsFlowVersion)); - dwsFlow.setLatestVersion(dwsFlowVersion); - return dwsFlow; + public DSSFlow getOneVersionFlow(Long flowID, String version, Long projectVersionID) { + DSSFlow dssFlow = getFlowByID(flowID); + DSSFlowVersion dssFlowVersion = flowMapper.selectVersionByFlowID(flowID, version, projectVersionID); + String userName = dssUserMapper.getuserName(dssFlowVersion.getUpdatorID()); + Map query = bmlService.query(userName, dssFlowVersion.getJsonPath(), dssFlowVersion.getVersion()); + dssFlowVersion.setJson(query.get("string").toString()); + dssFlow.setFlowVersions(Arrays.asList(dssFlowVersion)); + dssFlow.setLatestVersion(dssFlowVersion); + return dssFlow; } /* @Override - public String getLatestJsonByFlow(DWSFlow dwsFlow) { - DWSFlow latestVersionFlow = getLatestVersionFlow(dwsFlow.getId()); + public String getLatestJsonByFlow(DSSFlow DSSFlow) { + DSSFlow latestVersionFlow = getLatestVersionFlow(DSSFlow.getId()); return latestVersionFlow.getLatestVersion().getJson(); } @Override - public DWSFlow getLatestVersionFlow(Long ProjectID, String flowName) { + public DSSFlow getLatestVersionFlow(Long ProjectID, String flowName) { Long flowID = flowMapper.selectFlowIDByProjectIDAndFlowName(ProjectID, flowName); return getLatestVersionFlow(flowID); }*/ @@ -154,22 +154,22 @@ public class DWSFlowServiceImpl implements DWSFlowService { @Lock @Transactional(rollbackFor = DSSErrorException.class) @Override - public void updateFlowBaseInfo(DWSFlow dwsFlow, Long projectVersionID, Long taxonomyID) throws DSSErrorException { + public void updateFlowBaseInfo(DSSFlow dssFlow, Long projectVersionID, Long taxonomyID) throws DSSErrorException { try { - flowMapper.updateFlowBaseInfo(dwsFlow); + flowMapper.updateFlowBaseInfo(dssFlow); } catch (DuplicateKeyException e) { logger.info(e.getMessage()); throw new DSSErrorException(90003, "工作流名不能重复"); } - if (taxonomyID != null) updateFlowTaxonomyRelation(dwsFlow.getId(), taxonomyID); + if (taxonomyID != null) updateFlowTaxonomyRelation(dssFlow.getId(), taxonomyID); } @Override public void updateFlowTaxonomyRelation(Long flowID, Long taxonomyID) throws DSSErrorException { - DWSFlow dwsFlow = getFlowByID(flowID); + DSSFlow dssFlow = getFlowByID(flowID); Long oldTaxonomyID = flowTaxonomyMapper.selectTaxonomyIDByFlowID(flowID); - if (!dwsFlow.getRootFlow() && (!oldTaxonomyID.equals(taxonomyID))) throw new DSSErrorException(90010, "子工作流不允许更新分类id"); - if (!dwsFlow.getRootFlow() && (oldTaxonomyID.equals(taxonomyID))) return; + if (!dssFlow.getRootFlow() && (!oldTaxonomyID.equals(taxonomyID))) throw new DSSErrorException(90010, "子工作流不允许更新分类id"); + if (!dssFlow.getRootFlow() && (oldTaxonomyID.equals(taxonomyID))) return; //这里也要同时更新子工作流的分类id List subFlowIDList = flowMapper.selectSubFlowIDByParentFlowID(flowID); subFlowIDList.add(flowID); @@ -213,17 +213,17 @@ public class DWSFlowServiceImpl implements DWSFlowService { String resourceId = getLatestVersionByFlowIDAndProjectVersionID(flowID, projectVersionID).getJsonPath(); //上传文件获取resourceId和version save应该是已经有 Map bmlReturnMap = bmlService.update(userName, resourceId, jsonFlow); - DWSFlowVersion dwsFlowVersion = new DWSFlowVersion(); - dwsFlowVersion.setUpdatorID(dwsUserMapper.getUserID(userName)); - dwsFlowVersion.setUpdateTime(new Date()); - dwsFlowVersion.setVersion(bmlReturnMap.get("version").toString()); - dwsFlowVersion.setJsonPath(resourceId); - dwsFlowVersion.setComment(comment); - dwsFlowVersion.setFlowID(flowID); - dwsFlowVersion.setSource("保存更新"); - dwsFlowVersion.setProjectVersionID(projectVersionID); + DSSFlowVersion dssFlowVersion = new DSSFlowVersion(); + dssFlowVersion.setUpdatorID(dssUserMapper.getUserID(userName)); + dssFlowVersion.setUpdateTime(new Date()); + dssFlowVersion.setVersion(bmlReturnMap.get("version").toString()); + dssFlowVersion.setJsonPath(resourceId); + dssFlowVersion.setComment(comment); + dssFlowVersion.setFlowID(flowID); + dssFlowVersion.setSource("保存更新"); + dssFlowVersion.setProjectVersionID(projectVersionID); //version表中插入数据 - flowMapper.insertFlowVersion(dwsFlowVersion); + flowMapper.insertFlowVersion(dssFlowVersion); return bmlReturnMap.get("version").toString(); } @@ -233,8 +233,8 @@ public class DWSFlowServiceImpl implements DWSFlowService { } @Override - public DWSFlowVersion getLatestVersionByFlowIDAndProjectVersionID(Long flowID, Long projectVersionID) { - List versions = flowMapper.listVersionByFlowIDAndProjectVersionID(flowID, projectVersionID) + public DSSFlowVersion getLatestVersionByFlowIDAndProjectVersionID(Long flowID, Long projectVersionID) { + List versions = flowMapper.listVersionByFlowIDAndProjectVersionID(flowID, projectVersionID) .stream().sorted((v1, v2) -> { return v1.compareTo(v2); }).collect(Collectors.toList()); @@ -262,21 +262,21 @@ public class DWSFlowServiceImpl implements DWSFlowService { deleteFlow(subFlowID, projectVersionID); } for (Long subFlowID : subFlowIDs) { - deleteDWSDB(subFlowID, projectVersionID); + deleteDSSDB(subFlowID, projectVersionID); // TODO: 2019/6/5 wtss发布过的工作流的删除? // TODO: 2019/6/5 json中资源的删除 // TODO: 2019/6/5 事务的保证 } - deleteDWSDB(flowId, projectVersionID); + deleteDSSDB(flowId, projectVersionID); } - private void deleteDWSDB(Long flowID, Long projectVersionID) { + private void deleteDSSDB(Long flowID, Long projectVersionID) { flowMapper.deleteFlowVersions(flowID, projectVersionID); if (projectVersionID == null || (flowMapper.noVersions(flowID) != null && flowMapper.noVersions(flowID))) { flowMapper.deleteFlowBaseInfo(flowID); flowMapper.deleteFlowRelation(flowID); flowTaxonomyMapper.deleteFlowTaxonomyRelation(flowID); } - //第一期没有工作流的发布,所以不需要删除dws工作流的发布表 + //第一期没有工作流的发布,所以不需要删除DSS工作流的发布表 } } diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSFlowTaxonomyServiceImpl.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSFlowTaxonomyServiceImpl.java similarity index 58% rename from dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSFlowTaxonomyServiceImpl.java rename to dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSFlowTaxonomyServiceImpl.java index 6d2d4d56b1772d1b67e5db043249a3cfa9f2e638..0cbba745b2f18f1695bfb7d779034f203f1ad07c 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSFlowTaxonomyServiceImpl.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSFlowTaxonomyServiceImpl.java @@ -17,15 +17,15 @@ package com.webank.wedatasphere.dss.server.service.impl; -import com.webank.wedatasphere.dss.server.service.DWSFlowService; -import com.webank.wedatasphere.dss.server.service.DWSFlowTaxonomyService; -import com.webank.wedatasphere.dss.server.service.DWSProjectService; +import com.webank.wedatasphere.dss.server.service.DSSFlowService; +import com.webank.wedatasphere.dss.server.service.DSSFlowTaxonomyService; +import com.webank.wedatasphere.dss.server.service.DSSProjectService; import com.webank.wedatasphere.dss.server.dao.FlowMapper; import com.webank.wedatasphere.dss.server.dao.FlowTaxonomyMapper; -import com.webank.wedatasphere.dss.server.entity.DWSFlowTaxonomy; -import com.webank.wedatasphere.dss.common.entity.flow.DWSFlow; -import com.webank.wedatasphere.dss.common.entity.flow.DWSFlowVersion; -import com.webank.wedatasphere.dss.common.entity.project.DWSProject; +import com.webank.wedatasphere.dss.server.entity.DSSFlowTaxonomy; +import com.webank.wedatasphere.dss.common.entity.flow.DSSFlow; +import com.webank.wedatasphere.dss.common.entity.flow.DSSFlowVersion; +import com.webank.wedatasphere.dss.common.entity.project.DSSProject; import com.webank.wedatasphere.dss.common.exception.DSSErrorException; import com.webank.wedatasphere.dss.server.lock.Lock; import org.slf4j.Logger; @@ -41,7 +41,7 @@ import java.util.stream.Collectors; @Service -public class DWSFlowTaxonomyServiceImpl implements DWSFlowTaxonomyService { +public class DSSFlowTaxonomyServiceImpl implements DSSFlowTaxonomyService { private Logger logger = LoggerFactory.getLogger(this.getClass()); @Autowired @@ -51,60 +51,60 @@ public class DWSFlowTaxonomyServiceImpl implements DWSFlowTaxonomyService { private FlowMapper flowMapper; @Autowired - private DWSProjectService projectService; + private DSSProjectService projectService; @Autowired - private DWSFlowService flowService; + private DSSFlowService flowService; @Override - public DWSFlowTaxonomy getFlowTaxonomyByID(Long id) { + public DSSFlowTaxonomy getFlowTaxonomyByID(Long id) { return flowTaxonomyMapper.selectFlowTaxonomyByID(id); } - private List listFlowByTaxonomyID(Long projectID, Long taxonomyID, Boolean isRootFlow){ + private List listFlowByTaxonomyID(Long projectID, Long taxonomyID, Boolean isRootFlow){ return flowMapper.listFlowByTaxonomyID(projectID,taxonomyID,isRootFlow); } // TODO: 2019/5/16 级联查询返回json序列化好像有问题,暂时只能循环查 @Override - public List listAllFlowTaxonomy(Long projectVersionID,Boolean isRootFlow) { - DWSProject dwsProject = projectService.getProjectByProjectVersionID(projectVersionID); - List dwsFlowTaxonomies = listFlowTaxonomyByProjectID(dwsProject.getId()); - for (DWSFlowTaxonomy dwsFlowTaxonomy : dwsFlowTaxonomies) { - List dwsFlowList = listFlowByTaxonomyID(dwsProject.getId(),dwsFlowTaxonomy.getId(),isRootFlow); - for (DWSFlow dwsFlow : dwsFlowList) { - DWSFlowVersion version = flowService.getLatestVersionByFlowIDAndProjectVersionID(dwsFlow.getId(),projectVersionID); - dwsFlow.setLatestVersion(version); + public List listAllFlowTaxonomy(Long projectVersionID, Boolean isRootFlow) { + DSSProject dssProject = projectService.getProjectByProjectVersionID(projectVersionID); + List dwsFlowTaxonomies = listFlowTaxonomyByProjectID(dssProject.getId()); + for (DSSFlowTaxonomy dssFlowTaxonomy : dwsFlowTaxonomies) { + List dssFlowList = listFlowByTaxonomyID(dssProject.getId(), dssFlowTaxonomy.getId(),isRootFlow); + for (DSSFlow dssFlow : dssFlowList) { + DSSFlowVersion version = flowService.getLatestVersionByFlowIDAndProjectVersionID(dssFlow.getId(),projectVersionID); + dssFlow.setLatestVersion(version); } - dwsFlowTaxonomy.setDwsFlowList(dwsFlowList.stream().filter(f ->f.getLatestVersion() !=null).collect(Collectors.toList())); + dssFlowTaxonomy.setDssFlowList(dssFlowList.stream().filter(f ->f.getLatestVersion() !=null).collect(Collectors.toList())); } return dwsFlowTaxonomies; } - private List listFlowTaxonomyByProjectID(Long projectID) { + private List listFlowTaxonomyByProjectID(Long projectID) { return flowTaxonomyMapper.listFlowTaxonomyByProjectID(projectID); } //有projectID应该可以同时过滤不同用户中的,我的工作流分类下的工作流 @Override - public List listFlowTaxonomy(Long projectVersionID,Long flowTaxonomyID, Boolean isRootFlow) { - DWSFlowTaxonomy flowTaxonomy = getFlowTaxonomyByID(flowTaxonomyID); - DWSProject dwsProject = projectService.getProjectByProjectVersionID(projectVersionID); - List dwsFlowList = listFlowByTaxonomyID(dwsProject.getId(),flowTaxonomyID,isRootFlow); - for (DWSFlow dwsFlow : dwsFlowList) { - DWSFlowVersion version = flowService.getLatestVersionByFlowIDAndProjectVersionID(dwsFlow.getId(),projectVersionID); - dwsFlow.setLatestVersion(version); + public List listFlowTaxonomy(Long projectVersionID, Long flowTaxonomyID, Boolean isRootFlow) { + DSSFlowTaxonomy flowTaxonomy = getFlowTaxonomyByID(flowTaxonomyID); + DSSProject dssProject = projectService.getProjectByProjectVersionID(projectVersionID); + List dssFlowList = listFlowByTaxonomyID(dssProject.getId(),flowTaxonomyID,isRootFlow); + for (DSSFlow dssFlow : dssFlowList) { + DSSFlowVersion version = flowService.getLatestVersionByFlowIDAndProjectVersionID(dssFlow.getId(),projectVersionID); + dssFlow.setLatestVersion(version); } - flowTaxonomy.setDwsFlowList(dwsFlowList.stream().filter(f ->f.getLatestVersion() !=null).collect(Collectors.toList())); + flowTaxonomy.setDssFlowList(dssFlowList.stream().filter(f ->f.getLatestVersion() !=null).collect(Collectors.toList())); return Arrays.asList(flowTaxonomy); } @Lock @Transactional(rollbackFor = DSSErrorException.class) @Override - public void addFlowTaxonomy(DWSFlowTaxonomy dwsFlowTaxonomy,Long projectVersionID) throws DSSErrorException { + public void addFlowTaxonomy(DSSFlowTaxonomy dssFlowTaxonomy, Long projectVersionID) throws DSSErrorException { try { - flowTaxonomyMapper.insertFlowTaxonomy(dwsFlowTaxonomy); + flowTaxonomyMapper.insertFlowTaxonomy(dssFlowTaxonomy); }catch (DuplicateKeyException e){ logger.info(e.getMessage()); throw new DSSErrorException(90005,"工作流名分类名不能重复"); @@ -114,9 +114,9 @@ public class DWSFlowTaxonomyServiceImpl implements DWSFlowTaxonomyService { @Lock @Transactional(rollbackFor = DSSErrorException.class) @Override - public void updateFlowTaxonomy(DWSFlowTaxonomy dwsFlowTaxonomy,Long projectVersionID) throws DSSErrorException { + public void updateFlowTaxonomy(DSSFlowTaxonomy dssFlowTaxonomy, Long projectVersionID) throws DSSErrorException { try { - flowTaxonomyMapper.updateFlowTaxonomy(dwsFlowTaxonomy); + flowTaxonomyMapper.updateFlowTaxonomy(dssFlowTaxonomy); }catch (DuplicateKeyException e){ logger.info(e.getMessage()); throw new DSSErrorException(90005,"工作流名分类名不能重复"); diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSNodeInfoServiceImpl.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSNodeInfoServiceImpl.java similarity index 85% rename from dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSNodeInfoServiceImpl.java rename to dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSNodeInfoServiceImpl.java index bfbdfa5d510a3408d16046a0970bc042cab0a3a1..0b9571f265df88065dcef79fc872dc4f8d924f13 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSNodeInfoServiceImpl.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSNodeInfoServiceImpl.java @@ -17,10 +17,9 @@ package com.webank.wedatasphere.dss.server.service.impl; -import com.webank.wedatasphere.dss.application.entity.Application; import com.webank.wedatasphere.dss.server.dao.NodeInfoMapper; import com.webank.wedatasphere.dss.server.entity.NodeInfo; -import com.webank.wedatasphere.dss.server.service.DWSNodeInfoService; +import com.webank.wedatasphere.dss.server.service.DSSNodeInfoService; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Service; @@ -28,7 +27,7 @@ import java.util.List; @Service -public class DWSNodeInfoServiceImpl implements DWSNodeInfoService { +public class DSSNodeInfoServiceImpl implements DSSNodeInfoService { @Autowired private NodeInfoMapper nodeInfoMapper; diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSProjectServiceImpl.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSProjectServiceImpl.java similarity index 63% rename from dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSProjectServiceImpl.java rename to dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSProjectServiceImpl.java index 9934d2373c4d64eeac0cada8b051b77ec1e7efb9..9d007cebc5c2f1e2451cff7e9f17f93723f8243f 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSProjectServiceImpl.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSProjectServiceImpl.java @@ -27,25 +27,25 @@ import com.webank.wedatasphere.dss.appjoint.scheduler.tuning.ProjectTuning; import com.webank.wedatasphere.dss.appjoint.service.ProjectService; import com.webank.wedatasphere.dss.application.entity.Application; import com.webank.wedatasphere.dss.application.service.ApplicationService; -import com.webank.wedatasphere.dss.common.entity.flow.DWSFlow; -import com.webank.wedatasphere.dss.common.entity.flow.DWSFlowVersion; -import com.webank.wedatasphere.dss.common.entity.project.DWSProject; -import com.webank.wedatasphere.dss.common.entity.project.DWSProjectPublishHistory; -import com.webank.wedatasphere.dss.common.entity.project.DWSProjectVersion; +import com.webank.wedatasphere.dss.common.entity.flow.DSSFlow; +import com.webank.wedatasphere.dss.common.entity.flow.DSSFlowVersion; +import com.webank.wedatasphere.dss.common.entity.project.DSSProject; +import com.webank.wedatasphere.dss.common.entity.project.DSSProjectPublishHistory; +import com.webank.wedatasphere.dss.common.entity.project.DSSProjectVersion; import com.webank.wedatasphere.dss.common.entity.project.Project; import com.webank.wedatasphere.dss.common.exception.DSSErrorException; -import com.webank.wedatasphere.dss.common.protocol.RequestDWSProject; +import com.webank.wedatasphere.dss.common.protocol.RequestDSSProject; import com.webank.wedatasphere.dss.common.utils.DSSExceptionUtils; import com.webank.wedatasphere.dss.server.constant.DSSServerConstant; import com.webank.wedatasphere.dss.server.dao.*; -import com.webank.wedatasphere.dss.server.entity.DWSFlowTaxonomy; -import com.webank.wedatasphere.dss.server.entity.DWSProjectTaxonomyRelation; +import com.webank.wedatasphere.dss.server.entity.DSSFlowTaxonomy; +import com.webank.wedatasphere.dss.server.entity.DSSProjectTaxonomyRelation; import com.webank.wedatasphere.dss.server.function.FunctionInvoker; import com.webank.wedatasphere.dss.server.lock.Lock; import com.webank.wedatasphere.dss.server.lock.LockEnum; import com.webank.wedatasphere.dss.server.service.BMLService; -import com.webank.wedatasphere.dss.server.service.DWSFlowService; -import com.webank.wedatasphere.dss.server.service.DWSProjectService; +import com.webank.wedatasphere.dss.server.service.DSSFlowService; +import com.webank.wedatasphere.dss.server.service.DSSProjectService; import com.webank.wedatasphere.dss.server.util.ThreadPoolTool; import org.apache.commons.lang.StringUtils; import org.apache.commons.math3.util.Pair; @@ -63,30 +63,23 @@ import java.util.stream.Stream; @Service -public class DWSProjectServiceImpl implements DWSProjectService { +public class DSSProjectServiceImpl implements DSSProjectService { private Logger logger = LoggerFactory.getLogger(this.getClass()); @Autowired private ProjectTaxonomyMapper projectTaxonomyMapper; @Autowired - private DWSUserMapper dwsUserMapper; + private DSSUserMapper dssUserMapper; @Autowired private FlowMapper flowMapper; @Autowired private FlowTaxonomyMapper flowTaxonomyMapper; @Autowired - private DWSFlowService flowService; + private DSSFlowService flowService; @Autowired private ProjectMapper projectMapper; - @Autowired - private SchedulerAppJoint schedulerAppJoint; - @Autowired - private ProjectParser projectParser; - @Autowired - private ProjectTuning projectTuning; - @Autowired - private ProjectPublishHook[] projectPublishHooks; + @Autowired private BMLService bmlService; @Autowired @@ -94,30 +87,32 @@ public class DWSProjectServiceImpl implements DWSProjectService { @Autowired private FunctionInvoker functionInvoker; + private SchedulerAppJoint schedulerAppJoint=null; + @Override - public DWSProject getProjectByID(Long id) { + public DSSProject getProjectByID(Long id) { /*JdbcProjectImpl instance = wtssdbConnector.getInjector().getInstance(JdbcProjectImpl.class); Project project = instance.fetchProjectById(id.intValue()); - DWSProject dwsProject = EntityUtils.Project2DWSProject(project);*/ + DSSProject DSSProject = EntityUtils.Project2DSSProject(project);*/ return projectMapper.selectProjectByID(id); } @Transactional(rollbackFor = {DSSErrorException.class,AppJointErrorException.class}) @Override - public Long addProject(String userName, String name, String description, Long taxonomyID,String product,Integer applicationArea,String business) throws DSSErrorException, AppJointErrorException { - DWSProject dwsProject = new DWSProject(); - dwsProject.setUserName(userName); - dwsProject.setName(name); - dwsProject.setDescription(description); + public Long addProject(String userName, String name, String description, Long taxonomyID,String product,Integer applicationArea,String business, Long workspaceId) throws DSSErrorException, AppJointErrorException { + DSSProject dssProject = new DSSProject(); + dssProject.setUserName(userName); + dssProject.setName(name); + dssProject.setDescription(description); //创建scheduler的project if(existSchesulis()){ - createSchedulerProject(dwsProject); + createSchedulerProject(dssProject); } //创建appjoint的project - Map appjointProjectIDAndAppID = createAppjointProject(dwsProject); - Long userID = dwsUserMapper.getUserID(userName); + Map appjointProjectIDAndAppID = createAppjointProject(dssProject); + Long userID = dssUserMapper.getUserID(userName); //创建dss自己的project - Pair pair = addDWSProject(userID, name, description,product,applicationArea,business); + Pair pair = addDSSProject(userID, name, description,product,applicationArea,business, workspaceId); //添加关联 projectTaxonomyMapper.addProjectTaxonomyRelation(pair.getFirst(), taxonomyID, userID); if(!appjointProjectIDAndAppID.isEmpty())projectMapper.addAccessProjectRelation(appjointProjectIDAndAppID,pair.getFirst()); @@ -126,7 +121,7 @@ public class DWSProjectServiceImpl implements DWSProjectService { - private Map createAppjointProject(DWSProject project) throws DSSErrorException, AppJointErrorException { + private Map createAppjointProject(DSSProject project) throws DSSErrorException, AppJointErrorException { Map applicationProjectIDs = new HashMap(); List> pairs = functionInvoker.projectServiceAddFunction(project, ProjectService::createProject, applicationService.listAppjoint()); for (Pair pair : pairs) { @@ -137,32 +132,37 @@ public class DWSProjectServiceImpl implements DWSProjectService { return applicationProjectIDs; } - private Pair addDWSProject(Long userID, String name, String description,String product,Integer applicationArea,String business) { - DWSProject dwsProject = new DWSProject(); - dwsProject.setUserID(userID); - dwsProject.setName(name); - dwsProject.setDescription(description); - dwsProject.setSource(DSSServerConstant.DWS_PROJECT_SOURCE); - dwsProject.setCreateTime(new Date()); - dwsProject.setCreateBy(userID); - dwsProject.setProduct(product); - dwsProject.setApplicationArea(applicationArea); - dwsProject.setBusiness(business); - projectMapper.addProject(dwsProject); - DWSProjectVersion dwsProjectVersion = new DWSProjectVersion(); - dwsProjectVersion.setComment(DSSServerConstant.DWS_PROJECT_FIRST_VERSION_COMMENT); - dwsProjectVersion.setProjectID(dwsProject.getId()); - dwsProjectVersion.setUpdateTime(new Date()); - dwsProjectVersion.setUpdatorID(userID); - dwsProjectVersion.setVersion(DSSServerConstant.DWS_PROJECT_FIRST_VERSION); - dwsProjectVersion.setLock(0); - projectMapper.addProjectVersion(dwsProjectVersion); - return new Pair(dwsProject.getId(),dwsProjectVersion.getId()); + private Pair addDSSProject(Long userID, String name, String description, String product, Integer applicationArea, String business, Long workspaceId) { + DSSProject dssProject = new DSSProject(); + dssProject.setUserID(userID); + dssProject.setName(name); + dssProject.setDescription(description); + dssProject.setSource(DSSServerConstant.DSS_PROJECT_SOURCE); + dssProject.setCreateTime(new Date()); + dssProject.setCreateBy(userID); + dssProject.setProduct(product); + dssProject.setApplicationArea(applicationArea); + dssProject.setBusiness(business); + dssProject.setWorkspaceId(workspaceId); + projectMapper.addProject(dssProject); + DSSProjectVersion dssProjectVersion = new DSSProjectVersion(); + dssProjectVersion.setComment(DSSServerConstant.DSS_PROJECT_FIRST_VERSION_COMMENT); + dssProjectVersion.setProjectID(dssProject.getId()); + dssProjectVersion.setUpdateTime(new Date()); + dssProjectVersion.setUpdatorID(userID); + dssProjectVersion.setVersion(DSSServerConstant.DSS_PROJECT_FIRST_VERSION); + dssProjectVersion.setLock(0); + projectMapper.addProjectVersion(dssProjectVersion); + return new Pair(dssProject.getId(), dssProjectVersion.getId()); } - private void createSchedulerProject(DWSProject dwsProject) throws DSSErrorException { + private void createSchedulerProject(DSSProject dssProject) throws DSSErrorException { try { - functionInvoker.projectServiceAddFunction(dwsProject,ProjectService::createProject,Arrays.asList(schedulerAppJoint)); + if(getSchedulerAppJoint() != null) { + functionInvoker.projectServiceAddFunction(dssProject, ProjectService::createProject, Arrays.asList(getSchedulerAppJoint())); + }else{ + logger.error("Add scheduler project failed for scheduler appjoint is null"); + } } catch (Exception e) { logger.error("add scheduler project failed,", e); throw new DSSErrorException(90002, "add scheduler project failed" + e.getMessage()); @@ -175,11 +175,15 @@ public class DWSProjectServiceImpl implements DWSProjectService { //无法修改名字 //更新wtssProject中的description if(!StringUtils.isBlank(description)){ - DWSProject project = getProjectByID(projectID); + DSSProject project = getProjectByID(projectID); project.setUserName(userName); project.setDescription(description); if(existSchesulis()){ - functionInvoker.projectServiceFunction(project,ProjectService::updateProject,Arrays.asList(schedulerAppJoint)); + if(getSchedulerAppJoint() != null) { + functionInvoker.projectServiceFunction(project,ProjectService::updateProject,Arrays.asList(getSchedulerAppJoint())); + }else{ + logger.error("Update scheduler project failed for scheduler appjoint is null"); + } } functionInvoker.projectServiceFunction(project,ProjectService::updateProject,applicationService.listAppjoint()); } @@ -190,11 +194,15 @@ public class DWSProjectServiceImpl implements DWSProjectService { @Override public void deleteProject(long projectID, String userName, Boolean ifDelScheduler) throws DSSErrorException { try { - DWSProject project = getProjectByID(projectID); + DSSProject project = getProjectByID(projectID); project.setUserName(userName); if(ifDelScheduler){ if(existSchesulis()){ - functionInvoker.projectServiceFunction(project,ProjectService::deleteProject,Arrays.asList(schedulerAppJoint)); + if(getSchedulerAppJoint() != null) { + functionInvoker.projectServiceFunction(project, ProjectService::deleteProject, Arrays.asList(getSchedulerAppJoint())); + }else{ + logger.error("Delete scheduler project failed for scheduler appjoint is null"); + } } } functionInvoker.projectServiceFunction(project,ProjectService::deleteProject,applicationService.listAppjoint()); @@ -205,19 +213,19 @@ public class DWSProjectServiceImpl implements DWSProjectService { throw new DSSErrorException(90012, errorMsg); } flowTaxonomyMapper.deleteFlowTaxonomyByProjectID(projectID); - List dwsFlowList = flowMapper.listFlowByProjectID(projectID); - flowService.batchDeleteFlow(dwsFlowList.stream().map(f -> f.getId()).distinct().collect(Collectors.toList()), null); + List dssFlowList = flowMapper.listFlowByProjectID(projectID); + flowService.batchDeleteFlow(dssFlowList.stream().map(f -> f.getId()).distinct().collect(Collectors.toList()), null); projectMapper.deleteProjectVersions(projectID); projectMapper.deleteProjectBaseInfo(projectID); projectTaxonomyMapper.deleteProjectTaxonomyRelationByProjectID(projectID); } @Override - public DWSProject getLatestVersionProject(Long projectID) { - DWSProject dwsProject = getProjectByID(projectID); - DWSProjectVersion dwsProjectVersion = projectMapper.selectLatestVersionByProjectID(projectID); - dwsProject.setLatestVersion(dwsProjectVersion); - return dwsProject; + public DSSProject getLatestVersionProject(Long projectID) { + DSSProject dssProject = getProjectByID(projectID); + DSSProjectVersion dssProjectVersion = projectMapper.selectLatestVersionByProjectID(projectID); + dssProject.setLatestVersion(dssProjectVersion); + return dssProject; } /** @@ -227,9 +235,9 @@ public class DWSProjectServiceImpl implements DWSProjectService { * @return */ @Override - public DWSProject getProjectByProjectVersionID(Long projectVersionID) { - DWSProject dwsProject = projectMapper.selectProjectByVersionID(projectVersionID); - return dwsProject; + public DSSProject getProjectByProjectVersionID(Long projectVersionID) { + DSSProject dssProject = projectMapper.selectProjectByVersionID(projectVersionID); + return dssProject; } @Override @@ -238,13 +246,13 @@ public class DWSProjectServiceImpl implements DWSProjectService { } @Override - public List listAllProjectVersions(Long projectID) { - List dwsProjectVersions = projectMapper.listProjectVersionsByProjectID(projectID); - for (DWSProjectVersion dwsProjectVersion : dwsProjectVersions) { - DWSProjectPublishHistory publishHistory = projectMapper.selectProjectPublishHistoryByProjectVersionID(dwsProjectVersion.getId()); - dwsProjectVersion.setPublishHistory(publishHistory); + public List listAllProjectVersions(Long projectID) { + List dssProjectVersions = projectMapper.listProjectVersionsByProjectID(projectID); + for (DSSProjectVersion dssProjectVersion : dssProjectVersions) { + DSSProjectPublishHistory publishHistory = projectMapper.selectProjectPublishHistoryByProjectVersionID(dssProjectVersion.getId()); + dssProjectVersion.setPublishHistory(publishHistory); } - return dwsProjectVersions; + return dssProjectVersions; } @@ -252,53 +260,63 @@ public class DWSProjectServiceImpl implements DWSProjectService { @Transactional(rollbackFor = {DSSErrorException.class, InterruptedException.class,AppJointErrorException.class}) @Override public void publish(Long projectVersionID, String userName, String comment) throws DSSErrorException, InterruptedException, AppJointErrorException { - // TODO: 2019/9/24 try catch 下载json要挪到parser去 - //1.封装dwsProject - DWSProject dwsProject = projectMapper.selectProjectByVersionID(projectVersionID); - dwsProject.setUserName(dwsUserMapper.getuserName(dwsProject.getUserID())); - logger.info(userName + "-开始发布工程:" + dwsProject.getName() + "版本ID为:" + projectVersionID); - ArrayList dwsFlows = new ArrayList<>(); - List dwsFlowVersionList = flowMapper.listLatestRootFlowVersionByProjectVersionID(projectVersionID); - for (DWSFlowVersion dwsFlowVersion : dwsFlowVersionList) { - DWSFlow dwsFlow = flowMapper.selectFlowByID(dwsFlowVersion.getFlowID()); - String json = (String) bmlService.query(userName, dwsFlowVersion.getJsonPath(), dwsFlowVersion.getVersion()).get("string"); - if (!dwsFlow.getHasSaved()) { - logger.info("工作流{}从未保存过,忽略",dwsFlow.getName()); - } else if(StringUtils.isNotBlank(json)){ - dwsFlowVersion.setJson(json); - dwsFlow.setLatestVersion(dwsFlowVersion); - createPublishProject(userName, dwsFlowVersion.getFlowID(), dwsFlow, projectVersionID); - dwsFlows.add(dwsFlow); - } else { - String warnMsg = String.format(DSSServerConstant.PUBLISH_FLOW_REPORT_FORMATE, dwsFlow.getName(), dwsFlowVersion.getVersion()); - logger.info(warnMsg); - throw new DSSErrorException(90013, warnMsg); + + SchedulerAppJoint schedulerAppJoint = getSchedulerAppJoint(); + if(schedulerAppJoint != null) { + ProjectParser projectParser = schedulerAppJoint.getProjectParser(); + ProjectTuning projectTuning = schedulerAppJoint.getProjectTuning(); + ProjectPublishHook[] projectPublishHooks = schedulerAppJoint.getProjectPublishHooks(); + // TODO: 2019/9/24 try catch 下载json要挪到parser去 + //1.封装DSSProject + DSSProject dssProject = projectMapper.selectProjectByVersionID(projectVersionID); + dssProject.setUserName(dssUserMapper.getuserName(dssProject.getUserID())); + logger.info(userName + "-开始发布工程:" + dssProject.getName() + "版本ID为:" + projectVersionID); + ArrayList dssFlows = new ArrayList<>(); + List dssFlowVersionList = flowMapper.listLatestRootFlowVersionByProjectVersionID(projectVersionID); + for (DSSFlowVersion dssFlowVersion : dssFlowVersionList) { + DSSFlow dssFlow = flowMapper.selectFlowByID(dssFlowVersion.getFlowID()); + String json = (String) bmlService.query(userName, dssFlowVersion.getJsonPath(), dssFlowVersion.getVersion()).get("string"); + if (!dssFlow.getHasSaved()) { + logger.info("工作流{}从未保存过,忽略", dssFlow.getName()); + } else if (StringUtils.isNotBlank(json)) { + dssFlowVersion.setJson(json); + dssFlow.setLatestVersion(dssFlowVersion); + createPublishProject(userName, dssFlowVersion.getFlowID(), dssFlow, projectVersionID); + dssFlows.add(dssFlow); + } else { + String warnMsg = String.format(DSSServerConstant.PUBLISH_FLOW_REPORT_FORMATE, dssFlow.getName(), dssFlowVersion.getVersion()); + logger.info(warnMsg); + throw new DSSErrorException(90013, warnMsg); + } } + if (dssFlows.isEmpty()) throw new DSSErrorException(90007, "该工程没有可以发布的工作流,请检查工作流是否都为空"); + dssProject.setFlows(dssFlows); + //2.封装DSSProject完成,开始发布 + SchedulerProject schedulerProject = projectParser.parseProject(dssProject); + projectTuning.tuningSchedulerProject(schedulerProject); + Stream.of(projectPublishHooks).forEach(DSSExceptionUtils.handling(hook -> hook.prePublish(schedulerProject))); + (schedulerAppJoint.getProjectService()).publishProject(schedulerProject, schedulerAppJoint.getSecurityService().login(userName)); + Stream.of(projectPublishHooks).forEach(DSSExceptionUtils.handling(hook -> hook.postPublish(schedulerProject))); + //3.发布完成后复制工程 + DSSProjectVersion dssProjectVersion = projectMapper.selectProjectVersionByID(projectVersionID); + copyProjectVersionMax(projectVersionID, dssProjectVersion, dssProjectVersion, userName, null); + }else { + logger.error("SchedulerAppJoint is null"); + throw new DSSErrorException(90014, "SchedulerAppJoint is null"); } - if (dwsFlows.isEmpty()) throw new DSSErrorException(90007, "该工程没有可以发布的工作流,请检查工作流是否都为空"); - dwsProject.setFlows(dwsFlows); - //2.封装dwsProject完成,开始发布 - SchedulerProject schedulerProject = projectParser.parseProject(dwsProject); - projectTuning.tuningSchedulerProject(schedulerProject); - Stream.of(projectPublishHooks).forEach(DSSExceptionUtils.handling(hook -> hook.prePublish(schedulerProject))); - (schedulerAppJoint.getProjectService()).publishProject(schedulerProject, schedulerAppJoint.getSecurityService().login(userName)); - Stream.of(projectPublishHooks).forEach(DSSExceptionUtils.handling(hook -> hook.postPublish(schedulerProject))); - //3.发布完成后复制工程 - DWSProjectVersion dwsProjectVersion = projectMapper.selectProjectVersionByID(projectVersionID); - copyProjectVersionMax(projectVersionID, dwsProjectVersion, dwsProjectVersion, userName, null); } @Override public Long createPublishHistory(String comment, Long creatorID, Long projectVersionID) { - DWSProjectPublishHistory dwsProjectPublishHistory = new DWSProjectPublishHistory(); - dwsProjectPublishHistory.setComment(comment); - dwsProjectPublishHistory.setCreateID(creatorID); - dwsProjectPublishHistory.setCreateTime(new Date()); - dwsProjectPublishHistory.setUpdateTime(new Date()); - dwsProjectPublishHistory.setProjectVersionID(projectVersionID); - dwsProjectPublishHistory.setState(0); - projectMapper.insertPublishHistory(dwsProjectPublishHistory); - return dwsProjectPublishHistory.getId(); + DSSProjectPublishHistory dssProjectPublishHistory = new DSSProjectPublishHistory(); + dssProjectPublishHistory.setComment(comment); + dssProjectPublishHistory.setCreateID(creatorID); + dssProjectPublishHistory.setCreateTime(new Date()); + dssProjectPublishHistory.setUpdateTime(new Date()); + dssProjectPublishHistory.setProjectVersionID(projectVersionID); + dssProjectPublishHistory.setState(0); + projectMapper.insertPublishHistory(dssProjectPublishHistory); + return dssProjectPublishHistory.getId(); } @Override @@ -307,18 +325,18 @@ public class DWSProjectServiceImpl implements DWSProjectService { } @Override - public DWSProjectPublishHistory getPublishHistoryByID(Long projectVersionID) { + public DSSProjectPublishHistory getPublishHistoryByID(Long projectVersionID) { return projectMapper.selectProjectPublishHistoryByProjectVersionID(projectVersionID); } @Override - public DWSProject getExecutionDWSProject(RequestDWSProject requestDWSProject) throws DSSErrorException { - DWSFlow dwsFlow = flowService.getOneVersionFlow(requestDWSProject.flowId(), requestDWSProject.version(),requestDWSProject.projectVersionId()); - DWSProject dwsProject = projectMapper.selectProjectByVersionID(requestDWSProject.projectVersionId()); - dwsProject.setUserName(dwsUserMapper.getuserName(dwsProject.getUserID())); - DWSFlow returnFlow = recursiveGenerateParentFlow(dwsFlow,requestDWSProject); - dwsProject.setFlows(Arrays.asList(returnFlow)); - return dwsProject; + public DSSProject getExecutionDSSProject(RequestDSSProject requestDSSProject) throws DSSErrorException { + DSSFlow dssFlow = flowService.getOneVersionFlow(requestDSSProject.flowId(), requestDSSProject.version(), requestDSSProject.projectVersionId()); + DSSProject dssProject = projectMapper.selectProjectByVersionID(requestDSSProject.projectVersionId()); + dssProject.setUserName(dssUserMapper.getuserName(dssProject.getUserID())); + DSSFlow returnFlow = recursiveGenerateParentFlow(dssFlow, requestDSSProject); + dssProject.setFlows(Arrays.asList(returnFlow)); + return dssProject; } @Override @@ -335,43 +353,43 @@ public class DWSProjectServiceImpl implements DWSProjectService { return appjointProjectID == null? projectID:appjointProjectID; } - private DWSFlow recursiveGenerateParentFlow(DWSFlow dwsFlow,RequestDWSProject requestDWSProject) throws DSSErrorException { - DWSFlow returnFlow = null; - Long parentFlowID = flowService.getParentFlowID(dwsFlow.getId()); + private DSSFlow recursiveGenerateParentFlow(DSSFlow dssFlow, RequestDSSProject requestDSSProject) throws DSSErrorException { + DSSFlow returnFlow = null; + Long parentFlowID = flowService.getParentFlowID(dssFlow.getId()); if(parentFlowID != null){ //对于当前执行的工作流的父工作流,直接找其最新的版本 - DWSFlow parentFlow = flowService.getLatestVersionFlow(parentFlowID,requestDWSProject.projectVersionId()); - parentFlow.setChildren(Arrays.asList(dwsFlow)); - returnFlow = recursiveGenerateParentFlow(parentFlow,requestDWSProject); + DSSFlow parentFlow = flowService.getLatestVersionFlow(parentFlowID, requestDSSProject.projectVersionId()); + parentFlow.setChildren(Arrays.asList(dssFlow)); + returnFlow = recursiveGenerateParentFlow(parentFlow, requestDSSProject); }else { - returnFlow = dwsFlow; + returnFlow = dssFlow; } return returnFlow; } - private void createPublishProject(String userName, Long parentFlowID, DWSFlow dwsFlowParent, Long projectVersionID) throws DSSErrorException { + private void createPublishProject(String userName, Long parentFlowID, DSSFlow dssFlowParent, Long projectVersionID) throws DSSErrorException { List subFlowIDS = flowMapper.selectSubFlowIDByParentFlowID(parentFlowID); - ArrayList dwsFlows = new ArrayList<>(); + ArrayList dssFlows = new ArrayList<>(); for (Long subFlowID : subFlowIDS) { - DWSFlowVersion dwsFlowVersion = flowService.getLatestVersionByFlowIDAndProjectVersionID(subFlowID, projectVersionID); - if (dwsFlowVersion != null) { //subFlowIDS通过flow关联查出来的,但是有可能最新版本的project对已有的flows做了删除 - DWSFlow dwsFlow = flowMapper.selectFlowByID(dwsFlowVersion.getFlowID()); - String json = (String) bmlService.query(userName, dwsFlowVersion.getJsonPath(), dwsFlowVersion.getVersion()).get("string"); - if (!dwsFlow.getHasSaved()) { - logger.info("工作流{}从未保存过,忽略",dwsFlow.getName()); + DSSFlowVersion dssFlowVersion = flowService.getLatestVersionByFlowIDAndProjectVersionID(subFlowID, projectVersionID); + if (dssFlowVersion != null) { //subFlowIDS通过flow关联查出来的,但是有可能最新版本的project对已有的flows做了删除 + DSSFlow dssFlow = flowMapper.selectFlowByID(dssFlowVersion.getFlowID()); + String json = (String) bmlService.query(userName, dssFlowVersion.getJsonPath(), dssFlowVersion.getVersion()).get("string"); + if (!dssFlow.getHasSaved()) { + logger.info("工作流{}从未保存过,忽略", dssFlow.getName()); }else if (StringUtils.isNotBlank(json)){ - dwsFlowVersion.setJson(json); - dwsFlow.setLatestVersion(dwsFlowVersion); - createPublishProject(userName, subFlowID, dwsFlow, projectVersionID); - dwsFlows.add(dwsFlow); + dssFlowVersion.setJson(json); + dssFlow.setLatestVersion(dssFlowVersion); + createPublishProject(userName, subFlowID, dssFlow, projectVersionID); + dssFlows.add(dssFlow); } else { - String warnMsg = String.format(DSSServerConstant.PUBLISH_FLOW_REPORT_FORMATE, dwsFlow.getName(), dwsFlowVersion.getVersion()); + String warnMsg = String.format(DSSServerConstant.PUBLISH_FLOW_REPORT_FORMATE, dssFlow.getName(), dssFlowVersion.getVersion()); logger.info(warnMsg); throw new DSSErrorException(90013, warnMsg); } } } - dwsFlowParent.setChildren(dwsFlows); + dssFlowParent.setChildren(dssFlows); } /* @@ -381,17 +399,17 @@ public class DWSProjectServiceImpl implements DWSProjectService { @Transactional(rollbackFor = {DSSErrorException.class, InterruptedException.class,AppJointErrorException.class}) @Override public Long copyProject(Long projectVersionID, Long projectID, String projectName, String userName) throws DSSErrorException, InterruptedException, AppJointErrorException { - DWSProject project = projectMapper.selectProjectByID(projectID); + DSSProject project = projectMapper.selectProjectByID(projectID); if (StringUtils.isNotEmpty(projectName)) {project.setName(projectName);} - DWSProjectTaxonomyRelation projectTaxonomyRelation = projectTaxonomyMapper.selectProjectTaxonomyRelationByTaxonomyIdOrProjectId(projectID); + DSSProjectTaxonomyRelation projectTaxonomyRelation = projectTaxonomyMapper.selectProjectTaxonomyRelationByTaxonomyIdOrProjectId(projectID); //添加至wtss的project数据库,获取projectID project.setUserName(userName); if(existSchesulis()){ createSchedulerProject(project); } Map appjointProjectIDAndAppID = createAppjointProject(project); - Long userID = dwsUserMapper.getUserID(userName); - //添加至dws的project数据库,这里的projectID应该不需要自增 + Long userID = dssUserMapper.getUserID(userName); + //添加至DSS的project数据库,这里的projectID应该不需要自增 //目前是相同数据库,需要自增id project.setUserID(userID); project.setCreateTime(new Date()); @@ -399,7 +417,7 @@ public class DWSProjectServiceImpl implements DWSProjectService { projectMapper.addProject(project); if(!appjointProjectIDAndAppID.isEmpty())projectMapper.addAccessProjectRelation(appjointProjectIDAndAppID,project.getId()); projectTaxonomyMapper.addProjectTaxonomyRelation(project.getId(), projectTaxonomyRelation.getTaxonomyId(), userID); - DWSProjectVersion maxVersion = projectMapper.selectLatestVersionByProjectID(projectID); + DSSProjectVersion maxVersion = projectMapper.selectLatestVersionByProjectID(projectID); copyProjectVersionMax(maxVersion.getId(), maxVersion, maxVersion, userName, project.getId()); return project.getId(); } @@ -420,19 +438,19 @@ public class DWSProjectServiceImpl implements DWSProjectService { @Lock(type = LockEnum.ADD) @Transactional(rollbackFor = {DSSErrorException.class, InterruptedException.class}) @Override - public void copyProjectVersionMax(Long projectVersionID, DWSProjectVersion maxVersion, DWSProjectVersion copyVersion, String userName, Long WTSSprojectID) throws DSSErrorException, InterruptedException { + public void copyProjectVersionMax(Long projectVersionID, DSSProjectVersion maxVersion, DSSProjectVersion copyVersion, String userName, Long WTSSprojectID) throws DSSErrorException, InterruptedException { // copy project_version String maxVersionNum = generateNewVersion(maxVersion.getVersion()); if (null != WTSSprojectID) { - copyVersion.setVersion(DSSServerConstant.DWS_PROJECT_FIRST_VERSION); + copyVersion.setVersion(DSSServerConstant.DSS_PROJECT_FIRST_VERSION); copyVersion.setProjectID(WTSSprojectID); } else { copyVersion.setVersion(maxVersionNum); } - Long userID = dwsUserMapper.getUserID(userName); + Long userID = dssUserMapper.getUserID(userName); copyVersion.setUpdatorID(userID); copyVersion.setUpdateTime(new Date()); - List flowVersions = flowMapper.listLastFlowVersionsByProjectVersionID(copyVersion.getId()) + List flowVersions = flowMapper.listLastFlowVersionsByProjectVersionID(copyVersion.getId()) .stream().sorted((o1, o2) -> Integer.valueOf(o1.getFlowID().toString()) - Integer.valueOf(o2.getFlowID().toString())) .collect(Collectors.toList()); Long oldProjectVersionID = copyVersion.getId(); @@ -443,20 +461,20 @@ public class DWSProjectServiceImpl implements DWSProjectService { // copy flow if (null != WTSSprojectID) { flowVersions.stream().forEach(f -> { - DWSFlow flow = flowMapper.selectFlowByID(f.getFlowID()); + DSSFlow flow = flowMapper.selectFlowByID(f.getFlowID()); Long parentFlowID = flowMapper.selectParentFlowIDByFlowID(flow.getId()); if (parentFlowID != null) {subAndParentFlowIDMap.put(flow.getId(), parentFlowID);} }); - for (DWSFlowVersion fv : flowVersions) { + for (DSSFlowVersion fv : flowVersions) { // 添加所有父子到map中 - DWSFlow flow = flowMapper.selectFlowByID(fv.getFlowID()); + DSSFlow flow = flowMapper.selectFlowByID(fv.getFlowID()); flow.setCreatorID(userID); flow.setName(flow.getName()); flow.setProjectID(copyVersion.getProjectID()); flow.setCreateTime(new Date()); Long taxonomyID = flowTaxonomyMapper.selectTaxonomyIDByFlowID(flow.getId()); - DWSFlowTaxonomy flowTaxonomy = flowTaxonomyMapper.selectFlowTaxonomyByID(taxonomyID); + DSSFlowTaxonomy flowTaxonomy = flowTaxonomyMapper.selectFlowTaxonomyByID(taxonomyID); //新增flow相关数据 fv.setOldFlowID(flow.getId()); flow.setId(null); @@ -476,19 +494,19 @@ public class DWSProjectServiceImpl implements DWSProjectService { if (null != taxonomyID){flowTaxonomyMapper.insertFlowTaxonomyRelation(flowTaxonomy.getId(), flow.getId());} fv.setFlowID(flow.getId()); } - for (DWSFlowVersion fv : flowVersions) { + for (DSSFlowVersion fv : flowVersions) { if (subAndParentFlowIDMap.get(fv.getFlowID()) != null){flowMapper.insertFlowRelation(fv.getFlowID(), subAndParentFlowIDMap.get(fv.getFlowID()));} } } // copy flow_version if (flowVersions.size() > 0) { - ThreadPoolTool tool = new ThreadPoolTool(5, flowVersions); - tool.setCallBack(new ThreadPoolTool.CallBack() { + ThreadPoolTool tool = new ThreadPoolTool(5, flowVersions); + tool.setCallBack(new ThreadPoolTool.CallBack() { @Override - public void method(List flowVersions) { - for (DWSFlowVersion fv : flowVersions) { + public void method(List flowVersions) { + for (DSSFlowVersion fv : flowVersions) { // 工作流版本的json文件,都是需要重新上传到bml - Map bmlQueryMap = bmlService.download(dwsUserMapper.getuserName(fv.getUpdatorID()), fv.getJsonPath(), fv.getVersion()); + Map bmlQueryMap = bmlService.download(dssUserMapper.getuserName(fv.getUpdatorID()), fv.getJsonPath(), fv.getVersion()); BufferedReader bufferedReader = new BufferedReader(new InputStreamReader((InputStream) bmlQueryMap.get("is"))); StringBuilder sb = new StringBuilder(); String s = null; @@ -576,4 +594,16 @@ public class DWSProjectServiceImpl implements DWSProjectService { }).collect(Collectors.toList()); } + + private SchedulerAppJoint getSchedulerAppJoint(){ + if(schedulerAppJoint == null){ + try { + schedulerAppJoint = (SchedulerAppJoint)applicationService.getAppjoint("schedulis"); + } catch (AppJointErrorException e) { + logger.error("Schedule system init failed!", e); + } + } + return schedulerAppJoint; + } + } diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSProjectTaxonomyServiceImpl.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSProjectTaxonomyServiceImpl.java similarity index 61% rename from dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSProjectTaxonomyServiceImpl.java rename to dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSProjectTaxonomyServiceImpl.java index 1d573c556506222988203305686cf3ef5ab8950f..2151151679f09197ad9e6b654520d698d1f135c6 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSProjectTaxonomyServiceImpl.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSProjectTaxonomyServiceImpl.java @@ -18,11 +18,11 @@ package com.webank.wedatasphere.dss.server.service.impl; +import com.webank.wedatasphere.dss.common.entity.project.DSSProject; import com.webank.wedatasphere.dss.server.dao.ProjectTaxonomyMapper; -import com.webank.wedatasphere.dss.common.entity.project.DWSProject; -import com.webank.wedatasphere.dss.server.entity.DWSProjectTaxonomy; -import com.webank.wedatasphere.dss.server.service.DWSProjectService; -import com.webank.wedatasphere.dss.server.service.DWSProjectTaxonomyService; +import com.webank.wedatasphere.dss.server.entity.DSSProjectTaxonomy; +import com.webank.wedatasphere.dss.server.service.DSSProjectService; +import com.webank.wedatasphere.dss.server.service.DSSProjectTaxonomyService; import com.webank.wedatasphere.dss.common.exception.DSSErrorException; import org.slf4j.Logger; import org.slf4j.LoggerFactory; @@ -36,22 +36,22 @@ import java.util.List; @Service -public class DWSProjectTaxonomyServiceImpl implements DWSProjectTaxonomyService { +public class DSSProjectTaxonomyServiceImpl implements DSSProjectTaxonomyService { private Logger logger = LoggerFactory.getLogger(this.getClass()); @Autowired private ProjectTaxonomyMapper projectTaxonomyMapper; @Autowired - private DWSProjectService projectService; + private DSSProjectService projectService; @Override - public DWSProjectTaxonomy getProjectTaxonomyByID(Long id) { + public DSSProjectTaxonomy getProjectTaxonomyByID(Long id) { return projectTaxonomyMapper.selectProjectTaxonomyByID(id); } @Override - public List listProjectTaxonomyByUser(String userName) { + public List listProjectTaxonomyByUser(String userName) { return projectTaxonomyMapper.listProjectTaxonomyByUser(userName); } @@ -62,37 +62,40 @@ public class DWSProjectTaxonomyServiceImpl implements DWSProjectTaxonomyService @Override - public List listAllProjectTaxonomy(String userName) { - List dwsProjectTaxonomies = listProjectTaxonomyByUser(userName); - for (DWSProjectTaxonomy dwsProjectTaxonomy : dwsProjectTaxonomies) { - List projectIDs = listProjectIDByTaxonomyID(dwsProjectTaxonomy.getId(), userName); - ArrayList dwsProjectList = new ArrayList<>(); + public List listAllProjectTaxonomy(String userName, Long workspaceId) { + List dssProjectTaxonomies = listProjectTaxonomyByUser(userName); + for (DSSProjectTaxonomy dssProjectTaxonomy : dssProjectTaxonomies) { + List projectIDs = listProjectIDByTaxonomyID(dssProjectTaxonomy.getId(), userName); + ArrayList dssProjectList = new ArrayList<>(); for (Long projectID : projectIDs) { - DWSProject dwsProject = projectService.getLatestVersionProject(projectID); - dwsProjectList.add(dwsProject); + DSSProject dssProject = projectService.getLatestVersionProject(projectID); + // 只选择返回属于这个workspace的project,(某些用户拥有多个workspace的不同project) + if(workspaceId.equals(dssProject.getWorkspaceId())) { + dssProjectList.add(dssProject); + } } - dwsProjectTaxonomy.setDwsProjectList(dwsProjectList); + dssProjectTaxonomy.setDssProjectList(dssProjectList); } - return dwsProjectTaxonomies; + return dssProjectTaxonomies; } @Override - public List listProjectTaxonomy(Long taxonomyID, String userName) { - DWSProjectTaxonomy dwsProjectTaxonomy = getProjectTaxonomyByID(taxonomyID); - List projectIDs = listProjectIDByTaxonomyID(dwsProjectTaxonomy.getId(), userName); - ArrayList dwsProjectList = new ArrayList<>(); + public List listProjectTaxonomy(Long taxonomyID, String userName) { + DSSProjectTaxonomy dssProjectTaxonomy = getProjectTaxonomyByID(taxonomyID); + List projectIDs = listProjectIDByTaxonomyID(dssProjectTaxonomy.getId(), userName); + ArrayList dssProjectList = new ArrayList<>(); for (Long projectID : projectIDs) { - DWSProject dwsProject = projectService.getLatestVersionProject(projectID); - dwsProjectList.add(dwsProject); + DSSProject dssProject = projectService.getLatestVersionProject(projectID); + dssProjectList.add(dssProject); } - dwsProjectTaxonomy.setDwsProjectList(dwsProjectList); - return Arrays.asList(dwsProjectTaxonomy); + dssProjectTaxonomy.setDssProjectList(dssProjectList); + return Arrays.asList(dssProjectTaxonomy); } @Override - public void addProjectTaxonomy(DWSProjectTaxonomy dwsProjectTaxonomy) throws DSSErrorException { + public void addProjectTaxonomy(DSSProjectTaxonomy dssProjectTaxonomy) throws DSSErrorException { try { - projectTaxonomyMapper.insertProjectTaxonomy(dwsProjectTaxonomy); + projectTaxonomyMapper.insertProjectTaxonomy(dssProjectTaxonomy); } catch (DuplicateKeyException e) { logger.info(e.getMessage()); throw new DSSErrorException(90004, "工程分类名不能重复"); @@ -100,9 +103,9 @@ public class DWSProjectTaxonomyServiceImpl implements DWSProjectTaxonomyService } @Override - public void updateProjectTaxonomy(DWSProjectTaxonomy dwsProjectTaxonomy) throws DSSErrorException { + public void updateProjectTaxonomy(DSSProjectTaxonomy dssProjectTaxonomy) throws DSSErrorException { try { - projectTaxonomyMapper.updateProjectTaxonomy(dwsProjectTaxonomy); + projectTaxonomyMapper.updateProjectTaxonomy(dssProjectTaxonomy); } catch (DuplicateKeyException e) { logger.info(e.getMessage()); throw new DSSErrorException(90004, "工程分类名不能重复"); diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSUserServiceImpl.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSUserServiceImpl.java similarity index 71% rename from dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSUserServiceImpl.java rename to dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSUserServiceImpl.java index 8c2ec4c72a3edc715bc099ea581fe83be90493b9..60973c98f6974dde7d93ef150aa59fcc858a825f 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DWSUserServiceImpl.java +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSUserServiceImpl.java @@ -18,18 +18,20 @@ package com.webank.wedatasphere.dss.server.service.impl; -import com.webank.wedatasphere.dss.server.dao.DWSUserMapper; -import com.webank.wedatasphere.dss.server.service.DWSUserService; +import com.webank.wedatasphere.dss.application.entity.DSSUser; +import com.webank.wedatasphere.dss.server.dao.DSSUserMapper; +import com.webank.wedatasphere.dss.server.service.DSSUserService; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Service; @Service -public class DWSUserServiceImpl implements DWSUserService { +public class DSSUserServiceImpl implements DSSUserService { @Autowired - private DWSUserMapper dwsUserMapper; + private DSSUserMapper dssUserMapper; + @Override public Long getUserID(String userName) { - return dwsUserMapper.getUserID(userName); + return dssUserMapper.getUserID(userName); } } diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSWorkspaceServiceImpl.java b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSWorkspaceServiceImpl.java new file mode 100644 index 0000000000000000000000000000000000000000..19a27befb17732a021933fdc06c248eef1b8c6a4 --- /dev/null +++ b/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/impl/DSSWorkspaceServiceImpl.java @@ -0,0 +1,155 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.server.service.impl; + +import com.webank.wedatasphere.dss.server.constant.DSSServerConstant; +import com.webank.wedatasphere.dss.server.dao.WorkspaceMapper; +import com.webank.wedatasphere.dss.server.dto.response.*; +import com.webank.wedatasphere.dss.server.entity.*; +import com.webank.wedatasphere.dss.server.service.DSSWorkspaceService; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Service; + +import java.util.ArrayList; +import java.util.List; + +/** + * Created by schumiyi on 2020/6/22 + */ +@Service +public class DSSWorkspaceServiceImpl implements DSSWorkspaceService { + + @Autowired + private WorkspaceMapper workspaceMapper; + + @Override + public List getWorkspaces() { + + return workspaceMapper.getWorkspaces(); + } + + @Override + public DSSWorkspace getWorkspacesById(Long id) { + return workspaceMapper.getWorkspaceById(id); + } + + + @Override + public Long addWorkspace(String userName, String name, String department, String label, String description) { + DSSWorkspace dssWorkspace = new DSSWorkspace(); + dssWorkspace.setName(name); + dssWorkspace.setDepartment(department); + dssWorkspace.setDescription(description); + dssWorkspace.setLabel(label); + dssWorkspace.setCreateBy(userName); + dssWorkspace.setSource(DSSServerConstant.DSS_WORKSPACE_SOURCE); + dssWorkspace.setLastUpdateUser(userName); + workspaceMapper.addWorkSpace(dssWorkspace); + return dssWorkspace.getId(); + } + + @Override + public boolean existWorkspaceName(String name) { + return !workspaceMapper.findByWorkspaceName(name).isEmpty(); + } + + @Override + public List getWorkSpaceDepartments() { + // TODO: service层和dao层完善 + WorkspaceDepartmentVo dp = new WorkspaceDepartmentVo(); + dp.setId(1L); + dp.setName("应用开发组"); + WorkspaceDepartmentVo di = new WorkspaceDepartmentVo(); + di.setId(2L); + di.setName("平台研发组"); + List departments = new ArrayList<>(); + departments.add(dp); + departments.add(di); + return departments; + } + + @Override + public List getHomepageDemos(boolean isChinese) { + List demoMenuVos = isChinese ? workspaceMapper.getHomepageDemoMenusCn() : workspaceMapper.getHomepageDemoMenusEn(); + for (HomepageDemoMenuVo demoMenuVo : demoMenuVos) { + Long menuId = demoMenuVo.getId(); + List demoInstanceVos = isChinese ? workspaceMapper.getHomepageInstancesByMenuIdCn(menuId) : workspaceMapper.getHomepageInstancesByMenuIdEn(menuId); + demoMenuVo.setDemoInstances(demoInstanceVos); + } + return demoMenuVos; + } + + @Override + public List getHomepageVideos(boolean isChinese) { + return isChinese ? workspaceMapper.getHomepageVideosCn() : workspaceMapper.getHomepageVideosEn(); + } + + @Override + public List getWorkspaceManagements(Long workspaceId, String username, boolean isChinese) { + if (!isAdminUser(workspaceId, username)) { + return new ArrayList<>(); + } + List managementMenuVos = isChinese ? workspaceMapper.getManagementMenuCn() : workspaceMapper.getManagementMenuEn(); + return getMenuAppInstances(managementMenuVos, isChinese); + } + + private List getMenuAppInstances(List menuVos, boolean isChinese) { + for (OnestopMenuVo menuVo : menuVos) { + Long menuId = menuVo.getId(); + List menuAppInstanceVos = isChinese ? workspaceMapper.getMenuAppInstancesCn(menuId) : workspaceMapper.getMenuAppInstancesEn(menuId); + menuVo.setAppInstances(menuAppInstanceVos); + } + return menuVos; + } + + @Override + public List getWorkspaceApplications(Long workspaceId, String username, boolean isChinese) { + List applicationMenuVos = isChinese ? workspaceMapper.getApplicationMenuCn() : workspaceMapper.getApplicationMenuEn(); + return getMenuAppInstances(applicationMenuVos, isChinese); + } + + @Override + public List getWorkspaceFavorites(Long workspaceId, String username, boolean isChinese) { + return isChinese ? workspaceMapper.getWorkspaceFavoritesCn(username, workspaceId) : workspaceMapper.getWorkspaceFavoritesEn(username, workspaceId); + } + + @Override + public Long addFavorite(String username, Long workspaceId, Long menuApplicationId) { + DSSFavorite dssFavorite = new DSSFavorite(); + dssFavorite.setUsername(username); + dssFavorite.setWorkspaceId(workspaceId); + dssFavorite.setMenuApplicationId(menuApplicationId); + // todo: order will from the front end + dssFavorite.setOrder(1); + dssFavorite.setCreateBy(username); + dssFavorite.setLastUpdateUser(username); + workspaceMapper.addFavorite(dssFavorite); + return dssFavorite.getId(); + } + + @Override + public Long deleteFavorite(String username, Long favouritesId) { + workspaceMapper.deleteFavorite(favouritesId); + return favouritesId; + } + + private boolean isAdminUser(Long workspaceId, String username) { + DSSWorkspace workspace = workspaceMapper.getWorkspaceById(workspaceId); + return username != null && workspace != null && username.equals(workspace.getCreateBy()); + } +} diff --git a/dss-server/src/main/resources/default/AddschedulerUser.sh b/dss-server/src/main/resources/default/AddschedulerUser.sh new file mode 100644 index 0000000000000000000000000000000000000000..116c2d96b8958c7c507270f46c268e1e3d6906c9 --- /dev/null +++ b/dss-server/src/main/resources/default/AddschedulerUser.sh @@ -0,0 +1,13 @@ +#!/bin/bash + +user=$1 +password=$2 +installDir=$3 + + +if grep -i "^${user}=" $installDir/token.properties; + then + sed -i '' "s/^$user=.*/$user=$password/" $installDir/token.properties +else + echo "$user=$password" >> $installDir/token.properties +fi diff --git a/dss-server/src/main/resources/default/CreateLdapAccount.sh b/dss-server/src/main/resources/default/CreateLdapAccount.sh new file mode 100644 index 0000000000000000000000000000000000000000..1318ab16b7cfcbf42e818795650e77ba4bf10fed --- /dev/null +++ b/dss-server/src/main/resources/default/CreateLdapAccount.sh @@ -0,0 +1,18 @@ +#!/bin/bash +source /etc/profile +server_host=$1 +server_login_user=$2 +server_login_password=$3 +server_python_path=$4 +server_ldap_source_path=$5 +ldap_user=$6 +ldap_password=$7 +echo "$server_login_password ssh $server_login_user@$server_host sudo python $server_python_path add_with_pw $ldap_user -p $ldap_password" +sshpass -p $server_login_password ssh $server_login_user@$server_host "sudo python $server_python_path add_with_pw $ldap_user -p $ldap_password" +#sshpass -p $server_login_password ssh $server_login_user@$server_host "sudo su - root -c 'source /etc/profile && source $server_ldap_source_path && sudo python $server_python_path add_with_pw $ldap_user -p $ldap_password && deactivate'" + +echo "******************LDAP USER CREATED***********************" + + + + diff --git a/dss-server/src/main/resources/default/CreateLinuxUser.sh b/dss-server/src/main/resources/default/CreateLinuxUser.sh new file mode 100644 index 0000000000000000000000000000000000000000..fabcac1c5ae7990a22638f67c9826b8a78e630ed --- /dev/null +++ b/dss-server/src/main/resources/default/CreateLinuxUser.sh @@ -0,0 +1,35 @@ +#!/bin/bash + +source /etc/profile +server_user_strs=$1 +echo $server_user_strs +add_user_name=$2 +add_user_password=$3 +server_user_array=(${server_user_strs//,/ }) +for server_user_str in ${server_user_array[@]} +do + server_user_info=(${server_user_str//#/ }) + server_host=${server_user_info[0]} + server_user_name=${server_user_info[1]} + server_user_password=${server_user_info[2]} + echo "${server_host},${server_user_name},${server_user_password}" + + sudo sshpass -p $server_user_password ssh -o ConnectTimeout=1 $server_user_name@$server_host "echo success" + [ $? -ne 0 ] && echo "登录主机${server_host}失败" && exit 254 +done + +echo "************服务器网络校验通过,开始创建用户*****************" + +for server_user_str in ${server_user_array[@]} +do + + server_user_info=(${server_user_str//#/ }) + server_host=${server_user_info[0]} + server_user_name=${server_user_info[1]} + server_user_password=${server_user_info[2]} + + #sshpass -p $server_user_password ssh $server_user_name@$server_host "sudo useradd $add_user_name && echo $add_user_password |sudo -i passwd --stdin $add_user_name" + sshpass -p $server_user_password ssh $server_user_name@$server_host "sudo useradd $add_user_name -s /sbin/nologin" + + [ $? -ne 0 ] && echo "创建用户失败:${host}失败" && exit 254 +done diff --git a/dss-server/src/main/resources/default/HdfsPath.sh b/dss-server/src/main/resources/default/HdfsPath.sh new file mode 100644 index 0000000000000000000000000000000000000000..cd765486c508cba04b42baa2a17a13b285c93946 --- /dev/null +++ b/dss-server/src/main/resources/default/HdfsPath.sh @@ -0,0 +1,6 @@ +#!/bin/bash + +user=$1 +dir=$2 +hdfs dfs -mkdir -p $dir +hdfs dfs -chown $user:$user $dir \ No newline at end of file diff --git a/dss-server/src/main/resources/default/LinuxPath.sh b/dss-server/src/main/resources/default/LinuxPath.sh new file mode 100644 index 0000000000000000000000000000000000000000..bc55860b1037a0fd4497d3e9b12f9dccb117e190 --- /dev/null +++ b/dss-server/src/main/resources/default/LinuxPath.sh @@ -0,0 +1,7 @@ +#!/bin/bash + +user=$1 +dir=$2 +echo $1 $2; +sudo mkdir -p $dir +sudo chown $user:$user $dir \ No newline at end of file diff --git a/dss-server/src/main/resources/linkis.properties b/dss-server/src/main/resources/linkis.properties index d23e8850cdf838c5a3a0d6977870d5d62ca54ac8..bd39846ef3167ed7f2103c3625c80342b3f14ea8 100644 --- a/dss-server/src/main/resources/linkis.properties +++ b/dss-server/src/main/resources/linkis.properties @@ -17,7 +17,7 @@ wds.linkis.test.mode=true -wds.linkis.server.mybatis.datasource.url=jdbc:mysql://127.0.0.1:3306/ +wds.linkis.server.mybatis.datasource.url=jdbc:mysql://0.0.0.1:3306/linkis?characterEncoding=UTF-8 wds.linkis.server.mybatis.datasource.username= @@ -38,9 +38,34 @@ wds.linkis.server.mybatis.typeAliasesPackage=com.webank.wedatasphere.dss.server. wds.linkis.server.mybatis.BasePackage=com.webank.wedatasphere.dss.server.dao,com.webank.wedatasphere.dss.application.dao ##azkaban -wds.dss.appjoint.scheduler.azkaban.address=http://127.0.0.1:8091 +wds.dss.appjoint.scheduler.azkaban.address=http://0.0.0.0:8081 wds.linkis.gateway.ip=127.0.0.1 wds.linkis.gateway.port=9001 wds.dss.appjoint.scheduler.project.store.dir=file:///appcom/tmp/wds/scheduler +wds.linkis.super.user.name=root +wds.linkis.workspace.user.root.path=file:///tmp/linkis/ +wds.linkis.hdfs.user.root.path=hdfs:///tmp/linkis +wds.linkis.result.set.root.path=hdfs:///tmp/linkis +wds.linkis.scheduler.path=file:///appcom/tmp/wds/scheduler +wds.linkis.user.path=hdfs:///user +wds.linkis.dss.install.dir=/usr/local/dss_linkis/dss/dss-server +wds.linkis.azkaban.install.dir=/usr/local/dss_linkis/azkaban + +wds.linkis.metastore.hive.hdfs.base.path=/user/hive/warehouse +wds.linkis.metastore.script.path=default/Metastore.sh +wds.linkis.metastore.db.tail=_default + +wds.linkis.kerberos.realm= +wds.linkis.kerberos.admin= +wds.linkis.kerberos.enable.switch=0 +wds.linkis.kerberos.script.path=default/Kerberos.sh +wds.linkis.kerberos.keytab.path=/etc/security/keytabs +wds.linkis.kerberos.kdc.node= +wds.linkis.kerberos.kdc.user.name= +wds.linkis.kerberos.kdc.user.password= +wds.linkis.kerberos.ssh.port=22 +wds.dss.deploy.path=/usr/local/dss_linkis +wds.dss.user.account.command.class=com.webank.wedatasphpere.dss.user.service.impl.LinuxUserCommand,com.webank.wedatasphpere.dss.user.service.impl.KerberosCommand,com.webank.wedatasphpere.dss.user.service.impl.LdapCommand,com.webank.wedatasphpere.dss.user.service.impl.WorkspaceCommand,com.webank.wedatasphpere.dss.user.service.impl.MetastoreCommand,com.webank.wedatasphpere.dss.user.service.impl.AzkabanCommand +wds.dss.scheduler.url=/schedule/system \ No newline at end of file diff --git a/dss-server/src/main/scala/com/webank/wedatasphere/dss/server/crumb/CrumbFactory.scala b/dss-server/src/main/scala/com/webank/wedatasphere/dss/server/crumb/CrumbFactory.scala index 9c631c4c2c89ec469904480c7da0f969e65f3582..35df203a7abbd105fcdb05a63785d87ecab8a9cd 100644 --- a/dss-server/src/main/scala/com/webank/wedatasphere/dss/server/crumb/CrumbFactory.scala +++ b/dss-server/src/main/scala/com/webank/wedatasphere/dss/server/crumb/CrumbFactory.scala @@ -18,8 +18,8 @@ package com.webank.wedatasphere.dss.server.crumb import com.webank.wedatasphere.dss.server.entity.CrumbType.CrumbType -import com.webank.wedatasphere.dss.server.entity.{Crumb, CrumbType, DWSFlowTaxonomy, DWSProjectTaxonomy} -import com.webank.wedatasphere.dss.server.service.impl.{DWSFlowServiceImpl, DWSFlowTaxonomyServiceImpl, DWSProjectServiceImpl, DWSProjectTaxonomyServiceImpl} +import com.webank.wedatasphere.dss.server.entity.{Crumb, CrumbType, DSSFlowTaxonomy, DSSProjectTaxonomy} +import com.webank.wedatasphere.dss.server.service.impl.{DSSFlowServiceImpl, DSSFlowTaxonomyServiceImpl, DSSProjectServiceImpl, DSSProjectTaxonomyServiceImpl} import org.springframework.beans.factory.annotation.Autowired import org.springframework.stereotype.Component @@ -29,13 +29,13 @@ import scala.collection.mutable.ArrayBuffer @Component class CrumbFactory { @Autowired - private var projectService: DWSProjectServiceImpl = _ + private var projectService: DSSProjectServiceImpl = _ @Autowired - private var projectTaxonomyService:DWSProjectTaxonomyServiceImpl = _ + private var projectTaxonomyService:DSSProjectTaxonomyServiceImpl = _ @Autowired - private var flowService: DWSFlowServiceImpl = _ + private var flowService: DSSFlowServiceImpl = _ @Autowired - private var flowTaxonomyService:DWSFlowTaxonomyServiceImpl = _ + private var flowTaxonomyService:DSSFlowTaxonomyServiceImpl = _ def createCrumbs(crumbType: CrumbType, params: Array[String]): Array[Crumb] = { crumbType match { @@ -104,7 +104,7 @@ class CrumbFactory { def createCrumbData(crumbType: CrumbType, params: java.util.Map[String, String],userName:String): Any = { crumbType match { - case CrumbType.All =>createAllData(userName) + case CrumbType.All =>createAllData(userName, params.get("workspaceId").toLong) case CrumbType.SortProject =>createSortProjectData(params.get("projectTaxonomyID").toLong,userName) case CrumbType.Project =>createProjectData(params.get("projectVersionID").toLong,params.get("isRootFlow").toBoolean) case CrumbType.SortFlow =>createSortFlowData(params.get("projectVersionID").toLong,params.get("flowTaxonomyID").toLong,params.get("isRootFlow").toBoolean) @@ -112,19 +112,19 @@ class CrumbFactory { } } - private def createAllData(userName:String):java.util.List[DWSProjectTaxonomy] ={ - projectTaxonomyService.listAllProjectTaxonomy(userName) + private def createAllData(userName:String, workspaceId: Long):java.util.List[DSSProjectTaxonomy] ={ + projectTaxonomyService.listAllProjectTaxonomy(userName, workspaceId) } - private def createSortProjectData(projectTaxonomyID:Long,userName:String):java.util.List[DWSProjectTaxonomy] ={ + private def createSortProjectData(projectTaxonomyID:Long,userName:String):java.util.List[DSSProjectTaxonomy] ={ projectTaxonomyService.listProjectTaxonomy(projectTaxonomyID,userName) } - private def createProjectData(projectVersionID:Long,isRootFlow:Boolean):java.util.List[DWSFlowTaxonomy] = { + private def createProjectData(projectVersionID:Long,isRootFlow:Boolean):java.util.List[DSSFlowTaxonomy] = { flowTaxonomyService.listAllFlowTaxonomy(projectVersionID,isRootFlow) } - private def createSortFlowData(projectVersionID:Long,flowTaxonomyID:Long,isRootFlow:Boolean):java.util.List[DWSFlowTaxonomy] = { + private def createSortFlowData(projectVersionID:Long,flowTaxonomyID:Long,isRootFlow:Boolean):java.util.List[DSSFlowTaxonomy] = { flowTaxonomyService.listFlowTaxonomy(projectVersionID,flowTaxonomyID:Long,isRootFlow:Boolean) } } diff --git a/dss-server/src/main/scala/com/webank/wedatasphere/dss/server/crumb/QuerParamsParser.scala b/dss-server/src/main/scala/com/webank/wedatasphere/dss/server/crumb/QuerParamsParser.scala index 345a9a47cde5d2be635eb1de9947d210eefa19b9..5b979f85f700a89f65ddf99c03c8c5264a11c876 100644 --- a/dss-server/src/main/scala/com/webank/wedatasphere/dss/server/crumb/QuerParamsParser.scala +++ b/dss-server/src/main/scala/com/webank/wedatasphere/dss/server/crumb/QuerParamsParser.scala @@ -27,10 +27,11 @@ object QuerParamsParser { def getCrumbType(queryParams: String): CrumbType = { queryParams.split("&").size match { - case 1 => if ("".equals(queryParams)) CrumbType.All else CrumbType.SortProject - case 3 => CrumbType.Project - case 4 => CrumbType.SortFlow - case 5 => CrumbType.Flow + case 1 => CrumbType.All + case 2 => CrumbType.SortProject + case 4 => CrumbType.Project + case 5 => CrumbType.SortFlow + case 6 => CrumbType.Flow } } diff --git a/dss-server/src/main/scala/com/webank/wedatasphere/dss/server/receiver/DSSServerReceiver.scala b/dss-server/src/main/scala/com/webank/wedatasphere/dss/server/receiver/DSSServerReceiver.scala index 05fef8a515b9bbdf1047b63388f5cba469281cd4..951128083aec7f075b95387bcc6fcb0a109eb2a9 100644 --- a/dss-server/src/main/scala/com/webank/wedatasphere/dss/server/receiver/DSSServerReceiver.scala +++ b/dss-server/src/main/scala/com/webank/wedatasphere/dss/server/receiver/DSSServerReceiver.scala @@ -19,8 +19,8 @@ package com.webank.wedatasphere.dss.server.receiver import com.webank.wedatasphere.dss.application.dao.ApplicationMapper import com.webank.wedatasphere.dss.common.exception.DSSErrorException -import com.webank.wedatasphere.dss.common.protocol.{RequestDSSApplication, RequestDWSProject} -import com.webank.wedatasphere.dss.server.service.DWSProjectService +import com.webank.wedatasphere.dss.common.protocol.{RequestDSSApplication, RequestDSSProject} +import com.webank.wedatasphere.dss.server.service.DSSProjectService import com.webank.wedatasphere.linkis.rpc.{Receiver, Sender} import org.springframework.beans.factory.annotation.Autowired import org.springframework.stereotype.Component @@ -32,7 +32,7 @@ import scala.concurrent.duration.Duration class DSSServerReceiver extends Receiver{ @Autowired - var dwsProjectService:DWSProjectService = _ + var dwsProjectService:DSSProjectService = _ @Autowired var applicationMapper:ApplicationMapper = _ @@ -40,7 +40,7 @@ class DSSServerReceiver extends Receiver{ override def receive(message: Any, sender: Sender): Unit = {} override def receiveAndReply(message: Any, sender: Sender): Any = message match { - case f:RequestDWSProject => dwsProjectService.getExecutionDWSProject(f) + case f:RequestDSSProject => dwsProjectService.getExecutionDSSProject(f) case RequestDSSApplication(name) => applicationMapper.getApplication(name) case _ =>throw new DSSErrorException(90000,"") } diff --git a/dss-server/src/main/test/com/webank/TestUnit.java b/dss-server/src/main/test/com/webank/TestUnit.java new file mode 100644 index 0000000000000000000000000000000000000000..07c619ae98451b0374812728cfdb60f118eaaa26 --- /dev/null +++ b/dss-server/src/main/test/com/webank/TestUnit.java @@ -0,0 +1,14 @@ +package com.webank; + +import com.webank.wedatasphpere.dss.user.service.impl.UserAuthorizationClient; +import org.junit.Test; + +public class TestUnit { + + @Test + public void test() throws Exception { + + UserAuthorizationClient UserAuthorizationClient = new UserAuthorizationClient(); + + } +} diff --git a/dss-user-manager/pom.xml b/dss-user-manager/pom.xml new file mode 100644 index 0000000000000000000000000000000000000000..ef03b3a667010b2f8ed9e133ce5b2bf836d6a312 --- /dev/null +++ b/dss-user-manager/pom.xml @@ -0,0 +1,87 @@ + + + + dss + com.webank.wedatasphere.dss + 0.9.1 + + 4.0.0 + + dss-user-manager + jar + + UTF-8 + + + + + com.webank.wedatasphere.linkis + linkis-module + ${linkis.version} + provided + + + scala-xml_2.11 + org.scala-lang.modules + + + guava + com.google.guava + + + + + com.webank.wedatasphere.linkis + linkis-common + 0.9.4 + + + com.github.rholder + guava-retrying + 2.0.0 + provided + + + org.junit.jupiter + junit-jupiter + RELEASE + test + + + dom4j + dom4j + 1.6.1 + + + cn.hutool + hutool-all + 5.3.2 + + + mysql + mysql-connector-java + 5.1.49 + + + com.typesafe + config + 1.4.1 + + + + + + org.apache.maven.plugins + maven-deploy-plugin + + + + org.apache.maven.plugins + maven-jar-plugin + + + + + \ No newline at end of file diff --git a/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/conf/DSSUserManagerConfig.java b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/conf/DSSUserManagerConfig.java new file mode 100644 index 0000000000000000000000000000000000000000..731194cc49ec14cbf5f5cd5a912dbf47a2d83064 --- /dev/null +++ b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/conf/DSSUserManagerConfig.java @@ -0,0 +1,60 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.conf; + +import com.webank.wedatasphere.linkis.common.conf.CommonVars; + +import java.util.ResourceBundle; + +/** + * @program: dss-appjoint-auth + * @description: 用户模块配置文件 + * + * @create: 2020-12-30 16:26 + **/ + + +public class DSSUserManagerConfig { +// private final static ResourceBundle resource = ResourceBundle.getBundle("linkis"); + public static final String LOCAL_USER_ROOT_PATH = CommonVars.apply("wds.dss.user.root.dir","null").getValue().trim(); + public static final String BDP_SERVER_MYBATIS_DATASOURCE_URL = CommonVars.apply("wds.linkis.server.mybatis.datasource.url", "null").getValue().trim(); + public static final String BDP_SERVER_MYBATIS_DATASOURCE_USERNAME = CommonVars.apply("wds.linkis.server.mybatis.datasource.username", "null").getValue().trim(); + public static final String BDP_SERVER_MYBATIS_DATASOURCE_PASSWORD = CommonVars.apply("wds.linkis.server.mybatis.datasource.password", "null").getValue().trim(); + public static final String SCHEDULER_ADDRESS = CommonVars.apply("wds.dss.appjoint.scheduler.azkaban.address", "null").getValue().trim(); + public static final String USER_ACCOUNT_COMMANDS = CommonVars.apply("wds.dss.user.account.command.class", "null").getValue().trim(); + + public static final String METASTORE_HDFS_PATH = CommonVars.apply("wds.linkis.metastore.hive.hdfs.base.path", "null").getValue().trim(); + public static final String METASTORE_SCRIPT_PAHT = CommonVars.apply("wds.linkis.metastore.script.path", "null").getValue().trim(); + public static final String METASTORE_DB_TAIL = CommonVars.apply("wds.linkis.metastore.db.tail", "_default").getValue().trim(); + + public static final String KERBEROS_REALM = CommonVars.apply("wds.linkis.kerberos.realm", "null").getValue().trim(); + public static final String KERBEROS_ADMIN = CommonVars.apply("wds.linkis.kerberos.admin", "null").getValue().trim(); + public static final String KERBEROS_SCRIPT_PATH = CommonVars.apply("wds.linkis.kerberos.script.path", "null").getValue().trim(); + public static final String KERBEROS_KEYTAB_PATH = CommonVars.apply("wds.linkis.kerberos.keytab.path", "null").getValue().trim(); + public static final String KERBEROS_SSH_PORT = CommonVars.apply("wds.linkis.kerberos.ssh.port", "22").getValue().trim(); + public static final String KERBEROS_KDC_NODE = CommonVars.apply("wds.linkis.kerberos.kdc.node", "null").getValue().trim(); + public static final String KERBEROS_KDC_USER_NAME = CommonVars.apply("wds.linkis.kerberos.kdc.user.name", "null").getValue().trim(); + public static final String KERBEROS_KDC_USER_PASSWORD = CommonVars.apply("wds.linkis.kerberos.kdc.user.password", "null").getValue().trim(); + public static final String KERBEROS_ENABLE_SWITCH = CommonVars.apply("wds.linkis.kerberos.enable.switch", "null").getValue().trim(); + public static final String DSS_DEPLOY_PATH = CommonVars.apply("wds.dss.deploy.path", "null").getValue().trim(); + public static final String DSS_SCHEDULER_URL = CommonVars.apply("wds.dss.scheduler.url", "/schedule/system").getValue().trim(); + + + +} diff --git a/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/dto/request/AuthorizationBody.java b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/dto/request/AuthorizationBody.java new file mode 100644 index 0000000000000000000000000000000000000000..250f66f169645bcae4fa24fc1891161b7d620fe2 --- /dev/null +++ b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/dto/request/AuthorizationBody.java @@ -0,0 +1,99 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.dto.request; + +import java.util.ArrayList; +import java.util.HashMap; +import java.util.List; + +/** + * @program: user-manager + * @description: 施工单数据结构 + * + * @create: 2020-08-12 14:29 + **/ +public class AuthorizationBody { + + private String username; + private String password; + private String dssInstallDir; + private String azkakanDir; + private ArrayList servers; + + + public ArrayList getServers() { + return servers; + } + + public void setServers(ArrayList servers) { + this.servers = servers; + } + + + + + public String getAzkakanDir() { + return azkakanDir; + } + + public void setAzkakanDir(String azkakanDir) { + this.azkakanDir = azkakanDir; + } + + public String getDssInstallDir() { + return dssInstallDir; + } + + public void setDssInstallDir(String installDir) { + this.dssInstallDir = installDir; + } + + public List> getPaths() { + return paths; + } + + public void setPaths(List> paths) { + this.paths = paths; + } + + private List> paths; + + public String getUsername() { + return username; + } + + public void setUsername(String username) { + this.username = username; + } + + public String getPassword() { + return password; + } + + public void setPassword(String password) { + this.password = password; + } + + public String getDatabaseName(){ + return this.username + "_default"; + } + + + +} diff --git a/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/dto/request/LinuxServer.java b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/dto/request/LinuxServer.java new file mode 100644 index 0000000000000000000000000000000000000000..7569612f5cada8832c69fd98533c2bd373bef995 --- /dev/null +++ b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/dto/request/LinuxServer.java @@ -0,0 +1,53 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.dto.request; + +public class LinuxServer { + + private String linuxHost; //master ip,Comma separated + private String linuxLoginUser; + private String linuxLoginPassword; + + + public String getLinuxHost() { + return linuxHost; + } + + public void setLinuxHost(String linuxHosts) { + this.linuxHost = linuxHosts; + } + + public String getLinuxLoginUser() { + return linuxLoginUser; + } + + public void setLinuxLoginUser(String linuxLoginUser) { + this.linuxLoginUser = linuxLoginUser; + } + + public String getLinuxLoginPassword() { + return linuxLoginPassword; + } + + public void setLinuxLoginPassword(String linuxLoginPassword) { + this.linuxLoginPassword = linuxLoginPassword; + } + + +} diff --git a/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/AbsCommand.java b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/AbsCommand.java new file mode 100644 index 0000000000000000000000000000000000000000..be4bf3332fdf7694dd9f6ebbc8305c37a701e7db --- /dev/null +++ b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/AbsCommand.java @@ -0,0 +1,114 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.service; + + +import com.webank.wedatasphere.linkis.server.Message; +import com.webank.wedatasphpere.dss.user.dto.request.AuthorizationBody; +import org.dom4j.DocumentException; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import javax.ws.rs.core.Response; +import java.io.BufferedReader; +import java.io.IOException; +import java.io.InputStreamReader; +import java.net.URL; + +/** + * 各模块的授权 继承这个类 根据需要实现自己的类。 + */ +public abstract class AbsCommand implements Command { + protected final Logger logger = LoggerFactory.getLogger(getClass()); + + @Override + public String capacity(AuthorizationBody body) { + return Command.SUCCESS; + } + + @Override + public String renew(AuthorizationBody body) { + return Command.SUCCESS; + } + + @Override + public String undoAuthorization(AuthorizationBody body) { return Command.SUCCESS; } + + @Override +// public String authorization(AuthorizationBody body) throws DocumentException { return Command.SUCCESS; } + public String authorization(AuthorizationBody body) throws IOException, Exception { return Command.SUCCESS; } + + public String toMessage(String msg) { + return this.getClass().getSimpleName() + "the module starts execution"+ msg; + } + + protected String runShell(String scriptPath, String[] args){ + String bashCommand; + try { + bashCommand = "sh " + scriptPath + " " + String.join(" ", args); + Runtime runtime = Runtime.getRuntime(); + Process process = runtime.exec(bashCommand); + + return this.getString(process); + } + catch (Exception e){ + logger.error(scriptPath, e); + return e.getMessage(); + } + } + + protected String getString(Process process) throws IOException, InterruptedException { + BufferedReader br = new BufferedReader(new InputStreamReader(process.getInputStream())); + + String inline; + while ((inline = br.readLine()) != null) { + if (!inline.equals("")) { + inline = inline.replaceAll("<", "<").replaceAll(">", ">"); + logger.info(inline); + } else { + logger.info("\n"); + } + } + br.close(); + br = new BufferedReader(new InputStreamReader(process.getErrorStream())); //错误信息 + while ((inline = br.readLine()) != null) { + if (!inline.equals("")) + logger.warn(inline); + else + logger.warn("\n"); + } + + int status = process.waitFor(); + if (status != 0){ + logger.error("shell error: "+status); + } + br.close(); + return Command.SUCCESS; + } + + protected String getResource(String path){ + try { + URL url = this.getClass().getClassLoader().getResource(path); + return url.getPath(); + }catch (Exception e){ + logger.error("File does not exist " + path, e); + } + return null; + } +} diff --git a/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/Command.java b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/Command.java new file mode 100644 index 0000000000000000000000000000000000000000..dfed254468d561b49e3967300d8921aa8028839f --- /dev/null +++ b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/Command.java @@ -0,0 +1,58 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.service; + + +import com.webank.wedatasphpere.dss.user.dto.request.AuthorizationBody; +import org.dom4j.DocumentException; + +import java.io.IOException; + +public interface Command { + + final public static String SUCCESS = "success"; + final public static String ERROR = "error"; + /** + * 授权开通服务 + * @param body + * @return 成功 success 其他失败 + */ + public String authorization(AuthorizationBody body) throws DocumentException, IOException, Exception; + + /** + * 关闭授权 + * @param body + * @return 成功 success 其他失败 + */ + public String undoAuthorization(AuthorizationBody body) throws Exception; + + /** + * 扩容 + * @param body + * @return 成功 success 其他失败 + */ + public String capacity(AuthorizationBody body) throws Exception; + + /** + * 续费 + * @param body + * @return 成功 success 其他失败 + */ + public String renew(AuthorizationBody body) throws Exception; +} diff --git a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSUserService.java b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/MacroCommand.java similarity index 68% rename from dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSUserService.java rename to dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/MacroCommand.java index fe279df24cded94ea1fcea0cd455ca41d8eeddf1..0d2d3431cf6552b55ec8c703f3cd2b7f8068003b 100644 --- a/dss-server/src/main/java/com/webank/wedatasphere/dss/server/service/DWSUserService.java +++ b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/MacroCommand.java @@ -15,9 +15,19 @@ * */ -package com.webank.wedatasphere.dss.server.service; + +package com.webank.wedatasphpere.dss.user.service; + + +/** + * @program: user-authorization + * @description: 开通命令接口 + * + * @create: 2020-08-10 14:24 + **/ +public interface MacroCommand extends Command { + + public void add(AbsCommand command) throws Exception; -public interface DWSUserService { - Long getUserID(String userName); } diff --git a/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/AzkabanCommand.java b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/AzkabanCommand.java new file mode 100644 index 0000000000000000000000000000000000000000..2f85ae38b3a8935bf9cc6e8fd45bd5d21eaa3f7f --- /dev/null +++ b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/AzkabanCommand.java @@ -0,0 +1,110 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.service.impl; + +import com.webank.wedatasphpere.dss.user.dto.request.AuthorizationBody; +import com.webank.wedatasphpere.dss.user.service.AbsCommand; +import org.dom4j.Document; +import org.dom4j.DocumentException; +import org.dom4j.Element; +import org.dom4j.io.OutputFormat; +import org.dom4j.io.SAXReader; +import org.dom4j.io.XMLWriter; + +import java.io.File; +import java.io.FileInputStream; +import java.io.FileOutputStream; +import java.io.IOException; +import java.util.Iterator; +import java.util.List; + +/** + * @program: dss-appjoint-auth + * @description: 开通azkaban账号 + * + * @create: 2021-01-08 15:53 + **/ + +public class AzkabanCommand extends AbsCommand { + @Override + public String authorization(AuthorizationBody body) { + + try{ + this.xmlHandler(body.getAzkakanDir()+"/conf/azkaban-users.xml", body); + String[] args = {body.getUsername(), body.getPassword(), body.getDssInstallDir()+"/conf/"}; + String path = getResource("default/AddschedulerUser.sh"); + return this.runShell(path, args); + }catch (Exception err){ + logger.error("AzkabanCommand auth error:", err); + return err.getMessage(); + } + } + + private void xmlHandler(String azkPath, AuthorizationBody body) throws DocumentException, IOException { + SAXReader reader = new SAXReader(); + + File file = new File(azkPath); + Document document; + FileInputStream fis = null; + try { + fis = new FileInputStream(file); + document = reader.read(fis); + }catch (DocumentException e){ + throw e; + }finally { + if (fis != null) { + fis.close(); + } + } + + Element root = document.getRootElement(); + + Iterator it = root.elementIterator("user"); + Boolean userExists = false; + Element element = null; + while (it.hasNext()) { + element = (Element) it.next(); + + String v = element.attributeValue("username"); + if(v.equals(body.getUsername())){ //修改密码 + userExists = true; + element.attribute("password").setValue(body.getPassword()); + } + } + if(!userExists){ //新增账号 + Element cloneEl = element.createCopy(); + cloneEl.attribute("username").setValue(body.getUsername()); + cloneEl.attribute("password").setValue(body.getPassword()); + + List elements = root.elements("user"); + elements.add(elements.size(), cloneEl); + } + + this.saveXml(document, file); + + } + + private void saveXml(Document document, File file) throws IOException { + FileOutputStream out =new FileOutputStream(file); + OutputFormat format=OutputFormat.createPrettyPrint(); + XMLWriter writer=new XMLWriter(out, format); + writer.write(document); + writer.close(); + } +} diff --git a/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/KerberosCommand.java b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/KerberosCommand.java new file mode 100644 index 0000000000000000000000000000000000000000..ef120f19e780ccd7e9a51e4d4a9e8b66543fc45d --- /dev/null +++ b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/KerberosCommand.java @@ -0,0 +1,69 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.service.impl; + +import com.typesafe.config.Config; +import com.typesafe.config.ConfigFactory; +import com.webank.wedatasphere.linkis.common.conf.CommonVars; +import com.webank.wedatasphpere.dss.user.conf.DSSUserManagerConfig; +import com.webank.wedatasphpere.dss.user.dto.request.AuthorizationBody; +import com.webank.wedatasphpere.dss.user.service.AbsCommand; +import com.webank.wedatasphpere.dss.user.service.Command; + +import java.io.IOException; +import java.net.InetAddress; + +/** + * @date 2021/1/5 + */ +public class KerberosCommand extends AbsCommand { + + @Override + public String authorization(AuthorizationBody body) throws Exception { + String rst = createKt(body); + return rst != Command.SUCCESS ? rst : Command.SUCCESS; + } + + private String createKt(AuthorizationBody body) throws Exception { + String userName = body.getUsername(); + String hostName = InetAddress.getLocalHost().getHostName(); + String res = null; + if(userName != null){ + res = callShell(DSSUserManagerConfig.KERBEROS_SCRIPT_PATH, userName,hostName, + DSSUserManagerConfig.KERBEROS_KEYTAB_PATH,DSSUserManagerConfig.KERBEROS_SSH_PORT, + DSSUserManagerConfig.KERBEROS_KDC_NODE,DSSUserManagerConfig.KERBEROS_KDC_USER_NAME,DSSUserManagerConfig.KERBEROS_KDC_USER_PASSWORD,DSSUserManagerConfig.KERBEROS_REALM,DSSUserManagerConfig.KERBEROS_ENABLE_SWITCH); + } + return res; + } + + private String callShell(String shellFile, String username, String hostName, String keytabPath, + String sshPort, String kdcNode, String kdcUser,String password, String realm,String enableSwich) throws Exception { + + String bashCommand = getResource(shellFile); + String scriptCmd ; + if(null != hostName){ + scriptCmd = String.format("%s %s %s %s %s %s %s %s %s", username,hostName,keytabPath,sshPort,kdcNode,kdcUser,password,realm,enableSwich); + }else { + scriptCmd = String.format("%s %s %s %s %s %s %s %s", username,keytabPath,sshPort,kdcNode,kdcUser,password,realm,enableSwich); + } + String[] args = scriptCmd.split(" "); + return this.runShell(bashCommand, args); + } + +} diff --git a/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/LdapCommand.java b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/LdapCommand.java new file mode 100644 index 0000000000000000000000000000000000000000..31c2ecdbbf676f543b850c62c07a61c9271afd5b --- /dev/null +++ b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/LdapCommand.java @@ -0,0 +1,58 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.service.impl; + +import com.webank.wedatasphere.linkis.common.conf.CommonVars; +import com.webank.wedatasphpere.dss.user.conf.DSSUserManagerConfig; +import com.webank.wedatasphpere.dss.user.dto.request.AuthorizationBody; +import com.webank.wedatasphpere.dss.user.service.AbsCommand; +import com.webank.wedatasphpere.dss.user.service.Command; + +import java.io.BufferedReader; +import java.io.InputStreamReader; +import java.util.HashMap; +import java.util.List; + +/** + * @program: user-authorization + * @description: 创建用户空间 + * + * @create: 2020-08-13 13:39 + **/ + +public class LdapCommand extends AbsCommand { + + @Override + public String authorization(AuthorizationBody body) throws Exception { + + String userName = body.getUsername(); + String UserPassword = body.getPassword(); + String dssDeployPath = DSSUserManagerConfig.DSS_DEPLOY_PATH; + + String bashCommand = this.getClass().getClassLoader().getResource("default/CreateLdapAccount.sh").getPath(); + String[] args = { + userName, + UserPassword, + dssDeployPath + }; + + return this.runShell(bashCommand, args); + } + +} diff --git a/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/LinuxUserCommand.java b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/LinuxUserCommand.java new file mode 100644 index 0000000000000000000000000000000000000000..10a567e21659c889bd6f0bbc027e681af248a64f --- /dev/null +++ b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/LinuxUserCommand.java @@ -0,0 +1,60 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.service.impl; + +import com.webank.wedatasphpere.dss.user.dto.request.AuthorizationBody; +import com.webank.wedatasphpere.dss.user.dto.request.LinuxServer; +import com.webank.wedatasphpere.dss.user.service.AbsCommand; + +import java.util.ArrayList; + +/** + * @program: user-authorization + * @description: 创建用户空间 + * + * @create: 2020-08-13 13:39 + **/ + +public class LinuxUserCommand extends AbsCommand { + + @Override + public String authorization(AuthorizationBody body) throws Exception { + +// String hosts = body.getLinuxHosts(); +// String linuxPassword = body.getLinuxLoginPassword(); +// String linuxUserName = body.getLinuxLoginUser(); + ArrayList linuxServers = body.getServers(); + logger.info("服务器ip"+linuxServers.toString()); + StringBuffer stringBuffer = new StringBuffer(); + for(LinuxServer linuxServer:linuxServers){ + String hosts = linuxServer.getLinuxHost(); + String linuxPassword = linuxServer.getLinuxLoginPassword(); + String linuxUserName = linuxServer.getLinuxLoginUser(); + stringBuffer.append(hosts).append("#").append(linuxUserName).append("#").append(linuxPassword).append(","); + } + stringBuffer.deleteCharAt(stringBuffer.lastIndexOf(",")); + String addUserName = body.getUsername(); + String addUserPassword = body.getPassword(); + String bashCommand = this.getClass().getClassLoader().getResource("default/CreateLinuxUser.sh").getPath(); + String[] args = {stringBuffer.toString(),addUserName,addUserPassword}; + + return this.runShell(bashCommand, args); + } + +} diff --git a/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/MetastoreCommand.java b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/MetastoreCommand.java new file mode 100644 index 0000000000000000000000000000000000000000..607691919267d106cb2f389a990d7408bdf2617b --- /dev/null +++ b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/MetastoreCommand.java @@ -0,0 +1,56 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.service.impl; + +import com.typesafe.config.Config; +import com.typesafe.config.ConfigFactory; +import com.webank.wedatasphere.linkis.common.conf.CommonVars; +import com.webank.wedatasphpere.dss.user.conf.DSSUserManagerConfig; +import com.webank.wedatasphpere.dss.user.dto.request.AuthorizationBody; +import com.webank.wedatasphpere.dss.user.service.AbsCommand; +import com.webank.wedatasphpere.dss.user.service.Command; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +/** + * @date 2021/1/5 + */ +public class MetastoreCommand extends AbsCommand { + private static final Logger logger = LoggerFactory.getLogger(MetastoreCommand.class); + + @Override + public String authorization(AuthorizationBody body) throws Exception { + String rst = createDb(body); + return rst != Command.SUCCESS ? rst : Command.SUCCESS; + } + + private String createDb(AuthorizationBody body) throws Exception { + String bashCommand = null; + String[] args = null; + String userName = body.getUsername(); + if (userName != null) { + String dbName = userName + DSSUserManagerConfig.METASTORE_DB_TAIL; + String path = DSSUserManagerConfig.METASTORE_HDFS_PATH + "/"+dbName+".db"; + bashCommand = getResource(DSSUserManagerConfig.METASTORE_SCRIPT_PAHT); + args = new String[]{ userName,dbName,path, + DSSUserManagerConfig.KERBEROS_REALM,DSSUserManagerConfig.KERBEROS_ADMIN,DSSUserManagerConfig.KERBEROS_KEYTAB_PATH,DSSUserManagerConfig.KERBEROS_ENABLE_SWITCH}; + } + return this.runShell(bashCommand, args); + } +} diff --git a/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/SchedulisCommand.java b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/SchedulisCommand.java new file mode 100644 index 0000000000000000000000000000000000000000..73351943b9dab71ff613a94bdf8c281c54c99a79 --- /dev/null +++ b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/SchedulisCommand.java @@ -0,0 +1,146 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.service.impl; + +import com.fasterxml.jackson.databind.JsonNode; +import com.webank.wedatasphpere.dss.user.conf.DSSUserManagerConfig; +import com.webank.wedatasphpere.dss.user.dto.request.AuthorizationBody; +import com.webank.wedatasphpere.dss.user.service.AbsCommand; +import com.webank.wedatasphpere.dss.user.service.Command; + +import java.net.URI; +import java.sql.Connection; +import java.sql.DriverManager; +import java.sql.PreparedStatement; +import java.sql.SQLException; +import java.sql.Statement; +import java.util.Arrays; +import java.util.HashMap; +import java.util.Map; + +import cn.hutool.crypto.SecureUtil; +import cn.hutool.crypto.symmetric.DES; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import org.springframework.http.HttpStatus; +import org.springframework.http.ResponseEntity; +import org.springframework.util.StringUtils; +import org.springframework.web.client.RestTemplate; +import org.springframework.web.util.UriComponentsBuilder; + +/** + * @date 2021/1/11 + */ +public class SchedulisCommand extends AbsCommand { + + private static final Logger logger = LoggerFactory.getLogger(SchedulisCommand.class); + + @Override + public String authorization(AuthorizationBody body) { + String username = body.getUsername(); + logger.info("开始新增调度用户:{}", username); + String password = body.getPassword(); + String encryptionPwd = getEncryptionPwd(username, password); + Connection connection; + Statement stmt; + try { + logger.info("开始插入ctyun_user表"); + connection = getConnection(); + stmt = connection.createStatement(); + String sql = "INSERT INTO `ctyun_user` (`id`,`name`,`username`,`email`,`password`,`work_order_item_config`) VALUES (?,?,?,?,?,NULL) ON DUPLICATE KEY UPDATE `password` = ?"; + PreparedStatement statement = connection.prepareCall(sql); + statement.setString(1, username); + statement.setString(2, username); + statement.setString(3, username); + statement.setString(4, username); + statement.setString(5, encryptionPwd); + statement.setString(6, encryptionPwd); + statement.executeUpdate(); + stmt.close(); + connection.close(); + logger.info("完成插入ctyun_user表"); + } catch (SQLException e) { + logger.error(e.getMessage()); + } + logger.info("开始调用接口新增schedulis用户"); + addSchedulisUser(username, password); + logger.info("结束调用接口新增schedulis用户"); + return Command.SUCCESS; + } + + private static Connection getConnection() { + try { + //注册数据库的驱动 + Class.forName("com.mysql.jdbc.Driver"); + //获取数据库连接(里面内容依次是:主机名和端口、用户名、密码) + String url = DSSUserManagerConfig.BDP_SERVER_MYBATIS_DATASOURCE_URL; + String user = DSSUserManagerConfig.BDP_SERVER_MYBATIS_DATASOURCE_USERNAME; + String password = DSSUserManagerConfig.BDP_SERVER_MYBATIS_DATASOURCE_PASSWORD; + return DriverManager.getConnection(url, user, password); + } catch (Exception e) { + logger.error(e.getMessage()); + } + return null; + } + + private boolean addSchedulisUser(String username, String password) { + String schedulerUrl = DSSUserManagerConfig.DSS_SCHEDULER_URL; + RestTemplate restTemplate = new RestTemplate(); + String schedulisUrl = DSSUserManagerConfig.SCHEDULER_ADDRESS; + String url = new StringBuilder().append(schedulisUrl) + .append(schedulerUrl) + .toString(); + Map params = new HashMap<>(4); + params.put("userId", username); + params.put("password", password); + params.put("ajax", "addSystemUserViaFastTrackCtyun"); + String fullUrl = addParams(url, params); + UriComponentsBuilder builder = UriComponentsBuilder.fromHttpUrl(fullUrl); + URI uri = builder.build().encode().toUri(); + ResponseEntity responseEntity = restTemplate.getForEntity(uri, JsonNode.class); + return HttpStatus.OK.equals(responseEntity.getStatusCode()); + } + + private static String addParams(String url, Map params) { + if (params.isEmpty()) { + return url; + } + StringBuilder sb = new StringBuilder().append(url).append("?"); + for (Map.Entry entry : params.entrySet()) { + if (StringUtils.hasText(entry.getValue())) { + sb.append(entry.getKey()) + .append("=") + .append(entry.getValue()) + .append("&"); + } + } + return sb.deleteCharAt(sb.length()-1).toString(); + } + + private String getEncryptionPwd(String username, String password) { + int minSize = 8; + while (username.length() < minSize) { + username += username; + } + byte[] keyBytes = username.getBytes(); + Arrays.sort(keyBytes); + DES des = SecureUtil.des(keyBytes); + return des.encryptHex(password); + } +} diff --git a/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/UserAuthorizationClient.java b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/UserAuthorizationClient.java new file mode 100644 index 0000000000000000000000000000000000000000..fd275a0e4b819fa6b6e8844a6bbe4c2f1721bda2 --- /dev/null +++ b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/UserAuthorizationClient.java @@ -0,0 +1,57 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.service.impl; + + +import com.webank.wedatasphpere.dss.user.conf.DSSUserManagerConfig; +import com.webank.wedatasphpere.dss.user.dto.request.AuthorizationBody; +import com.webank.wedatasphpere.dss.user.service.AbsCommand; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +/** + * @program: user-authorization + * @description: + * + * @create: 2020-08-10 14:24 + **/ +public class UserAuthorizationClient { + + public UserMacroCommand userMacroCommand = new UserMacroCommand(); + protected final Logger logger = LoggerFactory.getLogger(UserAuthorizationClient.class); + + public UserAuthorizationClient() { + + String[] commandPaths = DSSUserManagerConfig.USER_ACCOUNT_COMMANDS.split(","); + for(String classPath: commandPaths){ + try { + userMacroCommand.add((AbsCommand) Class.forName(classPath).newInstance()); + } catch (Exception e) { + logger.info(e.getMessage()); + e.printStackTrace(); + } + } + } + + public String authorization(AuthorizationBody body) throws Exception { + return userMacroCommand.authorization(body); + } + + +} diff --git a/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/UserMacroCommand.java b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/UserMacroCommand.java new file mode 100644 index 0000000000000000000000000000000000000000..c842e666c344cbbc4a0eebad1411da157795662d --- /dev/null +++ b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/UserMacroCommand.java @@ -0,0 +1,118 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.service.impl; + + +import com.github.rholder.retry.*; +import com.google.common.base.Predicates; +import com.webank.wedatasphpere.dss.user.dto.request.AuthorizationBody; +import com.webank.wedatasphpere.dss.user.service.AbsCommand; +import com.webank.wedatasphpere.dss.user.service.Command; +import com.webank.wedatasphpere.dss.user.service.MacroCommand; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.io.IOException; +import java.util.ArrayList; +import java.util.List; +import java.util.concurrent.Callable; +import java.util.concurrent.ExecutionException; + +/** + * @program: user-authorization + * @description: 开通命令实现 + * + * @create: 2020-08-10 14:24 + **/ +public class UserMacroCommand implements MacroCommand { + + private List commandList = new ArrayList<>(); + + protected final Logger logger = LoggerFactory.getLogger(getClass()); + + private static Integer RETRY_COUNT = 1; + + @Override + public void add(AbsCommand command) { + + commandList.add(command); + } + + @Override + public String authorization(AuthorizationBody body) throws Exception { + + return this.execute("authorization", body); + + } + + @Override + public String undoAuthorization(AuthorizationBody json) throws Exception { + + return this.execute("undoAuthorization", json); + + } + + @Override + public String capacity(AuthorizationBody json) throws Exception { + + return this.execute("capacity", json); + + } + + @Override + public String renew(AuthorizationBody json) throws Exception { + + return this.execute("renew", json); + + } + + + /** + * 授权操作基础方法 + * @param funName 调用的函数名 + * @param body 传入的数据 + * @return + * @throws ClassNotFoundException + * @throws NoSuchMethodException + */ + private String execute(String funName, AuthorizationBody body) throws Exception { + + for (AbsCommand command : commandList) { + + switch (funName) { + case "authorization": + command.authorization(body); + + case "undoAuthorization": + command.undoAuthorization(body); + + case "capacity": + command.capacity(body); + } + + } + + return "success"; + } + + + } + + + diff --git a/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/WorkspaceCommand.java b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/WorkspaceCommand.java new file mode 100644 index 0000000000000000000000000000000000000000..37b9f95972458d90e81d3088221049656e4455de --- /dev/null +++ b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/WorkspaceCommand.java @@ -0,0 +1,70 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.service.impl; + +import com.webank.wedatasphpere.dss.user.dto.request.AuthorizationBody; +import com.webank.wedatasphpere.dss.user.service.AbsCommand; +import com.webank.wedatasphpere.dss.user.service.Command; + +import java.util.HashMap; +import java.util.List; + +/** + * @program: user-authorization + * @description: 创建用户空间 + * + * @create: 2020-08-13 13:39 + **/ + +public class WorkspaceCommand extends AbsCommand { + + @Override + public String authorization(AuthorizationBody body) throws Exception { + List> paths = body.getPaths(); + String result = ""; + Boolean isSuccess = true; + for(HashMap map : paths){ + String path = map.get("value"); + String rst = createDir(path, body); + result += rst; + if(!rst.equals(Command.SUCCESS)){ + isSuccess = false; + } + } + if(isSuccess){ + return Command.SUCCESS; + } + logger.error(result); + return result; + } + + private String createDir(String path, AuthorizationBody body) throws Exception { + String bashCommand; + + if(path.contains("hdfs:")){ + path = path.replace("hdfs://", "") + "/" + body.getUsername(); + bashCommand = getResource("default/HdfsPath.sh"); + }else { + path = path.replace("file://", "") + "/" + body.getUsername(); + bashCommand = getResource("default/LinuxPath.sh"); + } + String[] args = {body.getUsername(), path}; + return this.runShell(bashCommand, args); + } +} diff --git a/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/YarnCommand.java b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/YarnCommand.java new file mode 100644 index 0000000000000000000000000000000000000000..a4d854780a55bd1a7b904b0bd3b98077d070cb8b --- /dev/null +++ b/dss-user-manager/src/main/java/com/webank/wedatasphpere/dss/user/service/impl/YarnCommand.java @@ -0,0 +1,39 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.service.impl; + + +import com.webank.wedatasphpere.dss.user.dto.request.AuthorizationBody; +import com.webank.wedatasphpere.dss.user.service.AbsCommand; +import com.webank.wedatasphpere.dss.user.service.Command; + +/** + * @program: user-authorization + * @description: 开通yarn权限的实现 + * + * @create: 2020-08-10 14:24 + **/ +public class YarnCommand extends AbsCommand { + + @Override + public String authorization(AuthorizationBody body) { + + return Command.SUCCESS; + } +} diff --git a/dss-user-manager/src/main/resources/config/properties.conf b/dss-user-manager/src/main/resources/config/properties.conf new file mode 100644 index 0000000000000000000000000000000000000000..ece97f39a35fd24ceb6ce619350133fc83a53d5b --- /dev/null +++ b/dss-user-manager/src/main/resources/config/properties.conf @@ -0,0 +1,14 @@ +#--Metastore配置 +base_path = #--元数据位于hdfs位置上的base +metastore_sh = default/Metastore.sh #--执行脚本的位置 +db_tail = #--hdfs上数据库文件的尾缀 +realm = #--kerberos域 +admin = #--获得hive metastore服务admin角色授权的用户(如hdfs,非常重要,否则无法完成授权) + +#--Kerberos配置 +shellFile = default/Kerberos.sh #--执行脚本的位置 +keytabPath = /etc/security/keytabs #--keytab在本节点上的存放位置 +sshPort = 22 #--ssh操作使用的端口 +kdcNode = #--kerberos服务kdc节点 +kdcUser = #--在进行ssh kdc操作是,对应kdc节点上的用户,应当知道该用户的密码,并且该用户应存在于kdc节点sudoers列表内 +password = "password" #--上面提到的kdc接应用户的登录密码,用于ssh操作 diff --git a/dss-user-manager/src/main/resources/default/AddschedulerUser.sh b/dss-user-manager/src/main/resources/default/AddschedulerUser.sh new file mode 100644 index 0000000000000000000000000000000000000000..d9a5a02e448ccc45fac9f169c0fe1fc1776521cf --- /dev/null +++ b/dss-user-manager/src/main/resources/default/AddschedulerUser.sh @@ -0,0 +1,18 @@ +#!/usr/bin/env bash + +user=$1 +password=$2 +installDir=$3 + + +if grep -i "^${user}=" $installDir/token.properties; + then + if [ "$(uname)" == "Darwin" ]; + then + sed -i '' "s/^$user=.*/$user=$password/" $installDir/token.properties + else + sed -i "s/^$user=.*/$user=$password/" $installDir/token.properties + fi +else + echo "$user=$password" >> $installDir/token.properties +fi diff --git a/dss-user-manager/src/main/resources/default/CreateLdapAccount.sh b/dss-user-manager/src/main/resources/default/CreateLdapAccount.sh new file mode 100644 index 0000000000000000000000000000000000000000..0f4c8cbe14d39a8d59d10f50e49d089d4d7323fe --- /dev/null +++ b/dss-user-manager/src/main/resources/default/CreateLdapAccount.sh @@ -0,0 +1,7 @@ +#!/bin/bash +source /etc/profile +ldap_user=$1 +ldap_password=$2 +ldap_script_path=$3 +source $ldap_script_path/tools/venv/bin/activate && sudo $ldap_script_path/tools/venv/bin/python $ldap_script_path/tools/bin/ldap_user.py add_with_pw $ldap_user -p $ldap_password +echo "******************LDAP USER CREATED***********************" diff --git a/dss-user-manager/src/main/resources/default/CreateLinuxUser.sh b/dss-user-manager/src/main/resources/default/CreateLinuxUser.sh new file mode 100644 index 0000000000000000000000000000000000000000..32e42bccc0e62ce1e2441cc5c162c1785b5c27b0 --- /dev/null +++ b/dss-user-manager/src/main/resources/default/CreateLinuxUser.sh @@ -0,0 +1,34 @@ +#!/bin/bash + +source /etc/profile +server_user_strs=$1 +echo $server_user_strs +add_user_name=$2 +add_user_password=$3 +server_user_array=(${server_user_strs//,/ }) +for server_user_str in ${server_user_array[@]} +do + server_user_info=(${server_user_str//#/ }) + server_host=${server_user_info[0]} + server_user_name=${server_user_info[1]} + server_user_password=${server_user_info[2]} + echo "${server_host},${server_user_name},${server_user_password}" + + sudo sshpass -p $server_user_password ssh -o ConnectTimeout=1 $server_user_name@$server_host "echo success" + [ $? -ne 0 ] && echo "logon server:${server_host} failed" && exit 254 +done + +echo "************Start creating user*****************" + +for server_user_str in ${server_user_array[@]} +do + + server_user_info=(${server_user_str//#/ }) + server_host=${server_user_info[0]} + server_user_name=${server_user_info[1]} + server_user_password=${server_user_info[2]} + #sshpass -p $server_user_password ssh $server_user_name@$server_host "sudo useradd $add_user_name && echo $add_user_password |sudo -i passwd --stdin $add_user_name" + sshpass -p $server_user_password ssh $server_user_name@$server_host "sudo useradd $add_user_name -s /sbin/nologin" + + [ $? -ne 0 ] && echo "create user failed:${host}" && exit 254 +done diff --git a/dss-user-manager/src/main/resources/default/HdfsPath.sh b/dss-user-manager/src/main/resources/default/HdfsPath.sh new file mode 100644 index 0000000000000000000000000000000000000000..1f295650141107a28b5af8cdd31ca3a39fd1c6fc --- /dev/null +++ b/dss-user-manager/src/main/resources/default/HdfsPath.sh @@ -0,0 +1,13 @@ +#!/bin/bash +source /etc/profile +user=$1 +dir=$2 +echo $1 $2 +id $user +if [ $? -ne 0 ]; then + sudo useradd $user -s /sbin/nologin + echo "create user successfully" +fi + +hdfs dfs -mkdir -p $dir +hdfs dfs -chown $user:$user $dir diff --git a/dss-user-manager/src/main/resources/default/Kerberos.sh b/dss-user-manager/src/main/resources/default/Kerberos.sh new file mode 100644 index 0000000000000000000000000000000000000000..80c3eb502c3c21d2ee4252d322c104de4fec1f8b --- /dev/null +++ b/dss-user-manager/src/main/resources/default/Kerberos.sh @@ -0,0 +1,146 @@ +#!/bin/bash +source /etc/profile +#需要将当前登录用户,如dss加入到sudoers + +#函数 +check_principal_exist(){ + all_principal=`timeout 30 sshpass -p $PASSWORD ssh -p $SSH_PORT $KDC_USER@$KDCSERVER "sudo /usr/sbin/kadmin.local -q \"list_principals\""` #echo "all_principal:"$all_principal + principal=$1 + if [[ $all_principal =~ $principal ]] + then + #echo "包含" + return 1 + else + #echo "不包含" + return 0 + fi +} + +add_principal(){ + principalPrefix=$1 + echo "add_principal func,principalPrefix:"$principalPrefix + check_principal_exist "$principalPrefix@$REALM" + ifexist=$? + if [ $ifexist -eq 1 ] + then + echo "已有principal" + else + echo "没有principal,将会生成" + timeout 30 sshpass -p $PASSWORD ssh -p $SSH_PORT $KDC_USER@$KDCSERVER "sudo /usr/sbin/kadmin.local -q \"addprinc -randkey $principalPrefix\"" + fi +} + +generate_user(){ + username=$1 + if id -u $username >/dev/null 2>&1; then + echo "user exists" + else + echo "user does not exist, so we will create!" + sudo useradd $username + fi +} + +gen_keytab(){ + user=$1 + host=$2 + principalPrefix="$user/$host" + principal="$user/$host@$REALM" + add_principal $principalPrefix + if [[ $? -ne 0 ]];then + echo "create keytab failed!!!" + exit 1 + fi + timeout 30 sshpass -p $PASSWORD ssh -p $SSH_PORT $KDC_USER@$KDCSERVER "sudo rm -rf /tmp/$host.$user.keytab" + timeout 30 sshpass -p $PASSWORD ssh -p $SSH_PORT $KDC_USER@$KDCSERVER "sudo /usr/sbin/kadmin.local -q \"xst -norandkey -k /tmp/$host.$user.keytab $user/$host\"" + timeout 30 sshpass -p $PASSWORD ssh -p $SSH_PORT $KDC_USER@$KDCSERVER "sudo chmod 755 /tmp/$host.$user.keytab" + timeout 30 sshpass -p $PASSWORD scp -P $SSH_PORT $KDC_USER@$KDCSERVER:/tmp/$host.$user.keytab ./ + if [[ -f "$host.$user.keytab" ]]; then + sudo mv ./$host.$user.keytab $CENTER_KEYTAB_PATH/$user.keytab + if [[ $? != 0 ]];then + echo "rename keytab failed!" + else + generate_user $user + sudo chown $user $CENTER_KEYTAB_PATH/$user.keytab + sudo su - $user -c "kinit -kt $CENTER_KEYTAB_PATH/$user.keytab $principal" + deployUser=`whoami` + sudo su - $deployUser -c "crontab -l > conf && echo '* */12 * * * sudo -u $user kinit -kt $CENTER_KEYTAB_PATH/$user.keytab $principal' >> conf && crontab conf && rm -f conf" + fi + else + echo "the $user.keytab does not exist, please check your previous steps!" + fi +} + +gen_keytab_user(){ + user=$1 + principalPrefix="$user" + principal="$user@$REALM" + add_principal $principalPrefix + timeout 30 sshpass -p $PASSWORD ssh -p $SSH_PORT $KDC_USER@$KDCSERVER "sudo rm -rf /tmp/$user.keytab" + timeout 30 sshpass -p $PASSWORD ssh -p $SSH_PORT $KDC_USER@$KDCSERVER "sudo /usr/sbin/kadmin.local -q \"xst -norandkey -k /tmp/$user.keytab $user\"" + timeout 30 sshpass -p $PASSWORD ssh -p $SSH_PORT $KDC_USER@$KDCSERVER "sudo chmod 755 /tmp/$user.keytab" + timeout 30 sshpass -p $PASSWORD scp -P $SSH_PORT $KDC_USER@$KDCSERVER:/tmp/$user.keytab ./ + if [[ -f "$user.keytab" ]]; then + sudo mv ./$user.keytab $CENTER_KEYTAB_PATH/$user.keytab + generate_user $user + sudo chown $user $CENTER_KEYTAB_PATH/$user.keytab + sudo su - $user -c "kinit -kt $CENTER_KEYTAB_PATH/$user.keytab $principal" + sudo su - op -c "crontab -l > conf && echo '* */12 * * * sudo -u $user kinit -kt $CENTER_KEYTAB_PATH/$user.keytab $principal' >> conf && crontab conf && rm -f conf" + else + echo "the $user.keytab does not exist, please check your previous steps!" + fi + +} + + + +#第一个参数为功能参数(必须有),第二个为user(必须有),第三个为host(可以有) +if [ $# -lt 3 ] || [ $# -gt 9 ]; then + echo -e "\033[31m \033[05m请确认您的操作,输入格式如下 功能参数 [user|user hostname]\033[0m" + echo "Usage: $0 genenateKeytab {username|username hostname}" + echo `date '+%Y-%m-%d %H:%M:%S'`" parameters:"$* >>/tmp/deltaKerberos.log + exit 1 +else + if [ $# -eq 8 ]; then + user=$1 + CENTER_KEYTAB_PATH=$2 + SSH_PORT=$3 + KDCSERVER=$4 + KDC_USER=$5 + PASSWORD=$6 + REALM=$7 + KERBEROS_ENABLE=$8 + echo $user + echo $CENTER_KEYTAB_PATH + echo $SSH_PORT + echo $KDCSERVER_$KDC_USER + echo $REALM + echo $KERBEROS_ENABLE + if [ $KERBEROS_ENABLE = "0" ]; then + echo "kerberos is disabled" + else + echo "kerberos is enable" + echo `date '+%Y-%m-%d %H:%M:%S'`" in genenate_key_tab username:"$user >>/tmp/deltaKerberos.log + gen_keytab_user $user + fi + else + user=$1 + host=$2 + CENTER_KEYTAB_PATH=$3 + SSH_PORT=$4 + KDCSERVER=$5 + KDC_USER=$6 + PASSWORD=$7 + REALM=$8 + KERBEROS_ENABLE=$9 + echo $REALM + echo $KERBEROS_ENABLE + if [ $KERBEROS_ENABLE = "0" ]; then + echo "kerberos is disabled" + else + echo "kerberos1 is enable" + echo `date '+%Y-%m-%d %H:%M:%S'`" in genenate_key_tab username:"$user" hostname:"$host >>/tmp/deltaKerberos.log + gen_keytab $user $host + fi + fi +fi +exit 0 diff --git a/dss-user-manager/src/main/resources/default/LinuxPath.sh b/dss-user-manager/src/main/resources/default/LinuxPath.sh new file mode 100644 index 0000000000000000000000000000000000000000..03db38d857a48e4c6b4b8dc2eeeeafa4f8f8754a --- /dev/null +++ b/dss-user-manager/src/main/resources/default/LinuxPath.sh @@ -0,0 +1,12 @@ +#!/bin/bash +source /etc/profile +user=$1 +dir=$2 +echo $1 $2; +id $user +if [ $? -ne 0 ]; then + useradd $user -s /sbin/nologin + echo "create user successfully" +fi +sudo mkdir -p $dir +sudo chown $user:$user $dir diff --git a/dss-user-manager/src/main/resources/default/Metastore.sh b/dss-user-manager/src/main/resources/default/Metastore.sh new file mode 100644 index 0000000000000000000000000000000000000000..3d72f665892e55d91fb4563d3418588472ed7aa6 --- /dev/null +++ b/dss-user-manager/src/main/resources/default/Metastore.sh @@ -0,0 +1,44 @@ +#!/bin/bash +source /etc/profile +user_name=$1 +db_name=$2 +path=$3 +realm=$4 +admin=$5 +ktPath=$6 +host_name=`hostname` +kerberos_enable=$7 +echo $kerberos_enable +if [ $kerberos_enable = "0" ]; then + echo "kerberos is disabled" +else + echo "kerberos is enabled" + kinit -kt $ktPath/$admin.keytab $admin/${host_name}@${realm} +fi + + #二、metastore操作 +hive -e "create database if not exists $db_name" +if [[ $? -ne 0 ]]; then + echo "create database failed!" +else + #修改数据库所属,将principal用户添加到metastore侧hive-site.xml hive.users.in.admin.role中 + if [ $kerberos_enable = "0" ]; then + hive -e "grant all on database $db_name to user $user_name" + else + hive -e "set role admin; grant all on database $db_name to user $user_name" + fi +fi + + #三、hdfs操作 +if [[ $? -ne 0 ]]; then + #回滚 + hive -e "drop database $db_name" + echo "grant database failed,rollback finished!" +else + echo "grant database $db_name successfully!" + #修改hdfs路径所属 + hdfs dfs -chown $user_name:$user_name $path + #修改hdfs路径权限 + hdfs dfs -chmod -R 700 $path + echo "hdfs operation successfully!" +fi diff --git a/dss-user-manager/src/main/resources/read_anlexander.txt b/dss-user-manager/src/main/resources/read_anlexander.txt new file mode 100644 index 0000000000000000000000000000000000000000..fc60487e097064d055f2e62baf8bb85c5fd9b0e5 --- /dev/null +++ b/dss-user-manager/src/main/resources/read_anlexander.txt @@ -0,0 +1,20 @@ +一键化开通账户 + 服务目录结构: + 进入口调用: + 入口调用:dss-user-manager\src\main\java\com\webank\wedatasphpere\dss\user\service\impl\UserAuthorizationClient.java + +#参考配置文件resource/config/properties.conf +#metastore功能使用说明 + 参数:base_path --元数据位于hdfs位置上的base + metastore_sh --执行脚本的位置 + db_tail --hdfs上数据库文件的尾缀 + realm --kerberos域 + admin --获得hive metastore服务admin角色授权的用户(如hdfs,非常重要,否则无法完成授权) + +#kerberos功能使用说明 + 参数:keytabPath --keytab在本节点上的存放位置 + shellFile --执行脚本的位置 + sshPort --ssh操作使用的端口 + kdcNode --kerberos服务kdc节点 + kdcUser --在进行ssh kdc操作是,对应kdc节点上的用户,应当知道该用户的密码,并且该用户应存在于kdc节点sudoers列表内 + password --上面提到的kdc接应用户的登录密码,用于ssh操作 \ No newline at end of file diff --git a/dss-user-manager/src/test/java/com/webank/wedatasphpere/dss/user/service/impl/KerberosCommandTest.java b/dss-user-manager/src/test/java/com/webank/wedatasphpere/dss/user/service/impl/KerberosCommandTest.java new file mode 100644 index 0000000000000000000000000000000000000000..445b4889d4e7b411e992043e57b2a58da2af790e --- /dev/null +++ b/dss-user-manager/src/test/java/com/webank/wedatasphpere/dss/user/service/impl/KerberosCommandTest.java @@ -0,0 +1,43 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.service.impl; + + +import com.webank.wedatasphpere.dss.user.dto.request.AuthorizationBody; +import org.junit.jupiter.api.Test; + +import java.io.IOException; + +class KerberosCommandTest { + + @Test +// @MethodSource("body") + void authorization() { + AuthorizationBody body = new AuthorizationBody(); + body.setUsername("anlexander"); + body.setPassword("123321"); + KerberosCommand test = new KerberosCommand(); + try { + test.authorization(body); + } catch (Exception e) { + e.printStackTrace(); + } + System.out.println("当前测试方法结束"); + } +} diff --git a/dss-user-manager/src/test/java/com/webank/wedatasphpere/dss/user/service/impl/MetastoreCommandTest.java b/dss-user-manager/src/test/java/com/webank/wedatasphpere/dss/user/service/impl/MetastoreCommandTest.java new file mode 100644 index 0000000000000000000000000000000000000000..da1fcb91e28e5be16afaf8aa1b08c033fbbe3976 --- /dev/null +++ b/dss-user-manager/src/test/java/com/webank/wedatasphpere/dss/user/service/impl/MetastoreCommandTest.java @@ -0,0 +1,43 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.service.impl; + + +import com.webank.wedatasphpere.dss.user.dto.request.AuthorizationBody; +import org.junit.jupiter.api.Test; + +class MetastoreCommandTest { + + @Test +// @MethodSource("body") + void authorization() { + AuthorizationBody body = new AuthorizationBody(); + body.setUsername("anlexander"); + body.setPassword("123321"); + MetastoreCommand test = new MetastoreCommand(); + + try { + test.authorization(body); + } catch (Exception e) { + e.printStackTrace(); + } + + System.out.println("当前测试方法结束"); + } +} diff --git a/dss-user-manager/src/test/java/com/webank/wedatasphpere/dss/user/service/impl/WorkspaceCommandTest.java b/dss-user-manager/src/test/java/com/webank/wedatasphpere/dss/user/service/impl/WorkspaceCommandTest.java new file mode 100644 index 0000000000000000000000000000000000000000..68098eaaf4220e5f07a6c453dc56ea6746e17f86 --- /dev/null +++ b/dss-user-manager/src/test/java/com/webank/wedatasphpere/dss/user/service/impl/WorkspaceCommandTest.java @@ -0,0 +1,46 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + + +package com.webank.wedatasphpere.dss.user.service.impl; + + +import com.webank.wedatasphpere.dss.user.conf.DSSUserManagerConfig; +import com.webank.wedatasphpere.dss.user.dto.request.AuthorizationBody; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.params.provider.MethodSource; +import org.junit.jupiter.params.provider.ValueSource; + +class WorkspaceCommandTest { + + @Test +// @MethodSource("body") + void authorization() { + AuthorizationBody body = new AuthorizationBody(); + body.setUsername("luxl"); + body.setPassword("123321"); + WorkspaceCommand test = new WorkspaceCommand(); + try { + test.authorization(body); + } catch (Exception e) { + e.printStackTrace(); + } + System.out.println("当前测试方法结束"); + +// DSSUserManagerConfig.LOCAL_USER_ROOT_PATH.getValue(); + } +} diff --git a/eventchecker-appjoint/pom.xml b/eventchecker-appjoint/pom.xml index 6eab212ffa38ef6c35199195aed10fe8011175a5..bdf035e3db1ecb0c2038215e63b76a298059096a 100644 --- a/eventchecker-appjoint/pom.xml +++ b/eventchecker-appjoint/pom.xml @@ -22,7 +22,7 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 4.0.0 diff --git a/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/service/AbstractEventCheckReceiver.java b/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/service/AbstractEventCheckReceiver.java index 4f0a85e07fac7aecf35ebad52e0703a4841a6e9b..fbd1c887ae5abab375894d00c232e7bdc70c5c64 100644 --- a/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/service/AbstractEventCheckReceiver.java +++ b/eventchecker-appjoint/src/main/java/com/webank/wedatasphere/dss/appjoint/schedulis/jobtype/service/AbstractEventCheckReceiver.java @@ -80,39 +80,62 @@ public class AbstractEventCheckReceiver extends AbstractEventCheck{ boolean result = false; String vNewMsgID = "-1"; PreparedStatement updatePstmt = null; + PreparedStatement pstmtForGetID = null; Connection msgConn = null; vNewMsgID = setConsumedMsg(props,log,consumedMsgInfo); try { if(StringUtils.isNotEmpty(vNewMsgID) && StringUtils.isNotBlank(vNewMsgID) && !"-1".equals(vNewMsgID)){ msgConn = getEventCheckerConnection(props,log); if(msgConn == null) return false; - int vProcessID = jobId; - String vReceiveTime = DateFormatUtils.format(new Date(), "yyyy-MM-dd HH:mm:ss");; - String sqlForUpdateMsg = "INSERT INTO event_status(receiver,topic,msg_name,receive_time,msg_id) VALUES(?,?,?,?,?) ON DUPLICATE KEY UPDATE receive_time=VALUES(receive_time),msg_id= CASE WHEN msg_id= " + lastMsgId + " THEN VALUES(msg_id) ELSE msg_id END"; - log.info("last message offset {} is:" + lastMsgId); - updatePstmt = msgConn.prepareCall(sqlForUpdateMsg); - updatePstmt.setString(1, receiver); - updatePstmt.setString(2, topic); - updatePstmt.setString(3, msgName); - updatePstmt.setString(4, vReceiveTime); - updatePstmt.setString(5, vNewMsgID); - int updaters = updatePstmt.executeUpdate(); - log.info("updateMsgOffset successful {} update result is:" + updaters); - if(updaters != 0){ - log.info("Received message successfully , update message status succeeded, consumed flow execution ID: " + vProcessID); - //return true after update success - result = true; + msgConn.setAutoCommit(false); + String sqlForReadMsgID = "SELECT msg_id FROM event_status WHERE receiver=? AND topic=? AND msg_name=? for update"; + pstmtForGetID = msgConn.prepareCall(sqlForReadMsgID); + pstmtForGetID.setString(1, receiver); + pstmtForGetID.setString(2, topic); + pstmtForGetID.setString(3, msgName); + ResultSet rs = pstmtForGetID.executeQuery(); + String nowLastMsgId = rs.last()==true ? rs.getString("msg_id"):"0"; + log.info("receive message successfully , Now check to see if the latest offset has changed ,nowLastMsgId is {} " + nowLastMsgId); + if("0".equals(nowLastMsgId) || nowLastMsgId.equals(lastMsgId)){ + + int vProcessID = jobId; + String vReceiveTime = DateFormatUtils.format(new Date(), "yyyy-MM-dd HH:mm:ss");; + String sqlForUpdateMsg = "INSERT INTO event_status(receiver,topic,msg_name,receive_time,msg_id) VALUES(?,?,?,?,?) ON DUPLICATE KEY UPDATE receive_time=VALUES(receive_time),msg_id= CASE WHEN msg_id= " + lastMsgId + " THEN VALUES(msg_id) ELSE msg_id END"; + log.info("last message offset {} is:" + lastMsgId); + updatePstmt = msgConn.prepareCall(sqlForUpdateMsg); + updatePstmt.setString(1, receiver); + updatePstmt.setString(2, topic); + updatePstmt.setString(3, msgName); + updatePstmt.setString(4, vReceiveTime); + updatePstmt.setString(5, vNewMsgID); + int updaters = updatePstmt.executeUpdate(); + log.info("updateMsgOffset successful {} update result is:" + updaters); + if(updaters != 0){ + log.info("Received message successfully , update message status succeeded, consumed flow execution ID: " + vProcessID); + //return true after update success + result = true; + }else{ + log.info("Received message successfully , update message status failed, consumed flow execution ID: " + vProcessID); + result = false; + } }else{ - log.info("Received message successfully , update message status failed, consumed flow execution ID: " + vProcessID); + log.info("the latest offset has changed , Keep waiting for the signal"); result = false; } + msgConn.commit(); }else{ result = false; } }catch (SQLException e){ log.error("Error update Msg Offset" + e); + try { + msgConn.rollback(); + } catch (SQLException ex) { + log.error("transaction rollback failed " + e); + } return false; }finally { + closeQueryStmt(pstmtForGetID, log); closeQueryStmt(updatePstmt, log); closeConnection(msgConn, log); } diff --git a/examples/ch3/WordCount.jar b/examples/ch3/WordCount.jar new file mode 100644 index 0000000000000000000000000000000000000000..cbcf5803c46d88dc21fb7c149da4bae1f9107f08 Binary files /dev/null and b/examples/ch3/WordCount.jar differ diff --git a/examples/ch3/dept.txt b/examples/ch3/dept.txt new file mode 100644 index 0000000000000000000000000000000000000000..fe587aa7dc37474c96a308d5d03d3df9ec58c2d8 --- /dev/null +++ b/examples/ch3/dept.txt @@ -0,0 +1,4 @@ +10 ACCOUNTING 1700 +20 RESEARCH 1800 +30 SALES 1900 +40 OPERATIONS 1700 \ No newline at end of file diff --git a/examples/ch3/emp.txt b/examples/ch3/emp.txt new file mode 100644 index 0000000000000000000000000000000000000000..f8c77cd409e2f224f78af229d8188a16be1aedc5 --- /dev/null +++ b/examples/ch3/emp.txt @@ -0,0 +1,14 @@ +7369 SMITH CLERK 7902 1980-12-17 800.00 20 +7499 ALLEN SALESMAN 7698 1981-2-20 1600.00 300.00 30 +7521 WARD SALESMAN 7698 1981-2-22 1250.00 500.00 30 +7566 JONES MANAGER 7839 1981-4-2 2975.00 20 +7654 MARTIN SALESMAN 7698 1981-9-28 1250.00 1400.00 30 +7698 BLAKE MANAGER 7839 1981-5-1 2850.00 30 +7782 CLARK MANAGER 7839 1981-6-9 2450.00 10 +7788 SCOTT ANALYST 7566 1987-4-19 3000.00 20 +7839 KING PRESIDENT 1981-11-17 5000.00 10 +7844 TURNER SALESMAN 7698 1981-9-8 1500.00 0.00 30 +7876 ADAMS CLERK 7788 1987-5-23 1100.00 20 +7900 JAMES CLERK 7698 1981-12-3 950.00 30 +7902 FORD ANALYST 7566 1981-12-3 3000.00 20 +7934 MILLER CLERK 7782 1982-1-23 1300.00 10 \ No newline at end of file diff --git a/examples/ch3/emp1.txt b/examples/ch3/emp1.txt new file mode 100644 index 0000000000000000000000000000000000000000..92d107a178d1103724845fc9cfecfc353bf2a9ac --- /dev/null +++ b/examples/ch3/emp1.txt @@ -0,0 +1,5 @@ +7369 SMITH CLERK 7902 2020-1-17 800.00 20 +7499 ALLEN SALESMAN 7698 2020-1-20 1600.00 300.00 30 +7521 WARD SALESMAN 7698 2020-1-22 1250.00 500.00 30 +7566 JONES MANAGER 7839 2020-1-2 2975.00 20 +7654 MARTIN SALESMAN 7698 2020-1-28 1250.00 1400.00 30 \ No newline at end of file diff --git a/examples/ch3/emp2.txt b/examples/ch3/emp2.txt new file mode 100644 index 0000000000000000000000000000000000000000..d00a23876b75878771c46780240ae0afb0743497 --- /dev/null +++ b/examples/ch3/emp2.txt @@ -0,0 +1,5 @@ +7698 BLAKE MANAGER 7839 2020-2-1 2850.00 30 +7782 CLARK MANAGER 7839 2020-2-9 2450.00 10 +7788 SCOTT ANALYST 7566 2020-2-19 3000.00 20 +7839 KING PRESIDENT 2020-2-17 5000.00 10 +7844 TURNER SALESMAN 7698 2020-2-8 1500.00 0.00 30 \ No newline at end of file diff --git a/examples/ch3/emp3.txt b/examples/ch3/emp3.txt new file mode 100644 index 0000000000000000000000000000000000000000..8f62853f2729e1de1b4167c6639f62d199c5bcdb --- /dev/null +++ b/examples/ch3/emp3.txt @@ -0,0 +1,4 @@ +7876 ADAMS CLERK 7788 2020-3-23 1100.00 20 +7900 JAMES CLERK 7698 2020-3-3 950.00 30 +7902 FORD ANALYST 7566 2020-3-3 3000.00 20 +7934 MILLER CLERK 7782 2020-3-23 1300.00 10 \ No newline at end of file diff --git a/examples/ch3/rename.jar b/examples/ch3/rename.jar new file mode 100644 index 0000000000000000000000000000000000000000..738483a21ba68d16c44311765b3bf3644d4ccc35 Binary files /dev/null and b/examples/ch3/rename.jar differ diff --git a/images/zh_CN/chapter3/tests/hive-6.png b/images/zh_CN/chapter3/tests/hive-6.png new file mode 100644 index 0000000000000000000000000000000000000000..f1cb9d661504ba851a6c06339daf29b1a58f17e2 Binary files /dev/null and b/images/zh_CN/chapter3/tests/hive-6.png differ diff --git a/images/zh_CN/chapter3/tests/hive1.png b/images/zh_CN/chapter3/tests/hive1.png new file mode 100644 index 0000000000000000000000000000000000000000..19322869c0091824fe75f37480d92120cf35c746 Binary files /dev/null and b/images/zh_CN/chapter3/tests/hive1.png differ diff --git a/images/zh_CN/chapter3/tests/hive2.png b/images/zh_CN/chapter3/tests/hive2.png new file mode 100644 index 0000000000000000000000000000000000000000..8232e3b46dfb0103d9724e369454e045d24c2937 Binary files /dev/null and b/images/zh_CN/chapter3/tests/hive2.png differ diff --git a/images/zh_CN/chapter3/tests/hive3.png b/images/zh_CN/chapter3/tests/hive3.png new file mode 100644 index 0000000000000000000000000000000000000000..523f1e13cc8284eb6f6ee00a7f3bd9699581a63c Binary files /dev/null and b/images/zh_CN/chapter3/tests/hive3.png differ diff --git a/images/zh_CN/chapter3/tests/hive4.png b/images/zh_CN/chapter3/tests/hive4.png new file mode 100644 index 0000000000000000000000000000000000000000..77e26589e8b122ea3bd46e0c496f11cebba0a5b4 Binary files /dev/null and b/images/zh_CN/chapter3/tests/hive4.png differ diff --git a/images/zh_CN/chapter3/tests/hive5.png b/images/zh_CN/chapter3/tests/hive5.png new file mode 100644 index 0000000000000000000000000000000000000000..f2ecae30192fd0c6b3fa183d5bb4f4f3fa2cd8e5 Binary files /dev/null and b/images/zh_CN/chapter3/tests/hive5.png differ diff --git a/images/zh_CN/chapter3/tests/hive7.png b/images/zh_CN/chapter3/tests/hive7.png new file mode 100644 index 0000000000000000000000000000000000000000..efe7141071fe8330297a7a00af010a43e33e19a8 Binary files /dev/null and b/images/zh_CN/chapter3/tests/hive7.png differ diff --git a/images/zh_CN/chapter3/tests/home.png b/images/zh_CN/chapter3/tests/home.png new file mode 100644 index 0000000000000000000000000000000000000000..1564c5150faa8694485800bd52dd9839bcb1cc1e Binary files /dev/null and b/images/zh_CN/chapter3/tests/home.png differ diff --git a/images/zh_CN/chapter3/tests/udf-3.png b/images/zh_CN/chapter3/tests/udf-3.png new file mode 100644 index 0000000000000000000000000000000000000000..64bad9b7c486eff3de0e2f859b0b614b3ac76222 Binary files /dev/null and b/images/zh_CN/chapter3/tests/udf-3.png differ diff --git a/images/zh_CN/chapter3/tests/udf1.png b/images/zh_CN/chapter3/tests/udf1.png new file mode 100644 index 0000000000000000000000000000000000000000..0866f38457f1ec3fe393a56c0b1b1fd45b4ecc6d Binary files /dev/null and b/images/zh_CN/chapter3/tests/udf1.png differ diff --git a/images/zh_CN/chapter3/tests/udf2.png b/images/zh_CN/chapter3/tests/udf2.png new file mode 100644 index 0000000000000000000000000000000000000000..8bb046f7b32c5b6f88d6681e8b03e3ec4c74c58d Binary files /dev/null and b/images/zh_CN/chapter3/tests/udf2.png differ diff --git a/plugins/azkaban/linkis-jobtype/pom.xml b/plugins/azkaban/linkis-jobtype/pom.xml index 1bd28b54c0d82aef1083004b013da25fcd309cc1..9d0d787b5a708a23686f0979b7861603a8236216 100644 --- a/plugins/azkaban/linkis-jobtype/pom.xml +++ b/plugins/azkaban/linkis-jobtype/pom.xml @@ -23,7 +23,7 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 com.webank.wedatasphere.dss linkis-jobtype diff --git a/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkabanAppJointLinkisJob.java b/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkabanAppJointLinkisJob.java index e395567455dcd634cd6f41279ce5b9dbebd3cc1b..46064b6f64b78c9a38ac958187af1bbe17550e35 100644 --- a/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkabanAppJointLinkisJob.java +++ b/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkabanAppJointLinkisJob.java @@ -19,6 +19,7 @@ package com.webank.wedatasphere.dss.plugins.azkaban.linkis.jobtype.job; import com.webank.wedatasphere.dss.linkis.node.execution.job.AbstractAppJointLinkisJob; import com.webank.wedatasphere.dss.plugins.azkaban.linkis.jobtype.conf.LinkisJobTypeConf; +import org.apache.commons.lang.StringUtils; /** @@ -29,6 +30,9 @@ public class AzkabanAppJointLinkisJob extends AbstractAppJointLinkisJob { @Override public String getSubmitUser() { + if (StringUtils.isEmpty(getJobProps().get(LinkisJobTypeConf.FLOW_SUBMIT_USER))){ + return getJobProps().get(LinkisJobTypeConf.PROXY_USER); + } return getJobProps().get(LinkisJobTypeConf.FLOW_SUBMIT_USER); } diff --git a/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkabanAppJointSignalSharedJob.java b/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkabanAppJointSignalSharedJob.java index 01ebbf0b69a8b80ba02218031585366daf70293b..a24adcb69d5a1fdd6cabc9b7c11d69784e66124f 100644 --- a/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkabanAppJointSignalSharedJob.java +++ b/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkabanAppJointSignalSharedJob.java @@ -17,17 +17,30 @@ package com.webank.wedatasphere.dss.plugins.azkaban.linkis.jobtype.job; +import com.webank.wedatasphere.dss.linkis.node.execution.job.JobSignalKeyCreator; import com.webank.wedatasphere.dss.linkis.node.execution.job.SignalSharedJob; import com.webank.wedatasphere.dss.plugins.azkaban.linkis.jobtype.conf.LinkisJobTypeConf; import java.util.Map; /** - * Created by peacewong on 2019/11/14. + * Created by johnnwang on 2019/11/14. */ -public class AzkabanAppJointSignalSharedJob extends AzkabanAppJointLinkisSharedJob implements SignalSharedJob { +public class AzkabanAppJointSignalSharedJob extends AzkabanAppJointLinkisJob implements SignalSharedJob { + private JobSignalKeyCreator signalKeyCreator; + + @Override + public JobSignalKeyCreator getSignalKeyCreator() { + return this.signalKeyCreator; + } + + @Override + public void setSignalKeyCreator(JobSignalKeyCreator signalKeyCreator) { + this.signalKeyCreator = signalKeyCreator; + } + @Override public String getMsgSaveKey() { Map configuration = this.getConfiguration(); diff --git a/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkabanCommonLinkisJob.java b/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkabanCommonLinkisJob.java index dd46a8bd2a904f47a39ed43653b1b611857fa9a7..cf398810c52ebeb556cbefef15f098a8a5a11a00 100644 --- a/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkabanCommonLinkisJob.java +++ b/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkabanCommonLinkisJob.java @@ -19,6 +19,7 @@ package com.webank.wedatasphere.dss.plugins.azkaban.linkis.jobtype.job; import com.webank.wedatasphere.dss.linkis.node.execution.job.AbstractCommonLinkisJob; import com.webank.wedatasphere.dss.plugins.azkaban.linkis.jobtype.conf.LinkisJobTypeConf; +import org.apache.commons.lang.StringUtils; /** @@ -29,6 +30,9 @@ public class AzkabanCommonLinkisJob extends AbstractCommonLinkisJob { @Override public String getSubmitUser() { + if (StringUtils.isEmpty(getJobProps().get(LinkisJobTypeConf.FLOW_SUBMIT_USER))){ + return getJobProps().get(LinkisJobTypeConf.PROXY_USER); + } return getJobProps().get(LinkisJobTypeConf.FLOW_SUBMIT_USER); } diff --git a/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkabanJobSignalKeyCreator.java b/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkabanJobSignalKeyCreator.java new file mode 100644 index 0000000000000000000000000000000000000000..30c6a6415f4dbae7c9799dac35710f7c7dcf6cbd --- /dev/null +++ b/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkabanJobSignalKeyCreator.java @@ -0,0 +1,39 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ + +package com.webank.wedatasphere.dss.plugins.azkaban.linkis.jobtype.job; + +import com.webank.wedatasphere.dss.linkis.node.execution.job.Job; +import com.webank.wedatasphere.dss.linkis.node.execution.job.JobSignalKeyCreator; +import com.webank.wedatasphere.dss.linkis.node.execution.job.SignalSharedJob; +import com.webank.wedatasphere.dss.plugins.azkaban.linkis.jobtype.conf.LinkisJobTypeConf; + +public class AzkabanJobSignalKeyCreator implements JobSignalKeyCreator { + + @Override + public String getSignalKeyByJob(Job job) { + String projectId = job.getJobProps().get(LinkisJobTypeConf.PROJECT_ID); + String flowId = job.getJobProps().get(LinkisJobTypeConf.FLOW_NAME); + String flowExecId = job.getJobProps().get(LinkisJobTypeConf.FLOW_EXEC_ID); + return projectId + "." + flowId + "." + flowExecId ; + } + + @Override + public String getSignalKeyBySignalSharedJob(SignalSharedJob job) { + return getSignalKeyByJob((Job)job); + } +} diff --git a/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkanbanBuilder.java b/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkanbanBuilder.java index bcd88097bda42edab44b689f87b9f242e26c3e40..c38fac24404f89d5afa0cba596406e12f435c917 100644 --- a/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkanbanBuilder.java +++ b/plugins/azkaban/linkis-jobtype/src/main/java/com/webank/wedatasphere/dss/plugins/azkaban/linkis/jobtype/job/AzkanbanBuilder.java @@ -20,7 +20,10 @@ package com.webank.wedatasphere.dss.plugins.azkaban.linkis.jobtype.job; import com.webank.wedatasphere.dss.linkis.node.execution.conf.LinkisJobExecutionConfiguration; import com.webank.wedatasphere.dss.linkis.node.execution.entity.BMLResource; import com.webank.wedatasphere.dss.linkis.node.execution.exception.LinkisJobExecutionErrorException; +import com.webank.wedatasphere.dss.linkis.node.execution.execution.LinkisNodeExecution; +import com.webank.wedatasphere.dss.linkis.node.execution.execution.impl.LinkisNodeExecutionImpl; import com.webank.wedatasphere.dss.linkis.node.execution.job.*; +import com.webank.wedatasphere.dss.linkis.node.execution.parser.JobParamsParser; import com.webank.wedatasphere.dss.linkis.node.execution.utils.LinkisJobExecutionUtils; import com.webank.wedatasphere.dss.plugins.azkaban.linkis.jobtype.conf.LinkisJobTypeConf; import com.webank.wedatasphere.dss.plugins.azkaban.linkis.jobtype.utils.LinkisJobTypeUtils; @@ -29,17 +32,29 @@ import org.apache.commons.lang.StringUtils; import java.util.*; /** - * Created by peacewong on 2019/11/3. + * Created by johnnwang on 2019/11/3. */ public class AzkanbanBuilder extends Builder{ private Map jobProps; + private JobSignalKeyCreator jobSignalKeyCreator = new AzkabanJobSignalKeyCreator(); + public AzkanbanBuilder setJobProps(Map jobProps) { this.jobProps = jobProps; return this; } + { + init(); + } + + private void init(){ + JobParamsParser jobParamsParser = new JobParamsParser(); + jobParamsParser.setSignalKeyCreator(jobSignalKeyCreator); + LinkisNodeExecutionImpl linkisNodeExecution = (LinkisNodeExecutionImpl)LinkisNodeExecutionImpl.getLinkisNodeExecution(); + linkisNodeExecution.registerJobParser(jobParamsParser); + } @Override protected String getJobType() { @@ -99,6 +114,7 @@ public class AzkanbanBuilder extends Builder{ return null; } else { AzkabanAppJointSignalSharedJob signalSharedJob = new AzkabanAppJointSignalSharedJob(); + signalSharedJob.setSignalKeyCreator(jobSignalKeyCreator); signalSharedJob.setJobProps(this.jobProps); return signalSharedJob; } diff --git a/plugins/azkaban/linkis-jobtype/src/main/resources/assembly.xml b/plugins/azkaban/linkis-jobtype/src/main/resources/assembly.xml index 72594e53c2ca1b222fcd628c653445598be9fd05..a5d8302db8cd2d1db3e5d900734eb345be07174b 100644 --- a/plugins/azkaban/linkis-jobtype/src/main/resources/assembly.xml +++ b/plugins/azkaban/linkis-jobtype/src/main/resources/assembly.xml @@ -59,15 +59,6 @@ unix - - ${basedir}/target - - *.jar - - 0777 - / - unix - ${basedir}/bin diff --git a/plugins/linkis/linkis-appjoint-entrance/bin/start-linkis-appjoint-entrance.sh b/plugins/linkis/linkis-appjoint-entrance/bin/start-linkis-appjoint-entrance.sh index 0128a4d1cde185fff60bb6239a2fc862d585f771..4436def35d0eeeb0cb448ef29744c7208f2cb6ad 100644 --- a/plugins/linkis/linkis-appjoint-entrance/bin/start-linkis-appjoint-entrance.sh +++ b/plugins/linkis/linkis-appjoint-entrance/bin/start-linkis-appjoint-entrance.sh @@ -1,36 +1,37 @@ #!/bin/bash - cd `dirname $0` cd .. -HOE=`pwd` - export DWS_ENTRANCE_HOE=$HOE +HOME=`pwd` -export DWS_ENTRANCE_PID=$HOE/bin/linkis-appjoint-entrance.pid +export SERVER_PID=$HOME/bin/linkis.pid +export SERVER_LOG_PATH=$HOME/logs +export SERVER_CLASS=com.webank.wedatasphere.linkis.DataWorkCloudApplication -if [[ -f "${DWS_ENTRANCE_PID}" ]]; then - pid=$(cat ${DWS_ENTRANCE_PID}) - if kill -0 ${pid} >/dev/null 2>&1; then - echo "Entrance is already running." - return 0; - fi +if test -z "$SERVER_HEAP_SIZE" +then + export SERVER_HEAP_SIZE="512M" fi -export DWS_ENTRANCE_LOG_PATH=$HOE/logs -export DWS_ENTRANCE_DEBUG="-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=19959" -export DWS_ENTRANCE_HEAP_SIZE="2G" -export DWS_ENTRANCE_JAVA_OPTS="-Xms$DWS_ENTRANCE_HEAP_SIZE -Xmx$DWS_ENTRANCE_HEAP_SIZE -XX:+UseG1GC -XX:MaxPermSize=500m $DWS_ENTRANCE_DEBUG" - -cmd="nohup java $DWS_ENTRANCE_JAVA_OPTS -cp $HOE/conf:$HOE/lib/* com.webank.wedatasphere.linkis.DataWorkCloudApplication 2>&1 > $DWS_ENTRANCE_LOG_PATH/linkis.out &" -#echo "CMD IS $cmd" +if test -z "$SERVER_JAVA_OPTS" +then + export SERVER_JAVA_OPTS=" -Xmx$SERVER_HEAP_SIZE -XX:+UseG1GC -Xloggc:$HOME/logs/linkis-gc.log" +fi +if [[ -f "${SERVER_PID}" ]]; then + pid=$(cat ${SERVER_PID}) + if kill -0 ${pid} >/dev/null 2>&1; then + echo "Server is already running." + exit 1 + fi +fi -nohup java $DWS_ENTRANCE_JAVA_OPTS -cp $HOE/conf:$HOE/lib/* com.webank.wedatasphere.linkis.DataWorkCloudApplication 2>&1 > $DWS_ENTRANCE_LOG_PATH/linkis.out & +nohup java $SERVER_JAVA_OPTS -cp $HOME/conf:$HOME/lib/* $SERVER_CLASS 2>&1 > $SERVER_LOG_PATH/linkis.out & pid=$! if [[ -z "${pid}" ]]; then - echo "AppJoint Entrance start failed!" + echo "server $SERVER_NAME start failed!" exit 1 else - echo "AppJoint Entrance start succeeded!" - echo $pid > $DWS_ENTRANCE_PID + echo "server $SERVER_NAME start succeeded!" + echo $pid > $SERVER_PID sleep 1 fi diff --git a/plugins/linkis/linkis-appjoint-entrance/bin/stop-linkis-appjoint-entrance.sh b/plugins/linkis/linkis-appjoint-entrance/bin/stop-linkis-appjoint-entrance.sh index f3aad1635df7b8a51705d8daf7242c83c0e77391..a1528811c78ecba3d46f1a9395d492d90f2ca3c7 100644 --- a/plugins/linkis/linkis-appjoint-entrance/bin/stop-linkis-appjoint-entrance.sh +++ b/plugins/linkis/linkis-appjoint-entrance/bin/stop-linkis-appjoint-entrance.sh @@ -4,9 +4,9 @@ cd `dirname $0` cd .. HOE=`pwd` -export DWS_ENTRANCE_PID=$HOE/bin/linkis-appjoint-entrance.pid +export DSS_ENTRANCE_PID=$HOE/bin/linkis.pid -function wait_for_DWS_ENGINE_MANAGER_to_die() { +function wait_for_DSS_ENGINE_MANAGER_to_die() { local pid local count pid=$1 @@ -33,15 +33,15 @@ function wait_for_DWS_ENGINE_MANAGER_to_die() { fi } -if [[ ! -f "${DWS_ENTRANCE_PID}" ]]; then +if [[ ! -f "${DSS_ENTRANCE_PID}" ]]; then echo "AppJoint Entrance is not running" else - pid=$(cat ${DWS_ENTRANCE_PID}) + pid=$(cat ${DSS_ENTRANCE_PID}) if [[ -z "${pid}" ]]; then echo "AppJoint Entrance is not running" else - wait_for_DWS_ENGINE_MANAGER_to_die $pid 40 - $(rm -f ${DWS_ENTRANCE_PID}) + wait_for_DSS_ENGINE_MANAGER_to_die $pid 40 + $(rm -f ${DSS_ENTRANCE_PID}) echo "AppJoint Entrance is stopped." fi fi diff --git a/plugins/linkis/linkis-appjoint-entrance/pom.xml b/plugins/linkis/linkis-appjoint-entrance/pom.xml index 8443216f9c125e6837cca2152df387de3114a128..bb7db1901845e97ed3e6a3eee220e3236296156b 100644 --- a/plugins/linkis/linkis-appjoint-entrance/pom.xml +++ b/plugins/linkis/linkis-appjoint-entrance/pom.xml @@ -22,7 +22,7 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 4.0.0 @@ -54,8 +54,21 @@ - - + + com.webank.wedatasphere.linkis + linkis-cloudRPC + ${linkis.version} + + + com.webank.wedatasphere.linkis + linkis-storage + ${linkis.version} + + + com.webank.wedatasphere.linkis + linkis-httpclient + ${linkis.version} + diff --git a/plugins/linkis/linkis-appjoint-entrance/src/main/assembly/distribution.xml b/plugins/linkis/linkis-appjoint-entrance/src/main/assembly/distribution.xml index d4a200502a0bf0aeb5f7bcb35f911d288cc016f7..b3ac1a0b59391151331fcee2fc29ad7032d0bafe 100644 --- a/plugins/linkis/linkis-appjoint-entrance/src/main/assembly/distribution.xml +++ b/plugins/linkis/linkis-appjoint-entrance/src/main/assembly/distribution.xml @@ -59,16 +59,16 @@ aopalliance:aopalliance:jar asm:asm:jar cglib:cglib:jar - com.amazonaws:aws-java-sdk-autoscaling:jar - com.amazonaws:aws-java-sdk-core:jar - com.amazonaws:aws-java-sdk-ec2:jar - com.amazonaws:aws-java-sdk-route53:jar - com.amazonaws:aws-java-sdk-sts:jar - com.amazonaws:jmespath-java:jar + + + + + + com.fasterxml.jackson.core:jackson-annotations:jar com.fasterxml.jackson.core:jackson-core:jar com.fasterxml.jackson.core:jackson-databind:jar - com.fasterxml.jackson.dataformat:jackson-dataformat-cbor:jar + com.fasterxml.jackson.datatype:jackson-datatype-jdk8:jar com.fasterxml.jackson.datatype:jackson-datatype-jsr310:jar com.fasterxml.jackson.jaxrs:jackson-jaxrs-base:jar @@ -84,7 +84,7 @@ com.google.code.gson:gson:jar com.google.guava:guava:jar com.google.inject:guice:jar - com.google.protobuf:protobuf-java:jar + com.netflix.archaius:archaius-core:jar com.netflix.eureka:eureka-client:jar com.netflix.eureka:eureka-core:jar @@ -100,7 +100,6 @@ com.netflix.ribbon:ribbon-loadbalancer:jar com.netflix.ribbon:ribbon-transport:jar com.netflix.servo:servo-core:jar - com.ning:async-http-client:jar com.sun.jersey.contribs:jersey-apache-client4:jar com.sun.jersey:jersey-client:jar com.sun.jersey:jersey-core:jar @@ -112,15 +111,15 @@ com.thoughtworks.xstream:xstream:jar com.webank.wedatasphere.linkis:linkis-common:jar com.webank.wedatasphere.linkis:linkis-module:jar - commons-beanutils:commons-beanutils:jar - commons-beanutils:commons-beanutils-core:jar - commons-cli:commons-cli:jar - commons-codec:commons-codec:jar - commons-collections:commons-collections:jar - commons-configuration:commons-configuration:jar - commons-daemon:commons-daemon:jar - commons-dbcp:commons-dbcp:jar - commons-digester:commons-digester:jar + + + + + + + + + commons-httpclient:commons-httpclient:jar commons-io:commons-io:jar commons-jxpath:commons-jxpath:jar @@ -155,11 +154,9 @@ joda-time:joda-time:jar log4j:log4j:jar mysql:mysql-connector-java:jar - net.databinder.dispatch:dispatch-core_2.11:jar - net.databinder.dispatch:dispatch-json4s-jackson_2.11:jar org.antlr:antlr-runtime:jar org.antlr:stringtemplate:jar - org.apache.commons:commons-compress:jar + org.apache.commons:commons-math:jar org.apache.commons:commons-math3:jar org.apache.curator:curator-client:jar @@ -169,13 +166,13 @@ org.apache.directory.api:api-util:jar org.apache.directory.server:apacheds-i18n:jar org.apache.directory.server:apacheds-kerberos-codec:jar - org.apache.hadoop:hadoop-annotations:jar - org.apache.hadoop:hadoop-auth:jar - org.apache.hadoop:hadoop-common:jar - org.apache.hadoop:hadoop-hdfs:jar - org.apache.htrace:htrace-core:jar - org.apache.httpcomponents:httpclient:jar - org.apache.httpcomponents:httpcore:jar + + + + + + + org.apache.logging.log4j:log4j-api:jar org.apache.logging.log4j:log4j-core:jar org.apache.logging.log4j:log4j-jul:jar @@ -194,7 +191,6 @@ org.eclipse.jetty:jetty-continuation:jar org.eclipse.jetty:jetty-http:jar org.eclipse.jetty:jetty-io:jar - org.eclipse.jetty:jetty-jndi:jar org.eclipse.jetty:jetty-plus:jar org.eclipse.jetty:jetty-security:jar org.eclipse.jetty:jetty-server:jar @@ -243,7 +239,6 @@ org.json4s:json4s-ast_2.11:jar org.json4s:json4s-core_2.11:jar org.json4s:json4s-jackson_2.11:jar - org.jsoup:jsoup:jar org.jvnet.mimepull:mimepull:jar org.jvnet:tiger-types:jar org.latencyutils:LatencyUtils:jar @@ -298,7 +293,7 @@ org.springframework:spring-webmvc:jar org.tukaani:xz:jar org.yaml:snakeyaml:jar - software.amazon.ion:ion-java:jar + xerces:xercesImpl:jar xmlenc:xmlenc:jar xmlpull:xmlpull:jar diff --git a/plugins/linkis/linkis-appjoint-entrance/src/main/scala/com/webank/wedatasphere/dss/linkis/appjoint/entrance/execute/AppJointEntranceEngine.scala b/plugins/linkis/linkis-appjoint-entrance/src/main/scala/com/webank/wedatasphere/dss/linkis/appjoint/entrance/execute/AppJointEntranceEngine.scala index 26e4305d494c4988542931645c46af694034ce6b..e16c9b93e574305f62c8f48d186ac77220cbb1d3 100644 --- a/plugins/linkis/linkis-appjoint-entrance/src/main/scala/com/webank/wedatasphere/dss/linkis/appjoint/entrance/execute/AppJointEntranceEngine.scala +++ b/plugins/linkis/linkis-appjoint-entrance/src/main/scala/com/webank/wedatasphere/dss/linkis/appjoint/entrance/execute/AppJointEntranceEngine.scala @@ -31,6 +31,7 @@ import com.webank.wedatasphere.dss.linkis.appjoint.entrance.job.AppJointExecuteR import com.webank.wedatasphere.linkis.common.exception.ErrorException import com.webank.wedatasphere.linkis.common.utils.{Logging, Utils} import com.webank.wedatasphere.linkis.entrance.execute.{EngineExecuteAsynReturn, EntranceEngine, EntranceJob} +import com.webank.wedatasphere.linkis.entrance.interceptor.impl.CustomVariableUtils import com.webank.wedatasphere.linkis.protocol.engine.{JobProgressInfo, RequestTask} import com.webank.wedatasphere.linkis.protocol.query.RequestPersistTask import com.webank.wedatasphere.linkis.scheduler.executer._ @@ -156,6 +157,9 @@ class AppJointEntranceEngine(properties: util.Map[String, Any]) val nodeType = nodeContext.getAppJointNode.getNodeType val realAppJointType = if (nodeType.contains(".")) nodeType.substring(0, nodeType.indexOf(".")) else nodeType val appJoint = AppJointManager.getAppJoint(realAppJointType) + if((realAppJointType.toLowerCase()).contains("datacheck")){ + replaceCustomVariables(nodeContext.getRuntimeMap) + } val user = if (null != runTimeMap.get("user")) runTimeMap.get("user").toString else null val session = if (StringUtils.isNotEmpty(user)){ if (appJoint.getSecurityService != null) appJoint.getSecurityService.login(user) else null @@ -189,8 +193,19 @@ class AppJointEntranceEngine(properties: util.Map[String, Any]) ErrorExecuteResponse(s"cannot do this executeRequest $executeRequest", new ErrorException(80056, s"cannot do this executeRequest $executeRequest")) } + private def replaceCustomVariables(runTimeMap:java.util.Map[String, Object]):Unit = { + val key = "check.object" + val value:String = if (null != runTimeMap.get(key)) runTimeMap.get(key).toString else "" + val task = new RequestPersistTask + task.setExecutionCode(value) + task.setParams(new util.HashMap[String, Object]()) + val (result, code) = CustomVariableUtils.replaceCustomVar(task, "sql") + logger.info(s"after code replace code is $code") + if (result) runTimeMap(key) = code + } } + case class AppJointEntranceExecuteException(errMsg:String) extends ErrorException(70046, errMsg) class AppJointEntranceAsyncExecuteResponse extends AsynReturnExecuteResponse with Logging{ diff --git a/pom.xml b/pom.xml index 000a11f38ed86a2886d6821cb7b97d7c78af7817..53346c0ffcc63c6aaad3c09a8fa79815c1a1a53a 100644 --- a/pom.xml +++ b/pom.xml @@ -23,7 +23,7 @@ pom com.webank.wedatasphere.dss dss - 0.5.0 + 0.9.1 dss-common @@ -44,11 +44,12 @@ plugins/azkaban/linkis-jobtype plugins/linkis/linkis-appjoint-entrance assembly + dss-user-manager - 0.5.0 - 0.9.1 + 0.9.1 + 0.9.4 2.11.8 1.8 3.3.3 diff --git a/qualitis-appjoint/appjoint/pom.xml b/qualitis-appjoint/appjoint/pom.xml index 5082eafa6a49315e9e4ea56891b8509643ecba73..330b5cfd2f9e50dfba014ee14a5fc76d53c49cbf 100644 --- a/qualitis-appjoint/appjoint/pom.xml +++ b/qualitis-appjoint/appjoint/pom.xml @@ -5,7 +5,7 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 4.0.0 diff --git a/qualitis-appjoint/appjoint/src/main/java/com.webank/wedatasphere/appjoint/QualitisNodeExecution.java b/qualitis-appjoint/appjoint/src/main/java/com.webank/wedatasphere/appjoint/QualitisNodeExecution.java index dbace5346051f1d857f0fd4c9ad575c1117dea38..f2b0bbc38998bf114b7d9417cf3c4b49cb68fd92 100644 --- a/qualitis-appjoint/appjoint/src/main/java/com.webank/wedatasphere/appjoint/QualitisNodeExecution.java +++ b/qualitis-appjoint/appjoint/src/main/java/com.webank/wedatasphere/appjoint/QualitisNodeExecution.java @@ -58,7 +58,7 @@ public class QualitisNodeExecution extends LongTermNodeExecution { @Override public boolean canExecute(AppJointNode appJointNode, NodeContext context, Session session) { - return false; + return appJointNode.getNodeType().toLowerCase().contains("qualitis"); } @Override @@ -68,8 +68,10 @@ public class QualitisNodeExecution extends LongTermNodeExecution { String filter = (String) map.get("filter"); String executionUser = (String) map.get("executeUser"); String createUser = (String) map.get("user"); - Long groupId = Long.valueOf((Integer) map.get("ruleGroupId")); - + Map map1 = ((AppJointNode)appJointNode).getJobContent(); + String id = map1.get("ruleGroupId").toString(); + float f = Float.valueOf(id); + Long groupId = (long)f; QualitisSubmitRequest submitRequest = new QualitisSubmitRequest(); submitRequest.setCreateUser(createUser); submitRequest.setExecutionUser(executionUser); diff --git a/qualitis-appjoint/appjoint/src/main/java/com.webank/wedatasphere/project/QualitisProjectServiceImpl.java b/qualitis-appjoint/appjoint/src/main/java/com.webank/wedatasphere/project/QualitisProjectServiceImpl.java index c65273eb1a8188aca8f7742defb6e974252cba6e..0c3cde2b9139c147796449a42b11666b5379a2d5 100644 --- a/qualitis-appjoint/appjoint/src/main/java/com.webank/wedatasphere/project/QualitisProjectServiceImpl.java +++ b/qualitis-appjoint/appjoint/src/main/java/com.webank/wedatasphere/project/QualitisProjectServiceImpl.java @@ -7,7 +7,7 @@ import com.webank.wedatasphere.dss.appjoint.exception.AppJointErrorException; import com.webank.wedatasphere.dss.appjoint.service.AppJointUrlImpl; import com.webank.wedatasphere.dss.appjoint.service.ProjectService; import com.webank.wedatasphere.dss.appjoint.service.session.Session; -import com.webank.wedatasphere.dss.common.entity.project.DWSProject; +import com.webank.wedatasphere.dss.common.entity.project.DSSProject; import com.webank.wedatasphere.dss.common.entity.project.Project; import org.apache.commons.lang.RandomStringUtils; import org.slf4j.Logger; @@ -60,10 +60,10 @@ public class QualitisProjectServiceImpl extends AppJointUrlImpl implements Proje try { QualitisAddProjectRequest qualitisAddProjectRequest = new QualitisAddProjectRequest(); - DWSProject dwsProject = (DWSProject) project; + DSSProject dssProject = (DSSProject) project; qualitisAddProjectRequest.setProjectName(project.getName()); qualitisAddProjectRequest.setDescription(project.getDescription()); - qualitisAddProjectRequest.setUsername(dwsProject.getUserName()); + qualitisAddProjectRequest.setUsername(dssProject.getUserName()); RestTemplate restTemplate = new RestTemplate(); HttpEntity entity = generateEntity(qualitisAddProjectRequest); @@ -73,7 +73,7 @@ public class QualitisProjectServiceImpl extends AppJointUrlImpl implements Proje String responseStatus = (String) response.get("code"); if (FAILURE_CODE.equals(responseStatus)) { // Send request to auto create qualitis user - autoAddUser(restTemplate, dwsProject.getUserName()); + autoAddUser(restTemplate, dssProject.getUserName()); // restart to create project response = createProjectReal(restTemplate, entity); @@ -142,9 +142,9 @@ public class QualitisProjectServiceImpl extends AppJointUrlImpl implements Proje try { QualitisDeleteProjectRequest qualitisDeleteProjectRequest = new QualitisDeleteProjectRequest(); - DWSProject dwsProject = (DWSProject) project; + DSSProject dssProject = (DSSProject) project; qualitisDeleteProjectRequest.setProjectId(project.getId()); - qualitisDeleteProjectRequest.setUsername(dwsProject.getUserName()); + qualitisDeleteProjectRequest.setUsername(dssProject.getUserName()); RestTemplate restTemplate = new RestTemplate(); HttpEntity entity = generateEntity(qualitisDeleteProjectRequest); @@ -183,11 +183,11 @@ public class QualitisProjectServiceImpl extends AppJointUrlImpl implements Proje try { QualitisUpdateProjectRequest qualitisUpdateProjectRequest = new QualitisUpdateProjectRequest(); - DWSProject dwsProject = (DWSProject) project; + DSSProject dssProject = (DSSProject) project; qualitisUpdateProjectRequest.setProjectId(project.getId()); qualitisUpdateProjectRequest.setProjectName(project.getName()); qualitisUpdateProjectRequest.setDescription(project.getDescription()); - qualitisUpdateProjectRequest.setUsername(dwsProject.getUserName()); + qualitisUpdateProjectRequest.setUsername(dssProject.getUserName()); RestTemplate restTemplate = new RestTemplate(); HttpEntity entity = generateEntity(qualitisUpdateProjectRequest); diff --git a/sendemail-appjoint/sendemail-core/pom.xml b/sendemail-appjoint/sendemail-core/pom.xml index 7a76cc235e2466918b0c78291f2c3819b5777de9..3739647f0a0d6b3fb3cd383e3b48604830874a4e 100644 --- a/sendemail-appjoint/sendemail-core/pom.xml +++ b/sendemail-appjoint/sendemail-core/pom.xml @@ -22,7 +22,7 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 4.0.0 diff --git a/visualis-appjoint/appjoint/pom.xml b/visualis-appjoint/appjoint/pom.xml index 4ac917e906a5ae9c0846311827b64a0ca96dcdf9..e572638daf35bb5e41039501cf0c61f7808f35bf 100644 --- a/visualis-appjoint/appjoint/pom.xml +++ b/visualis-appjoint/appjoint/pom.xml @@ -22,7 +22,7 @@ dss com.webank.wedatasphere.dss - 0.5.0 + 0.9.1 4.0.0 @@ -45,6 +45,20 @@ true + + com.webank.wedatasphere.linkis + linkis-httpclient + ${linkis.version} + provided + true + + + + net.databinder.dispatch + dispatch-core_2.11 + 0.12.3 + + diff --git a/visualis-appjoint/appjoint/src/main/scala/com/webank/wedatasphere/dss/appjoint/visualis/execution/VisualisNodeExecution.scala b/visualis-appjoint/appjoint/src/main/scala/com/webank/wedatasphere/dss/appjoint/visualis/execution/VisualisNodeExecution.scala index 9de32921cf5e8d1f5313ebe7ec1a5a804ea5921b..fa2575726313a6c8d565e992f48295064d72d89b 100644 --- a/visualis-appjoint/appjoint/src/main/scala/com/webank/wedatasphere/dss/appjoint/visualis/execution/VisualisNodeExecution.scala +++ b/visualis-appjoint/appjoint/src/main/scala/com/webank/wedatasphere/dss/appjoint/visualis/execution/VisualisNodeExecution.scala @@ -17,7 +17,7 @@ package com.webank.wedatasphere.dss.appjoint.visualis.execution -import java.io.ByteArrayOutputStream +import java.io.{ByteArrayOutputStream, InputStream} import java.util import java.util.Base64 @@ -28,25 +28,34 @@ import com.webank.wedatasphere.dss.appjoint.service.session.Session import com.webank.wedatasphere.dss.appjoint.visualis.execution.VisualisNodeExecutionConfiguration._ import com.webank.wedatasphere.linkis.common.exception.ErrorException import com.webank.wedatasphere.linkis.common.log.LogUtils -import com.webank.wedatasphere.linkis.common.utils.{HttpClient, Logging, Utils} +import com.webank.wedatasphere.linkis.common.utils.{Logging, Utils} import com.webank.wedatasphere.linkis.storage.{LineMetaData, LineRecord} import org.apache.commons.io.IOUtils import scala.collection.JavaConversions.mapAsScalaMap -import scala.concurrent.ExecutionContext +import scala.concurrent.duration.Duration +import scala.concurrent.{Await, ExecutionContext} +import dispatch._ +import org.json4s.{DefaultFormats, Formats} /** * Created by enjoyyin on 2019/10/12. */ -class VisualisNodeExecution extends NodeExecution with HttpClient with Logging { - - override protected implicit val executors: ExecutionContext = Utils.newCachedExecutionContext(VISUALIS_THREAD_MAX.getValue, getName + "-NodeExecution-Thread", true) +class VisualisNodeExecution extends NodeExecution with Logging { private val DISPLAY = "display" private val DASHBOARD = "dashboard" var basicUrl:String = _ + protected implicit val executors: ExecutionContext = Utils.newCachedExecutionContext(VISUALIS_THREAD_MAX.getValue, getName + "-NodeExecution-Thread", true) + protected implicit val formats: Formats = DefaultFormats + + private implicit def svc(url: String): Req = + dispatch.url(url) + + + override def getBaseUrl: String = this.basicUrl override def setBaseUrl(basicUrl: String): Unit = this.basicUrl = basicUrl @@ -95,6 +104,17 @@ class VisualisNodeExecution extends NodeExecution with HttpClient with Logging { appJointResponse } + def download(url: String, queryParams: Map[String, String], headerParams: Map[String, String], + write: InputStream => Unit, + paths: String*): Unit = { + var req = url.GET + if(headerParams != null && headerParams.nonEmpty) req = req <:< headerParams + if(queryParams != null) queryParams.foreach{ case (k, v) => req = req.addQueryParameter(k, v)} + if(paths != null) paths.filter(_ != null).foreach(p => req = req / p) + val response = Http(req OK as.Response(_.getResponseBodyAsStream)).map(write) + Await.result(response, Duration.Inf) + } + private def getRealId(displayId:String):Int = { Utils.tryCatch{ val f = java.lang.Float.parseFloat(displayId) diff --git a/web/config.sh b/web/config.sh index 0ebe48dbe606e0fe2720110a9739eaf591ff3c90..864e492715b00d9d9968835f5bdfc8b03ed56a5c 100644 --- a/web/config.sh +++ b/web/config.sh @@ -1,8 +1,8 @@ -#Configuring front-end ports -dss_port="8088" +#dss web port +dss_web_port="8088" -#URL of the backend linkis gateway -linkis_url="http://localhost:20401" +#dss web access linkis gateway adress +linkis_gateway_url="http://localhost:9001" -#dss ip address -dss_ipaddr=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}') +#dss nginx ip +dss_nginx_ip=$(ip addr | awk '/^[0-9]+: / {}; /inet.*global/ {print gensub(/(.*)\/(.*)/, "\\1", "g", $2)}'|awk 'NR==1') diff --git a/web/install.sh b/web/install.sh index 01a7b33223ba5ffba5fe3d979f76cb899928449c..9a5a410c6ac25c94d76964ebc9533b411da8193c 100644 --- a/web/install.sh +++ b/web/install.sh @@ -1,12 +1,22 @@ #!/bin/bash #当前路径 -workDir=$(cd `dirname $0`; pwd) +shellDir=`dirname $0` +workDir=`cd ${shellDir}/..;pwd` + +echo "dss web install start" + +dss_web_port=$DSS_WEB_PORT +linkis_gateway_url=$LINKIS_GATEWAY_URL +dss_nginx_ip=$DSS_NGINX_IP +#linkis_eureka_url=$LINKIS_EUREKA_URL + +if [[ "$dss_web_port" == "" ]]; then +source ${workDir}/conf/config.sh +fi -echo "dss front-end deployment script" -source $workDir/config.sh # 前端放置目录,默认为解压目录 dss_basepath=$workDir @@ -44,11 +54,11 @@ version=`cat /etc/redhat-release|sed -r 's/.* ([0-9]+)\..*/\1/'` echo "========================================================================配置信息=======================================================================" -echo "前端访问端口:${dss_port}" -echo "后端Linkis的地址:${linkis_url}" -echo "静态文件地址:${dss_basepath}/dist" -echo "当前路径:${workDir}" -echo "本机ip:${dss_ipaddr}" +echo "DSS web访问端口:${dss_web_port}" +echo "Linkis gateway 的地址:${linkis_gateway_url}" +echo "DSS web 静态文件地址:${dss_basepath}/dist" +echo "DSS web 安装路径:${workDir}" +echo "DSS nginx ip:${dss_nginx_ip}" echo "========================================================================配置信息=======================================================================" echo "" @@ -56,19 +66,18 @@ echo "" # 创建文件并配置nginx dssConf(){ - - s_host='$host' + s_host='$host' s_remote_addr='$remote_addr' s_proxy_add_x_forwarded_for='$proxy_add_x_forwarded_for' s_http_upgrade='$http_upgrade' - echo " + sudo sh -c "echo ' server { - listen $dss_port;# 访问端口 + listen $dss_web_port;# 访问端口 server_name localhost; #charset koi8-r; #access_log /var/log/nginx/host.access.log main; location /dss/visualis { - root ${dss_basepath}/dss/visualis; # 静态文件目录 + root ${dss_basepath}; # 静态文件目录 autoindex on; } location / { @@ -76,14 +85,14 @@ dssConf(){ index index.html index.html; } location /ws { - proxy_pass $linkis_url;#后端Linkis的地址 + proxy_pass $linkis_gateway_url;#后端Linkis的地址 proxy_http_version 1.1; proxy_set_header Upgrade $s_http_upgrade; proxy_set_header Connection "upgrade"; } location /api { - proxy_pass $linkis_url; #后端Linkis的地址 + proxy_pass $linkis_gateway_url; #后端Linkis的地址 proxy_set_header Host $s_host; proxy_set_header X-Real-IP $s_remote_addr; proxy_set_header x_real_ipP $s_remote_addr; @@ -105,7 +114,7 @@ dssConf(){ root /usr/share/nginx/html; } } - " > /etc/nginx/conf.d/dss.conf + ' > /etc/nginx/conf.d/dss.conf" } @@ -114,28 +123,28 @@ centos7(){ # nginx是否安装 #sudo rpm -Uvh http://nginx.org/packages/centos/7/noarch/RPMS/nginx-release-centos-7-0.el7.ngx.noarch.rpm sudo yum install -y nginx - echo "nginx 安装成功" + sudo echo "nginx 安装成功" # 配置nginx dssConf # 解决 0.0.0.0:8888 问题 - yum -y install policycoreutils-python - semanage port -a -t http_port_t -p tcp $dss_port + sudo yum -y install policycoreutils-python + sudo semanage port -a -t http_port_t -p tcp $dss_web_port # 开放前端访问端口 - firewall-cmd --zone=public --add-port=$dss_port/tcp --permanent + sudo firewall-cmd --zone=public --add-port=$dss_web_port/tcp --permanent # 重启防火墙 - firewall-cmd --reload + sudo firewall-cmd --reload # 启动nginx - systemctl restart nginx + sudo systemctl restart nginx # 调整SELinux的参数 - sed -i "s/SELINUX=enforcing/SELINUX=disabled/g" /etc/selinux/config + sudo sed -i "s/SELINUX=enforcing/SELINUX=disabled/g" /etc/selinux/config # 临时生效 - setenforce 0 + sudo setenforce 0 } @@ -144,7 +153,7 @@ centos6(){ # yum S_basearch='$basearch' S_releasever='$releasever' - echo " + sudo echo " [nginx] name=nginx repo baseurl=http://nginx.org/packages/centos/$E_releasever/$S_basearch/ @@ -153,31 +162,31 @@ centos6(){ " >> /etc/yum.repos.d/nginx.repo # install nginx - yum install nginx -y + sudo yum install nginx -y # 配置nginx dssConf # 防火墙 - S_iptables=`lsof -i:$dss_port | wc -l` + S_iptables=`lsof -i:$dss_web_port | wc -l` if [ "$S_iptables" -gt "0" ];then # 已开启端口防火墙重启 - service iptables restart + sudo service iptables restart else # 未开启防火墙添加端口再重启 - iptables -I INPUT 5 -i eth0 -p tcp --dport $dss_port -m state --state NEW,ESTABLISHED -j ACCEPT - service iptables save - service iptables restart + sudo iptables -I INPUT 5 -i eth0 -p tcp --dport $dss_web_port -m state --state NEW,ESTABLISHED -j ACCEPT + sudo service iptables save + sudo service iptables restart fi # start - /etc/init.d/nginx start + sudo /etc/init.d/nginx start # 调整SELinux的参数 - sed -i "s/SELINUX=enforcing/SELINUX=disabled/g" /etc/selinux/config + sudo sed -i "s/SELINUX=enforcing/SELINUX=disabled/g" /etc/selinux/config # 临时生效 - setenforce 0 + sudo setenforce 0 } @@ -190,6 +199,11 @@ fi if [[ $version -eq 7 ]]; then centos7 fi -echo '安装visualis前端,用户自行编译DSS前端安装包,则安装时需要把visualis的前端安装包放置于此'$dss_basepath/dss/visualis',用于自动化安装:' + +if ! test -e $dss_basepath/dss/visualis/build.zip; then +echo "Error*************:用户自行编译安装DSS WEB时,则需要把visualis的前端安装包build.zip放置于$dss_basepath/dss/visualis用于自动化安装" +exit 1 +fi + cd $dss_basepath/dss/visualis;unzip -o build.zip > /dev/null -echo "请浏览器访问:http://${dss_ipaddr}:${dss_port}" +#echo "请浏览器访问:http://${dss_nginx_ip}:${dss_web_port}" diff --git a/web/package.json b/web/package.json index 09814a5fbab2d8b7a0729119e2e78b0c606de3d5..f0f6dd869558b05ca7280d07bd3b3152e1ca9d30 100644 --- a/web/package.json +++ b/web/package.json @@ -1,6 +1,6 @@ { "name": "dataspherestudio", - "version": "0.5.0", + "version": "0.9.1", "private": true, "scripts": { "serve": "vue-cli-service serve", @@ -35,7 +35,7 @@ "md5": "^2.2.1", "mitt": "^1.1.3", "moment": "^2.22.2", - "monaco-editor": "^0.15.1", + "monaco-editor": "^0.20.0", "pinyin": "^2.9.0", "qs": "^6.9.1", "reconnecting-websocket": "^4.1.10", diff --git a/web/src/assets/iconfont/font-dws-icon.svg b/web/src/assets/iconfont/font-dws-icon.svg old mode 100644 new mode 100755 index 776dd6f8c9f3b83e2b54f6f34c9072a14cc52051..2b1c8c606c9950863ee0fc708e40c55cdf70799e --- a/web/src/assets/iconfont/font-dws-icon.svg +++ b/web/src/assets/iconfont/font-dws-icon.svg @@ -1,85 +1,103 @@ -Generated by IcoMoon + + + +{ + "fontFamily": "font-dws-icon", + "description": "Font generated by IcoMoon.", + "majorVersion": 1, + "minorVersion": 0, + "version": "Version 1.0", + "fontId": "font-dws-icon", + "psName": "font-dws-icon", + "subFamily": "Regular", + "fullName": "font-dws-icon" +} + + + - + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/web/src/assets/iconfont/font-dws-icon.ttf b/web/src/assets/iconfont/font-dws-icon.ttf old mode 100644 new mode 100755 index ad3754489b4f4908f51cb9dcf7cbd796c826e206..ddfdc404d24a2ba5f5d5f49320bd116c45aecccf Binary files a/web/src/assets/iconfont/font-dws-icon.ttf and b/web/src/assets/iconfont/font-dws-icon.ttf differ diff --git a/web/src/assets/iconfont/font-dws-icon.woff b/web/src/assets/iconfont/font-dws-icon.woff old mode 100644 new mode 100755 index 978571427d8bf8f0e5f9b0c3bf62fa6516b907ea..efd6d59b70bbc7e4a718d21dc4967ad3a4a00eec Binary files a/web/src/assets/iconfont/font-dws-icon.woff and b/web/src/assets/iconfont/font-dws-icon.woff differ diff --git a/web/src/assets/styles/iconfonts.scss b/web/src/assets/styles/iconfonts.scss index 4fb16747f6680c9fb9c1d6dd230586e73382136d..551929385cabb3bb5e7b3e08efe05128d77f215a 100644 --- a/web/src/assets/styles/iconfonts.scss +++ b/web/src/assets/styles/iconfonts.scss @@ -14,21 +14,17 @@ * limitations under the License. * */ - @import 'variables.scss'; - @font-face { font-family: 'font-dws-icon'; - src: url('../iconfont/font-dws-icon.eot?fkqrnv'); - src: url('../iconfont/font-dws-icon.eot?fkqrnv#iefix') format('embedded-opentype'), - url('../iconfont/font-dws-icon.ttf?fkqrnv') format('truetype'), - url('../iconfont/font-dws-icon.woff?fkqrnv') format('woff'), - url('../iconfont/font-dws-icon.svg?fkqrnv#font-dws-icon') format('svg'); + src: url('../iconfont/font-dws-icon.eot?fkqrnv'); + src: url('../iconfont/font-dws-icon.eot?fkqrnv#iefix') format('embedded-opentype'), url('../iconfont/font-dws-icon.ttf?fkqrnv') format('truetype'), url('../iconfont/font-dws-icon.woff?fkqrnv') format('woff'), url('../iconfont/font-dws-icon.svg?fkqrnv#font-dws-icon') format('svg'); font-weight: normal; font-style: normal; } -[class^="fi-"], [class*=" fi-"] { +[class*=" fi-"], +[class^="fi-"] { /* use !important to prevent issues with browser extensions that change fonts */ font-family: 'font-dws-icon' !important; font-style: normal; @@ -45,293 +41,377 @@ .fi-export:before { content: "\e904"; } + .fi-download:before { content: "\e905"; } + .fi-quit:before { content: "\e906"; } + .fi-search:before { content: "\e907"; } + .fi-undo:before { content: "\e908"; } + .fi-redo:before { content: "\e909"; } + .fi-format:before { content: "\e90a"; } + .fi-play:before { content: "\e90b"; } + .fi-stop:before { content: "\e90c"; } + .fi-save:before { content: "\e90d"; } + .fi-disconnect:before { content: "\e90e"; } + .fi-warn:before { content: "\e90f"; } + .fi-expand-right:before { content: "\e910"; } + .fi-cross:before { content: "\e911"; } + .fi-tick:before { content: "\e912"; } + .fi-dir-fold:before { content: "\e913"; } + .fi-dir-unfold:before { content: "\e914"; } + .fi-more-things:before { content: "\e915"; } + .fi-radio-on2:before { content: "\e918"; } + .fi-ide:before { content: "\e91d"; } + .fi-hivedb.open:before { content: "\e91e"; } + .fi-hivedb:before { content: "\e91f"; } + .fi-disk-o:before { content: "\e920"; } + .fi-disk:before { content: "\e921"; } + .fi-project-o:before { content: "\e922"; } + .fi-project:before { content: "\e923"; } + .fi-caret-down:before { content: "\e924"; } + .fi-caret-right:before { content: "\e925"; } + .fi-folder:before { content: "\e926"; padding: 0 2px; font-size: 14px; } + .fi-folder-o:before { content: "\e927"; padding: 0 2px; font-size: 14px; } + .fi-file:before { content: "\e928"; font-size: 16px; color: gray; } + .fi-file-o:before { content: "\e929"; font-size: 16px; padding: 0 2px; } + .fi-logo:before { - content: "\e930" + content: "\e930"; } + .fi-table:before { content: "\e931"; } + .fi-table.open:before { content: "\e931"; color: gray; } + .fi-field:before { content: "\e932"; font-size: 14px; color: gray; } + .fi-field.open:before { content: "\e932"; font-size: 14px; color: gray; } + .fi-open-in:before { - content: "\e92d" + content: "\e92d"; } + .fi-dock-show:before { - content: "\e92e" + content: "\e92e"; } + .fi-dock-hide:before { - content: "\e92f" + content: "\e92f"; } + .fi-hive:before { content: "\e93a"; color: #f4cf2a; font-size: 16px; } + .fi-spark:before { content: "\e93b"; color: $warning-color; font-size: 16px; } + .fi-scala:before { content: "\e93c"; color: $error-color; font-size: 16px; } + .fi-jdbc:before { content: "\e93d"; font-size: 16px; } + .fi-python:before { content: "\e93e"; color: #3573a6; font-size: 16px; font-weight: bold; } + .fi-spark-python:before { color: #3573a6; content: "\e93f"; font-size: 18px; } + .fi-storage:before { content: "\e940"; color: #4db091; font-size: 12px; } + .fi-sas:before { content: "\e941"; color: #58c6a2; } + .fi-r:before { content: "\e942"; color: #2d8cf0; font-size: 14px; } + .fi-txt:before { content: "\e943"; color: gray; font-size: 16px; } + .fi-log:before { content: "\e944"; color: gray; font-size: 16px; } + .fi-xls:before { content: "\e945"; color: #36af47; font-size: 16px; } + .fi-xlsx:before { content: "\e946"; color: #36af47; font-size: 16px; } + .fi-csv:before { content: "\e947"; color: #36af47; font-size: 16px; } + .fi-jar:before { content: "\e948"; color: #e45f3d; font-size: 16px; } + .fi-fx-method:before { content: "\e94a"; } + .fi-fx-method-o:before { content: "\e94a"; } + .fi-fx-udf:before { content: "\e94b"; } + .fi-fx-udf-o:before { - content: "\e94b" + content: "\e94b"; } + .fi-data-develop:before { content: "\e600"; color: #4cbf4b; } + .fi-resource:before { content: "\e601"; color: #3293e3; } + .fi-data-exchange:before { content: "\e602"; } + .fi-algorithms:before { content: "\e603"; color: #ff3d3d; } + .fi-workflow:before { - content: "\e604" + content: "\e604"; } + .fi-bi:before { content: "\e605"; color: #9654f5; font-size: 14px; } + .fi-schedule:before { content: "\e903"; } + .fi-workflow1:before { content: "\e901"; } + .fi-exchange:before { content: "\e902"; } + .fi-application:before { content: "\e916"; } + .fi-newproject:before { content: "\e91a"; } + .fi-addproject:before { content: "\e919"; } + .fi-visualis:before { content: "\e91b"; } + .fi-qualitis:before { content: "\e91c"; } + .fi-scriptis:before { content: "\e92a"; } + .fi-system:before { content: "\e92b"; } +.fi-plus:before { + content: "\ea0a"; +} + +.fi-cross1:before { + content: "\ea0f"; +} + // refresh icon 单独设置padding -.ivu-icon-ios-refresh{ +.ivu-icon-ios-refresh { padding: 4px 3px; } - @keyframes we-icon-loading-spin { - from { - transform: rotate(0deg); - } - 50% { - transform: rotate(180deg); - } - to { - transform: rotate(360deg); - } + from { + transform: rotate(0deg); } + + 50% { + transform: rotate(180deg); + } + + to { + transform: rotate(360deg); + } +} + .we-icon-loading { - color: #3d3d3d; - animation: we-icon-loading-spin 1s linear infinite; + color: #3d3d3d; + animation: we-icon-loading-spin 1s linear infinite; } diff --git a/web/src/assets/styles/login.scss b/web/src/assets/styles/login.scss index d0ed8583e5bccc2a1ba30e3caef3621a1747bf91..43402abe39b8f7e58a53d9752eb56f9f4407da70 100644 --- a/web/src/assets/styles/login.scss +++ b/web/src/assets/styles/login.scss @@ -44,7 +44,6 @@ } .login-main { width: 380px; - height: 300px; background-color: $body-background; margin-right: 9.5%; padding: 20px; @@ -76,9 +75,15 @@ } } .remember-user-name { - margin: 5px 0 0 10px; + margin: 0px 0 10px 10px; } .ivu-form-item { margin-bottom: 20px; } + .captcha-wp { + display: flex; + img{ + height: 44px; + } + } } diff --git a/web/src/assets/styles/workspace.scss b/web/src/assets/styles/workspace.scss new file mode 100644 index 0000000000000000000000000000000000000000..da510ba56cb5fd417cd47da6ce2a90ff925f0c2a --- /dev/null +++ b/web/src/assets/styles/workspace.scss @@ -0,0 +1,291 @@ +/*! + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + */ +@import './variables.scss'; + +.page-bgc { + background: rgb(245, 245, 245); + height: 100%; + margin: 0; + display: flex; + flex-direction: column; + //竖轴方向 + + .page-bgc-header { + padding: 10px 25px 0; + } + + .header-title { + font-size: 14px; + font-weight: bold; + padding-left: 5px; + border-left: 3px solid $primary-color; + } + + .header-info { + padding: 10px 0 10px 20px; + + p { + line-height: 24px; + } + } + +} + +.app-icon { + font-size: 14px; + height: 30px; + line-height: 30px; + color: #39f; +} + +.nodata-tips { + font-size: 12px; + margin: auto; + height: 70px; + line-height: 70px; + text-align: center; +} + +.workspace-main { + padding: 10px 25px; + display: -webkit-box; + display: flex; + -webkit-box-pack: start; + justify-content: flex-start; + -webkit-box-align: start; + align-items: flex-start; + + .item-header { + font-size: 14px; + font-weight: bold; + padding-left: 23px; + border-left: 2px solid $primary-color; + + } + + .left { + flex: 1; + min-height: 137px; + // box-shadow: 0 1px 6px rgba(0,0,0,.2); + border-color: rgba(0,0,0,0); + padding: 0; + } + + .right { + // flex: 1; + margin-left: 25px; + min-height: 137px; + width: 500px; + // box-shadow: 0 1px 6px rgba(0,0,0,.2); + border-color: rgba(0,0,0,0); + padding: 0; + } + + .setting-bt-wrap { + position: absolute; + right: 20px; + top: 10px; + font-size: 24px; + + &.ivu-btn-text { + outline: none; + border: none; + box-shadow: 0 0 0 2px transparent; + } + } + + .app-list { + display: flex; + flex-wrap: wrap; + align-items: center; + + .app-item-add { + display: flex; + margin: 20px 10px 10px 25px; + padding: 10px; + cursor: pointer; + font-size: 24px; + transition: color .2s linear,background-color .2s linear,border .2s linear,box-shadow .2s linear; + + &:hover { + transition: color .2s linear,background-color .2s linear,border .2s linear,box-shadow .2s linear; + color: #39f; + + box-shadow: 0 2px 12px 0 rgba(0,0,0,.2); + border-radius: 4px; + } + } + + .shadow { + box-shadow: 0 2px 12px 0 rgba(0,0,0,.2); + border-radius: 4px; + } + + .app-item-wrap { + display: flex; + position: relative; + margin: 20px 10px 10px 25px; + padding: 10px 30px; + cursor: pointer; + border: 1px solid rgb(245, 245, 245); + border-radius: 3px; + + .close-wrap { + position: absolute; + right: -5px; + top: -5px; + color: #39f; + } + + &:hover { + box-shadow: 0 2px 12px 0 rgba(0,0,0,.2); + border-radius: 4px; + } + + .label { + margin-left: 10px; + // width: 110px; + // font-weight: 700; + font-size: 12px; + overflow: hidden; + text-overflow: ellipsis; + white-space: nowrap; + line-height: 30px; + height: 30px; + } + } + } +} + +.app-list-main { + padding: 0 25px 10px; + // bottom: 0; + flex: auto; + + .app-list-tabs { + padding: 20px 20px 20px 22px; + + border-radius: 6px; + height: 100%; + position: relative; + -webkit-transition: all .2s ease-in-out; + transition: all .2s ease-in-out; + background-color: #fff; + + &:hover { + box-shadow: 0 2px 7px rgba(0,0,0,.15); + border-color: transparent; + position: relative; + } + + .pane-wrap { + display: flex; + flex-wrap: wrap; + + .pane-item { + float: left; + margin: 10px 100px 0 0; + width: 450px; + } + + .app-entrance { + display: flex; + + .app-title-wrap { + flex: 1; + font-size: 12px; + + .app-title { + display: flex; + align-items: center; + margin-top: -10px; + margin-right: 50px; + flex-wrap: wrap; + + .sub-margin { + margin-right: 10px; + } + + .sub-title { + font-weight: 700; + } + + .app-tag { + color: #07C1E0; + border: 1px solid transparent; + background: #07C1E0; + } + + } + + } + + .app-status-wrap-active { + position: absolute; + right: 0; + top: 6px; + width: 70px; + font-size: 12px; + + color: #0c6; + + span { + color: #515a6e; + } + } + + .app-status-wrap-disable { + + position: absolute; + right: 0; + top: 6px; + width: 70px; + font-size: 12px; + color: #ccc; + + span { + color: #515a6e; + } + } + + } + + .button-wrap { + display: flex; + margin-top: 20px; + // justify-content: center; + + .entrace-btn { + margin-right: 10px; + } + } + + } + } + +} + +.input-wrap { + position: absolute; + right: 20px; + top: 20px; + width: 200px; +} + +.radio-box .ivu-radio .ivu-radio-inner { + border: 1px solid #2d8cf0; +} diff --git a/web/src/commonData/i18n/common/en.json b/web/src/commonData/i18n/common/en.json index d89c3806323a613fc9666e01ba455527b25c2625..02a3bf13e748f7c72e94ade1918c446435d991ef 100644 --- a/web/src/commonData/i18n/common/en.json +++ b/web/src/commonData/i18n/common/en.json @@ -81,9 +81,10 @@ "publishSuccess": "Successfully published", "publishFailed": "Publish failed", "infoTitle": "Welcome", - "infoHeader": "Welcome to DataSphere Studio !", + "infoHeader": "Welcome to Workflow Development !", "infoBodyFirstRow": "DataSphere Studio is a one-stop portal focusing on development and management of data application, as a part of WeDataSphere -- the big data platform of WeBank.", "infoBodySecondRow": "Based on Linkis computation middleware, it's able to easily integrate various data application systems, making the development of data application easier and simpler.", + "workspace": "Workspace", "appTitle": "Application Navigation", "applicationStudio": "Business Application Development", "applicationMore": "More Information", @@ -103,7 +104,35 @@ "deleteWtss": "Delete the scheduler project at the same time?", "noData": "No data please add" }, + "workspace": { + "infoHeader": "Welcome to DataSphere Studio !", + "createWorkspace": "Create Workspace", + "searchWorkspace": "Search Workspace", + "workspaceList": "Workspace List", + "editWorkspace": "Edit", + "newWorkspace": "New Workspace", + "editor": "Edit Workspace", + "workName": "Name", + "department": "Department", + "selectDepartment": "Please Select Department", + "label": "Label", + "addLabel": "Add Label", + "description": "Description", + "pleaseInputWorkspaceDesc": "Please Input Workspace Description", + "createdSuccess": "Successfully created!", + "createdFailed": "Created failed!", + "display": "Display", + "tableDisplay": "Table", + "cardDisplay": "Card", + "createTime": "CreateTime" + }, + "GLY": { + "ALTY": "Demos", + "KSRM": "Quick Start", + "HYP": "Change" + }, "workflow": { + "infoHeader": "Welcome to Workflow!", "workflow": "Workflow", "createWorkflow": "Create workflow", "gotoVisualis": "Enter Visualis", @@ -490,7 +519,7 @@ "cancel": "Cancel", "timeout": "Timeout on publishing project {name}!" }, - "tableDetails" : { + "tableDetails": { "BZBSX": "Table basic attributes", "BZDXX": "Table column information", "BTJXX": "Table statistical information", @@ -605,7 +634,8 @@ "navMune": { "FAQ": "FAQ", "clearCache": "Clear cache", - "logOut": "Log out" + "logOut": "Log out", + "userManager": "User Manager" }, "shape": { "dataDev": "Data development", @@ -771,7 +801,8 @@ "password": "Please enter password!", "loginSuccess": "Login Success", "haveLogin": "You have already logged in, please do not login repeatedly", - "vaildFaild": "Authentication failed!" + "vaildFaild": "Authentication failed!", + "captcha": "Please enter captcha" }, "userMenu": { "title": "Warning", @@ -779,6 +810,10 @@ "clearCacheSuccess": "Successfully cleared local cache!", "comingSoon": "Not yet open source, please stay tuned!" }, + "header": { + "home": "Home", + "console": "Console" + }, "headerNavBar": { "Workflow": "Workflow Development", "Exchangis": "Data Exchange Component", @@ -1018,6 +1053,22 @@ }, "error": { "fileExists": "Duplicated file!" + }, + "home": { + "welcome": "Welcome to the {text} workspace!", + "setting": "Seting", + "exit": "Exit", + "enter": "Enter {text}", + "dlgTitle": "New Fast entry", + "selectType": "Please select category", + "selectApp": "Please select Application", + "save": "Save", + "cancel": "Cancel", + "running": "running", + "stop": "disable", + "searchPlaceholder": "Search application", + "tips": "No data or no addition", + "repeat": "You cannot add the same application repeatedly" } }, "database": { @@ -1612,6 +1663,40 @@ "success": { "update": "Successfully updated global variables!" } + }, + "userManager": { + "createUser": "create user", + "username" : "username", + "password": "password", + "rootPath": "Root Path", + "spacePath": "Space root Path", + "hdfsPath": "HDFS root Path", + "resultPath": "Result root Path", + "schedulerPath": "Scheduler root Path", + "usernameNotNull": "username cannot be empty", + "usernameFormat": "username begins with a letter, contains only letters, numbers, or underscores", + "passwordNotNull": "password cannot be empty", + "passwordNotWeak": "password begins with a letter, contains letters, numbers, and special characters, not less than 8 digits", + "createSuccess": "create user Success", + "createFail": "create user Fail", + "dssInstallDir": "dss install directory", + "azkabanInstallDir": "Azkaban install directory", + "linuxHost" : "linuxHost", + "linuxLoginUser" : "linuxLoginUser", + "linuxLoginPassword" : "linuxLoginPassword", + "linuxHostNotNull" : "linuxHost cannot be empty", + "linuxLoginUserNotNull" : "linuxLoginUser cannot be empty", + "linuxLoginPasswordNotNull" : "linuxLoginPassword cannot be empty", + "serverSettings": "serverSettings", + "userSettings": "userSettings", + "XYB": "Next step", + "SYB": "Previous step", + "addServer": "Add server", + "deleteTip": "Are you sure you want to delete this server?", + "serverSame": " server's host and user is the same, please keep one server with the same configuration", + "serverNull": "The server configuration option cannot be empty", + "pathNotNull": "Cannot be empty", + "createTip": "It will take some time to create, please wait patiently, if successful, return to the home page, fail to stay on this page" } } -} +} \ No newline at end of file diff --git a/web/src/commonData/i18n/common/zh.json b/web/src/commonData/i18n/common/zh.json index 93c250f1a90231629e8866124e29a97f819c9171..98d0946695ddc7a96872c74aef3454f1d37bca86 100644 --- a/web/src/commonData/i18n/common/zh.json +++ b/web/src/commonData/i18n/common/zh.json @@ -17,7 +17,8 @@ "nameLength": "名称长度不能大于", "validNameDesc": "必须以字母开头,且只支持字母、数字、下划线", "project": "工程", - "feedback": "反馈" + "feedback": "反馈", + "validNameExist": "名称不能重复" }, "project": { "projectName": "工程名", @@ -81,11 +82,11 @@ "publishSuccess": "发布成功", "publishFailed": "发布失败", "infoTitle": "欢迎信息", - "infoHeader": "欢迎来到 DataSphere Studio !", + "infoHeader": "欢迎来到工作流开发首页!", "infoBodyFirstRow": "DataSphere Studio是微众银行大数据平台——WeDataSphere,自研的一站式数据应用开发管理门户。", "infoBodySecondRow": "基于Linkis计算中间件构建,可轻松整合上层各数据应用系统,让数据应用开发变得简洁又易用。", "appTitle": "应用导航", - "applicationStudio": "业务应用开发", + "workspace": "工作空间", "applicationMore": "了解更多", "applicationTipsFirst": "工程 > 工作流,是应用开发的基本组织结构。", "applictaionTipsSecond": "在工作流拖拽编辑页面,DataSphere Studio已集成的所有数据应用系统,都将以工作流节点的形式出现,让您能够以业务的视角将其编排串连起来,快速实现全部业务。", @@ -103,7 +104,35 @@ "deleteWtss": "是否同时删除scheduler工程?", "noData": "暂无数据请添加" }, + "workspace": { + "infoHeader": "欢迎来到 DataSphere Studio !", + "createWorkspace": "创建工作空间", + "searchWorkspace": "搜索工作空间", + "workspaceList": "工作空间列表", + "editWorkspace": "编辑", + "newWorkspace": "新建工作空间", + "editor": "编辑工作空间", + "workName": "工作空间名", + "department": "归属部门", + "selectDepartment": "请选择部门", + "label": "标签", + "addLabel": "添加标签", + "description": "描述", + "pleaseInputWorkspaceDesc": "请输入工作空间描述", + "createdSuccess": "工作空间创建成功!", + "createdFailed": "工作空间创建失败!", + "display": "展示方式", + "tableDisplay": "列表展示", + "cardDisplay": "图标展示", + "createTime": "创建时间" + }, + "GLY": { + "ALTY": "案例体验", + "KSRM": "快速入门", + "HYP": "换一批" + }, "workflow": { + "infoHeader": "欢迎来到工作流!", "workflow": "工作流", "createWorkflow": "创建工作流", "gotoVisualis": "进入Visualis", @@ -293,7 +322,7 @@ "WJYCZ": "文件已存在,请选择其它文件或选择其它文件夹!", "WJMCBHF": "文件名称不合法,仅支持以字母、数字、中文、下划线、中短线且带后缀的命名!", "SCBCG100": "上传文件不超过100M!", - "SCCG": "文件 {name} 上传成功!", + "SCCG": "文件 {name} 上传成功!", "WJCCXE": "文件大小超出限额!", "WJBSC": "资源文件 {name} 已被成功删除!" } @@ -490,7 +519,7 @@ "cancel": "关闭", "timeout": "工程{name}发布超时!" }, - "tableDetails" : { + "tableDetails": { "BZBSX": "表基本属性", "BZDXX": "表字段信息", "BTJXX": "表统计信息", @@ -534,7 +563,6 @@ "YES": "对", "GSHJX": "进行格式化成", "ZJXYGE": "组件需要的格式" - }, "logView": { "taskId": "任务ID:", @@ -606,7 +634,8 @@ "navMune": { "FAQ": "常见问题", "clearCache": "清理缓存", - "logOut": "退出登录" + "logOut": "退出登录", + "userManager": "用户管理" }, "shape": { "dataDev": "数据开发", @@ -772,7 +801,8 @@ "password": "请输入密码!", "loginSuccess": "登录成功", "haveLogin": "您已经登录,请不要重复登录", - "vaildFaild": "验证未通过!" + "vaildFaild": "验证未通过!", + "captcha": "请输入验证码" }, "userMenu": { "title": "警告", @@ -780,6 +810,10 @@ "clearCacheSuccess": "清除本地数据缓存成功!", "comingSoon": "尚未开源,敬请期待!" }, + "header": { + "home": "首页", + "console": "控制台" + }, "headerNavBar": { "Workflow": "工作流开发", "Exchangis": "数据交换组件", @@ -1019,6 +1053,22 @@ }, "error": { "fileExists": "该文件已经存在!" + }, + "home": { + "welcome": "欢迎来到 {text} 的工作空间!", + "setting": "设置", + "exit": "退出设置", + "enter": "进入{text}", + "dlgTitle": "新增快速入口", + "selectType": "请选择分类", + "selectApp": "请选择系统", + "save": "保存", + "cancel": "取消", + "running": "运行中", + "stop": "不可用", + "searchPlaceholder": "搜索应用系统", + "tips": "没有数据或者未添加", + "repeat": "不能重复添加相同的应用" } }, "database": { @@ -1613,6 +1663,40 @@ "success": { "update": "全局变量更新成功!" } + }, + "userManager": { + "createUser": "创建用户", + "username" : "账号", + "password": "密码", + "rootPath": "用户根目录", + "spacePath": "工作空间目录", + "hdfsPath": "HDFS根目录", + "resultPath": "结果存储根目录", + "schedulerPath": "调度存储根目录", + "usernameNotNull": "用户名不能为空", + "usernameFormat": "账号以字母开头,仅能含有字母、数字或者下划线", + "passwordNotNull": "密码不能为空", + "passwordNotWeak": "密码以字母开头,含有字母、数字和特殊字符,不少于8位", + "createSuccess": "创建账号成功", + "createFail": "创建账号失败", + "dssInstallDir": "DSS 安装目录", + "azkabanInstallDir": "Azkaban 安装目录", + "linuxHost" : "服务器ip", + "linuxLoginUser" : "服务器用户名", + "linuxLoginPassword" : "服务器密码", + "linuxHostNotNull" : "服务器ip不能为空", + "linuxLoginUserNotNull" : "服务器用户名不能为空", + "linuxLoginPasswordNotNull" : "服务器密码不能为空", + "serverSettings": "服务器配置", + "userSettings": "用户配置", + "XYB": "下一步", + "SYB": "上一步", + "addServer": "添加服务器", + "deleteTip": "确认删除该服务器?", + "serverSame": " 服务器ip与用户名配置相同,相同配置的服务器只能保留一个", + "serverNull": "服务器配置选项不能为空", + "pathNotNull": "不能为空", + "createTip": "创建需要一段时间,请耐心等候,如果成功则返回首页,失败则停留本页" } } -} +} \ No newline at end of file diff --git a/web/src/js/component/editor/editor.vue b/web/src/js/component/editor/editor.vue index 1418ae01aa547e55c651e3e843103fcace5dd1c6..79ae4e638741318b365f5160a423730f8d5e66db 100644 --- a/web/src/js/component/editor/editor.vue +++ b/web/src/js/component/editor/editor.vue @@ -91,7 +91,7 @@ export default { if (newValue == this.getValue()) { return; } - let readOnly = this.editor.getConfiguration().readOnly; + let readOnly = this.currentConfig.readOnly; if (readOnly) { // editor.setValue 和 model.setValue 都会丢失撤销栈 this.editor.setValue(newValue); diff --git a/web/src/js/component/navMenu/index.vue b/web/src/js/component/navMenu/index.vue index b965857bd402e1ba2f87eb1feba19a1bdb49233a..4b16b6d633065bd2f57c6207577ea03d0c2d6877 100644 --- a/web/src/js/component/navMenu/index.vue +++ b/web/src/js/component/navMenu/index.vue @@ -5,15 +5,15 @@ @@ -22,121 +22,68 @@ v-if="current"> + diff --git a/web/src/js/component/table/index.js b/web/src/js/component/table/index.js index 350c6b1a732fde1564ebe283bd44a96cc59ca94d..5def7449c0e0db95f9bd52ced1cd8833c9157cf1 100644 --- a/web/src/js/component/table/index.js +++ b/web/src/js/component/table/index.js @@ -15,7 +15,7 @@ * */ -import WeTable from './table.vue'; +import WeTable from './resultTable/table.vue'; import historyTable from './historyTable/historyTable.vue'; export default { WeTable, diff --git a/web/src/js/component/table/resultTable/body.vue b/web/src/js/component/table/resultTable/body.vue new file mode 100644 index 0000000000000000000000000000000000000000..502a4aa3f27c251d04584c6a0f161521df7a53a3 --- /dev/null +++ b/web/src/js/component/table/resultTable/body.vue @@ -0,0 +1,285 @@ + + + diff --git a/web/src/js/component/table/resultTable/header.vue b/web/src/js/component/table/resultTable/header.vue new file mode 100644 index 0000000000000000000000000000000000000000..f8bfc2fec51b3e6a72463c38a884152eb9910f47 --- /dev/null +++ b/web/src/js/component/table/resultTable/header.vue @@ -0,0 +1,128 @@ + + diff --git a/web/src/js/component/table/resultTable/list.vue b/web/src/js/component/table/resultTable/list.vue new file mode 100644 index 0000000000000000000000000000000000000000..aab7cf4e1ee47579b48edcbc614ff95fb7902f45 --- /dev/null +++ b/web/src/js/component/table/resultTable/list.vue @@ -0,0 +1,102 @@ + + diff --git a/web/src/js/component/table/resultTable/table.vue b/web/src/js/component/table/resultTable/table.vue new file mode 100644 index 0000000000000000000000000000000000000000..faa5693b04140bf4818789654cc257a3dfe6f6dc --- /dev/null +++ b/web/src/js/component/table/resultTable/table.vue @@ -0,0 +1,188 @@ + + + + diff --git a/web/src/js/component/vue-process/style/index.scss b/web/src/js/component/vue-process/style/index.scss index bd2cbbc5003ffe420303fbe1aa5d6fbc25d3e674..bf7ea68d1536efa52a9a42e2f33a2186f4d2fac5 100644 --- a/web/src/js/component/vue-process/style/index.scss +++ b/web/src/js/component/vue-process/style/index.scss @@ -400,7 +400,7 @@ $mask-color: rgba(55, 55, 55, .6); .designer-control-header { background: #5c5c5c; - background-image: linear-gradient(top, #5c5c5c, #3e3e3e); + background-image: linear-gradient(to top, #5c5c5c, #3e3e3e); height: 8px; border-left: 1px solid #5c5c5c; border-right: 1px solid #5c5c5c; diff --git a/web/src/js/component/workflowContentItem/index.scss b/web/src/js/component/workflowContentItem/index.scss index e4c1fb83f3f19cccf35f1e28950080df96f94431..00e473c4f42dc2a5987d643962f613174e54dd23 100644 --- a/web/src/js/component/workflowContentItem/index.scss +++ b/web/src/js/component/workflowContentItem/index.scss @@ -31,6 +31,14 @@ margin: 15px; box-shadow: 0 2px 12px 0 rgba(0,0,0,.2); background: #fff; + .project-add { + display: flex; + flex-direction: column; + justify-content: center; + align-items: center; + height: 100%; + cursor: pointer; + } .project-main { position: relative; height: 130px; diff --git a/web/src/js/component/workflowContentItem/index.vue b/web/src/js/component/workflowContentItem/index.vue index 09bb49838ebde08b8da4a063a2a0dd6ac186be92..c1c795a7853f9af88e1db684f807211f1480fa21 100644 --- a/web/src/js/component/workflowContentItem/index.vue +++ b/web/src/js/component/workflowContentItem/index.vue @@ -5,8 +5,19 @@ + +
+ + {{$t(`message.${source}`)}} +
+
-
{{$t('message.workflowItem.nodata')}}
+ v-else>{{$t('message.workflowItem.nodata')}} --> { diff --git a/web/src/js/module/footer/index.scss b/web/src/js/module/footer/index.scss index 5df19daf3a294a450cc0af49bb51d15664cf73e0..ee8cf90c60bc25948580abc642dbcf8ef107c216 100644 --- a/web/src/js/module/footer/index.scss +++ b/web/src/js/module/footer/index.scss @@ -50,3 +50,12 @@ color: $text-color; } } +.footer-mask { + position: fixed; + top: -100vh; + left: -100vw; + width: 200vw; + height: 200vh; + background-color: #00000000; +} + \ No newline at end of file diff --git a/web/src/js/module/footer/index.vue b/web/src/js/module/footer/index.vue index 16d9fd64419b476acedfa6db2a96922b8414aa77..e47a22d73883bff276c9d0f9a71ed4ff7aa6d00c 100644 --- a/web/src/js/module/footer/index.vue +++ b/web/src/js/module/footer/index.vue @@ -9,6 +9,7 @@ ref="resourceSimple" @update-job="updateJob"> +