# hadoopService **Repository Path**: ufo360/hadoop-service ## Basic Information - **Project Name**: hadoopService - **Description**: hadoop网盘 - **Primary Language**: Java - **License**: Not specified - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 3 - **Forks**: 0 - **Created**: 2021-01-27 - **Last Updated**: 2025-03-05 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README ## 新部署 hadoop 网盘 版本说明: 阿里云ubutnu18 hadoop2.8.3 jdk1.8.0_241 ##### 1.配置shh ``` sudo apt-get update sudo apt-get install openssh-server ssh localhost cd ~/.ssh/ ssh-keygen -t rsa ``` 运行后一路回车 ``` cat ./id_rsa.pub >> ./authorized_keys ``` 再次ssh localhost的时候就可以无密码登陆了。 ##### 2.安装java 新建文件夹 ``` sudo mkdir -p /usr/local/java ``` 下载jdk:http://www.sousou88.com/download/platform.html ``` sudo tar xvzf jdk-8u241-linux-x64.tar.gz ``` 配置环境变量: ``` sudo vi /etc/profile ``` 末尾添加以下代码: ``` JAVA_HOME=/usr/local/java/jdk1.8.0_241 PATH=$PATH:$HOME/bin:$JAVA_HOME/bin export JAVA_HOME export PATH ``` 告诉ubutnu jdk的位置: ``` sudo update-alternatives --install "/usr/bin/java" "java" "/usr/local/java/jdk1.8.0_241/bin/java" 1 sudo update-alternatives --install "/usr/bin/javac" "javac" "/usr/local/java/jdk1.8.0_241/bin/javac" 1 sudo update-alternatives --install "/usr/bin/javaws" "javaws" "/usr/local/java/jdk1.8.0_241/bin/javaws" 1 ``` 相应的将其设为默认方式: ``` sudo update-alternatives --set java /usr/local/java/jdk1.8.0_241/bin/java sudo update-alternatives --set javac /usr/local/java/jdk1.8.0_241/bin/javac sudo update-alternatives --set javaws /usr/local/java/jdk1.8.0_241/bin/javaws ``` 配置环境变量: ``` cd ~ vim ~/.bashrc ``` 开头位置加上 ``` export JAVA_HOME=/usr/local/java/jdk1.8.0_241 export JRE_HOME=${JAVA_HOME}/jre export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib export PATH=${JAVA_HOME}/bin:$PATH ``` 重新加载环境变量的配置文件: ``` source /etc/profile java -version ``` ![image-20210127104804767](https://gitee.com/ufo360/picgo2/raw/master/img/20210127104804.png) ##### 3.linux安装hadoop 可以去镜像:https://mirrors.cnnic.cn/apache/hadoop/common/, 下载stable目录下,没有src的压缩包。务必使用2.8.3版本,其他版本会出各种问题。下载不到就在网盘里找。 移动到有权限的目录 ``` sudo mv /home/ufo/hadoop-2.8.3.tar.gz /usr/local/ ``` 解压hadoop包 ``` sudo tar xvzf hadoop-2.8.3.tar.gz ``` 重命名并修改权限 ``` sudo mv ./hadoop-2.8.3/ ./hadoop sudo chown -R ufo ./hadoop ``` 检测hadoop版本 ``` cd /usr/local/hadoop/bin ./hadoop version ``` ![image-20210127104746043](https://gitee.com/ufo360/picgo2/raw/master/img/20210127104746.png) ##### 4.linux伪分布式部署 修改配置文件 core-site.xml ``` cd /usr/local/hadoop/etc/hadoop/ vi core-site.xml ``` ```xml hadoop.tmp.dir file:/usr/local/hadoop/tmp Abase for other temporary directories. fs.defaultFS hdfs://localhost:9000 ``` 修改配置文件 hdfs-site.xml ``` vi hdfs-site.xml ``` ```xml dfs.replication 1 dfs.namenode.name.dir file:/usr/local/hadoop/tmp/dfs/name dfs.datanode.data.dir file:/usr/local/hadoop/tmp/dfs/data ``` NameNode 的格式化: ``` cd ../../ ./bin/hdfs namenode -format ``` 开启 NameNode 和 DataNode 守护进程 ``` cd /usr/local/hadoop ./sbin/start-dfs.sh ``` 判断是否成功开启 ``` jps ``` ![image-20210127105159770](https://gitee.com/ufo360/picgo2/raw/master/img/20210127105159.png) 内部访问web: [http://localhost:50070](http://localhost:50070/) 外部访问web: 1.阿里云控制台开启50070端口 2.关闭防火墙 ``` ufw disable ``` [http://localhost:50070](http://localhost:50070/) ![image-20210127105131687](https://gitee.com/ufo360/picgo2/raw/master/img/20210127105131.png) ##### 5.win10安装hadoop 还是同样的包: ![image-20200920143451631](https://gitee.com/ufo360/typora/raw/master/image-20200920143451631.png) 下载hadooponwindows-master: ![image-20200920143528195](https://gitee.com/ufo360/typora/raw/master/image-20200920143528195.png) 解压,将里面的bin目录复制替换到hadoop安装目录: 主要是多了一个winutils: ![image-20200920143533832](https://gitee.com/ufo360/typora/raw/master/image-20200920143533832.png) ##### 6.win10伪分布式部署 修改配置文件/etc/hadoop/ core-site.xml ```xml hadoop.tmp.dir file:/usr/local/hadoop/tmp Abase for other temporary directories. fs.defaultFS hdfs://localhost:9000 ``` 修改配置文件 hdfs-site.xml ```xml dfs.replication 1 dfs.namenode.name.dir file:/usr/local/hadoop/tmp/dfs/name dfs.datanode.data.dir file:/usr/local/hadoop/tmp/dfs/data ``` NameNode 的格式化: 去到bin目录: ``` hdfs namenode -format ``` 开启 NameNode 和 DataNode 守护进程: 去到sbin目录: 双击start-all.cmd ![image-20200920143713352](https://gitee.com/ufo360/typora/raw/master/image-20200920143713352.png) 内部问web: [http://localhost:50070](http://localhost:50070/) ##### 7.添加用户 win10: ``` hdfs dfs -mkdir -p /user/ufo ``` linux: ``` cd /usr/local/hadoop ./bin/hdfs dfs -mkdir -p /user/ufo ``` ##### 8.springboot打包 项目地址: 注意修改源代码: 这个下载地址采用tomcat的8080端口下载。 ![image-20200920163837461](https://gitee.com/ufo360/typora/raw/master/image-20200920163837461.png) 文件上传时就传到这个路径下。 ```java String filePath = "/usr/local/apache-tomcat-8.5.61/webapps/examples/hadooptmp"; ``` ![image-20200920163858953](https://gitee.com/ufo360/typora/raw/master/image-20200920163858953.png) ``` mvn clean ``` 如果出错:Process terminated,就自己去找原因,双击那一行会有错误提示。 ``` mvn install ``` install时把测试关了 ![image-20210127105759332](https://gitee.com/ufo360/picgo2/raw/master/img/20210127105759.png) 生成target文件夹,下面有个jar包,复制到linux普通目录。 ![image-20200920163640360](https://gitee.com/ufo360/typora/raw/master/image-20200920163640360.png) ``` sudo nohup java -jar service-0.0.1-SNAPSHOT.jar >out.out & ``` ##### 9.vue项目打包 注意修改源代码: ![image-20200920163807864](https://gitee.com/ufo360/typora/raw/master/image-20200920163807864.png) ``` npm run build ``` 生成dist文件夹,把这个文件夹拷贝到linux的ngnix目录下,访问index.hrtml即可。 ``` unzip -O CP936 xxx ``` ![image-20200920163657679](https://gitee.com/ufo360/typora/raw/master/image-20200920163657679.png) 10.访问 ![image-20210127113626543](https://gitee.com/ufo360/picgo2/raw/master/img/20210127113626.png) ![image-20210127113635979](https://gitee.com/ufo360/picgo2/raw/master/img/20210127113636.png)