1.参考http://www.fogsvc.com/97.html文档设置三台虚拟机并固定ip

192.168.1.10   hadoop-master

192.168.1.11   hadoop-slave1

192.168.1.12   hadoop-slave2

1.1 添加用户hadoop

sudo adduser hadoop

 设置密码 

 1.2 为sudoers分配777权限

 chmod 777 /etc/sudoers 

 1.3 编辑sudoers

 vim /etc/sudoers 

 1.4 

 在root ALL=(ALL) ALL下新增一行hadoop ALL=(ALL) ALL,保存退出 

 1.5把/etc/sudoers权限改回去

pkexec chmod 0440 /etc/sudoers

2.配置hosts 

 vi /etc/hosts 

 添加内容

 192.168.1.10   hadoop-master

 192.168.1.11   hadoop-slave1 

 192.168.1.12   hadoop-slave2 

 如下图:

3.安装jdk 

 3.1更新源 

 sudo aptget update

3.2安装 

 sudo aptget install openjdk8jdk 

 注意:安装报错如下图

解决办法:

执行   apt-get remove libx11-6   后再安装 

 3.3查看版本

java -version

4.配置SSH免密登录

 切换hadoop用户  su hadoop 

 4.1安装openssh-server 

 sudo apt install openssh-server

4.2 使用rsa算法生成公钥和私钥(master)

 ssh-keygen -t rsa

 4.3 导入公钥(master)

 cat .ssh/id_rsa.pub >> .ssh/authorized_keys

 4.4 验证,连接本地(master)

ssh hadoop-master

4.5把其他节点生成的秘钥复制到master 

 4.5.1在slave1执行 

 scp .ssh/id_rsa.pub hadoop@hadoop-master:/home/hadoop/id_rsa_1.pub 

 4.5.2输入密码 

 4.5.3在slave2执行 

 scp .ssh/id_rsa.pub hadoop@hadoop-master:/home/hadoop/id_rsa_2.pub 

 4.5.4输入密码 

 4.5.5 导入从slave1和slave2复制过来的公钥

 cat /home/hadoop/id_rsa_1.pub >> .ssh/authorized_keys 

 cat /home/hadoop/id_rsa_2.pub >> .ssh/authorized_keys 

 4.5.6验证

 免密登陆slave1

ssh  hadoop@hadoop-slave1 

 免密登陆slave2

ssh hadoop@hadoop-slave2

 4.5.7把master上的authorized_keys复制到slave1、slave2,并分别验证是否配置成功 

 scp .ssh/authorized_keys hadoop@hadoop-slave1:/home/hadoop/.ssh/authorized_keys

 scp .ssh/authorized_keys hadoop@hadoop-slave2:/home/hadoop/.ssh/authorized_keys