Hadoop/Spark安装

单机安装Hadoop

  1. 安装Java
sudo apt-get install default-jdk
java-version

2.设置Hadoop用户和组

sudo addgroup hadoop
sudo adduser --ingroup hadoop hduser

3.安装并配置SSH

$ sudo apt-get install ssh
$ su hduser
$ ssh-keygen -t rsa -P ""
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

4.安装Hadoop

首先需要下载并解压Hadoop文件,运行命令:

$wget https://www.apache.org/dyn/closer.cgi/hadoop/common/hadoop-2.8.5/hadoop-2.8.5.tar.gz

下载完毕后,就是解压缩:

$ tar xvzf hadoop-2.6.0.tar.gz

然后将Hadoop文件夹搬到新文件夹,并且给hduser这个用户权限:

$ sudo mv hadoop-2.6.0 /usr/local/hadoop

$ cd /usr/local

$ sudo chown -R hduser:hadoop hadoop

sudo apt install postgresql postgresql-contrib
sudo apt-get install scala

你可能感兴趣的:(Hadoop/Spark安装)