基于docker容器,快速搭建hadoop+spark+hive+hbase集群的详细过程记录

1.安装docker

 参考:https://blog.csdn.net/as4589sd/article/details/104140244

 1.1.安装docker  

yum update -y
yum -y install docker
systemctl start docker

1.2.更改为公司本地镜像源,或者阿里镜像源

vi /etc/docker/daemon.json
{
  "registry-mirrors": ["http://192.168.6.196:8099"],  "insecure-registries":["http://192.168.6.196:8098"]
   #"registry-mirrors": ["httpshttp://aj2rgad5192.mirror168.aliyuncs6.com96:8099"]
}

  1.3.设置为默认启动

chkconfig docker on

1.4.重启docker,测试hello-world

service docker restart
docker run hello-world

1.5.安装docker-compose

参考:

你可能感兴趣的:(docker,#,spark,#,hadoop)