记录《现有docker中安装spark3.4.1》

基础docker环境中存储hadoop3--方便后续查看

参考:

记录《现有docker中安装spark3.4.1》_第1张图片

记录《现有docker中安装spark3.4.1》_第2张图片 

实践:

 记录《现有docker中安装spark3.4.1》_第3张图片

 记录《现有docker中安装spark3.4.1》_第4张图片

 

export JAVA_HOME=/opt/apache/jdk1.8.0_333
export SPARK_MASTER_IP=192.168.0.220
export SPARK_WORKER_MEMORY=4g
export SPARK_WORKER_CORES=2
export SPARK_EXECUTOR_MEMORY=4g
export HADOOP_HOME=/opt/apache/hadoop-3.2.2/
export HADOOP_CONF_DIR=/opt/apache/hadoop-3.2.2/etc/hadoop
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/apache/jdk1.8.0_333/jre/lib/amd64
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#    http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

# A Spark Worker will be started on each of the machines listed below.
hadoop220
hadoop221
hadoop222

记录《现有docker中安装spark3.4.1》_第5张图片

记录《现有docker中安装spark3.4.1》_第6张图片

 记录《现有docker中安装spark3.4.1》_第7张图片

 

 

你可能感兴趣的:(docker,容器,运维)