spark-shell启动分析

spark-shell脚本源码

#!/usr/bin/env bash

#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#    http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

#
# Shell script for starting the Spark Shell REPL

cygwin=false
case "$(uname)" in
  CYGWIN*) cygwin=true;;
esac

# Enter posix mode for bash
set -o posix

if [ -z "${SPARK_HOME}" ]; then
  source "$(dirname "$0")"/find-spark-home
fi

export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]

Scala REPL options:
  -I                    preload , enforcing line-by-line interpretation"

# SPARK-4161: scala does not assume use of the java classpath,
# so we need to add the "-Dscala.usejavacp=true" flag manually. We
# do this specifically for the Spark shell because the scala REPL
# has its own class loader, and any additional classpath specified
# through spark.driver.extraClassPath is not automatically propagated.
SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Dscala.usejavacp=true"

function main() {
  if $cygwin; then
    # Workaround for issue involving JLine and Cygwin
    # (see http://sourceforge.net/p/jline/bugs/40/).
    # If you're using the Mintty terminal emulator in Cygwin, may need to set the
    # "Backspace sends ^H" setting in "Keys" section of the Mintty options
    # (see https://github.com/sbt/sbt/issues/562).
    stty -icanon min 1 -echo > /dev/null 2>&1
    export SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Djline.terminal=unix"
    "${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"
    stty icanon echo > /dev/null 2>&1
  else
    export SPARK_SUBMIT_OPTS
    "${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"
  fi
}

# Copy restore-TTY-on-exit functions from Scala script so spark-shell exits properly even in
# binary distribution of Spark where Scala is not installed
exit_status=127
saved_stty=""

# restore stty settings (echo in particular)
function restoreSttySettings() {
  stty $saved_stty
  saved_stty=""
}

function onExit() {
  if [[ "$saved_stty" != "" ]]; then
    restoreSttySettings
  fi
  exit $exit_status
}

# to reenable echo if we are interrupted before completing.
trap onExit INT

# save terminal settings
saved_stty=$(stty -g 2>/dev/null)
# clear on error so we don't later try to restore them
if [[ ! $? ]]; then
  saved_stty=""
fi

main "$@"

# record the exit status lest it be overwritten:
# then reenable echo and propagate the code.
exit_status=$?
onExit

  1. cygwin=false
    windows操作linux系统需要安装这个,初始值为false。

  2. shell的uname命令,返回当前系统信息。

    uname用法:
    -a或–all:显示全部的信息;
    -m或–machine:显示电脑类型;
    -n或-nodename:显示在网络上的主机名称;
    -r或–release:显示操作系统的发行编号;
    -s或–sysname:显示操作系统名称;
    -v:显示操作系统的版本;
    -p或–processor:输出处理器类型或"unknown";
    -i或–hardware-platform:输出硬件平台或"unknown";
    -o或–operating-system:输出操作系统名称;
    –help:显示帮助;
    –version:显示版本信息。

  3. shell的case用法。

    case $变量名 in
    	模式一)
    		command
    	;;
    	模式二)
    		command
    	;;
    	*)
    		command
    	;;
    esac
    
    例:
    read -p 'press key:' KEY
    case $KEY in 
    	[a-z][A-Z])
    	echo 'a leeter';;
    	[0-9])
    	echo 'a digit';;
    	*)
    	echo 'other';;
    esac
    
  4. if [ -z “${SPARK_HOME}” ];
    表示${SPARK_HOME}长度为零则为真。

  5. dirname “$0”
    表示当前运行命令的路径,其中 $0 返回当前运行的命令。

  6. 从脚本中看出执行及调用流程:
    spark-shell(main) → spark-submit → spark-class → org.apache.spark.launcher.Main "$@"
    其实,spark-shell里面调用的还是spark-submit,一直追溯下去,会发现底层就是使用了java来启动。

  7. REPL:Read-Eval-Print-Loop

你可能感兴趣的:(Spark)