Spark由浅到深(1)--安装,测试,问题排错

安装部署

// 选择需要的版本.
官网下载: http://spark.apache.org/downloads.html

// 部署
tar -zxf spark-1.4.0-bin-hadoop2.6.tgz
cd spark-1.4.0-bin-hadoop2.6

// 执行SparkShell, 这里使用Python的.
bin/pyspark

问题&排错

错误1: “…….Name or service not known”
[GCC 4.4.7 20120313 (Red Hat 4.4.7-18)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/10/20 15:13:11 INFO SparkContext: Running Spark version 1.4.0
17/10/20 15:13:11 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/10/20 15:13:12 ERROR SparkContext: Error initializing SparkContext.
java.net.UnknownHostException: Gee01: Gee01: Name or service not known
    at java.net.InetAddress.getLocalHost(InetAddress.java:1505)
    at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:821)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:814)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:814)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:871)
	at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:871)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.util.Utils$.localHostName(Utils.scala:871)
    at org.apache.spark.SparkContext.(SparkContext.scala:387)
    at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:61)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
    at py4j.Gateway.invoke(Gateway.java:214)
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
    at py4j.GatewayConnection.run(GatewayConnection.java:207)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.UnknownHostException: Gee01: Name or service not known
    at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
    at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928)
    at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323)
    at java.net.InetAddress.getLocalHost(InetAddress.java:1500)
    ... 20 more
17/10/20 15:13:12 INFO SparkContext: Successfully stopped SparkContext
Traceback (most recent call last):
  File "/home/django/software/spark-1.4.0-bin-hadoop2.6/python/pyspark/shell.py", line 43, in 
    sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
  File "/home/django/software/spark-1.4.0-bin-hadoop2.6/python/pyspark/context.py", line 113, in __init__
    conf, jsc, profiler_cls)
  File "/home/django/software/spark-1.4.0-bin-hadoop2.6/python/pyspark/context.py", line 165, in _do_init
    self._jsc = jsc or self._initialize_context(self._conf._jconf)
  File "/home/django/software/spark-1.4.0-bin-hadoop2.6/python/pyspark/context.py", line 219, in _initialize_context
    return self._jvm.JavaSparkContext(jconf)
  File "/home/django/software/spark-1.4.0-bin-hadoop2.6/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 701, in __call__
  File "/home/django/software/spark-1.4.0-bin-hadoop2.6/python/lib/py4j-0.8.2.1-src.zip/py4j/protocol.py", line 300, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.net.UnknownHostException: Gee01: Gee01: Name or service not known
    at java.net.InetAddress.getLocalHost(InetAddress.java:1505)
    at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:821)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:814)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:814)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:871)
	at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:871)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.util.Utils$.localHostName(Utils.scala:871)
    at org.apache.spark.SparkContext.(SparkContext.scala:387)
    at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:61)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
    at py4j.Gateway.invoke(Gateway.java:214)
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
    at py4j.GatewayConnection.run(GatewayConnection.java:207)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.UnknownHostException: Gee01: Name or service not known
    at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
    at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928)
    at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323)
    at java.net.InetAddress.getLocalHost(InetAddress.java:1500)
    ... 20 more

原因: 不能解析hostname
解决:

// 查看个人主机hostname, 我的主机hostname是: Gee01
hostname

// 修改hosts文件并保存
vim /etc/hosts

// 确保有这一行,后面和hostname一致.
127.0.0.1 localhost Gee01

在此执行Shell, 就可以正常进入.

Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 1.4.0
      /_/

Using Python version 2.7.13 (default, Aug 30 2017 11:49:17)
SparkContext available as sc, HiveContext available as sqlContext.
问题2: 日志输出太多了, 可不可以减少?

描述:
输入一行命令后, 输出日志太
原因: 日志级别的设置
解决:

// 进入配置目录
cd spark-1.4.0-bin-hadoop2.6/conf

// 复制一份日志配模板为:log4j.properties 
cp log4j.properties.template log4j.properties

// 编辑 日志配置文件
vim log4j.properties

// 修改第一行 
log4j.rootCategory=INFO, console # 修改前
log4j.rootCategory=WARN, console # 修改后

重新进入Shell, 会发现日志输出信息减少.
Spark由浅到深(1)--安装,测试,问题排错_第1张图片

退出Shell, 输入”quit()”或者 C-d(Ctrl+d).

问题3: PyShell下没有Tab不全, 好痛苦. 有没有解决方案?

解决方法:

// 命令行输入并再次进入即可.
IPYTHON=1 ./bin/pyspark

你可能感兴趣的:(Spark,Spark,BigData)