spark spark-shell java.lang.NoClassDefFoundError: parquet/hadoop/ParquetOutputCommitter



spark版本:


spark spark-shell java.lang.NoClassDefFoundError: parquet/hadoop/ParquetOutputCommitter_第1张图片


报错:




Please instead use:
 - ./spark-submit with --driver-class-path to augment the driver classpath
 - spark.executor.extraClassPath to augment the executor classpath
        
18/03/01 11:36:50 WARN spark.SparkConf: Setting 'spark.executor.extraClassPath' to ':/home/hadoop/spark/lib/jackson-annotations-2.2.3.jar:/home/hadoop/spark/lib/jackson-core-2.2.3.jar:/home/hadoop/spark/lib/jackson-databind-2.2.3.jar:/home/hadoop/spark/lib/jackson-jaxrs-1.9.2.jar:/home/hadoop/spark/lib/jackson-xc-1.9.2.jar' as a work-around.
18/03/01 11:36:50 WARN spark.SparkConf: Setting 'spark.driver.extraClassPath' to ':/home/hadoop/spark/lib/jackson-annotations-2.2.3.jar:/home/hadoop/spark/lib/jackson-core-2.2.3.jar:/home/hadoop/spark/lib/jackson-databind-2.2.3.jar:/home/hadoop/spark/lib/jackson-jaxrs-1.9.2.jar:/home/hadoop/spark/lib/jackson-xc-1.9.2.jar' as a work-around.
Spark context available as sc (master = yarn-client, app id = application_1519812925694_0007).
java.lang.NoClassDefFoundError: parquet/hadoop/ParquetOutputCommitter
at org.apache.spark.sql.SQLConf$.(SQLConf.scala:319)
at org.apache.spark.sql.SQLConf$.(SQLConf.scala)
at org.apache.spark.sql.SQLContext.(SQLContext.scala:85)
at org.apache.spark.sql.SQLContext.(SQLContext.scala:77)
at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1038)
at $iwC$$iwC.(:15)
at $iwC.(:24)
at (:26)
at .(:30)
at .()
at .(:7)
at .()
at $print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1045)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1326)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:821)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:852)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:800)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:133)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:305)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:160)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1064)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: parquet.hadoop.ParquetOutputCommitter
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 52 more





解决:


下载 parquet-hadoop-1.4.3.jar  加入到 spark 的classpath 里面


spark conf目录下  spark-defaults.conf




spark.driver.extraClassPath /home/hadoop/spark/lib/jackson-annotations-2.2.3.jar:/home/hadoop/spark/lib/jackson-core-2.2.3.jar:/home/hadoop/spark/lib/jackson-databind-2.2.3.jar:/home/hadoop/spark/lib/jackson-jaxrs-1.9.2.jar:/home/hadoop/spark/lib/jackson-xc-1.9.2.jar:/home/hadoop/spark/lib/parquet-hadoop-1.4.3.jar

你可能感兴趣的:(spark)