异常记录

1、spark启动异常:Caused by: java.net.NoRouteToHostException: No route to host (Host unreachable)

Caused by: java.net.NoRouteToHostException: No route to host (Host unreachable)
        at java.net.PlainSocketImpl.socketConnect(Native Method)
        at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
        at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
        at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
        at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
        at java.net.Socket.connect(Socket.java:589)
        at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:214)
        at com.mysql.jdbc.MysqlIO.(MysqlIO.java:298)
        ... 153 more

:14: error: not found: value spark
       import spark.implicits._
              ^
:14: error: not found: value spark
       import spark.sql
              ^

原因是mysql数据库地址变了,conf文件夹下的hive配置文件中的mysql地址没有及时修改,导致启动时无法连接mysql



2、spark使用外置hive启动报错,原因是hive使用了tez引擎,所以修改为mr即可
java.lang.NoClassDefFoundError: org/apache/tez/dag/api/SessionNotRunning



3、 java.lang.ClassNotFoundException: Class com.hadoop.compression.lzo.LzoCodec

spark shell中连接hive执行sql时抱如下错找不到lzo压缩jar包,因为core-site.xml中配置了lzo压缩,所以访问 hdfs 上文件时无法找到lzo包

解决方式:
1.注释调spark conf 中core-site.xml 中的lzo配置
2.spark-env 中配置lzo路径 https://blog.csdn.net/stark_summer/article/details/48375999



4、 编写sparksql,操作hive数据时出现如下异常,按说连接mysql应该时访问 metastore 数据库,但是如下信息发现区访问的时mysql 自带的系统数据库 performance_schema 去'./performance_schema/DBS.frm' 此数据库下找hive元数据了,最后检查发现performance_schema文件夹权限是root,改为mysql后问题解决
	at org.datanucleus.store.rdbms.schema.RDBMSSchemaHandler.getRDBMSTableIndexInfoForTable(RDBMSSchemaHandler.java:644)
	at org.datanucleus.store.rdbms.schema.RDBMSSchemaHandler.getRDBMSTableIndexInfoForTable(RDBMSSchemaHandler.java:584)
	at org.datanucleus.store.rdbms.schema.RDBMSSchemaHandler.getSchemaData(RDBMSSchemaHandler.java:201)
	at org.datanucleus.store.rdbms.table.TableImpl.getExistingIndices(TableImpl.java:1141)
	at org.datanucleus.store.rdbms.table.TableImpl.validateIndices(TableImpl.java:572)
	at org.datanucleus.store.rdbms.table.TableImpl.validateConstraints(TableImpl.java:390)
	at org.datanucleus.store.rdbms.table.ClassTable.validateConstraints(ClassTable.java:3463)
	at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:3464)
	at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:3190)
	at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2841)
	at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)
	at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:1605)
	at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:954)
	at org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:679)
	at org.datanucleus.store.rdbms.query.RDBMSQueryUtils.getStatementForCandidates(RDBMSQueryUtils.java:408)
	at org.datanucleus.store.rdbms.query.JDOQLQuery.compileQueryFull(JDOQLQuery.java:947)
	at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:370)
	at org.datanucleus.store.query.Query.executeQuery(Query.java:1744)
	at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)
	at org.datanucleus.store.query.Query.execute(Query.java:1654)
	at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:221)
	at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.ensureDbInit(MetaStoreDirectSql.java:183)
	at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:137)
	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:295)
	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
	at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
	at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
	at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
	at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:188)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:358)
	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:262)
	at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:66)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
	at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
	at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
	at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
	at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
	at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
	at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
	at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
	at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
	at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
	at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
	at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
	at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
	at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
	at com.cjy.sparkSQL.T3_datasource.T4_hiveLoadData$.main(T4_hiveLoadData.scala:13)
	at com.cjy.sparkSQL.T3_datasource.T4_hiveLoadData.main(T4_hiveLoadData.scala)
Caused by: java.sql.SQLException: Can't find file: './performance_schema/DBS.frm' (errno: 13 - Permission denied)
	at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:129)
	at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)
	at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:122)
	at com.mysql.cj.jdbc.StatementImpl.executeQuery(StatementImpl.java:1218)
	at com.mysql.cj.jdbc.DatabaseMetaData$6.forEach(DatabaseMetaData.java:2707)
	at com.mysql.cj.jdbc.DatabaseMetaData$6.forEach(DatabaseMetaData.java:2694)
	at com.mysql.cj.jdbc.IterateBlock.doForAll(IterateBlock.java:56)
	at com.mysql.cj.jdbc.DatabaseMetaData.getIndexInfo(DatabaseMetaData.java:2766)
	at org.datanucleus.store.rdbms.schema.RDBMSSchemaHandler.getRDBMSTableIndexInfoForTable(RDBMSSchemaHandler.java:615)
	... 90 more

解决方式:

  1. 查看如下 performance_schema属于 root,修改它的权限、所属为mysql即可
[root@hadoop102 mysql]# ll
总用量 176184
-rwxrwx---. 1 mysql mysql       56 5月  31 2019 auto.cnf
drwxrwx---. 2 mysql mysql     4096 7月  29 00:03 call
-rwxrwx---. 1 mysql mysql     4799 6月   1 2019 hadoop101.err
-rwxrwx---. 1 mysql mysql        5 7月  29 19:13 hadoop102.pid
drwxrwx---. 2 mysql mysql     4096 6月   2 2019 hymc
-rwxrwx---. 1 mysql mysql 79691776 7月  29 19:21 ibdata1
-rwxrwx---. 1 mysql mysql 50331648 7月  29 19:21 ib_logfile0
-rwxrwx---. 1 mysql mysql 50331648 5月  31 2019 ib_logfile1
drwxrwx---. 2 mysql mysql     4096 6月   1 22:59 ke
drwxrwx---. 2 mysql mysql     4096 7月  29 19:21 metastore
drwxrwx---. 2 mysql mysql     4096 7月  22 2019 my@002dstudy
drwxrwx--x. 2 mysql mysql     4096 6月   1 2019 mysql
srwxrwxrwx. 1 mysql mysql        0 7月  29 19:13 mysql.sock
drwxrwx---. 2 root  root      4096 6月   1 2019 performance_schema
-rwxrwxr--. 1 mysql mysql     1772 6月   1 2019 RPM_UPGRADE_HISTORY
-rwxrwxr--. 1 mysql mysql      431 6月   1 2019 RPM_UPGRADE_MARKER-LAST
drwxrwxr-x. 2 mysql mysql     4096 6月   2 20:51 test

  1. 修改权限
[root@hadoop102 mysql]# chown mysql performance_schema/
[root@hadoop102 mysql]# chgrp mysql performance_schema/
[root@hadoop102 mysql]# chmod ug+rwx mysql performance_schema/

你可能感兴趣的:(其它)