在IDEA本地Spark连接Hive

步骤


  1. 复制hive-site.xmlhdfs-site.xml文件至项目的资源文件夹。
  2. maven中引入spark-hive_2.11以及访问元数据数据库所需要的依赖如mysql-connector-java
    
      org.apache.spark
      spark-hive_2.11
      ${spark.version}
    

    
      mysql
      mysql-connector-java
      5.1.47
    
  1. 代码中添加 enableHiveSupport()
val spark = SparkSession.builder().master("local[*]" ).appName("Demo").enableHiveSupport().getOrCreate()

可能出现的问题


  1. 出现异常:
javax.jdo.JDOFatalInternalException: Error creating transactional connection factory
	at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)
	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)

资源文件中缺少hdfs-site.xml文件

  1. 没有查询到hive表
    原因:没有在maven中引入spark-hive_2.11依赖

转载于:https://my.oschina.net/dreamness/blog/3083167

你可能感兴趣的:(在IDEA本地Spark连接Hive)