Hive报错集锦

1.Terminal initialization failed; falling back to unsupported

在启动Hive时报错:

Terminal initialization failed; falling back to unsupported

解决方法:

我看的有博客说是Hadoop的share/hadoop/yarn/lib/jline-0.9.94.jar版本太低,要用Hive的/lib/jline-2.12.jar替换jline-0.9.94.jar。我个人觉得这样做不可取,这应该说明Hadoop和Hive的版本不匹配,应该更换Hadoop或Hive的版本,使之匹配,不然难保日后有麻烦。


2.check the manual that corresponds to your MySQL server version for the right syntax to use near

在删除表时报错:

hive> drop table aaa;
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:
javax.jdo.JDODataStoreException: You have an error in your SQL syntax; check the manual that corresponds to
 your MySQL server version for the right syntax to use near 'OPTION SQL_SELECT_LIMIT=DEFAULT' at line 1

解决办法:
是mysql的版本和驱动的版本不匹配导致的,我的mysql版本是5.6的,刚开始使用的驱动是5.1.18,,但是mysql5.6已经抛弃了这个参数,所以会报上面错误,换成驱动mysql-connector-java-5.1.31-bin.jar后解决。
下载驱动: http://dev.mysql.com/downloads/connector/j/


3.For direct MetaStore DB connections, we don't support retries at the client level.

当创建表时报错:

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. 
MetaException(message:For direct MetaStore DB connections, we don't support retries at the client level.)

解决方法:

属于字符集乱码问题,需要设置MySQL字符集,如下:

mysql> alter database hive character set latin1;

其中“hive”是你设置的Hive元数据数据库。


4.extraneous input 'employee_2' expecting EOF near '<EOF>'

在删除索引时报错:

hive (mydb)> DROP INDEX IF EXISTS employee_index ON TABLE employee;  
FAILED: ParseException line 1:45 extraneous input 'employee' expecting EOF near '<EOF>'

解决方法:

DROP INDEX IF EXISTS employee_index ON TABLE employee。这条语句是《Programming Hive》中8.4节的例子,这本书用的是Hive0.7.0、Hive0.8.0、Hive0.9.0,而我用的是Hive1.0.1,不知道是不是版本问题,改成下面语句就可以了:

hive> DROP INDEX IF EXISTS employee_index ON  employee;

另外,我在查阅资料的时候,发现类似的问题还有 https://issues.apache.org/jira/browse/HIVE-8182,https://issues.apache.org/jira/browse/HIVE-9336。

5.Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.lazy.LazyString cannot be cast to java.lang.String

在实现GenericUDF时,下面的代码:

List<String> list = (List<String>) this.listOI.getList(arguments[0].get());

报错如下:

Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.lazy.LazyString cannot be cast to java.lang.String
    at edu.wzm.hive.ComplexUDFExample.evaluate(ComplexUDFExample.java:60)
    at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:185)
    at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:77)
    at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluatorHead._evaluate(ExprNodeEvaluatorHead.java:44)
    at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:77)
    at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:65)
    at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:77)
    ... 17 more

解决方法:

在Hive-1.0.1实现GenericUDF时,在evaluate()方法中必须使用org.apache.hadoop.hive.serde2.lazy包中的Lazy的数据类型。我没看过Hive的源代码,不过这个链接从源码中找到了原因:https://issues.apache.org/jira/browse/HIVE-11532。于是,把代码改成:

int elemNum = this.listOI.getListLength(arguments[0].get());
for(int i = 0; i < elemNum; i++) {
    	LazyString lelement = (LazyString) this.listOI.getListElement(arguments[0].get(), i);
    	String element = elementsOI.getPrimitiveJavaObject(lelement);
}


6.FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.hadoop.hbase.HTableDescriptor.addFamily(Lorg/apache/hadoop/hbase/HColumnDescriptor;)V

在Hive集成HBase时报错

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.hadoop.hbase.HTableDescriptor.addFamily(Lorg/apache/hadoop/hbase/HColumnDescriptor;)V

解决方法:

这是Hive和HBase版本不匹配所以。具体操作可以参考Hive集成HBase(一)


7.java.lang.NoClassDefFoundError: org/cliffc/high_scale_lib/Counter。

在Hive集成HBase时报错

java.lang.NoClassDefFoundError: org/cliffc/high_scale_lib/Counter。

解决方法:

这是Hive源码中hbase-handler通信源码的问题,其根源是集成的HBase版本大于1.0。具体操作可以参考Hive集成HBase(一)



PS:这个报错集锦将会持续更新,收集我所遇到的各种错误。



你可能感兴趣的:(hive)