Hive2.3.6升级至Hive3.1.3踩坑

1、coalesce报错

FAILED: SemanticException [Error 10014]: Line 197:4 Wrong arguments ''10'': Unsafe compares
      BETWEEN different types are disabled for safety reasons. If you know what you are doing,
             please SET hive.strict.checks.type.safety to false
            AND make sure that hive.mapred.mode is NOT SET to 'strict' to proceed. Note that you may get errors
             OR incorrect results if you make a mistake while
USING some of the unsafe features.

解决:hive.strict.checks.type.safety=false (默认是true)

2、MR任务失败

Task with the most failures(4):
-----
Task ID:
  task_1566481621886_15450555_m_000004

URL:
  http://TXIDC65-bigdata-resourcemanager1:8088/taskdetails.jsp?jobid=job_1566481621886_15450555&tipid=task_1566481621886_15450555_m_000004
-----
Diagnostic Messages for this Task:
Task KILL is received. Killing attempt!

FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
MapReduce Jobs Launched:
Stage-Stage-1: Map: 65  Reduce: 11   Cumulative CPU: 804.45 sec   HDFS Read: 1354997759 HDFS Write: 10808642 SUCCESS
Stage-Stage-6: Map: 6  Reduce: 3   Cumulative CPU: 64.73 sec   HDFS Read: 212028947 HDFS Write: 19101152 SUCCESS
Stage-Stage-7: Map: 5   FAIL
Total MapReduce CPU Time Spent: 14 minutes 29 seconds 179 msec

看MR日志:

java.lang.ClassCastException: org.apache.hadoop.hive.ql.exec.vector.LongColumnVector cannot be cast to org.apache.hadoop.hive.ql.exec.vector.DecimalColumnVector

解决:不使用向量化执行可以解决vector强转报错的问题,hive.vectorized.execution.enabled=false

3、关键字冲突

FAILED: ParseException line 7:1 cannot recognize input near 'application' 'string' 'comment' IN column name OR constraint

解决:字段名application和Hive中关键字冲突了,需要在字段两边加上application。除此之外常用的还有date、user等。

4、from_unixtime默认时区修改

巨坑!幸好上线前数仓小伙伴发现了这个问题。
在Hive3.1.0之后所有时区都改成了UTC,导致类似from_unixtime的时间UDF处理结果都少了8小时。
代码变化:
Hive2.3.6升级至Hive3.1.3踩坑_第1张图片
解决:
将HIVE-12192中相关代码回滚。

5、mapred-site.xml中增加

<property>
  <name>yarn.app.mapreduce.am.env</name>
  <value>HADOOP_MAPRED_HOME=/usr/local/yunji/hadoop</value>
</property>
<property>
  <name>mapreduce.map.env</name>
  <value>HADOOP_MAPRED_HOME=/usr/local/yunji/hadoop</value>
</property>
<property>
  <name>mapreduce.reduce.env</name>
  <value>HADOOP_MAPRED_HOME=/usr/local/yunji/hadoop</value>
</property>

你可能感兴趣的:(hive,hadoop,hive)