Hive-2.3.6启动报错 Relative path in absolute URI

问题介绍 :启动Hive时出现以下报错信息:

Logging initialized using configuration in file:/opt/module/hive-2.3.6/conf/hive-log4j2.properties Async: true
Exception in thread "main" java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bhive.session.id%7D_resources
        at org.apache.hadoop.fs.Path.initialize(Path.java:205)
        at org.apache.hadoop.fs.Path.(Path.java:171)
        at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:666)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:586)
        at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:553)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:750)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bhive.session.id%7D_resources
        at java.net.URI.checkPath(URI.java:1804)
        at java.net.URI.(URI.java:752)
        at org.apache.hadoop.fs.Path.initialize(Path.java:202)
        ... 12 more

解决办法:

//在配置hive-site.xml文件中修改以下路径,注

//我把hive工作产生的job文件放在了/opt/module/hive-2.3.6/data/tmp路径下,文件夹自己新建
    hive.exec.local.scratchdir
    /opt/module/hive-2.3.6/data/tmp
    Local scratch space for Hive jobs
  
  
//临时的下载目录放在/opt/module/hive-2.3.6/download
    hive.downloaded.resources.dir
    /opt/module/hive-2.3.6/download
    Temporary local directory for added resources in the remote file system.
  

    hive.exec.scratchdir
    /tmp/hive
    HDFS root scratch dir for Hive jobs which gets created with write all (733) permission. For each connecting user, an HDFS scratch dir: ${hive.exec.scratchdir}/<username> is created, with ${hive.scratch.dir.permission}.
  

Notes:以上的目录都是我自己新建的,可以根据自己的需要新建文件的存放目录。修改后重新启动bin/hive成功。

你可能感兴趣的:(linux学习,Hive)