Hive-1.2.2安装(本地模式)

Hive-1.2.2安装(本地模式)

  • 必要条件
    • Java 1.7 (Hive 1.2 及以上版本需要 Java 1.7 或更高版本)
    • Hadoop 2.x (推荐), 1.x (不支持 Hive 2.0.0 以上版本).
    • 配置好环境变量HADOOP_HOME=,或HADOOP_PREFIX=,或添加/bin至PATH中

使用一个稳定发行版安装包安装 Hive

  1. 安装配置MySQL-server,用于存储Hive metadata

    1. 安装mySQL-server,省略
    2. 创建database:hive_meta,user:hive,password:hive
      create database hive_meta;
      create user hive;
      grant all on hive_meta.* to hive@'%' identified by 'hive';
      flush privileges;
      
  2. 从官网下载hive-1.2.2安装包

    wget http://apache.fayea.com/hive/hive-1.2.2/apache-hive-1.2.2-bin.tar.gz
    
  3. 解压

    tar -zxvf apache-hive-1.2.2-bin.tar.gz
    
  4. 设置环境变量HIVE_HOME指向hive安装目录,添加$HIVE_HOME/bin至环境变量PATH中

    vi /etc/profile
    # 添加以下两行内容
    export HIVE_HOME=
    export PATH=$HIVE_HOME/bin:$PATH
    
  5. 配置hive-site.xml

    # 进入hive安装目录
    cd $HIVE_HOME
    # conf文件夹下的hive-default.xml.template文件仅作为文档查看,要更改hive的默认配置,需要在conf文件夹下新建文件hive-site.xml
    vi conf/hive-site.xml
    # 增加以下内容(具体配置项要根据具体环境作相应修改)
    
      
        hive.metastore.warehouse.dir
        /user/hive/warehouse
        location of default database for the warehouse
      
      
        javax.jdo.option.ConnectionDriverName
        com.mysql.jdbc.Driver
        Driver class name for a JDBC metastore
      
      
        javax.jdo.option.ConnectionURL
        jdbc:mysql://172.16.117.61:3306/hive_meta?createDatabaseIfNotExist=true
        JDBC connect string for a JDBC metastore
      
      
        javax.jdo.option.ConnectionUserName
        hive
        Username to use against metastore database
      
      
        javax.jdo.option.ConnectionPassword
        hive
        password to use against metastore database
      
    
    
  6. 添加MySQL JDBC驱动包到$HIVE_HOME/lib文件夹中

    1. 下载MySQL JDBC驱动包
      wget https://dev.mysql.com/get/Downloads/Connector-J/mysql-connector-java-5.1.42.tar.gz
      
    2. 解压
      tar -zxvf mysql-connector-java-5.1.42.tar.gz
      
    3. 拷贝MySQL JDBC驱动包到$HIVE_HOME/lib文件夹中
      cp mysql-connector-java-5.1.42/mysql-connector-java-5.1.42-bin.jar $HIVE_HOME/lib
      
  7. 进入hive-Cli

    hive
    

FAQ

  1. 异常信息:
[ERROR] Terminal initialization failed; falling back to unsupported
java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
at jline.TerminalFactory.create(TerminalFactory.java:101)
at jline.TerminalFactory.get(TerminalFactory.java:158)
at jline.console.ConsoleReader.(ConsoleReader.java:229)
at jline.console.ConsoleReader.(ConsoleReader.java:221)
at jline.console.ConsoleReader.(ConsoleReader.java:209)
at org.apache.hadoop.hive.cli.CliDriver.setupConsoleReader(CliDriver.java:787)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:721)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
  • 异常原因:$HIVE_HOME/lib/jline-*.jar与$HADOOP_HOME/share/hadoop/yarn/lib/jline-*.jar冲突
  • 解决办法:使用高版本jline替换掉低版本jar
    rm -f $HADOOP_HOME/share/hadoop/yarn/lib/jline-0.9.94.jar
    cp $HIVE_HOME/lib/jline-2.12.jar $HADOOP_HOME/share/hadoop/yarn/lib/
    
  1. 异常信息:
ERROR [main]: exec.DDLTask (DDLTask.java:failed(520)) - org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:file:/user/hive/warehouse/test is not a directory or unable to create one)
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:720)
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4135)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:306)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1676)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1435)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1218)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1082)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1072)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
  • 解决办法:配置环境变量 HADOOP_DEV_HOME=

转载于:https://my.oschina.net/u/3446722/blog/956639

你可能感兴趣的:(Hive-1.2.2安装(本地模式))