编译spark-project的hive

mvn clean compile package install -Phadoop-2 -DskipTests

 

main:
   [delete] Deleting directory /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/tmp
   [delete] Deleting directory /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/warehouse
    [mkdir] Created dir: /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/tmp
    [mkdir] Created dir: /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/warehouse
    [mkdir] Created dir: /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/tmp/conf
     [copy] Copying 11 files to /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive-packaging ---
[INFO] 
[INFO] --- maven-gpg-plugin:1.4:sign (sign-artifacts) @ hive-packaging ---
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-packaging ---
[INFO] Installing /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/pom.xml to /root/.m2/repository/org/spark-project/hive/hive-packaging/1.2.1.spark/hive-packaging-1.2.1.spark.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Hive ............................................... SUCCESS [  2.563 s]
[INFO] Hive Shims Common .................................. SUCCESS [  3.779 s]
[INFO] Hive Shims 0.20S ................................... SUCCESS [  1.568 s]
[INFO] Hive Shims 0.23 .................................... SUCCESS [  5.433 s]
[INFO] Hive Shims Scheduler ............................... SUCCESS [  2.011 s]
[INFO] Hive Shims ......................................... SUCCESS [  1.557 s]
[INFO] Hive Common ........................................ SUCCESS [  5.571 s]
[INFO] Hive Serde ......................................... SUCCESS [  5.134 s]
[INFO] Hive Metastore ..................................... SUCCESS [ 15.928 s]
[INFO] Hive Ant Utilities ................................. SUCCESS [  0.552 s]
[INFO] Spark Remote Client ................................ SUCCESS [  6.468 s]
[INFO] Hive Query Language ................................ SUCCESS [ 48.084 s]
[INFO] Hive Service ....................................... SUCCESS [  5.605 s]
[INFO] Hive Accumulo Handler .............................. SUCCESS [  4.734 s]
[INFO] Hive JDBC .......................................... SUCCESS [ 13.971 s]
[INFO] Hive Beeline ....................................... SUCCESS [  3.101 s]
[INFO] Hive CLI ........................................... SUCCESS [  2.993 s]
[INFO] Hive Contrib ....................................... SUCCESS [  2.797 s]
[INFO] Hive HBase Handler ................................. SUCCESS [  5.414 s]
[INFO] Hive HCatalog ...................................... SUCCESS [  0.950 s]
[INFO] Hive HCatalog Core ................................. SUCCESS [  4.163 s]
[INFO] Hive HCatalog Pig Adapter .......................... SUCCESS [  3.001 s]
[INFO] Hive HCatalog Server Extensions .................... SUCCESS [  3.124 s]
[INFO] Hive HCatalog Webhcat Java Client .................. SUCCESS [  3.362 s]
[INFO] Hive HCatalog Webhcat .............................. SUCCESS [ 12.030 s]
[INFO] Hive HCatalog Streaming ............................ SUCCESS [  3.114 s]
[INFO] Hive HWI ........................................... SUCCESS [  3.020 s]
[INFO] Hive ODBC .......................................... SUCCESS [  2.443 s]
[INFO] Hive Shims Aggregator .............................. SUCCESS [  0.211 s]
[INFO] Hive TestUtils ..................................... SUCCESS [  0.227 s]
[INFO] Hive Packaging ..................................... SUCCESS [  3.342 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:57 min
[INFO] Finished at: 2015-12-17T17:02:52+08:00
[INFO] Final Memory: 393M/11096M
[INFO] ------------------------------------------------------------------------

 

你可能感兴趣的:(编译spark-project的hive)