报错:
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable。
开始的时候我图方便用brew进行安装,但是坑的hadoop它需要的是protobuf2.5.0版本,而且必须是2.5.0版本。但是brew 装的是最新版本3.7,于是编译的是时候出现了如下错误
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 15:37 min
[INFO] Finished at: 2019-04-18T03:29:07+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.1.1:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: protoc version is 'libprotoc 3.7.1', expected version is '2.5.0' -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :hadoop-common
没办法只能从github 下载2.5.0版本的进行编译。
protobuf下载地址
编译前提是已经安装后号autoconf cmake ,可以brew安装这两个工具
按照官方给的安装说明,解压后进入目录进行编译
第一步:autoconf生成configure文件,但是报错了
configure.ac:17: error: possibly undefined macro: AM_MAINTAINER_MODE
If this token and others are legitimate, please use m4_pattern_allow.
See the Autoconf documentation.
configure.ac:32: error: possibly undefined macro: AM_INIT_AUTOMAKE
configure.ac:49: error: possibly undefined macro: AM_CONDITIONAL
configure.ac:75: error: possibly undefined macro: AC_PROG_LIBTOOL
configure.ac:140: error: possibly undefined macro: AC_CXX_STL_HASH
解决方案:安装libtool包
brew install libtool
第二步:
./configure --prefix=/User/xxx/protobuf 出现了问题,坑啊
bogon:protobuf-2.5.02 xxx$ ./configure --prefix=/User/weilanzhuan/protobuf
./configure: line 2166: syntax error near unexpected token `enable'
./configure: line 2166: `AM_MAINTAINER_MODE(enable)'
于是我去看了下 configure.ac 文件,开头有这么一句话
## Process this file with autoconf to produce configure.
## In general, the safest way to proceed is to run ./autogen.sh
我想什么意思大家都清楚了
于是我用 ./autogen.sh 命令代替autoconf重新生成configure文件
运行如下
Google Test not present. Fetching gtest-1.5.0 from the web...
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- 0:01:14 --:--:-- 0curl: (7) Failed to connect to googletest.googlecode.com port 80: Operation timed out
mv: rename gtest-1.5.0 to gtest: No such file or directory
但是文件一直下载不下来 gtest-1.5.0
解决办法就是 找到 autogen.sh 中的如下 shell :
if test ! -e gtest; then
echo "Google Test not present. Fetching gtest-1.5.0 from the web..."
curl http://googletest.googlecode.com/files/gtest-1.5.0.tar.bz2 | tar jx
mv gtest-1.5.0 gtest
fi
替换成
if test ! -e gtest; then
echo "Google Test not present. Fetching gtest-1.5.0 from the web..."
#Add the new path to download the google test with redirection since gihub redirects
curl -L https://github.com/google/googletest/archive/release-1.5.0.tar.gz | tar zx
mv googletest-release-1.5.0 gtest
#curl http://googletest.googlecode.com/files/gtest-1.5.0.tar.bz2 | tar jx
#mv gtest-1.5.0 gtest
fi
然后在执行 autogen.sh 就一切顺利了
然后就是
./configure --prefix=安装目录
make
make install
之后添加protobuf 到环境变量中去
vi .bash_profile
export PROTOBUF=安装目录
export PATH=$PROTOBUF/bin:$PATH
这样之后就可以进行hadoop的编译了
下载 hadoop-3.1.1-src 的包,解压后进入目录
执行编译命令, mvn可以用 brew install maven 安装
mvn package -Pdist,native -DskipTests -Dtar**
进行到这一步我以为就万事大吉了,谁知道后面还有一个大坑等着我,当运行到
构建 hadoop-yarn-server-nodemanager的时候报错了
/Users/weilanzhuan/Downloads/hadoop-3.1.1-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/utils/docker-util.c:63:13: warning: comparison of array 'args->data' not equal to a null pointer is always true [-Wtautological-pointer-compare]
[WARNING] if (args->data != NULL) {
[WARNING] ~~~~~~^~~~ ~~~~
[WARNING] /Users/xxx/Downloads/hadoop-3.1.1-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/utils/docker-util.c:1227:12: error: no matching function for call to 'getgrouplist'
[WARNING] ~~int rc = getgrouplist(user, pw->pw_gid, groups, &ngroups);~~
[WARNING] ^~~~~~~~~~~~
[WARNING] /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk/usr/include/unistd.h:653:6: note: candidate function not viable: no known conversion from 'gid_t *' (aka 'unsigned int *') to 'int *' for 3rd argument
[WARNING] int getgrouplist(const char *, int, int *, int *);
[WARNING] ^
[WARNING] In file included from /Users/xxx/Downloads/hadoop-3.1.1-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/test/utils/test_docker_util.cc:24:
[WARNING] /Users/xxx/Downloads/hadoop-3.1.1-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/utils/docker-util.c:1234:9: error: no matching function for call to 'getgrouplist'
[WARNING] if (getgrouplist(user, pw->pw_gid, groups, &ngroups) == -1) {
[WARNING] ^~~~~~~~~~~~
[WARNING] /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk/usr/include/unistd.h:653:6: note: candidate function not viable: no known conversion from 'gid_t *' (aka 'unsigned int *') to 'int *' for 3rd argument
[WARNING] int getgrouplist(const char *, int, int *, int *);
[WARNING] ^
[WARNING] 2 warnings and 2 errors generated.
[WARNING] make[2]: *** [CMakeFiles/cetest.dir/main/native/container-executor/test/utils/test_docker_util.cc.o] Error 1
[WARNING] make[1]: *** [CMakeFiles/cetest.dir/all] Error 2
[WARNING] make: *** [all] Error 2
打开 /Users/xxx/Downloads/hadoop-3.1.1-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/utils/docker-util.c 文件,找到1227行
int rc = getgrouplist(user, pw->pw_gid, groups, &ngroups);
替换成
int rc = getgrouplist(user, pw->pw_gid, (int *)groups, &ngroups);
还有1234行
if (getgrouplist(user, pw->pw_gid, groups, &ngroups) == -1) {
替换成
if (getgrouplist(user, pw->pw_gid, (int *)groups, &ngroups) == -1) {
最后clean 后重新编译,剩下的就是静静的喝杯茶等待了。
[INFO] No site descriptor found: nothing to attach.
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Apache Hadoop Main 3.1.1:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 1.766 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [ 1.987 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 1.680 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 4.408 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.817 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 2.256 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 6.854 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 2.993 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 8.759 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 3.965 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:18 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 6.936 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 6.824 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.378 s]
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 30.847 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [01:01 min]
[INFO] Apache Hadoop HDFS Native Client ................... SUCCESS [ 8.220 s]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 9.333 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 4.554 s]
[INFO] Apache Hadoop HDFS-RBF ............................. SUCCESS [ 20.519 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.295 s]
[INFO] Apache Hadoop YARN ................................. SUCCESS [ 0.317 s]
[INFO] Apache Hadoop YARN API ............................. SUCCESS [ 16.675 s]
[INFO] Apache Hadoop YARN Common .......................... SUCCESS [ 41.142 s]
[INFO] Apache Hadoop YARN Registry ........................ SUCCESS [ 6.666 s]
[INFO] Apache Hadoop YARN Server .......................... SUCCESS [ 0.325 s]
[INFO] Apache Hadoop YARN Server Common ................... SUCCESS [ 12.778 s]
[INFO] Apache Hadoop YARN NodeManager ..................... SUCCESS [ 39.962 s]
[INFO] Apache Hadoop YARN Web Proxy ....................... SUCCESS [ 4.483 s]
[INFO] Apache Hadoop YARN ApplicationHistoryService ....... SUCCESS [ 8.636 s]
[INFO] Apache Hadoop YARN Timeline Service ................ SUCCESS [ 6.309 s]
[INFO] Apache Hadoop YARN ResourceManager ................. SUCCESS [ 26.351 s]
[INFO] Apache Hadoop YARN Server Tests .................... SUCCESS [ 2.365 s]
[INFO] Apache Hadoop YARN Client .......................... SUCCESS [ 6.413 s]
[INFO] Apache Hadoop YARN SharedCacheManager .............. SUCCESS [ 4.396 s]
[INFO] Apache Hadoop YARN Timeline Plugin Storage ......... SUCCESS [ 4.548 s]
[INFO] Apache Hadoop YARN TimelineService HBase Backend ... SUCCESS [ 0.318 s]
[INFO] Apache Hadoop YARN TimelineService HBase Common .... SUCCESS [ 10.618 s]
[INFO] Apache Hadoop YARN TimelineService HBase Client .... SUCCESS [ 12.013 s]
[INFO] Apache Hadoop YARN TimelineService HBase Servers ... SUCCESS [ 0.310 s]
[INFO] Apache Hadoop YARN TimelineService HBase Server 1.2 SUCCESS [ 6.747 s]
[INFO] Apache Hadoop YARN TimelineService HBase tests ..... SUCCESS [ 16.156 s]
[INFO] Apache Hadoop YARN Router .......................... SUCCESS [ 6.339 s]
[INFO] Apache Hadoop YARN Applications .................... SUCCESS [ 0.290 s]
[INFO] Apache Hadoop YARN DistributedShell ................ SUCCESS [ 4.597 s]
[INFO] Apache Hadoop YARN Unmanaged Am Launcher ........... SUCCESS [ 3.358 s]
[INFO] Apache Hadoop MapReduce Client ..................... SUCCESS [ 0.532 s]
[INFO] Apache Hadoop MapReduce Core ....................... SUCCESS [ 20.702 s]
[INFO] Apache Hadoop MapReduce Common ..................... SUCCESS [ 15.689 s]
[INFO] Apache Hadoop MapReduce Shuffle .................... SUCCESS [ 5.359 s]
[INFO] Apache Hadoop MapReduce App ........................ SUCCESS [ 9.356 s]
[INFO] Apache Hadoop MapReduce HistoryServer .............. SUCCESS [ 6.231 s]
[INFO] Apache Hadoop MapReduce JobClient .................. SUCCESS [ 6.933 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 1.874 s]
[INFO] Apache Hadoop YARN Services ........................ SUCCESS [ 0.296 s]
[INFO] Apache Hadoop YARN Services Core ................... SUCCESS [ 16.257 s]
[INFO] Apache Hadoop YARN Services API .................... SUCCESS [ 2.481 s]
[INFO] Apache Hadoop YARN Site ............................ SUCCESS [ 0.285 s]
[INFO] Apache Hadoop YARN UI .............................. SUCCESS [ 0.286 s]
[INFO] Apache Hadoop YARN Project ......................... SUCCESS [ 12.881 s]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ...... SUCCESS [ 3.052 s]
[INFO] Apache Hadoop MapReduce NativeTask ................. SUCCESS [ 47.014 s]
[INFO] Apache Hadoop MapReduce Uploader ................... SUCCESS [ 2.910 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 5.827 s]
[INFO] Apache Hadoop MapReduce ............................ SUCCESS [ 5.833 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 15.792 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 5.396 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [ 2.943 s]
[INFO] Apache Hadoop Archive Logs ......................... SUCCESS [ 3.157 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 5.802 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 4.605 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [ 3.613 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [ 3.049 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 7.171 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 5.021 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [02:41 min]
[INFO] Apache Hadoop Kafka Library support ................ SUCCESS [ 11.710 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [ 17.688 s]
[INFO] Apache Hadoop Aliyun OSS support ................... SUCCESS [ 9.613 s]
[INFO] Apache Hadoop Client Aggregator .................... SUCCESS [ 4.044 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 6.585 s]
[INFO] Apache Hadoop Resource Estimator Service ........... SUCCESS [ 11.367 s]
[INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [ 18.783 s]
[INFO] Apache Hadoop Image Generation Tool ................ SUCCESS [ 3.853 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 10.500 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.273 s]
[INFO] Apache Hadoop Client API ........................... SUCCESS [01:26 min]
[INFO] Apache Hadoop Client Runtime ....................... SUCCESS [01:04 min]
[INFO] Apache Hadoop Client Packaging Invariants .......... SUCCESS [ 2.025 s]
[INFO] Apache Hadoop Client Test Minicluster .............. SUCCESS [02:01 min]
[INFO] Apache Hadoop Client Packaging Invariants for Test . SUCCESS [ 0.477 s]
[INFO] Apache Hadoop Client Packaging Integration Tests ... SUCCESS [ 0.590 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 27.949 s]
[INFO] Apache Hadoop Client Modules ....................... SUCCESS [ 0.309 s]
[INFO] Apache Hadoop Cloud Storage ........................ SUCCESS [ 1.601 s]
[INFO] Apache Hadoop Cloud Storage Project ................ SUCCESS [ 0.401 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 21:32 min
[INFO] Finished at: 2019-04-18T18:39:40+08:00
[INFO] ------------------------------------------------------------------------
完美收官