关于hadoop中的FUSE安装编译

昨天看到了FUSE的解决方式,在网上也有很多文档参考,记多少有点放心了。

在CDH5.13.0这个版本里的hadoop中  /opt/hadoop-2.6.0-cdh5.13.0/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/doc  里边的README也有些

为了验证  还是自己来一安装一次,试过以后彻底崩溃了,网上无外乎就是三四种方式,但是我都是在开头就跪了。

有人直接yum安装对应的程序   有人自己编译   各种方法看着都可以   但是就是不能成功

具体的问题后便在截图

1.直接yum 安装的(也没有提供具体是哪一个yum源,哎)

yum -y  install hadoop-hdfs-fuse

结果就是:
[root@master opt]# yum -y  install hadoop-hdfs-fuse
Loaded plugins: fastestmirror
base                                                                                   | 3.6 kB  00:00:00     
epel/x86_64/metalink                                                                   | 6.6 kB  00:00:00     
epel                                                                                   | 4.7 kB  00:00:00     
extras                                                                                 | 3.4 kB  00:00:00     
updates                                                                                | 3.4 kB  00:00:00     
Loading mirror speeds from cached hostfile
* base: mirrors.aliyun.com
* epel: ftp.cuhk.edu .hk
* extras: mirrors.aliyun.com
* updates: mirrors.163.com
No package hadoop-hdfs-fuse available.
Error: Nothing to do

所以这个方法就直接pass掉   不要浪费时间(以后再去找找是哪一个源)

https://www.cnblogs.com/zwgblog/p/6020587.html

2.第二种就是自己编译自己安装了

这个方法网上也有很多,但是时间比较久远了,各个版本都有点老。自己用的是cdh5.13.0   所以在安装的时候   只能参考

不能照搬步骤的

由于自己用的jdk是1.8版本的   在进入目标目录进行ant的时候  会提示版本的问题

clover.info:
     [echo]
     [echo]      Clover not found. Code coverage reports disabled.
     [echo]   
clover:
jvm-check:
BUILD FAILED
/opt/hadoop-2.6.0-cdh5.13.0/src/hadoop-mapreduce1-project/build.xml:387: Incorrect JVM, current = 1.8.0_144, required 1.7.


于是根据提示的问题  将build.xml  的对应位置修改为1.8
  
    
    
    
    
    
      
        
      
    
    
      Incorrect JVM, current = ${java.version}, required ${javaVersion}.
    
  

注意:这里边会有一个文件目录的问题,如果hadoop的目录名有修改或者什么的,会提示你配置不准确。为了避免不必要的麻烦

可以创建一个软连接

ln -s  你hadoop的完整目录  目录/hadoop-2.6.0-cdh5.13.0

我的就是    ln -s /opt/hadoop  /opt/hadoop-2.6.0-cdh5.13.0

这样我就重新进行编译,可惜  还是有错误

    [javac] warning: [options] bootstrap class path not set in conjunction with -source 1.7
    [javac] /opt/hadoop-2.6.0-cdh5.13.0/src/hadoop-mapreduce1-project/src/mapred/org/apache/hadoop/mapreduce/lib/partition/InputSampler.java:316: error: incompatible types: Object[] cannot be converted to K[]
    [javac]     K[] samples = sampler.getSample(inf, job);
    [javac]                                    ^
    [javac]   where K,V are type-variables:
    [javac]     K extends Object declared in method writePartitionFile(Job,Sampler)
    [javac]     V extends Object declared in method writePartitionFile(Job,Sampler)
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] 1 error
    [javac] 1 warning
BUILD FAILED
/opt/hadoop-2.6.0-cdh5.13.0/src/hadoop-mapreduce1-project/build.xml:468: Compile failed; see the compiler error output for details.

于是我就在虚拟机上重新进行测试   后来发现上面的那个问题好像是版本的问题   JDK用的是1.8版本   我换成1.7以后  这个问题就解决了

但是ant的问题又来了   根据错误提示   还是版本问题   直接将ant降了一个版本   就好了  

我之前用的是apache-ant-1.10.2-bin.tar.gz   后来就改用apache-ant-1.9.10-bin.tar.gz了   

https://blog.csdn.net/coolfeiweb/article/details/22746675

前期的libhdfs编译和这个的前半部分是一样的  (中间也遇到了build.xml的配置错误,按照上面网址的方法,添加了点东西就可以啦)

  需要定义reactor.repo的url:
  在/usr/hadoop/hadoop-2.0.0-cdh4.3.0/src/hadoop-mapreduce1-project/ivy/ivysettings.xml文件中添加:
 
           value=" http://repo1.maven.org/maven2/ "
  override="false"/>

 但是也没有.so生成  也是生成两个静态库  这就尴尬了

投机取巧的我用wget去github下了那个文件    直接去编译下部分(版本差距有点大啊 )

wget https://github.com/cloudera/Impala/blob/master/thirdparty/hadoop-2.0.0-cdh4.5.0/lib/native/libhdfs.so

提示的就是格式不匹配   于是自己去编译一遍  

     [exec] make[1]: Leaving directory `/opt/hadoop-2.6.0-cdh5.13.0/src/hadoop-mapreduce1-project/src/contrib/fuse-dfs/src'
     [exec] /opt/hadoop-2.6.0-cdh5.13.0/src/hadoop-mapreduce1-project/build/c++/Linux-amd64-64/lib/libhdfs.so: file not recognized: File format not recognized
     [exec] collect2: error: ld returned 1 exit status
     [exec] make[1]: *** [fuse_dfs] Error 1
     [exec] make: *** [all-recursive] Error 1


  https://blog.csdn.net/aquester/article/details/49814853

自己安装cmake  

之后替换掉就好了

再来编译后半部分   还是出错了   还是一个乱七八糟的问题

     [exec] make[1]: Entering directory `/opt/hadoop-2.6.0-cdh5.13.0/src/hadoop-mapreduce1-project/src/contrib/fuse-dfs/src'
     [exec] gcc -Wall -g -Wall -O3 -L/opt/hadoop-2.6.0-cdh5.13.0/src/hadoop-mapreduce1-project/build/c++/Linux-amd64-64/lib -lhdfs -L/lib -lfuse -L/opt/jdk1.8.0_144/jre/lib/amd64/server -ljvm  -o fuse_dfs fuse_dfs.o fuse_options.o fuse_trash.o fuse_stat_struct.o fuse_users.o fuse_init.o fuse_connect.o fuse_impls_access.o fuse_impls_chmod.o fuse_impls_chown.o fuse_impls_create.o fuse_impls_flush.o fuse_impls_getattr.o fuse_impls_mkdir.o fuse_impls_mknod.o fuse_impls_open.o fuse_impls_read.o fuse_impls_release.o fuse_impls_readdir.o fuse_impls_rename.o fuse_impls_rmdir.o fuse_impls_statfs.o fuse_impls_symlink.o fuse_impls_truncate.o fuse_impls_utimens.o fuse_impls_unlink.o fuse_impls_write.o  
     [exec] make[1]: Leaving directory `/opt/hadoop-2.6.0-cdh5.13.0/src/hadoop-mapreduce1-project/src/contrib/fuse-dfs/src'
     [exec] /usr/bin/ld: fuse_stat_struct.o: undefined reference to symbol 'ceil@@GLIBC_2.2.5'
     [exec] /usr/lib64/libm.so.6: error adding symbols: DSO missing from command line
     [exec] collect2: error: ld returned 1 exit status
     [exec] make[1]: *** [fuse_dfs] Error 1
     [exec] make: *** [all-recursive] Error 1


http://tieba.baidu.com/p/1602881253?red_tag=m0193289775

和这个类似

我去修改了MAKEFILE   里边正好有一个LIBS   但是没用   每次变异的时候  就会冲刷一遍   改了也没用

export LIBS=‘-lm’

再来编译   就奇迹般的好了

明天在服务器试一遍


好了  按照上边的步骤重新来了一遍   在export以后   本以为会成功的  可惜事与愿违

compile:
     [echo] contrib: fuse-dfs
     [exec] autoreconf: Entering directory `.'
     [exec] autoreconf: configure.ac: not using Gettext
     [exec] autoreconf: running: aclocal --force
     [exec] autoreconf: configure.ac: tracing
     [exec] autoreconf: configure.ac: not using Libtool
     [exec] autoreconf: running: /usr/bin/autoconf --force
     [exec] autoreconf: configure.ac: not using Autoheader
     [exec] autoreconf: running: automake --add-missing --copy --force-missing
     [exec] autoreconf: Leaving directory `.'
     [exec] checking build system type... x86_64-unknown-linux-gnu
     [exec] checking host system type... x86_64-unknown-linux-gnu
     [exec] configure: error: in `/opt/hadoop-2.6.0-cdh5.13.0/src/hadoop-mapreduce1-project/src/contrib/fuse-dfs':
     [exec] configure: error: C compiler cannot create executables
     [exec] See `config.log' for more detailschecking target system type... x86_64-unknown-linux-gnu
     [exec]
     [exec] checking for a BSD-compatible install... /usr/bin/install -c
     [exec] checking whether build environment is sane... yes
     [exec] checking for a thread-safe mkdir -p... /usr/bin/mkdir -p
     [exec] checking for gawk... gawk
     [exec] checking whether make sets $(MAKE)... yes
     [exec] checking whether make supports nested variables... yes
     [exec] checking for style of include used by make... GNU
     [exec]
     [exec] checking for gcc... gcc
     [exec] checking whether the C compiler works... no


没办法  继续看看

这个问题可以参考

https://blog.csdn.net/sxhong/article/details/740572

我直接换了一个终端   重新编译就可以了  export这条命令最好手动输入



哎哎哎


挂载不上   一直提示   cannot access dfs: Transport endpoint is not connected

这个问题想个办法   试试









你可能感兴趣的:(关于hadoop中的FUSE安装编译)