fuse_trash.c:119: error: too few arguments to function 'hdfsDelete'

在hadoop 1.1.2中使用命令ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1编译fuse_dfs的时候出现如下错误:

compile:
     [echo] contrib: fairscheduler
    [javac] /root/hadoop-1.1.2/hadoop-1.1.2/src/contrib/build-contrib.xml:188: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

check-libhdfs-fuse:

check-libhdfs-exists:

compile:
     [echo] contrib: fuse-dfs
     [exec] checking build system type... x86_64-unknown-linux-gnu
     [exec] checking host system type... x86_64-unknown-linux-gnu
     [exec] checking target system type... x86_64-unknown-linux-gnu
     [exec] checking for a BSD-compatible install... /usr/bin/install -c
     [exec] checking whether build environment is sane... yes
     [exec] checking for a thread-safe mkdir -p... /bin/mkdir -p
     [exec] checking for gawk... gawk
     [exec] checking whether make sets $(MAKE)... yes
     [exec] checking for style of include used by make... GNU
     [exec] 
     [exec] checking for gcc... gcc
     [exec] checking for C compiler default output file name... a.out
     [exec] checking whether the C compiler works... yes
     [exec] checking whether we are cross compiling... no
     [exec] checking for suffix of executables... 
     [exec] checking for suffix of object files... o
     [exec] checking whether we are using the GNU C compiler... yes
     [exec] checking whether gcc accepts -g... yes
     [exec] checking for gcc option to accept ISO C89... none needed
     [exec] checking dependency style of gcc... gcc3
     [exec] checking for g++... g++
     [exec] checking whether we are using the GNU C++ compiler... yes
     [exec] checking whether g++ accepts -g... yes
     [exec] checking dependency style of g++... gcc3
     [exec] checking for ranlib... ranlib
     [exec] checking for bash... /bin/sh
     [exec] checking for perl... /usr/bin/perl
     [exec] checking for python... /usr/bin/python
     [exec] checking for ar... /usr/bin/ar
     [exec] checking for ant... /root/apache-ant-1.9.2/bin/ant
     [exec] checking how to run the C preprocessor... gcc -E
     [exec] checking for grep that handles long lines and -e... /bin/grep
     [exec] checking for egrep... /bin/grep -E
     [exec] checking for uid_t in sys/types.h... yes
     [exec] checking for ANSI C header files... yes
     [exec] checking for sys/types.h... yes
     [exec] checking for sys/stat.h... yes
     [exec] checking for stdlib.h... yes
     [exec] checking for string.h... yes
     [exec] checking for memory.h... yes
     [exec] checking for strings.h... yes
     [exec] checking for inttypes.h... yes
     [exec] checking for stdint.h... yes
     [exec] checking for unistd.h... yes
     [exec] checking type of array argument to getgroups... gid_t
     [exec] checking for size_t... yes
     [exec] checking for getgroups... yes
     [exec] checking for working getgroups... yes
     [exec] checking type of array argument to getgroups... (cached) gid_t
     [exec] checking Checking EXTERNAL_PATH set to... /root/hadoop-1.1.2/hadoop-1.1.2/src/contrib/fuse-dfs
     [exec] checking whether to enable optimized build... yes
     [exec] checking whether to enable static mode... yes
     [exec] configure: creating ./config.status
     [exec] config.status: creating Makefile
     [exec] config.status: creating src/Makefile
     [exec] config.status: executing depfiles commands
     [exec] Making all in .
     [exec] fuse_trash.c: In function 'hdfsDeleteWithTrash':
     [exec] make[1]: Entering directory `/root/hadoop-1.1.2/hadoop-1.1.2/src/contrib/fuse-dfs'fuse_trash.c:119: error: too few arguments to function 'hdfsDelete'
     [exec] 
     [exec] make[1]: *** [fuse_trash.o] Error 1make[1]: Nothing to be done for `all-am'.
     [exec] 
     [exec] make: *** [all-recursive] Error 1
     [exec] make[1]: Leaving directory `/root/hadoop-1.1.2/hadoop-1.1.2/src/contrib/fuse-dfs'
     [exec] Making all in src
     [exec] make[1]: Entering directory `/root/hadoop-1.1.2/hadoop-1.1.2/src/contrib/fuse-dfs/src'
     [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/home/yjx/JDK/jdk1.6.0_24/include -I/root/hadoop-1.1.2/hadoop-1.1.2/src/c++/libhdfs/ -I/home/yjx/JDK/jdk1.6.0_24/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_trash.o -MD -MP -MF .deps/fuse_trash.Tpo -c -o fuse_trash.o fuse_trash.c
     [exec] make[1]: Leaving directory `/root/hadoop-1.1.2/hadoop-1.1.2/src/contrib/fuse-dfs/src'

BUILD FAILED
/root/hadoop-1.1.2/hadoop-1.1.2/build.xml:706: The following error occurred while executing this line:
/root/hadoop-1.1.2/hadoop-1.1.2/src/contrib/build.xml:30: The following error occurred while executing this line:
/root/hadoop-1.1.2/hadoop-1.1.2/src/contrib/fuse-dfs/build.xml:64: exec returned: 2

Total time: 49 seconds

根据错误提示“fuse_trash.c:119: error: too few arguments to function 'hdfsDelete',查看对应代码段:

 if (hdfsDelete(userFS, path)) {
    syslog(LOG_ERR,"ERROR: hdfs trying to delete the file %s",path);
    return -EIO;
  }
对比hdfs.h中hdfsDelete函数的定义:

int hdfsDelete(hdfsFS fs, const char* path, int recursive);

对策:

修改fuse_trash.c:119行,改为:

 if (hdfsDelete(userFS, path, 1)) {
    syslog(LOG_ERR,"ERROR: hdfs trying to delete the file %s",path);
    return -EIO;
  }

编译通过!

你可能感兴趣的:(fuse_trash.c:119: error: too few arguments to function 'hdfsDelete')