我是把hadoop 0.20.2安装到/opt/hadoop目录下,故 HADOOP_HOME=/opt/hadoop
而我系统安装的是openjdk-1.6.0,主目录 JAVA_HOME= /usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64
$cd /opt/hadoop
$ant compile-c++-libhdfs -Dislibhdfs=true
HADOOP_HOME=/opt/hadoop
JAVA_HOME=/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64
# 此项可以通过Makefile设置
C_INCLUDE_PATH=$HADOOP_HOME/src/c++/libhdfs:$JAVA_HOME/include:$JAVA_HOME/include/linux
CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
$export HADOOP_HOME=/opt/hadoop
其它环境变量设置参照上面命令。
$gedit /etc/profile
在这个文件中找到一行
export PATH MAIL USER ...
在这行的前面添加
JAVA_HOME=你的JAVA主目录 HADOOP_HOME=你的HADOOP主目录
# 此项可以通过Makefile设置
C_INCLUDE_PATH=那些头文件所在目录,用冒号:隔开 # 添加 $HADOOP_HOME/*.jar 到 CLASSPATH变量中 for i in $HADOOP_HOME/*.jar ; do CLASSPATH=$CLASSPATH:$i done # 添加 $HADOOP_HOME/lib/*.jar 到 CLASSPATH变量中 for i in $HADOOP_HOME/lib/*.jar ; do CLASSPATH=$CLASSPATH:$i done
在这行的后面添加
export PATH MAIL USER ... JAVA_HOME HADOOP_HOME C_INCLUDE_PATH
注:其中...表示省略的变量,还有很多变量,各个系统不一样,没必须全面写在这里,而且不是重点。
$gedit /etc/ld.so.conf.d/hdfs.conf
在此文件中填入目录
/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/amd64
/opt/hadoop/build/c++/Linux-amd64-64/lib
保存,退出,然后执行命令使配置文件生效
$ldconfig
Makefile写法
#指定hadoop安装目录
HADOOP_INSTALL=/opt/hadoop
#使用的平台类型,32位的是Linux-i386-32,64位的是Linux-amd64-64
PLATFORM=Linux-i386-32
#JAVA安装目录 JAVA_HOME=/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64
#寻找头文件的目录
CPPFLAGS= -I$(HADOOP_INSTALL)/src/c++/libhdfs -I$(JAVA_HOME)/include -I$(JAVA_HOME)/include/linux
#寻找动态链接库的目录
LIB = -L$(HADOOP_INSTALL)/c++/Linux-i386-32/lib/ libjvm=/usr/lib/jvm/java-6-openjdk/jre/lib/i386/client/libjvm.so LDFLAGS += -lhdfs testHdfs: testHdfs.c gcc testHdfs.c $(CPPFLAGS) $(LIB) $(LDFLAGS) $(libjvm) -o testHdfs clean: rm testHdfs
Makefile.am的写法
SUBDIRS= HADOOP_INSTALL=/opt/hadoop PLATFORM=Linux-amd64-64 JAVA_HOME=/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64
INCLUDES=-I$(HADOOP_INSTALL)/src/c++/libhdfs -I$(JAVA_HOME)/include -I$(JAVA_HOME)/include/linux
LDADD=-L$(HADOOP_INSTALL)/c++/$(PLATFORM)/lib -lhdfs
export INCLUDES LDADD
测试代码
/* testLibhdfs.c */
#include "hdfs.h" int main(int argc, char **argv) { hdfsFS fs = hdfsConnect("default", 0); const char* writePath = "/tmp/testfile.txt"; hdfsFile writeFile = hdfsOpenFile(fs, writePath, O_WRONLY|O_CREAT, 0, 0, 0); if(!writeFile) { fprintf(stderr, "Failed to open %s for writing!\n", writePath); exit(-1); } char* buffer = "Hello, World!"; tSize num_written_bytes = hdfsWrite(fs, writeFile, (void*)buffer, strlen(buffer)+1); if (hdfsFlush(fs, writeFile)) { fprintf(stderr, "Failed to 'flush' %s\n", writePath); exit(-1); } hdfsCloseFile(fs, writeFile); }
编译、运行:
$gcc -o testLibhdfs testLibhdfs.c -lhdfs
$./testLibhdfs