Hoop是对Hadoop HDFS Proxy 的改良重写,为HadoopHDFS提供了HTTP(S)的访问接口。使用Hoop,你可以:
Hoop由两部分组成:Hoop Server 和 Hoop Client,他们分别作用是:
下面是几个使用传统的CURL工具通过Hoop操作HDFS的例子:
1.获取home目录
$ curl -i "http://hoopbar:14000?op=homedir&user.name=babu" HTTP/1.1 200 OK Content-Type: application/json Transfer-Encoding: chunked {"homeDir":"http:\/\/hoopbar:14000\/user\/babu"} $
2.读取一个文件内容
$ curl -i "http://hoopbar:14000?/user/babu/hello.txt&user.name=babu" HTTP/1.1 200 OK Content-Type: application/octet-stream Transfer-Encoding: chunked Hello World! $
3.写文件
$ curl -i -X POST "http://hoopbar:14000/user/babu/data.txt?op=create" --data-binary @mydata.txt --header "content-type: application/octet-stream" HTTP/1.1 200 OK Location: http://hoopbar:14000/user/babu/data.txt Content-Type: application/json Content-Length: 0 $
4.列出目录的内容
$ curl -i "http://hoopbar:14000?/user/babu?op=list&user.name=babu" HTTP/1.1 200 OK Content-Type: application/json Transfer-Encoding: chunked [ { "path" : "http:\/\/hoopbar:14000\/user\/babu\/data.txt" "isDir" : false, "len" : 966, "owner" : "babu", "group" : "supergroup", "permission" : "-rw-r--r--", "accessTime" : 1310671662423, "modificationTime" : 1310671662423, "blockSize" : 67108864, "replication" : 3 } ] $
更多操作可以看这里:Hoop HTTP REST API
Hoop使用的是Apache License 2.0 发布,你可以在github上获取到它的源码(http://github.com/cloudera/hoop)这里(http://cloudera.github.com/hoop.)还有各种相关的安装使用教程。
来源:http://www.nosqlwiki.com/7-hoophadoop-hdfs-over-http