python访问hdfs(namenode是HA)

python访问hdfs(namenode是HA)


背景

本文记录用python访问hdfs,hdfs有两个namenode HA模式。主要有hdfs和pyhdfs两种方式,下面分别介绍下。

1.用hdfs

Python的HDFS相关包有很多,我使用的是hdfs,根据官方文档的说法,同时支持hdfs和WebHDFS,默认创建client的方式好像是WebHDFS,

需要通过datanode进行文件操作,而HDFS则是通过namenode进行文件操作,在这里卡了很久,也换过snakebite等包,直到把端口换成datanode,才正常连接。

➜  learn-python3 git:(master) ✗ pip install hdfs
/Users/zzy/anaconda2/lib/python2.7/site-packages/cryptography/hazmat/primitives/constant_time.py:26: CryptographyDeprecationWarning: Support for your Python version is deprecated. The next version of cryptography will remove support. Please upgrade to a 2.7.x release that supports hmac.compare_digest as soon as possible.
  utils.PersistentlyDeprecated2018,
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting hdfs
  Downloading https://files.pythonhosted.org/packages/de/07/9d4af32b643650a368ec29899bcb2a06343810b6ed5c2382d670eb35cc5b/hdfs-2.5.0.tar.gz
Collecting docopt (from hdfs)
  Downloading https://files.pythonhosted.org/packages/a2/55/8f8cab2afd404cf578136ef2cc5dfb50baa1761b68c9da1fb1e4eed343c9/docopt-0.6.2.tar.gz
Requirement already satisfied: requests>=2.7.0 in /Users/zzy/anaconda2/lib/python2.7/site-packages (from hdfs) (2.18.4)
Requirement already satisfied: six>=1.9.0 in /Users/zzy/anaconda2/lib/python2.7/site-packages (from hdfs) (1.11.0)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /Users/zzy/anaconda2/lib/python2.7/site-packages (from requests>=2.7.0->hdfs) (3.0.4)
Requirement already satisfied: idna<2.7,>=2.5 in /Users/zzy/anaconda2/lib/python2.7/site-packages (from requests>=2.7.0->hdfs) (2.6)
Requirement already satisfied: urllib3<1.23,>=1.21.1 in /Users/zzy/anaconda2/lib/python2.7/site-packages (from requests>=2.7.0->hdfs) (1.22)
Requirement already satisfied: certifi>=2017.4.17 in /Users/zzy/anaconda2/lib/python2.7/site-packages (from requests>=2.7.0->hdfs) (2019.3.9)
Building wheels for collected packages: hdfs, docopt
  Building wheel for hdfs (setup.py) ... done
  Stored in directory: /Users/zzy/Library/Caches/pip/wheels/e4/43/cb/5cfa9f13f579e92b6968e1f65624a6fd0e684e03deb163ce20
  Building wheel for docopt (setup.py) ... done
  Stored in directory: /Users/zzy/Library/Caches/pip/wheels/9b/04/dd/7daf4150b6d9b12949298737de9431a324d4b797ffd63f526e
Successfully built hdfs docopt
Installing collected packages: docopt, hdfs
Successfully installed docopt-0.6.2 hdfs-2.5.0
python访问hdfs(namenode是HA)_第1张图片

运行如下代码:

client = Client("http://namenode1:50070",root="/",proxy='ikonw',timeout=100,session=False)
    dir(client)

报错:

CryptographyDeprecationWarning: Support for your Python version is deprecated. The next version of cryptography will remove support. Please upgrade to a 2.7.x release that supports hmac.compare_digest as soon as possible.
  utils.PersistentlyDeprecated2018,

2.pyhdfs

参考pyhdfs官网,hosts按照下面的方式传参,name http host:port 的list或者逗号分隔的string

•   hosts (list or str) – List of NameNode HTTP host:port strings, either as list or a comma separated string. Port defaults to 50070 if left unspecified. Note that in Hadoop 3, the default NameNode HTTP port changed to 9870; the old default of 50070 is left as-is for backwards compatibility.
python访问hdfs(namenode是HA)_第2张图片

最终的python脚本如下:

iknow@6-34:~/zzy $ cat  HdfsUtil.py
#!/usr/bin/env python
# -*- coding: utf-8 -*-

from pyhdfs import HdfsClient

## ns1
def readFromHdfs():
    #client = HdfsClient(hosts="http://namenode1:50070",user_name="iknow")
    client = HdfsClient(hosts=[namenode1:50070","namenode2:50070"],user_name="iknow")
    #client = HdfsClient(hosts="http://ns1:8020",user_name="iknow")
    print("client:",client.list_status('/'))

    response = client.open("/flink/test_data/input/iris.csv")
    for line in response.read():
        print("line:",line)

if __name__ == '__main__':
readFromHdfs()

运行结果:
可以看到在当前路径下生成了iris.csv文件

python访问hdfs(namenode是HA)_第3张图片

从hdfs上copy一个文件到本地

hdfs.client.copy_to_local('/home/data/2018-06-28.out','/local_path/2018-06-28.out')

参考:

  • 使用Python访问HDFS

  • Pyhdfs官网

  • Python读hadoop文件

你可能感兴趣的:(python访问hdfs(namenode是HA))