pyspark集成anaconda类库,pyspark调用hive

pyspark集成anaconda类库,pyspark调用hive

    • CDH版oozie调取pyspark调取hive
        • 1.oozie调取操作hive的pyspark的python脚本
    • CDH集成ananconda类库,调用pyspark
      • 安装步骤:
        • 1.下载anaconda parcel包
        • 2.上传下载的三个包到/var/www/http/anaconda3(anaconda3自己建)
        • 3.打开CDH管理界面,主机-parcel-配置
        • 4.找到4.0.0下载、分配、激活
        • 5.配置环境变量 /etc/profile
        • 5.配置spark变量(如上)
        • vim ~/.jupyter/jupyter_notebook_config.py
        • 7.ipython profile create pyspark
        • 8.替换/usr/bin/python 为$ANACONDA_HOME/bin/python

CDH版oozie调取pyspark调取hive

1.oozie调取操作hive的pyspark的python脚本

问题:找不到hive库
解决方案:将hive-site.xml放到oozie/show/lib/lib_+日期 目录下
解决代码:

from pyspark import SparkConf,SparkContext;
from pyspark.sql import SQLContext,HiveContext;
conf =SparkConf().setAppName("test11");
sc= SparkContext(conf=conf)
sqlContext= HiveContext(sc)
tables=sqlContext.sql("select * from test");
tables.show();

CDH集成ananconda类库,调用pyspark

安装步骤:

1.下载anaconda parcel包

        Anaconda-4.0.0-el6.parcel
		Anaconda-4.0.0-el6.parcel.sha
		manifest.json
		下载地址:http://archive.cloudera.com/cdh5/parcels/5.14.0/

2.上传下载的三个包到/var/www/http/anaconda3(anaconda3自己建)

      问题:如果http://ip/anaconda3在浏览器显示不了
	  解决:
			systemctl start httpd.service
			systemctl enable httpd.service

3.打开CDH管理界面,主机-parcel-配置

远程 Parcel 存储库 URL 加上http://ip/anaconda3 ,保存后刷新parcel

4.找到4.0.0下载、分配、激活

5.配置环境变量 /etc/profile

	export SPARK_HOME=/opt/cloudera/parcels/CDH/lib/spark
	export ANACONDA_HOME=/opt/cloudera/parcels/Anaconda
	export PYTHONPATH=$SPARK_HOME/python:$ANACONDA_HOME/bin/python
	export PATH=$SPARK_HOME:$ANACONDA_HOME/bin:$PATH
	export PYSPARK_PYTHON=$ANACONDA_HOME/bin/python
	export PYSPARK_DRIVER_PYTHON=$ANACONDA_HOME/bin/python
	export PYSPARK_DRIVER_PYTHON_OPTS="notebook"

5.配置spark变量(如上)

vim ~/.jupyter/jupyter_notebook_config.py

        c = get_config()
		c.IPKernelApp.pylab = 'inline'
		c.NotebookApp.ip = '*' 
		c.NotebookApp.open.browser = False
		#c.NotebookApp.password = u''
		c.NotebookApp.port = 8888

7.ipython profile create pyspark

8.替换/usr/bin/python 为$ANACONDA_HOME/bin/python

## 成功!

你可能感兴趣的:(hadoop)