连接远程linux spark 配置windows 下pycharm开发环境

对不于不习惯在linux 操作的同学,往往想在windows 环境下编辑代码,然后利用远程linux 服务器的spark群集来提交任务,下面提供一个可行的方案

1 、设置环境变量

vim /etc/profile
添加:
export PYTHONPATH=$SPARK_HOME/python/:$SPARK_HOME/python/lib/py4j-0.10.4-src.zip

执行 sourse /etc/profile 生效
(注意py4j-0.10.4-src.zip要跟自己电脑上一致,如果不是root用户,设置相应用户环境变量(vim ~/.bashrc))

2 、在linux上的python安装py4j包

下载地址:https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free/linux-64/
具体安装方法自己百度

3、配置Dployment

Tools > Dployment > Configuration >connection
按要求填写
Tools > Dployment > Configuration >mappings
第一个填本地路径,如:E:\pc_workplace\spark
第二个linux 服务器路径:/root/yinxiong/spark
ools > Dployment > automatic upload (勾选)

4、 python interpreter 配置

settings>project>projec interpreter>
点击右上边配置按键,add remote 
选择ssh,然后根据实际情况填写

5,run配置

script:E:\pc_workplace\spark 
path mapping :E:\pc_workplace\spark=/root/yinxiong/spark

6 测试:

import os
import sys
os.environ['SPARK_HOME'] = "/opt/spark-2.2.0-bin-hadoop2.7"
sys.path.append("/opt/spark-2.2.0-bin-hadoop2.7/python")

try:
    from pyspark import SparkContext
    from pyspark import SparkConf

    print ("Successfully imported Spark Modules")

except ImportError as e:
    print ("Can not import Spark Modules", e)
    sys.exit(1)

conf = SparkConf().setAppName('myFirstAPP').setMaster('local') #连接spark
sc = SparkContext(conf = conf) ##生成SparkContext 对象

rdd = sc.textFile('/spark/wc.txt')
rdd_result = rdd.faltMap(lambda x:x.split()).Map(lambda x:(x,1)).reduceByKey(lambda x,y:x+y)
print(rdd_result.collect())
#wordcount.cache()
执行结果:

ssh://[email protected]:22/opt/anaconda3/bin/python3 -u /root/yinxiong/spark/demo1.py
Successfully imported Spark Modules
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
18/01/03 00:13:05 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
18/01/03 00:13:06 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4040. Attempting port 4041.
[‘hello inx’, ‘hello inx’, ‘hello crystal’, ‘hello crystal’, ‘hello crystal’, ‘hello spark’]
说明配置成功

你可能感兴趣的:(大数据之spark)