pyspark.sql.DataFrame与pandas.DataFrame之间的相互转换

pyspark.sql.DataFrame与pandas.DataFrame之间的相互转换

代码如下:

# -*- coding: utf-8 -*-
import pandas as pd
from pyspark.sql import SparkSession
from pyspark.sql import SQLContext
from pyspark import SparkContext

#初始化数据

#初始化pandas DataFrame
df = pd.DataFrame([[1, 2, 3], [4, 5, 6]], index=['row1', 'row2'], columns=['c1', 'c2', 'c3'])

#打印数据
print df

#初始化spark DataFrame
sc = SparkContext()
if __name__ == "__main__":
    spark = SparkSession\
        .builder\
        .appName("testDataFrame")\
        .getOrCreate()

sentenceData = spark.createDataFrame([
    (0.0, "I like Spark"),
    (1.0, "Pandas is useful"),
    (2.0, "They are coded by Python ")
], ["label", "sentence"])

#显示数据
sentenceData.select("label").show()

#spark.DataFrame 转换成 pandas.DataFrame
sqlContest = SQLContext(sc)
spark_df = sqlContest.createDataFrame(df)

#显示数据
spark_df.select("c1").show()

spark_df.select("*").show()


# pandas.DataFrame 转换成 spark.DataFrame
pandas_df = sentenceData.toPandas()

#打印数据
print pandas_df

输出结果:

/System/Library/Frameworks/Python.framework/Versions/2.7/bin/python2.7 /Users/a6/Downloads/PycharmProjects/Attempt_new/chat_mes_dm_sparkSql/dataframe_transform_and_handle.py
      c1  c2  c3
row1   1   2   3
row2   4   5   6
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
18/02/26 11:34:22 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/02/26 11:34:22 WARN Utils: Your hostname, localhost resolves to a loopback address: 127.0.0.1; using 10.2.32.85 instead (on interface en0)
18/02/26 11:34:22 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
+-----+
|label|
+-----+
|  0.0|
|  1.0|
|  2.0|
+-----+

+---+
| c1|
+---+
|  1|
|  4|
+---+

+---+---+---+
| c1| c2| c3|
+---+---+---+
|  1|  2|  3|
|  4|  5|  6|
+---+---+---+

   label                   sentence
0    0.0               I like Spark
1    1.0           Pandas is useful
2    2.0  They are coded by Python 

Process finished with exit code 0
参考: http://blog.csdn.net/zhurui_idea/article/details/72981715

你可能感兴趣的:(Spark,Python)