pyspark 数据类型转换_apache spark-将pyspark字符串转换为日期形式

apache spark-将pyspark字符串转换为日期形式

我有一个日期pyspark数据框,其中的字符串列格式为df.select(to_date(df.STRING_COLUMN).alias('new_date')).show(),并且我尝试将其转换为日期列。

我试过了:

df.select(to_date(df.STRING_COLUMN).alias('new_date')).show()

我得到一串空值。 有人可以帮忙吗?

6个解决方案

69 votes

可以(最好是?)在没有udf的情况下执行此操作:

from pyspark.sql.functions import unix_timestamp, from_unixtime

df = spark.createDataFrame(

[("11/25/1991",), ("11/24/1991",), ("11/30/1991",)],

['date_str']

)

df2 = df.select(

'date_str',

from_unixtime(unix_timestamp('date_str', 'MM/dd/yyy')).alias('date')

)

print(df2)

#DataFrame[date_str: string, date: timestamp]

df2.show(truncate=False)

#+----------+-------------------+

#|date_str |date |

#+----------+-------------------+

#|11/25/1991|1991-11-25 00:00:00|

#|11/24/1991|1991-11-24 00:00:00|

#|11/30/1991|1991-11-30 00:00:00|

#+----------+-------------------+

更新(1/10/2018):

对于Spark 2.2+,最好的方法是使用format或format函数,它们都支持format参数。 从文档:

>>> df = spark.createDataFrame([('1997-02-28 10:30:00',)], ['t'])

>>> df.select(to_timestamp(df.t, 'yyyy-MM-dd HH:mm:ss').alias('dt')).collect()

[Row(dt=datetime.datetime(1997, 2, 28, 10, 30))]

santon answered 2020-01-02T13:17:32Z

37 votes

from datetime import datetime

from pyspark.sql.functions import col, udf

from pyspark.sql.types import DateType

# Creation of a dummy dataframe:

df1 = sqlContext.createDataFrame([("11/25/1991","11/24/1991","11/30/1991"),

("11/25/1391","11/24/1992","11/30/1992")], schema=['first', 'second', 'third'])

# Setting an user define function:

# This function converts the string cell into a date:

func = udf (lambda x: datetime.strptime(x, '%m/%d/%Y'), DateType())

df = df1.withColumn('test', func(col('first')))

df.show()

df.printSchema()

这是输出:

+----------+----------+----------+----------+

| first| second| third| test|

+----------+----------+----------+----------+

|11/25/1991|11/24/1991|11/30/1991|1991-01-25|

|11/25/1391|11/24/1992|11/30/1992|1391-01-17|

+----------+----------+----------+----------+

root

|-- first: string (nullable = true)

|-- second: string (nullable = true)

|-- third: string (nullable = true)

|-- test: date (nullable = true)

Hugo Reyes answered 2020-01-02T13:17:54Z

22 votes

strptime()方法对我不起作用。 我得到另一个更清洁的解决方案,使用演员:

from pyspark.sql.types import DateType

spark_df1 = spark_df.withColumn("record_date",spark_df['order_submitted_date'].cast(DateType()))

#below is the result

spark_df1.select('order_submitted_date','record_date').show(10,False)

+---------------------+-----------+

|order_submitted_date |record_date|

+---------------------+-----------+

|2015-08-19 12:54:16.0|2015-08-19 |

|2016-04-14 13:55:50.0|2016-04-14 |

|2013-10-11 18:23:36.0|2013-10-11 |

|2015-08-19 20:18:55.0|2015-08-19 |

|2015-08-20 12:07:40.0|2015-08-20 |

|2013-10-11 21:24:12.0|2013-10-11 |

|2013-10-11 23:29:28.0|2013-10-11 |

|2015-08-20 16:59:35.0|2015-08-20 |

|2015-08-20 17:32:03.0|2015-08-20 |

|2016-04-13 16:56:21.0|2016-04-13 |

Frank answered 2020-01-02T13:18:14Z

7 votes

在接受的答案更新中,您没有看到to_date函数的示例,因此使用该函数的另一种解决方案是:

from pyspark.sql import functions as F

df = df.withColumn(

'new_date',

F.to_date(

F.unix_timestamp('STRINGCOLUMN', 'MM-dd-yyyy').cast('timestamp'))

Manrique answered 2020-01-02T13:18:35Z

1 votes

尝试这个:

df = spark.createDataFrame([('2018-07-27 10:30:00',)], ['Date_col'])

df.select(from_unixtime(unix_timestamp(df.Date_col, 'yyyy-MM-dd HH:mm:ss')).alias('dt_col'))

df.show()

+-------------------+

| Date_col|

+-------------------+

|2018-07-27 10:30:00|

+-------------------+

Vishwajeet Pol answered 2020-01-02T13:18:55Z

1 votes

可能没有那么多答案,所以想分享我的代码,这可以帮助某人

from pyspark.sql import SparkSession

from pyspark.sql.functions import to_date

spark = SparkSession.builder.appName("Python Spark SQL basic example")\

.config("spark.some.config.option", "some-value").getOrCreate()

df = spark.createDataFrame([('2019-06-22',)], ['t'])

df1 = df.select(to_date(df.t, 'yyyy-MM-dd').alias('dt'))

print df1

print df1.show()

输出

DataFrame[dt: date]

+----------+

| dt|

+----------+

|2019-06-22|

+----------+

如果要转换日期时间,则将上述代码转换为日期,然后使用to_timestamp。如果您有任何疑问,请告诉我。

Santosh kumar Manda answered 2020-01-02T13:19:23Z

你可能感兴趣的:(pyspark,数据类型转换)