pyspark.sql.functions.from_unixtime#

pyspark.sql.functions.from_unixtime(timestamp, format='yyyy-MM-dd HH:mm:ss')[source]#

Converts the number of seconds from unix epoch (1970-01-01 00:00:00 UTC) to a string representing the timestamp of that moment in the current system time zone in the given format.

New in version 1.5.0.

Changed in version 3.4.0: Supports Spark Connect.

Parameters
timestampColumn or str

column of unix time values.

formatstr, optional

format to use to convert to (default: yyyy-MM-dd HH:mm:ss)

Returns
Column

formatted timestamp as string.

Examples

>>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
>>> time_df = spark.createDataFrame([(1428476400,)], ['unix_time'])
>>> time_df.select(from_unixtime('unix_time').alias('ts')).collect()
[Row(ts='2015-04-08 00:00:00')]
>>> spark.conf.unset("spark.sql.session.timeZone")