pyspark.sql.functions.unix_date#
- pyspark.sql.functions.unix_date(col)[source]#
Returns the number of days since 1970-01-01.
New in version 3.5.0.
Examples
>>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles") >>> df = spark.createDataFrame([('1970-01-02',)], ['t']) >>> df.select(unix_date(to_date(df.t)).alias('n')).collect() [Row(n=1)] >>> spark.conf.unset("spark.sql.session.timeZone")