pyspark.sql.functions.make_timestamp#

pyspark.sql.functions.make_timestamp(years, months, days, hours, mins, secs, timezone=None)[source]#

Create timestamp from years, months, days, hours, mins, secs and timezone fields. The result data type is consistent with the value of configuration spark.sql.timestampType. If the configuration spark.sql.ansi.enabled is false, the function returns NULL on invalid inputs. Otherwise, it will throw an error instead.

New in version 3.5.0.

Parameters
yearsColumn or column name

The year to represent, from 1 to 9999

monthsColumn or column name

The month-of-year to represent, from 1 (January) to 12 (December)

daysColumn or column name

The day-of-month to represent, from 1 to 31

hoursColumn or column name

The hour-of-day to represent, from 0 to 23

minsColumn or column name

The minute-of-hour to represent, from 0 to 59

secsColumn or column name

The second-of-minute and its micro-fraction to represent, from 0 to 60. The value can be either an integer like 13 , or a fraction like 13.123. If the sec argument equals to 60, the seconds field is set to 0 and 1 minute is added to the final timestamp.

timezoneColumn or column name, optional

The time zone identifier. For example, CET, UTC and etc.

Returns
Column

A new column that contains a timestamp.

Examples

>>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")

Example 1: Make timestamp from years, months, days, hours, mins and secs.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[2014, 12, 28, 6, 30, 45.887, 'CET']],
...     ['year', 'month', 'day', 'hour', 'min', 'sec', 'tz'])
>>> df.select(
...     sf.make_timestamp(df.year, df.month, df.day, 'hour', df.min, df.sec, 'tz')
... ).show(truncate=False)
+----------------------------------------------------+
|make_timestamp(year, month, day, hour, min, sec, tz)|
+----------------------------------------------------+
|2014-12-27 21:30:45.887                             |
+----------------------------------------------------+

Example 2: Make timestamp without timezone.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[2014, 12, 28, 6, 30, 45.887, 'CET']],
...     ['year', 'month', 'day', 'hour', 'min', 'sec', 'tz'])
>>> df.select(
...     sf.make_timestamp(df.year, df.month, df.day, 'hour', df.min, df.sec)
... ).show(truncate=False)
+------------------------------------------------+
|make_timestamp(year, month, day, hour, min, sec)|
+------------------------------------------------+
|2014-12-28 06:30:45.887                         |
+------------------------------------------------+
>>> spark.conf.unset("spark.sql.session.timeZone")