pyspark.sql.functions.make_timestamp_ntz#

pyspark.sql.functions.make_timestamp_ntz(years, months, days, hours, mins, secs)[source]#

Create local date-time from years, months, days, hours, mins, secs fields. If the configuration spark.sql.ansi.enabled is false, the function returns NULL on invalid inputs. Otherwise, it will throw an error instead.

New in version 3.5.0.

Parameters
yearsColumn or column name

The year to represent, from 1 to 9999

monthsColumn or column name

The month-of-year to represent, from 1 (January) to 12 (December)

daysColumn or column name

The day-of-month to represent, from 1 to 31

hoursColumn or column name

The hour-of-day to represent, from 0 to 23

minsColumn or column name

The minute-of-hour to represent, from 0 to 59

secsColumn or column name

The second-of-minute and its micro-fraction to represent, from 0 to 60. The value can be either an integer like 13 , or a fraction like 13.123. If the sec argument equals to 60, the seconds field is set to 0 and 1 minute is added to the final timestamp.

Returns
Column

A new column that contains a local date-time.

Examples

>>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[2014, 12, 28, 6, 30, 45.887]],
...     ['year', 'month', 'day', 'hour', 'min', 'sec'])
>>> df.select(
...     sf.make_timestamp_ntz('year', 'month', df.day, df.hour, df.min, df.sec)
... ).show(truncate=False)
+----------------------------------------------------+
|make_timestamp_ntz(year, month, day, hour, min, sec)|
+----------------------------------------------------+
|2014-12-28 06:30:45.887                             |
+----------------------------------------------------+
>>> spark.conf.unset("spark.sql.session.timeZone")