pyspark.sql.functions.make_timestamp_ntz#
- pyspark.sql.functions.make_timestamp_ntz(years, months, days, hours, mins, secs)[source]#
Create local date-time from years, months, days, hours, mins, secs fields. If the configuration spark.sql.ansi.enabled is false, the function returns NULL on invalid inputs. Otherwise, it will throw an error instead.
New in version 3.5.0.
- Parameters
- years
Columnor column name The year to represent, from 1 to 9999
- months
Columnor column name The month-of-year to represent, from 1 (January) to 12 (December)
- days
Columnor column name The day-of-month to represent, from 1 to 31
- hours
Columnor column name The hour-of-day to represent, from 0 to 23
- mins
Columnor column name The minute-of-hour to represent, from 0 to 59
- secs
Columnor column name The second-of-minute and its micro-fraction to represent, from 0 to 60. The value can be either an integer like 13 , or a fraction like 13.123. If the sec argument equals to 60, the seconds field is set to 0 and 1 minute is added to the final timestamp.
- years
- Returns
ColumnA new column that contains a local date-time.
See also
pyspark.sql.functions.make_timestamp()pyspark.sql.functions.make_timestamp_ltz()pyspark.sql.functions.try_make_timestamp()pyspark.sql.functions.try_make_timestamp_ltz()pyspark.sql.functions.try_make_timestamp_ntz()pyspark.sql.functions.make_time()pyspark.sql.functions.make_interval()pyspark.sql.functions.try_make_interval()
Examples
>>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
>>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([[2014, 12, 28, 6, 30, 45.887]], ... ['year', 'month', 'day', 'hour', 'min', 'sec']) >>> df.select( ... sf.make_timestamp_ntz('year', 'month', df.day, df.hour, df.min, df.sec) ... ).show(truncate=False) +----------------------------------------------------+ |make_timestamp_ntz(year, month, day, hour, min, sec)| +----------------------------------------------------+ |2014-12-28 06:30:45.887 | +----------------------------------------------------+
>>> spark.conf.unset("spark.sql.session.timeZone")