pyspark.sql.functions.make_timestamp¶
- 
pyspark.sql.functions.make_timestamp(years: ColumnOrName, months: ColumnOrName, days: ColumnOrName, hours: ColumnOrName, mins: ColumnOrName, secs: ColumnOrName, timezone: Optional[ColumnOrName] = None) → pyspark.sql.column.Column[source]¶
- Create timestamp from years, months, days, hours, mins, secs and timezone fields. The result data type is consistent with the value of configuration spark.sql.timestampType. If the configuration spark.sql.ansi.enabled is false, the function returns NULL on invalid inputs. Otherwise, it will throw an error instead. - New in version 3.5.0. - Parameters
- yearsColumnor str
- the year to represent, from 1 to 9999 
- monthsColumnor str
- the month-of-year to represent, from 1 (January) to 12 (December) 
- daysColumnor str
- the day-of-month to represent, from 1 to 31 
- hoursColumnor str
- the hour-of-day to represent, from 0 to 23 
- minsColumnor str
- the minute-of-hour to represent, from 0 to 59 
- secsColumnor str
- the second-of-minute and its micro-fraction to represent, from 0 to 60. The value can be either an integer like 13 , or a fraction like 13.123. If the sec argument equals to 60, the seconds field is set to 0 and 1 minute is added to the final timestamp. 
- timezoneColumnor str
- the time zone identifier. For example, CET, UTC and etc. 
 
- years
 - Examples - >>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles") >>> df = spark.createDataFrame([[2014, 12, 28, 6, 30, 45.887, 'CET']], ... ["year", "month", "day", "hour", "min", "sec", "timezone"]) >>> df.select(make_timestamp( ... df.year, df.month, df.day, df.hour, df.min, df.sec, df.timezone).alias('r') ... ).show(truncate=False) +-----------------------+ |r | +-----------------------+ |2014-12-27 21:30:45.887| +-----------------------+ - >>> df.select(make_timestamp( ... df.year, df.month, df.day, df.hour, df.min, df.sec).alias('r') ... ).show(truncate=False) +-----------------------+ |r | +-----------------------+ |2014-12-28 06:30:45.887| +-----------------------+ >>> spark.conf.unset("spark.sql.session.timeZone")