PySpark : Determining the Last Day of the Month and Year from a Timestamp

PySpark @

Working with dates and times is a common operation in data processing. Sometimes, it’s necessary to compute the last day of a month or year based on a given date or timestamp. This article will guide you through how to accomplish these tasks using PySpark, the Python library for Apache Spark, with examples to enhance your understanding.

Setting up the Environment

Firstly, it’s important to set up our PySpark environment. Assuming you’ve installed Spark and PySpark correctly, you can initialize a SparkSession as follows:

from pyspark.sql import SparkSession
spark = SparkSession.builder \
    .appName(" Learning : Date and Time Operations") \

Understanding last_day and year Functions

The functions we’ll be utilizing in this tutorial are PySpark’s built-in last_day and year functions. The last_day function takes a date column and returns the last day of the month. The year function returns the year of a date as a number.

Getting the Last Day of the Month

To demonstrate, let’s create a DataFrame with some sample timestamps:

data = [("2023-01-15 13:45:30",), ("2023-02-22 08:20:00",), ("2023-07-07 22:15:00",)]
df = spark.createDataFrame(data, ["Timestamp"])
df = df.withColumn("Timestamp", F.col("Timestamp").cast(TimestampType()))
|Timestamp          |
|2023-01-15 13:45:30|
|2023-02-22 08:20:00|
|2023-07-07 22:15:00|

Now, we can use the last_day function to get the last day of the month for each timestamp:

df.withColumn("Last_Day_of_Month", F.last_day(F.col("Timestamp"))).show(truncate=False)
|Timestamp          |Last_Day_of_Month|
|2023-01-15 13:45:30|2023-01-31       |
|2023-02-22 08:20:00|2023-02-28       |
|2023-07-07 22:15:00|2023-07-31       |

The new Last_Day_of_Month column shows the last day of the month for each corresponding timestamp.

Getting the Last Day of the Year

Determining the last day of the year is slightly more complex, as there isn’t a built-in function for this in PySpark. However, we can accomplish it by combining the year function with some string manipulation. Here’s how:

df.withColumn("Year", F.year(F.col("Timestamp")))\
    .withColumn("Last_Day_of_Year", F.expr("make_date(Year, 12, 31)"))\

In the code above, we first extract the year from the timestamp using the year function. Then, we construct a new date representing the last day of that year using the make_date function. The make_date function creates a date from the year, month, and day values.

PySpark’s last_day function makes it straightforward to determine the last day of the month for a given date or timestamp, finding the last day of the year requires a bit more creativity. By combining the year and make_date functions, however, you can achieve this with relative ease.

|Timestamp          |Year|Last_Day_of_Year|
|2023-01-15 13:45:30|2023|2023-12-31      |
|2023-02-22 08:20:00|2023|2023-12-31      |
|2023-07-07 22:15:00|2023|2023-12-31      |

Spark important urls to refer

  1. Spark Examples
  2. PySpark Blogs
  3. Bigdata Blogs
  4. Spark Interview Questions
  5. Official Page
Author: user

Leave a Reply