0
votes

I have a SQL-Server query that calculates the last quarter End-date. Now, I need to convert this query into a Spark-SQL query.

The Sql-Server query and some sample examples are:

select dateadd(dd, -1, dateadd(qq, datediff(qq, 0, cast('2020-09-09' AS DATE)), 0)) as end_dt

Output: 2020-06-30

To calculate the same for another (2021-01-10) date:

select dateadd(dd, -1, dateadd(qq, datediff(qq, 0, cast('2021-01-10' AS DATE)), 0)) as end_dt

Output: 2020-12-31

Question: How can I implement the same in Spark SQL query (v2.4) ? How do I calculate the last day of the previous quarter as per the SQL-Server examples above and the required output below ?

input_date | end_date
------------------------
2020-01-21 | 2019-12-31
2020-02-06 | 2019-12-31
2020-04-15 | 2020-03-31
2020-07-10 | 2020-06-30
2020-10-20 | 2020-09-30
2021-02-04 | 2020-12-31

Any help is appreciated. Thanks.

2

2 Answers

1
votes

You can use date_trunc():

select date_add(
    date_trunc(cast('2020-09-09' as date), 'quarter'), 
    -1
) as end_dt
0
votes

To find immediate quarter end date for the given date.

select add_months(last_day(date_trunc('quarter',to_timestamp('2020-12-03','yyyy-MM-dd'))),2) as end_dt

to find immediate year end date for the given date.

select add_months(last_day(date_trunc('year',to_timestamp('2020-06-03','yyyy-MM-dd'))),11) as end_dt

to find immediate bi-annual end date for the given date.

select case when quarter(to_timestamp('2020-11-03','yyyy-MM-dd')) >=3 then 
add_months(last_day(date_trunc('year',to_timestamp('2020-11-03','yyyy-MM-dd'))),11) 
          when quarter(to_timestamp('2020-11-03','yyyy-MM-dd')) < 3 then 
add_months(last_day(date_trunc('year',to_timestamp('2020-11-03','yyyy-MM-dd'))),5)
          end as end_dt