I have a SQL-Server query that calculates the last quarter End-date. Now, I need to convert this query into a Spark-SQL
query.
The Sql-Server
query and some sample examples are:
select dateadd(dd, -1, dateadd(qq, datediff(qq, 0, cast('2020-09-09' AS DATE)), 0)) as end_dt
Output: 2020-06-30
To calculate the same for another (2021-01-10
) date:
select dateadd(dd, -1, dateadd(qq, datediff(qq, 0, cast('2021-01-10' AS DATE)), 0)) as end_dt
Output: 2020-12-31
Question: How can I implement the same in Spark SQL query (v2.4) ? How do I calculate the last day of the previous quarter as per the SQL-Server
examples above and the required output below ?
input_date | end_date
------------------------
2020-01-21 | 2019-12-31
2020-02-06 | 2019-12-31
2020-04-15 | 2020-03-31
2020-07-10 | 2020-06-30
2020-10-20 | 2020-09-30
2021-02-04 | 2020-12-31
Any help is appreciated. Thanks.