1
votes

I am facing problem converting bellow query in spark-sql in pyspark SQL-server query is

coalesce((Select top 1 f2.ChargeAmt from Fact_CMCharges f2
        where f2.BldgID = f.BldgID 
        and f2.LeaseID = f.LeaseID
        and f2.IncomeCat = f.IncomeCat
        and f2.Period < f.Period
        and f2.ActualProjected = 'Lease'
        order by f2.Period desc),0) as Charge

I did not find replacing key word of top in pyspark sql . Kindly Help me how could i convert this query in py-spark sql

1
I'm not familiar with sql-server. Would you care explaining what it does and a link reference for its documentation? - eliasah
well 'top' works like as limit keyword in MYSQL as i have understood. here this command actually pick top 1 row from given condition in where clause.the coalesce statement will execute this statement if it is true or return 0 by default - Kalyan
Can you give an example of an input data and expected output because it's still not very clear. - eliasah
w3schools.com/sql/… this link shows sql server 'top statement' example - Kalyan

1 Answers

1
votes

Since you said Spark-SQL and if you have `DF', then you can use something like this.

df.limit(1).show()