0
votes

This is the snapshot which is takes after adding a column but that does not contain the sum of all values of one column

I am trying to add a column in the dataframe which contains the sum of all values of one column in the same dataframe.

For eg: In the pic there are columns- UserID,MovieID,Rating,Unixtimestamp. Now I want to add one column named as Sum which will contain the sum of all values of Rating Column.

I have a Ratings Data Frame

Ratings DataFrame column name: USerID, MovieID, Ratings, UnixTimeStamp.

+------+-------+------+-------------+
|UserID|MovieID|Rating|UnixTimeStamp|
+------+-------+------+-------------+
|   196|    242|     3|    881250949|
|   186|    302|     3|    891717742|
|    22|    377|     1|    878887116|
|   244|     51|     2|    880606923|
|   166|    346|     1|    886397596|
+------+-------+------+-------------+

only showing top 5 rows

I have to calculate wa rating and store this into a dataframe.

wa_rating= (rating>3)/total ratings

please help me to find the wa_rating dataframe which contains a new column with that using scala spark

1
Can you provide an expected input and expected output example ? This way it would be easier to understand your requirements. - Constantine
Yes sure, I will provide the same. I have a Ratings Data Frame Ratings DataFrame column name: USerID, MovieID, Ratings, UnixTimeStamp. +------+-------+------+-------------+ |UserID|MovieID|Rating|UnixTimeStamp| +------+-------+------+-------------+ | 196| 242| 3| 881250949| | 186| 302| 3| 891717742| | 22| 377| 1| 878887116| | 244| 51| 2| 880606923| | 166| 346| 1| 886397596| +------+-------+------+-------------+ I have to calculate wa rating and store this into a dataframe. wa_rating= (rating>3)/total ratings - vikash patwa

1 Answers

3
votes

Check this out:

scala> val df = Seq((196,242,3,881250949),(186,302,3,891717742),(22,377,1,878887116),(244,51,2,880606923),(166,346,1,886397596)).toDF("userid","movieid","rating","unixtimestamp")
df: org.apache.spark.sql.DataFrame = [userid: int, movieid: int ... 2 more fields]

scala> df.show(false)
+------+-------+------+-------------+
|userid|movieid|rating|unixtimestamp|
+------+-------+------+-------------+
|196   |242    |3     |881250949    |
|186   |302    |3     |891717742    |
|22    |377    |1     |878887116    |
|244   |51     |2     |880606923    |
|166   |346    |1     |886397596    |
+------+-------+------+-------------+


scala> import org.apache.spark.sql.expressions._
import org.apache.spark.sql.expressions._

scala> val df2 = df.withColumn("total_rating",sum('rating).over())
df2: org.apache.spark.sql.DataFrame = [userid: int, movieid: int ... 3 more fields]

scala> df2.show(false)
19/01/23 08:38:46 WARN window.WindowExec: No Partition Defined for Window operation! Moving all data to a single partition, this can cause serious performance degradation.
+------+-------+------+-------------+------------+
|userid|movieid|rating|unixtimestamp|total_rating|
+------+-------+------+-------------+------------+
|22    |377    |1     |878887116    |10          |
|244   |51     |2     |880606923    |10          |
|166   |346    |1     |886397596    |10          |
|196   |242    |3     |881250949    |10          |
|186   |302    |3     |891717742    |10          |
+------+-------+------+-------------+------------+


scala> df2.withColumn("wa_rating",coalesce( when('rating >= 3,'rating),lit(0))/'total_rating).show(false)
19/01/23 08:47:49 WARN window.WindowExec: No Partition Defined for Window operation! Moving all data to a single partition, this can cause serious performance degradation.
+------+-------+------+-------------+------------+---------+
|userid|movieid|rating|unixtimestamp|total_rating|wa_rating|
+------+-------+------+-------------+------------+---------+
|22    |377    |1     |878887116    |10          |0.0      |
|244   |51     |2     |880606923    |10          |0.0      |
|166   |346    |1     |886397596    |10          |0.0      |
|196   |242    |3     |881250949    |10          |0.3      |
|186   |302    |3     |891717742    |10          |0.3      |
+------+-------+------+-------------+------------+---------+


scala>