I tried to round off a double value without decimal points in spark dataframe but same value is obtained at the output.
Below is the dataframe column value .
+-----+-----+
| SIG1| SIG2|
+-----+-----+
| 46.0| 46.0|
| 94.0| 46.0|
The schema for the dataframe column is as below.
scala> df.printSchema
root
|-- SIG1: double (nullable = true)
|-- SIG2: double (nullable = true)
The expected output is as below
+-----+-----+
| SIG1| SIG2|
+-----+-----+
| 46 | 46|
| 94 | 46|
I have tried rounding of column as below as per the document
+------------------------------------------------------------------+
|ReturnType| Signature | Description|
+------------------------------------------------------------------+
|DOUBLE |round(DOUBLE a)| Returns the rounded BIGINT value of a.|
the code used is
val df1 = df.withColumn("SIG1", round(col("SIG1"))).withColumn("SIG2", round(col("SIG2")))
Do we need to cast the column into int/bigint or is it possible with round function itself?
Thanks in advance!