I am trying to add a new column to my Spark Dataframe. New column added will be of a size based on a variable (say salt
) post which I will use that column to explode the dataset to use in salted join.
Currently, I am using consecutive lit
in an array
function but that has a problem that it cannot be parameterized and looks worst as a coding practice. My current implementation looks something like below.
int salt =3;
Dataset<Row> Reference_with_Salt_Col = Reference.withColumn("salt_array", array(lit(0), lit(1), lit(2)));
I have referred and looked at various methods but none of them seems to solve the problem in Java.
functions.typedlit
approach though works in Python/Scala doesn't seem to work in Java. Further passing an array or list also doesn't help with spark giving the error on the same.
I am using Spark 2.2.0 and Java 1.8 versions