Can we write a hive query in Spark - UDF. eg I have 2 tables: Table A and B
where b1 contains column names of A and b2 contains the value of that column in A. Now I want to query the tables in such a way that I get result as below: Result.
Basically replace the values of column in A with B based on column names and their corresponding values. To achieve that I wrote spark-UDF eg:convert as below
def convert(colname: String, colvalue:String)={
sqlContext.sql("SELECT b3 from B where b1 = colname and b2 = colvalue").toString;
}
I registered it as:
sqlContext.udf.register("conv",convert(_:String,_:String));
Now my main query is-
val result = sqlContext.sql("select a1 , conv('a2',a2), conv('a3',a3)");
result.take(2);
It gives me java.lang.NullPointerException.
Can someone please suggest if this feature is supported in spark/hive. Any other approach is also welcome. Thanks!