2
votes

I'm looking to apply the 'rlike' function on a column, however instead of a standard regular expression string I want to be able to input a column (which is a regular expression string).

ie. $col1.rlike($col2) where $col2 is in a regular expression format within the dataframe

I have tried applying a UDF: def rLike = udf((s: String, col: Column) => col.rlike(s))

This keeps giving me the error:

java.lang.UnsupportedOperationException: Schema for type org.apache.spark.sql.Column is not supported

Could anyone please shed some light on how I can fix this.

1
Changing col: Column to col: String should work. The column type is a string so that is the correct input type to an udf. - Shaido

1 Answers

1
votes

Method rlike does not support regex-matching pattern stored in a Column. An alternative is to use regexp_replace as shown below:

import org.apache.spark.sql.functions._
import spark.implicits._

val df = Seq(
  ("a123", "[a-z]\\d+"),
  ("b456", "[a-z]+")
).toDF("text", "pattern")

val matched = "Matched!"  // can be any value non-existent in column `text`

df.where(regexp_replace($"text", $"pattern", lit(matched)) === matched).show
// +----+--------+
// |text| pattern|
// +----+--------+
// |a123|[a-z]\d+|
// +----+--------+

In case you would like to implement a custom rlike as an UDF (which generally doesn't scale well compared with native Spark API functions), here's one way:

def rlike = udf( (text: String, pattern: String) => text match {
  case pattern.r() => true
  case _ => false
} )

// Applying the UDF
df.where(rlike($"text", $"pattern"))