33
votes

What is the equivalent in Pyspark for LIKE operator? For example I would like to do:

SELECT * FROM table WHERE column LIKE "*somestring*";

looking for something easy like this (but this is not working):

df.select('column').where(col('column').like("*s*")).show()
9
This is Scala, but pySpark will be essentially identical to this answer: stackoverflow.com/questions/35759099/… - Jeff

9 Answers

51
votes

You can use where and col functions to do the same. where will be used for filtering of data based on a condition (here it is, if a column is like '%string%'). The col('col_name') is used to represent the condition and like is the operator:

df.where(col('col1').like("%string%")).show()
13
votes

Using spark 2.0.0 onwards following also works fine:

df.select('column').where("column like '%s%'").show()

5
votes

Well...there should be sql like regexp ->

df.select('column').where(col('column').like("%s%")).show()
3
votes

To replicate the case-insensitive ILIKE, you can use lower in conjunction with like.

from pyspark.sql.functions import lower

df.where(lower(col('col1')).like("%string%")).show()
2
votes

In pyspark you can always register the dataframe as table and query it.

df.registerTempTable('my_table')
query = """SELECT * FROM my_table WHERE column LIKE '*somestring*'"""
sqlContext.sql(query).show()
2
votes

Using spark 2.4, to negate you can simply do:

df = df.filter("column not like '%bla%'")
1
votes

This worked for me:

import pyspark.sql.functions as f
df.where(f.col('column').like("%x%")).show()
-3
votes

I always use a UDF to implement such functionality:

from pyspark.sql import functions as F 
like_f = F.udf(lambda col: True if 's' in col else False, BooleanType())
df.filter(like_f('column')).select('column')