I am now converting a sql server query into spark.i am facing problem to convert given query
and not exists (Select 1 from @TT t2 where t2.TID = f.ID)
i have worked on it and understood that spark does not support 'not exist' command i have used except but error is
pyspark.sql.utils.AnalysisException: u'Except can only be performed on tables with the same number of columns, but the left table has 7 columns and the right has 31;'
I tried inner join operation as well so except does not work with uneven column number of table . what will be a compatible alternate query for this query in spark kindly help me i am using pyspark 2.0