I have a problem with running a Spark SQL query which uses a nested select with a "where in" clause. In the query below table1 represents a temporary table which comes from a more complicated query. In the end I want to substitute table1 with this query.
select * from (select * from table1) as table2
where (product, price)
in (select product, min(price) from table2 group by product)
The Spark error I get says:
AnalysisException: 'Table or view not found: table2;
How could I possibly change the query to make it work as intended?