I have two pyspark dfs
df1 has columns - a, b, c, d, e, f df2 has columns - c, d, e (Column names keep changing dynamically)
I want a df3 dataframe which is extracted from df1 based on the columns names from df2. So basically I want
select columns from df1 based on columns in df2 (df2 columns keep changing)
In above example result df should have columns - c, d, e (extracted from df1)
I unable to find any method which can achieve this. Please help