Do we need to install hadoop and spark separately and set the winutils on hadoop home and configure the environmental variable even when
We create maven project and write all dependency on POM file.
I am using intellij scala plug-in and spark
In short: how can i create Maven project in intellij using spark and scala Very helpful if get some suggestion. i checked you tube the below link. Link: https://youtu.be/cU3FshbeeFo, but i got doubt whether i have to get winutils in hadoop_home and spark_home.
Please help me with your suggestion.