0
votes

Do we need to install hadoop and spark separately and set the winutils on hadoop home and configure the environmental variable even when

We create maven project and write all dependency on POM file.

I am using intellij scala plug-in and spark

In short: how can i create Maven project in intellij using spark and scala Very helpful if get some suggestion. i checked you tube the below link. Link: https://youtu.be/cU3FshbeeFo, but i got doubt whether i have to get winutils in hadoop_home and spark_home.

Please help me with your suggestion.

2
Please check this article: knowdimension.com/en/data/…y.bedrov

2 Answers

0
votes

In order to build your package with scala through Maven, you don't need to install spark and winutils.exe. Scala itself is enough to build and get a jar executable file. But if you want to test your code in the local machine, then it is required.

-1
votes

You can open Intellij and go to :

1) File > New > Project

2) Then click the checkbox "create from archetype"

3) select the option - "org.apache.maven.archtypes:maven-archetype-webapp" and click next

4) Give group-id and artifactid .example : groupid = com.sample.project artifactid: sampleproject

and click next

5) Next screen will be maven screen .. you can just seect next

6) Next it will ask project name : give any name and select Finish