I have created a Maven project to run a wordcount spark-scala program. Here when I create my SparkConf it gives me an error "org.apache.spark.SparkConf does not have constructor". Similar for SparkContext (org.apache.spark.SparkContext has no constructor)
I have imported both SparkContext and SparkConf and also written in the proper constructor format.This could be a Maven issue but no such error pops up related to that.
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
object WordCount {
def main(args: Array[String]) {
val cf = new SparkConf().setAppName("WordCount").setMaster("local")
val sc = new SparkContext(cf)
val rawData = sc.textFile("C:/Users/siddharth.shankar/Documents/input.txt")
val words = rawData.flatMap(line => line.split(" "))
val wordCount = words.map(word => (word, 1)).reduceByKey(_ + _)
wordCount.foreach(println)
}
}
Here is my pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.devinline.spark</groupId>
<artifactId>SparkSample2</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>SparkSample Maven Webapp</name>
<url>http://maven.apache.org</url>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-winutils</artifactId>
<version>2.7.1</version>
</dependency>
</dependencies>
<build>
<finalName>SparkSample2</finalName>
</build>
</project>
I don't know what the issue is here as if I apply the same program as a regular spark-scala(no maven) application the program runs without errors.