I have installed the Spark in Window. I am trying to load the text file from D: drive. RDD is getting created normally but when I perform any action on that receiving error. I have tried with all the combinations of slash but not succeed
scala> val file = sc.textFile("D:\\file\\file1.txt")
15/12/16 07:53:51 INFO MemoryStore: ensureFreeSpace(175321) called with curMem=4
01474, maxMem=280248975
15/12/16 07:53:51 INFO MemoryStore: Block broadcast_2 stored as values in memory
(estimated size 171.2 KB, free 266.7 MB)
15/12/16 07:53:51 INFO MemoryStore: ensureFreeSpace(25432) called with curMem=57
6795, maxMem=280248975
15/12/16 07:53:51 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in
memory (estimated size 24.8 KB, free 266.7 MB)
15/12/16 07:53:51 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on l
ocalhost:51963 (size: 24.8 KB, free: 267.2 MB)
15/12/16 07:53:51 INFO BlockManagerMaster: Updated info of block broadcast_2_pie
ce0
15/12/16 07:53:51 INFO SparkContext: Created broadcast 2 from textFile at <conso
le>:21
file: org.apache.spark.rdd.RDD[String] = D:\file\file1.txt MapPartitionsRDD[5] a
t textFile at <console>:21
RDD getting created normally, but when I try to perform any action on RDD receiving the below error
scala> file.count()
org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: file
/D:/file/file1.txt
at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(Fi
eInputFormat.java:285)
at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.
ava:228)
at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.j
va:313)
at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:203)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD
scala:32)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1512)
at org.apache.spark.rdd.RDD.count(RDD.scala:1006)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
at $iwC$$iwC$$iwC.<init>(<console>:37)
at $iwC$$iwC.<init>(<console>:39)
at $iwC.<init>(<console>:41)
at <init>(<console>:43)
at .<init>(<console>:47)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
sorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala
1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala
1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:84
)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:
56)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.sc
la:901)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$l
op(SparkILoop.scala:669)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spar
ILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spar
ILoop$$process$1.apply(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spar
ILoop$$process$1.apply(SparkILoop.scala:944)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClas
Loader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$p
ocess(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
sorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSu
mit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:1
6)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
scala>