I'm trying to pass an argument to spark-shell. For example, I want today's date as a variable inside scala code.
val conf = new SparkConf().setAppName("test").setMaster("local[*]")
val sc = new SparkContext(conf)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val df = sqlContext.read.format("csv").load("./"+date+".csv")
My test.scala like above, and I need to get variable 'date' from the terminal. The solution that I've found is
$spark-shell -i <(echo val date = 2019-11-30 ; cat test3.scala)
However, this doesn't work. spark-shell runs, but nothing gets executed after it started to run. I'm new to scala, and have used only python before. In python, this function can be made by argparse library, and I want the scala code to run similarly like in python argparse.
Thanks in advance. Plus, I don't want to use sbt, I just want to use spark-shell.