0
votes

I have successfully configured PySpark kernel in jupyter notebook, I also installed SparkMagic. When I try to use the below command:

%%sql
SELECT DepDelay, ArrDelay FROM flightData

it starts working and suddenly Spark stops throwing the below error:

An error was encountered: Invalid status code '400' from http://localhost:8998/sessions/0/statements/4 with error payload: {"msg":"requirement failed: Session isn't active."}

you can find the full log file here to download and take a look. https://drive.google.com/open?id=1lvYqQBUCiIFp4lz3aVnzMgBNd9fzqJiz

Appreciate your help. Many thanks in advance

1
before running this line, are you sure that a connection is established, if not try spark.applicationID - Sarath Chandra Vema
everything else works, for example: the next lines, so i guess the application is running. data.createOrReplaceTempView("flightData") spark.sql("SELECT DayOfWeek, AVG(ArrDelay) AS AvgDelay FROM flightData GROUP BY DayOfWeek ORDER BY DayOfWeek").show() - M. Wadi

1 Answers

0
votes

Try in a single line like this

%sql SELECTDepDelay, ArrDelay FROM flightData