I am trying to add an external package in Jupyter of Azure Spark.
%%configure -f
{ "packages" : [ "com.microsoft.azure:spark-streaming-eventhubs_2.11:2.0.4" ] }
Its output :
Current session configs: {u'kind': 'spark', u'packages': [u'com.microsoft.azure:spark-streaming-eventhubs_2.11:2.0.4']}
But when I tried to import:
import org.apache.spark.streaming.eventhubs.EventHubsUtils
I got an error:
The code failed because of a fatal error: Invalid status code '400' from http://an0-o365au.zdziktedd3sexguo45qd4z4qhg.xx.internal.cloudapp.net:8998/sessions with error payload: "Unrecognized field \"packages\" (class com.cloudera.livy.server.interactive.CreateInteractiveRequest), not marked as ignorable (15 known properties: \"executorCores\", \"conf\", \"driverMemory\", \"name\", \"driverCores\", \"pyFiles\", \"archives\", \"queue\", \"kind\", \"executorMemory\", \"files\", \"jars\", \"proxyUser\", \"numExecutors\", \"heartbeatTimeoutInSecond\" [truncated]])\n at [Source: HttpInputOverHTTP@5bea54d; line: 1, column: 32] (through reference chain: com.cloudera.livy.server.interactive.CreateInteractiveRequest[\"packages\"])".
Some things to try: a) Make sure Spark has enough available resources for Jupyter to create a Spark context. For instructions on how to assign resources see http://go.microsoft.com/fwlink/?LinkId=717038 b) Contact your cluster administrator to make sure the Spark magics library is configured correctly.
I also tried:
%%configure
{ "conf": {"spark.jars.packages": "com.microsoft.azure:spark-streaming-eventhubs_2.11:2.0.4" }}
Got the same error.
Could someone point me a correct way to use external package in Jupyter of Azure Spark?