2
votes

I am trying to find a way to push a python script using Livy API (or client) on the spark server. I have tried the following.

  1. curl -X POST --data '{"file": "/user/test/pi.py"}' -H "Content-Type: application/json" localhost:8998/batches

, but when I see the logs it gives file not found because it is trying to find the path on the server.

  1. Using Livy Python Client.

r = client.upload_pyfile("/tmp/code/test.py")

, this returns a future object, but the batches are not created, (I am not even sure the file path it is uploading to).

Basically what I want is -

  1. Able to upload the file through the API, on the Spark server using Livy.

  2. submit a batch/ run trigger. using Livy.

1
Did you find a solution? :)Roelant
No Luck so far.shubham

1 Answers

0
votes

Using Livy JAVA client i am able to do the same. I have uploaded the jar to the Livy Server and submit Spark JOBS.

You can refer to the link below. https://livy.incubator.apache.org/docs/latest/programmatic-api.html

Points to note : 1. wait for the livy client to successfully upload the jar/file. 2. the jar/file should be available in the livy server or in HDFS which is accessible to livy server.