Using build tool (setuptools) packaged my python code as .egg format. I wanted to run this package through job in azure data-bricks.
I can able to execute the package in my local machine through below commands.
spark-submit --py-files ./dist/hello-1.0-py3.6.egg hello/pi.py
1) Copied the package into DBFS path as follows,
work-space -> User -> Create -> Library -> Library Source (DBFS) -> Library Type (Python Egg) -> Uploaded
2) Created a job with task as spark-submit on new cluster mode
3) Below parameters are configured for the task,
["--py-files","dbfs:/FileStore/jars/8c1231610de06d96-hello_1_0_py3_6-70b16.egg","hello/pi.py"]
Actual: /databricks/python/bin/python: can't open file '/databricks/driver/hello/hello.py': [Errno 2] No such file or directory
Expected: Job should execute successfully.