1
votes

I have a class named some_class() in a Python file here:

/some-folder/app/bin/file.py

I am importing it to my code here:

/some-folder2/app/code/file2.py

By

import sys
sys.path.append('/some-folder/app/bin')
from file import some_class

clss = some_class()

I want to use this class's function named some_function in map of spark

sc.parallelize(some_data_iterator).map(lambda x: clss.some_function(x))

This is giving me an error :

No module named file

While class.some_function when I am calling it outside map function of pyspark i.e. normally but not in pySpark's RDD. I think this has something to do with pyspark. I have no idea where am I going wrong in this.

I tried broadcasting this class and still didn't work.

1

1 Answers

5
votes

All Python dependencies have to be either present on the search path of the worker nodes or distributed manually using SparkContext.addPyFile method so something like this should do the trick:

sc.addPyFile("/some-folder/app/bin/file.py")

It will copy the file to all the workers and place in the working directory.

On a side note please don't use file as module name, even if it is only an example. Shadowing built-in functions in Python is not a very good idea.