I'm trying to run an application in spark (2.3.1) for java,the inconvenient is that every time I try to run the spark throw a message "Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resource" after a few tries (in all of those tries spark add and remove executor in the same worker but in the same port). So anyone has an idea about how to solve this?
I'm using a master in the computer A and the worker in the computer B. Set the computer A with driver memory of 3g and the worker with 2g (this is because the app doesn't require too much memory) and 4 cores to use in the executor.
I check other similar questions an most of them were network or memory issues, I discard the network issue because other application I can run with the worker.