1
votes

I'm looking to get more details on if these properties are for the whole cluster\each node or a combination of the two. for ecample spark:spark.executor.instances
2
, is this a property for the whole cluster or a specific node?

enter image description here

1
The properties are configured in every node of the cluster. You can find them in /etc/spark. - Dagang

1 Answers

0
votes

As dagang says, the various Dataproc properties should be identically set on every node of the cluster. The main config files you can file on each cluster corresponding to spark:/core:/distcp:/hdfs:/mapred:/yarn: are (in addition to others for additional optional components):

/etc/spark/conf/spark-defaults.conf
/etc/hadoop/conf/core-site.xml
/etc/hadoop/conf/distcp-default.xml
/etc/hadoop/conf/hdfs-site.xml
/etc/hadoop/conf/mapred-site.xml
/etc/hadoop/conf/yarn-site.xml

Other files can be found in similar directories.