I am trying to understand as to what happens when we run the collectAsMap() function in spark. As per the Pyspark docs,it says,
collectAsMap(self) Return the key-value pairs in this RDD to the master as a dictionary.
and for core spark it says,
def collectAsMap(): Map[K, V] Return the key-value pairs in this RDD to the master as a Map.
When I try to run a sample code in pyspark for a List, I get this result:

and for scala I get this result:

I am a little confused as to why it is not returning all the elements in the List. Can somebody help me understand what is happening in this scenario as to why I am getting selective results.
Thanks.
collectAsMap. you'll see all of the original pairs are preserved in an array or list if you simply usecollect- obataku