The Scala SDK is not binary compatible between major releases (for example, 2.10 and 2.11). If you have Scala code that you will be using with Spark and that code is compiled against a particular major version of Scala (say 2.10) then you will need to use the compatible version of Spark. For example, if you are writing Spark 1.4.1 code in Scala and you are using the 2.11.4 compiler, then you should use Spark 1.4.1_2.11.
If you are not using Scala code then there should be no functional difference between Spark 1.4.1_2.10 and Spark 1.4.1_2.11 (if there is, it is most likely a bug). The only difference should be the version of the Scala compiler used to compile Spark and the corresponding libraries.