Spark.speculation default value is false If you set to "true", performs speculative execution of tasks. This means if one or more tasks are running slowly in a stage, they will be re-launched.

http://spark.apache.org/docs/latest/configuration.html
You can add these flags to your spark-submit, passing them under --conf e.g.:
spark-submit \
--conf "spark.speculation=true" \
--conf "spark.speculation.multiplier=5" \
--conf "spark.speculation.quantile=0.90" \
--class "org.asyncified.myClass" "path/to/Vaquarkhanjar.jar"
Note :
Spark driver is spending a lot of time in speculation when managing a large number of tasks. enable it only if needed.