1
votes

I have a logistic regression model in Spark.
I want to extract the probability for label=1 from the output vector and calculate the areaUnderROC.

val assembler = new VectorAssembler()
.setInputCols(Array("A","B","C","D","E"))--for example
.setOutputCol("features")

val data = assembler.transform(logregdata)

val Array(training,test) = data.randomSplit(Array(0.7,0.3),seed=12345)
val training1 = training.select("label", "features")
val test1 = test.select("label", "features")

val lr = new LogisticRegression()
val model = lr.fit(training1)
val results = model.transform(test1)
results.show()

label|            features|       rawPrediction|    probability|  prediction|
+-----+--------------------+--------------------+--------------------+----------

  0.0|(54,[13,31,34,35,...|[2.44227333947447...|[0.91999457581425...|       0.0|

import org.apache.spark.mllib.evaluation.MulticlassMetrics

val predictionAndLabels =results.select($"probability",$"label").as[(Double,Double)].rdd
val metrics = new MulticlassMetrics(predictionAndLabels)
val auROC= metrics.areaUnderROC()

The probability looks like that: [0.9199945758142595,0.0800054241857405]
How can I extract the probability for label=1 from the vector and calculate the AUC?

1
I don't understand the question. Isn't that what areaUnderROC would calculate by default?jamborta
It suppose to be. In Python the same model returns AUC=91%, in Spark AUC= 73%. I want to test it manually. How can i extract the probability value from the vector?Lili

1 Answers

1
votes

You could get the value from the underlying RDD. This would return a tuple with your original label and the predicted value for P(label=1):

val predictions = results.map(row => (row.getAs[Double]("label"), row.getAs[Vector]("probability")(0)))