0
votes

I read from this thread - Get most accurate image using OpenCV - that I can use variance to measure which of the input images are the sharpest. I can't seem to find a tutorial for this. I am very new to openCV. Right now, my code scans images from a folder and stores them using vector

for (int ct = 0; ct < images.size() ; ct++) {
    //should i put the cvAvgSdv function here?
    waitKey(0);
}

Thank you for any help!

Update: I called this fxn: cvAvgSdv(images[ct],&scalar_mean,&std_dev); and it gave me an error: No suitable conversion function from cv::Mat to const cvArr * exists.

Can I use the fxn without converting the Mat to iplImage? If not, what's the easiest way to convert the Mat?

1
you could try cvAvgSdv(images[ct]->imageData,&scalar_mean,&std_dev);slggamer
you could try cvCreateImage, this will help you to convert mat to iplImage, sorry for last answer.slggamer

1 Answers

1
votes

yes, it is. you should calc like this:

CvScalar mean, std_dev; 
cvAvgSdv(img,&mean,&std_dev,NULL);