I am using openCV for Python, the cv2 library. I use the following function to compute the histogram of an image im_converted
hist = cv2.calcHist([im_converted], channels, None, histSize, ranges,hist, 1)
where im_converted is loaded as a numpy array of type uint8.
hist seems to be forced to be a numpy array of type float32. A problem arises when I use the backprojection function. (note: I normalize the histogram s.t np.sum(hist)=1)
backProj = cv2.calcBackProject([im_converted], channels, hist, ranges,scale);
The documentation is here. backProj is forced to be an uint8 numpy array.
- if scale=1, then backProj = 0
- if scale=255 then backProj is non zero, but the values are very small.
My question is: what is the scale factor that should be applied, given the differences between the types? Isn't there a way of changing the types? (note: I tried to do hist=zeros(histSize, dtype=uint8) but this was unsuccessful, I still got a float32 histogram in the end.)