I'm implementing a flood fill algorithm for the project I'm currently working on. I'm using it for the normal purpose, image editing. I have no problem with the basic algorithm, but I'd like a better looking fill.
In many cases areas of my image will have regions that are mostly one color, but which are bordered by pixels which are slightly lighter or darker. I'd like to know an algorithm for a "fuzzy" flood fill that won't leave these border pixels. I've tried to fill all pixels withn two different, simple, distance metrics of the origin pixel:
- a Manhattan Distance on all 3 color components: red, green, and blue
- the maximum of the distance between the color components.
Neither of these do the trick, often leaving borders and occasionally filling adjacent regions of a visually distinct but "close" color.
I don't think there is a magic bullet to solve my problem, but I'd be interested in knowing any algorithms which I might try to get a better result, or even where I might usefully look to find such algorithms. Looking around the net I've found reference to something called the "fuzzy flood fill mean shift algorihm", but I'm not sure that's even the same thing.