Assume I have a UISlider representing a continuous range of values from -1 to 2. If I leave the default min=0, max=1, then my value of 0 is represented by the slider being 1/3 of travel and a float value of 1/3 (0.33333333). I am particularly interested in the special value 0 and representing it as 0.333333 which will have to be (slightly) rounded feels wrong. If I were to change the minimum to -1 and the maximum to 2 then my 0 value is float value 0.0 exactly.
Would setting the min/max give me more accuracy? Does the thumb move on pixel boundaries - maybe I could use that information to try some rounding examples?
slider.minimumValue = -1.0;
andslider.maximumValue = 2.0;
? Then set theslider.value = 0;
? slider.value should still be a float – mkral