Does anyone have a decent algorithm for calculating axis minima and maxima?
When creating a chart for a given set of data items, I'd like to be able to give the algorithm:
- the maximum (y) value in the set
- the minimum (y) value in the set
- the number of tick marks to appear on the axis
- an optional value that must appear as a tick (e.g. zero when showing +ve and -ve values)
The algorithm should return
- the largest axis value
- the smallest axis value (although that could be inferred from the largest, the interval size and the number of ticks)
- the interval size
The ticks should be at a regular interval should be of a "reasonable" size (e.g. 1, 3, 5, possibly even 2.5, but not any more sig figs).
The presence of the optional value will skew this, but without that value the largest item should appear between the top two tick marks, the lowest value between the bottom two.
This is a language-agnostic question, but if there's a C#/.NET library around, that would be smashing ;)