Is there a standard or accepted best practice for how times should be displayed as HH:MM
when the source time has HH:MM:SS
precision? Should seconds be truncated or rounded to the nearest minute?
Socially, if I looked at a digital clock and saw that it was 4:00:45, I would never tell someone it was 4:01. But I didn't know if this convention is universal or if it applies in computing too.
Also, rounding to the nearest minute might produce unexpected behavior, e.g. if the rounding causes the hour or date to change. This doesn't necessarily apply to the particular use-case we're dealing with today, but I can easily imagine another use case where a list of "Sales in January" includes a sale on 31-Jan 23:59:59
that would be displayed as 1-Feb 00:00
If context is relevant to the answer, this use-case is a SQL Server app that converts a datetime
to a smalldatetime
, which SQL Server will round to the nearest minute. The result will be displayed as HH:MM
in a C# web application. The conversion happens in legacy code that we can't change right now, but we can force truncation of seconds instead of rounding.
But I'm not sure if we should do this.