Background
I'm developing an Android game to which I'm adding a 'speed' mode - here, instead of a high score, players will be trying to complete a task in the fastest time possible - the shorter the time taken, the higher up the leaderboard they will be.
The issue
Now, this is a game written in OpenGL ES 2.0 - it runs at 60 ticks per second (game logic updates) and renders as fast as the device can handle. It uses a fixed Delta Time (1f/Ticks_Per_Second).
Basically, each game tick happens every 16.66ms but I need to count and display centiseconds (10ms) - how?
If I hold a variable and say something like:
timePassed++; //Increase this each tick
Then after 1 second (60 ticks), I would have only counted to 60, I can get around this like so:
timePassed+=1.666666666666667;
Now after 1 second, timePassed = 100 (which is ultimately what I want). I can then get the actual time in seconds by dividing by 100 and truncating to 2 decimal places. (or just use this as 100ths of a second and use separate variables for seconds and minutes). Every tick, I would also render this change.
The issue with this is that I would always be 'skipping' some values, because I'm only updating 60 times per second - not 100.
System timer
I could also grab the time in milli or nano seconds, and then when the game ends, grab it again and work out the difference, however, I foresee a similar issue because I'm only querying the difference every 60th of a second.
Similarly, I could run the timer on another thread, but again, I would only ever be querying it at multiples of 1/60th of a second.
Hope I've explained the issue well enough, if any more info is required, please ask.