0
votes

Background

I'm developing an Android game to which I'm adding a 'speed' mode - here, instead of a high score, players will be trying to complete a task in the fastest time possible - the shorter the time taken, the higher up the leaderboard they will be.

The issue

Now, this is a game written in OpenGL ES 2.0 - it runs at 60 ticks per second (game logic updates) and renders as fast as the device can handle. It uses a fixed Delta Time (1f/Ticks_Per_Second).

Basically, each game tick happens every 16.66ms but I need to count and display centiseconds (10ms) - how?

If I hold a variable and say something like:

timePassed++; //Increase this each tick

Then after 1 second (60 ticks), I would have only counted to 60, I can get around this like so:

timePassed+=1.666666666666667;

Now after 1 second, timePassed = 100 (which is ultimately what I want). I can then get the actual time in seconds by dividing by 100 and truncating to 2 decimal places. (or just use this as 100ths of a second and use separate variables for seconds and minutes). Every tick, I would also render this change.

The issue with this is that I would always be 'skipping' some values, because I'm only updating 60 times per second - not 100.

System timer

I could also grab the time in milli or nano seconds, and then when the game ends, grab it again and work out the difference, however, I foresee a similar issue because I'm only querying the difference every 60th of a second.

Similarly, I could run the timer on another thread, but again, I would only ever be querying it at multiples of 1/60th of a second.

Hope I've explained the issue well enough, if any more info is required, please ask.

1
I don't quite understand - If the game duration was 1.345645343 seconds, isn't that the time for the leader-board? Why does it have to consider 'ticks'? Apologies if I've overlooked something you've said.brandall
@brandall, Hi, let's say (just to keep things super-simple), someone takes 2/100th of a second to complete a task, how could I possibly count and display 00.00.02 when I'm increasing the time by 1.666667 every tick? Hope I'm making sense! It would go from 1.6 to 3.2? - hope that makes sense :-)Zippy
I might have to duck out of this question - I still don't understand why you don't just abandon ticks from your equation altogether and have a time running based on actual elapsed time....? The moment the game completes, if you stop this timer (based on system nano time), isn't that the time it took?! I apologise again if I'm missing the point, but I assume others reading your question might puzzle over the same?brandall

1 Answers

0
votes

IMO the first approach isn't very reliable because you're counting frames not time. I would use system timer, and try decoupling game logic from rendering. So you will have:
Core logic --> generate leader-board every 10ms
Rendering --> every 16ms