0
votes

this is regarding this tutorial page from lazyfoo's set of SDL tutorials. There he first starts a timer to calculate how much time each frame should stay alive for, before it is refreshed. He does this using the following

if( ( cap == true ) && ( fps.get_ticks() < 1000 / FRAMES_PER_SECOND ) ) { 
 //Sleep the remaining frame time 
 SDL_Delay( ( 1000 / FRAMES_PER_SECOND ) - fps.get_ticks() ); 
} 

Although I've found that fps.get_ticks() always returns 0 (??) and so isn't the above not needed(?), cant we just completely leave out the timer and just delay for 1000/FPS.

I've tried both ways below and both give me the same thing. What am I missing here, why do we need a timer?.

#include "SDL/SDL.h"
#include "SDL/SDL_image.h"
#include <iostream>
SDL_Surface *background = NULL;
SDL_Surface *screen = NULL;
SDL_Surface *msg = NULL;
const int FPS = 20;

void initialize(void){
    if (SDL_Init(SDL_INIT_EVERYTHING) == -1 ){
        std::cout<<"could not start sdl"<<std::endl;
    }

    screen = SDL_SetVideoMode(640,480,32,SDL_SWSURFACE);
    if (screen == NULL){
        std::cout<<"could not make screen"<<std::endl;
    }

}
void cleanUp(void){
    SDL_Quit();
    SDL_FreeSurface(background);
    SDL_FreeSurface(msg);
}
void loadFiles(void){
    background = IMG_Load("background.bmp");
    msg = IMG_Load("msg.bmp");
    if (background == NULL){
        std::cout<<"could not load background"<<std::endl;
    }
    if (msg == NULL){
        std::cout<<"could not load msg"<<std::endl;
    }
}
void blitSurf(int x,int y,SDL_Surface *source,SDL_Surface *dest){
    SDL_Rect dest_pos;
    dest_pos.x = x;
    dest_pos.y = y;

    if (SDL_BlitSurface(source,NULL,dest,&dest_pos) == -1){
        std::cout<<"could not blit surface"<<std::endl;
    }
}
void update(void){
    if (SDL_Flip(screen) == -1 ){
        std::cout<<"could not update screen"<<std::endl;
    }
}

int main(int argc,char *argv[]){
    initialize();
    loadFiles();

    bool running = true;
    bool cap = false;
    int msg_pos_y = 0;
    int start = 0;
    int temp = 0;
    SDL_Event event;
    while (running == true){
        start = SDL_GetTicks();

        while (SDL_PollEvent(&event)){
            if (event.type == SDL_KEYDOWN){
                if (event.key.keysym.sym == SDLK_c){
                    if(cap == false){
                        cap = true;
                        std::cout<<"cap set to, true"<<std::endl;
                    }else{
                        cap = false;
                        std::cout<<"cap set to, false"<<std::endl;
                    }
                }
            }   
            if (event.type == SDL_QUIT){
                    running = false;
                    std::cout<<"Quit was pressed"<<std::endl;
            }
        }

        blitSurf(0,0,background,screen);
        if (msg_pos_y < 640){
            blitSurf(200,msg_pos_y,msg,screen);
            msg_pos_y++;
        }else{
            msg_pos_y = 0;
            blitSurf(200,msg_pos_y,msg,screen);
        }
        update();


        if ( (cap == true) && ( (SDL_GetTicks()-start) < (1000/FPS) ) ){
            SDL_Delay( (1000/FPS) - (SDL_GetTicks()-start) );
        }

        /* this works as well ??
        if ( cap == true ){
            SDL_Delay(1000/FPS);
        }
        */
    }

    cleanUp();
    return 0;
}
2

2 Answers

1
votes

Let's say you want 50 fps.

1000miliseconds / 50 = 20miliseconds delay.

But it takes time to render, compute physics, AI, whatever you are doing. Let's say that all this stuff I wrote takes 10miliseconds. You have 1000 miliseconds / (20ms delay + 10ms everything else) = 33.3 frames per second. You need to substract this 10ms from the delay.

0
votes

I'm no expert but what he does is : once your main loop that runs your sdl app is finished (drawing blitting and all that stuff), time may have passed a bit while you were doing your stuff, for example if it took 10ms to do draw your frame, you have to wait (1000/FPS) - 10 ms until the next frame, otherwise your frame will lasts for too long.