0
votes

I get some strange problems when trying to get the screen size in my universal iPhone/iPad app.

I was first using

[[UIScreen mainScreen] bounds]

But it does not return the correct size for iPhone 4 (at least not in the simulator), it just returns 320x480 for all iPhones

Then I changed to

CGSize screenSize = mainscr.currentMode.size;

And it works in the simulator for all apple devices, but when running this line on an iPhone 3GS device the program exits with a SIGABRT

Device is running 3.1.2

Any idea how to get the pixel dimension of the display in a device safe way?

3

3 Answers

3
votes

UIScreen.currentMode is not available in < 3.2, so you need to check with -respondsToSelector:

CGSize screenSize;
if ([mainscr respondsToSelector:@selector(currentMode)])
  screenSize = mainscr.currentMode.size;
else
  screenSize = mainscr.bounds.size;

Similarly, UIScreen.scale is not available in < 4.0, if you use that, check with -respondsToSelector: .

CGFloat scale = [mainscr respondsToSelector:@selector(scale)] ? mainscr.scale : 1.0f;
3
votes

[[UIScreen mainScreen] bounds] returns a value in points not in pixels but you can use the scale parameter to convert the resolution in pixels.

0
votes

The proper way to think about is that the screen resolution IS 320x480 but with a display scale of 2.0. Realize that it is very likely that 'other' apple devices in the future will have other display scales.. imagine for example a new iPad someday that has a scale of 1.5...

  if([[UIScreen mainScreen] respondsToSelector:@selector"scale"]) {
      displayScale = [[UIScreen mainScreen] scale]; }

The reason they did this is to make it easy to write apps that work on any device. You can put an object on the screen at 100,100 and it will be in the same place on both devices. Use the @2x naming method to provide two sets of images, one at 1x scale and one at 2x scale.