[EDIT1 - SPECIAL CASE]
There is a SPECIAL case however to this problem. Since Apple' current 2x algorithm is essentially a pixel doubling (in each direction), if you images are made of of only vertical and horizontal lines, then I would go about providing those images as the 1x and let the API auto upscale those, an example would be an image for a button that looks like this
As this will have zero problems upscaling and still looking very crisp!!! (Tested on both iPhones, as well as both iPads)
[ORIGINAL]
If you are simply doing it for display, then one way would be to simply provide the image as it should be for the retina only, and allow the UIImageView to scale down the image automatically for the non-retina displays. It WILL do this and will NOT be rejected by apple.
For the retina display, the UIImageView will naturally take up the space of 2x the pixels in each direction, thus when the image is read in for display on a retina display, the API will recognize that no scaling needs to occur!!!
Be careful though!!! Downscaling images lets say for buttons can have effects that make your images or buttons come out looking unclean / pixelated. This is why it is "recommended" to provide both, is with lets say a vector based image, you would be able to generate the highest quality image for a given resolution. Kinda depends on how you want to use the images.
As a side note. I would recommend that if you do choose the resizing option, that you try to recognize the device' screen resolution up front and resize the images (if necessary in my case for the SMALLER resolutions) only once at initialization into the button or UIImageView etc... so that you won't suffer the slower performance each time it needs to be rendered to screen.