I can't find any API to capture live photos. Did I miss something?
Live Photos
Live Photos is a new feature of iOS 9 that allows users to capture and relive their favorite moments with richer context than traditional photos. When the user presses the shutter button, the Camera app captures much more content along with the regular photo, including audio and additional frames before and after the photo. When browsing through these photos, users can interact with them and play back all the captured content, making the photos come to life.
iOS 9.1 introduces APIs that allow apps to incorporate playback of Live Photos, as well as export the data for sharing. There is new support in the Photos framework to fetch a PHLivePhoto object from the PHImageManager object, which is used to represent all the data that comprises a Live Photo. You can use a PHLivePhotoView object (defined in the PhotosUI framework) to display the contents of a Live Photo. The PHLivePhotoView view takes care of displaying the image, handling all user interaction, and applying the visual treatments to play back the content.
You can also use PHAssetResource to access the data of a PHLivePhoto object for sharing purposes. You can request a PHLivePhoto object for an asset in the user’s photo library by using PHImageManager or UIImagePickerController. If you have a sharing extension, you can also get PHLivePhoto objects by using NSItemProvider. On the receiving side of a share, you can recreate a PHLivePhoto object from the set of files originally exported by the sender.
During the keynote, they mentioned that Facebook will support Live Photos, so I would suspect there has to be a way to capture Live Photos.