Apple Live Photo file format

给你一囗甜甜゛ 提交于 2019-11-26 19:40:57

问题


Apple will introduce Live Photo in iOS 9/iPhone 6s. Where is the file format documented?


回答1:


Here's the link. Otherwise, here's the text:

Live Photos

Live Photos is a new feature of iOS 9 that allows users to capture and relive their favorite moments with richer context than traditional photos. When the user presses the shutter button, the Camera app captures much more content along with the regular photo, including audio and additional frames before and after the photo. When browsing through these photos, users can interact with them and play back all the captured content, making the photos come to life.

iOS 9.1 introduces APIs that allow apps to incorporate playback of Live Photos, as well as export the data for sharing. There is new support in the Photos framework to fetch a PHLivePhoto object from the PHImageManager object, which is used to represent all the data that comprises a Live Photo. You can use a PHLivePhotoView object (defined in the PhotosUI framework) to display the contents of a Live Photo. The PHLivePhotoView view takes care of displaying the image, handling all user interaction, and applying the visual treatments to play back the content.

You can also use PHAssetResource to access the data of a PHLivePhoto object for sharing purposes. You can request a PHLivePhoto object for an asset in the user’s photo library by using PHImageManager or UIImagePickerController. If you have a sharing extension, you can also get PHLivePhoto objects by using NSItemProvider. On the receiving side of a share, you can recreate a PHLivePhoto object from the set of files originally exported by the sender.

Guidelines for Displaying Live Photos

It’s important to remember that a Live Photo is still a photo. If you have to display a Live Photo in an environment that doesn’t support PHLivePhotoView, it’s recommended that you present it as a regular photo.

Don’t display the extra frames and audio of a Live Photo separately. It's important that the content of the Live Photo be presented in a consistent way that uses the same visual treatment and interaction model in all apps.

It’s recommended that you identify a photo as a Live Photo by placing the badge provided by the PHLivePhotoView class method livePhotoBadgeImageWithOptions:PHLivePhotoBadgeOptionsOverContent in the top-left corner of the photo.

Note that there is no support for providing the visual effect that users experience as they swipe through photos in the Photos app.

Guidelines for Sharing Live Photos

The data of a Live Photo is exported as a set of files in a PHAssetResource object. The set of files must be preserved as a unit when you upload them to a server. When you rebuild a PHLivePhoto with these files on the receiver side, the files are validated; loading fails if the files don’t come from the same asset.

If your app lets users apply effects or adjustments to a photo before sharing it, be sure to apply the same adjustments to all frames of the Live Photo. Alternatively, if you don’t support adjusting the entire contents of a Live Photo, share it as a regular photo and show an appropriate indication to the user.

If your app has UI for picking photos to share, you should let users play back the entire contents so they know exactly what they are sharing.When selecting photos to share in your app, users should also be able to turn a Live Photo off, so they can post it as a traditional photo.




回答2:


A live photo has two resources. They are tied together with an asset identifier (a UUID as a string).

  1. A JPEG; this must have a metadata entry for kCGImagePropertyMakerAppleDictionary with [17 : assetIdentifier] (17 is the Apple Maker Note Asset Identifier key).
  2. A Quicktime MOV encoded with H.264 at the appropriate framerate (12-15fps) and size (1080p). This MOV must have:
    • Top-level Quicktime Metadata entry for ["com.apple.quicktime.content.identifier" : assetIdentifier]. If using AVAsset you can get this from asset.metadataForFormat(AVMetadataFormatQuickTimeMetadata)
    • Timed Metadata track with ["com.apple.quicktime.still-image-time" : 0xFF]; The actual still image time matches up to the presentation timestamp for this metadata item. The payload seems to just be a single 0xFF byte (aka -1) and can be ignored. If using an AVAssetReader you can use CMSampleBufferGetOutputPresentationTimeStamp to get this time.

The assetIdentifier is what ties the two items together and the timed metadata track is what tells the system where the still image sits in the movie timeline.




回答3:


Outside of the documentation, Live Photos are made up of 2 resources, an image and an mov (quicktime movie file). So every Live Photo has 2 'actual' files connected by the wrapper of the Live Photo type.




回答4:


Live Photos is actually two files. Original JPEG Image and Full HD Video.




回答5:


Uniform Type Identifier (UTI) for the format is kUTTypeLivePhoto / com.apple.live-photo

@available(OSX 10.12, *)
public let kUTTypeLivePhoto: CFString

/*
 *
 *  kUTTypeLivePhoto
 *
 *    Live Photo
 *
 *    UTI: com.apple.live-photo
 *
 *
 */



回答6:


some additional info about live photos:

  1. agree, it has .mov file extension
  2. it saved in directory /var/mobile/media/DCIM/100apple/ alongside with jpg version of photo
  3. live photos can be played even on device without 3D touch (I can play it on my ipad 2017 via long-pressing on the photo)
  4. it can be played even on old phones (such as iphone 5) even on iOS8 if you install PhotosLive tweak


来源:https://stackoverflow.com/questions/32508375/apple-live-photo-file-format

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!