Apple will introduce Live Photo in iOS 9/iPhone 6s. Where is the file format documented?
A live photo has two resources. They are tied together with an asset identifier (a UUID as a string).
A JPEG; this must have a metadata entry for kCGImagePropertyMakerAppleDictionary with [17 : assetIdentifier] (17 is the Apple Maker Note Asset Identifier key).
A Quicktime MOV encoded with H.264 at the appropriate framerate (12-15fps) and size (1080p). This MOV must have:
Top-level Quicktime Metadata entry for ["com.apple.quicktime.content.identifier" : assetIdentifier]. If using AVAsset you can get this from asset.metadataForFormat(AVMetadataFormatQuickTimeMetadata)
Timed Metadata track with ["com.apple.quicktime.still-image-time" : 0xFF]; The actual still image time matches up to the presentation timestamp for this metadata item. The payload seems to just be a single 0xFF byte (aka -1) and can be ignored. If using an AVAssetReader you can use CMSampleBufferGetOutputPresentationTimeStamp to get this time.
The assetIdentifier is what ties the two items together and the timed metadata track is what tells the system where the still image sits in the movie timeline.
Here's the link. Otherwise, here's the text:
Live Photos
Live Photos is a new feature of iOS 9 that allows users to capture and
relive their favorite moments with richer context than traditional
photos. When the user presses the shutter button, the Camera app
captures much more content along with the regular photo, including
audio and additional frames before and after the photo. When browsing
through these photos, users can interact with them and play back all
the captured content, making the photos come to life.
iOS 9.1 introduces APIs that allow apps to incorporate playback of
Live Photos, as well as export the data for sharing. There is new
support in the Photos framework to fetch a PHLivePhoto object from the
PHImageManager object, which is used to represent all the data that
comprises a Live Photo. You can use a PHLivePhotoView object (defined
in the PhotosUI framework) to display the contents of a Live Photo.
The PHLivePhotoView view takes care of displaying the image, handling
all user interaction, and applying the visual treatments to play back
the content.
You can also use PHAssetResource to access the data of a PHLivePhoto
object for sharing purposes. You can request a PHLivePhoto object for
an asset in the user’s photo library by using PHImageManager or
UIImagePickerController. If you have a sharing extension, you can also
get PHLivePhoto objects by using NSItemProvider. On the receiving side
of a share, you can recreate a PHLivePhoto object from the set of
files originally exported by the sender.
Guidelines for Displaying Live Photos
It’s important to remember that a Live Photo is still a photo. If you have to display a Live Photo in
an environment that doesn’t support PHLivePhotoView, it’s recommended
that you present it as a regular photo.
Don’t display the extra frames and audio of a Live Photo separately.
It's important that the content of the Live Photo be presented in a
consistent way that uses the same visual treatment and interaction
model in all apps.
It’s recommended that you identify a photo as a Live Photo by placing
the badge provided by the PHLivePhotoView class method
livePhotoBadgeImageWithOptions:PHLivePhotoBadgeOptionsOverContent in
the top-left corner of the photo.
Note that there is no support for providing the visual effect that
users experience as they swipe through photos in the Photos app.
Guidelines for Sharing Live Photos
The data of a Live Photo is
exported as a set of files in a PHAssetResource object. The set of
files must be preserved as a unit when you upload them to a server.
When you rebuild a PHLivePhoto with these files on the receiver side,
the files are validated; loading fails if the files don’t come from
the same asset.
If your app lets users apply effects or adjustments to a photo before
sharing it, be sure to apply the same adjustments to all frames of the
Live Photo. Alternatively, if you don’t support adjusting the entire
contents of a Live Photo, share it as a regular photo and show an
appropriate indication to the user.
If your app has UI for picking photos to share, you should let users
play back the entire contents so they know exactly what they are
sharing.When selecting photos to share in your app, users should also
be able to turn a Live Photo off, so they can post it as a traditional
photo.
Outside of the documentation, Live Photos are made up of 2 resources, an image and an mov (quicktime movie file). So every Live Photo has 2 'actual' files connected by the wrapper of the Live Photo type.
Live Photos is actually two files. Original JPEG Image and Full HD Video.
Uniform Type Identifier (UTI) for the format is kUTTypeLivePhoto / com.apple.live-photo
#available(OSX 10.12, *)
public let kUTTypeLivePhoto: CFString
/*
*
* kUTTypeLivePhoto
*
* Live Photo
*
* UTI: com.apple.live-photo
*
*
*/
some additional info about live photos:
agree, it has .mov file extension
it saved in directory /var/mobile/media/DCIM/100apple/ alongside
with jpg version of photo
live photos can be played even on device without 3D touch (I can
play it on my ipad 2017 via long-pressing on the photo)
it can be played even on old phones (such as iphone 5) even on iOS8
if you install PhotosLive tweak
Related
I'm trying to build a hybrid iPhone app that makes use of the html-tag for video uploads.
Camera opens when the input is clicked, users can record video and upload works fine.
I'm quite new into swift and iOS-development, therefore my question:
How can I address the video inside the native app part, to additionally store it on the phone inside the gallery? Is this possible?
You should use accept="video/mp4,video/x-m4v,video/*"
MIMEType webkit
How can I access the GameCenter player's photo URL? I know about loadPhotoForSize but it seems to return UIImage. I need the URL as I would like to send this URL to my backend and show this photo for users from non-iOS devices.
There is no available GameCenter player's photo URL. It may exist right now, but Apple can change it any moment. And it is not in open access.
Correct way to show player's photo on another non-iOS platforms is to upload this photo to your game servers. Use loadPhotoForSize to obtain UIImage and than upload it to your server. For example you can use answers from here: ios Upload Image and Text using HTTP POST
If I have a UIWebView which has links to files then is it possible to save those files to disk outside of the application sandbox?
Yes I know an app can't access the filesystem outside of its sandbox, but could the saving process be done by whatever app can handle the file type? - For example if its an audio file, could my app launch QuickTime passing it the file (or file url) and the user is then able to save the file via the QuickTime app saving it in the appropriate location for audio files?
Apps are limited to saving data within their own sandbox. Which you seem to acknowledge already.
You can make one app launch another, which in theory could allow a second app to save data, but within its own sandbox. You also mention this.
In effect, you've answered your own question.
It is possible to write images to the photo album which is outside the sandbox.
UIImage* image = [UIImage imageNamed:#"picture_of_my_cat.png"];
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
In order to get data out from a UIWebView (javascript to objective C) you can subclass NSURLProtocol and use jQuery.ajax.
I have an app that records a video and saves it straight to the photo library on the iPhone. I want to, as soon as the user finishes recording, have the app open up in mail, with the video attached. How should I go about doing this?
I am using this basic idea: http://blog.mugunthkumar.com/coding/iphone-tutorial-in-app-email/
How do I get the file path for the video?
You could either store the video in your app and then attach that into the email before copying it to the photo library or show the user their photo library in the app via UIImagePicker - This may help or just google it ...
http://developer.apple.com/library/ios/#samplecode/MyImagePicker/Introduction/Intro.html#//apple_ref/doc/uid/DTS40010135
In order to be able to send a video taken on the iphone to a server I assume one needs its data, how can one get a videos data? Is there something similar to UIImageJPEGRepresentation(UIImage *, ) for images.
Yes, see the 3.0 documentation for UIImagePickerController and UIImagePickerControllerDelegate. Specifically the new imagePickerController:didFinishPickingMediaWithInfo: delegate method that will be called when the user chose a movie from the camera roll.
That delegate method will give you a dictionary containing information about the movie, including a URL to it's file. See the UIImagePickerControllerMediaURL key.
Once you have the (filesystem) URL, you can read the movie and send it to a server.
They are just mpeg-4 files, that you can end anywhere. It's not like the image picker where you only get an in-memory image.