Export Video PHAsset to NSData with GPS metadata - ios

I am trying to convert a video PHAsset to NSData for upload, but each attempt to save the data of the asset seems to lose the location metadata. Here is what I have tried:
Get the AVAsset from PHAsset, and use its URL:
The AVAsset itself has metadata in both the retrievedAsset.metadata and retrievedAsset.commonMetadata fields. I can see that GPS coordinates are part of that data.
[[PHImageManager defaultManager] requestAVAssetForVideo:asset
options:options
resultHandler:^(AVAsset *retrievedAsset, AVAudioMix *audioMix, NSDictionary *info) {
}
In the completion block above, retrievedAsset is actually an AVURLAsset, which has a local URL of the media data. That URL has a path extension of .MOV, if I just copy the data at the location to my temp directory for upload, then view the .MOV file inside my apps container, the resulting .MOV file shows no location data.
I have also tried using AVAssetExportSession to try and set the metadata specifically in the export.
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:retrievedAsset presetName:AVAssetExportPresetHighestQuality];
// My output url has a path extension of .MOV, and I set the file type to QuicktimeMove,
// because the original AVURLAsset had a .MOV extension, but I am not sure if that is correct.
// I have also tried making the MP4 with no luck.
exporter.outputURL = url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
// In here I have tried to use retrievedAsset.metadata, retrievedAsset.commonMetadata
// I have also tried to create a custom AVMetaDataItem with GPS coordinates
[exporter setMetadata:cutomMetaData];
[exporter exportAsynchronouslyWithCompletionHandler:^{
// The data is successfully exported to a .MOV at my specified URL, but always appears to not contain any location metadata.
}];
Is there any way I can get the original PHAsset metadata into the final movie file?

Related

AVAsset tracks is empty

Essentially I am looking to concatenate AVAsset files. I've got a rough idea of what to do but I'm struggling with loading the audio files.
I can play the files with an AVAudioPlayer, I can see them in the directory via my terminal, but when I attempt to load them with AVAssetURL it always returns an empty array for tracks.
The URL I am using:
NSURL *firstAudioFileLocation = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#%#", workingDirectory , #"/temp.pcm"]];
Which results in:
file:///Users/evolve/Library/Developer/CoreSimulator/Devices/8BF465E8-321C-47E6-BF2E-049C5E900F3C/data/Containers/Data/Application/4A2D29B2-E5B4-4D07-AE6B-1DD15F5E59A3/Documents/temp.pcm
The asset being loaded:
AVAsset *test = [AVURLAsset URLAssetWithURL:firstAudioFileLocation options:nil];
However when calling this:
NSLog(#" total tracks %#", test.tracks);
My output is always total tracks ().
My subsequent calls to add them to my AVMutableCompositionTrack end up crashing the app as the AVAsset seems to not have loaded correctly.
I have played with other variations for loading the asset including:
NSURL *alternativeLocation = [[NSBundle mainBundle] URLForResource:#"temp" withExtension:#"pcm"];
As well as trying to load AVAsset with the options from the documentation:
NSDictionary *assetOptions = #{AVURLAssetPreferPreciseDurationAndTimingKey: #YES};
How do I load the tracks from a local resource, recently created by the AVAudioRecorder?
EDIT
I had a poke around and found I can record and load a .CAF file extension.
Seems .PCM is unsupported for AVAsset, this page also was of great help. https://developer.apple.com/documentation/avfoundation/avfiletype
An AVAsset load is not instantaneous. You need to wait for the data to be available. Example:
AVAsset *test = [AVURLAsset URLAssetWithURL:firstAudioFileLocation options:nil];
[test loadValuesAsynchronouslyForKeys:#[#"playable",#"tracks"] completionHandler:^{
// Now tracks is available
NSLog(#" total tracks %#", test.tracks);
}];
A more detailed example can be found in the documentation.

How can I retrieve the metadata of PHAsset?

I am getting the user selected image in PHAsset format. and I want to get the image path.
My goal is to upload the image to Firebase and based their docs, I need to have the image path.
I found from researching that I need to get the metadata of the image first and store it in local file and then I can retrieve the URL. Is it correct?
My question here is (If the above correct), how can I get the metadata of PHAsset image format?
If you want to retrive the image file Path from your PHAsset address then use this function:
[asset requestContentEditingInputWithOptions:[PHContentEditingInputRequestOptions new] completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
NSURL *imageURL = contentEditingInput.fullSizeImageURL;
}];
asset=PHAsset *asset

Play a video stored using NSDataAsset (Xcode xcassets)

I am trying to use Apple App Thinning feature (available from iOS 9) that let you differentiate resources based on device architecture and features. In my case what I would like to do is to have a different video file in the application bundle (in .mp4 format) one for the iPhone and one for the iPad using Xcode .xcassets Data Set.
To retrieve a file from a .xcassets Data Set Apple provides the NSDataAsset class, but: since AVPlayer needs a URL to play a video and NSDataAsset only provides its contents using Data format, I'm unable to play the video.
What I would like to do is to retrive the NSDataAsset .data URL. Is it possible?
You can try:
NSDataAsset *videosDataAsset = [[NSDataAsset alloc] initWithName:#"AssetName"];
NSData *data = videosDataAsset.data;
NSString *filename = #"FileToSaveInto.mp4";
NSURL *URL = [[[[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask] lastObject] URLByAppendingPathComponent:filename];
if ([data writeToURL:URL atomically:YES]) {
// run player
}

UIImagePickerController returns H.265 instead of H.264

I have an app that uses a UIImagePickerController object to get a video file for sharing.
I'm setting the media types (picker.mediaTypes = [kUTTypeMovie as String]) and using the field UIImagePickerControllerMediaURL to fetch either the video details. This works properly and there are no issues when all devices are either H.264 live encoding, or when all devices are H.265 live decoding.
The issue is that I need to support devices which cannot playback H.265 content. I would like to take the "create a single compatible file" route. How can I tell UIImagePickerController to give me a H.264 video regardless of recording device's capabilities?
This is what worked for me. I used an AVAssetExportSession to get a H.265 video exported in H.264 format.
Maybe the above solution would work by choosing AVAssetExportPresetHighestQuality as the videoExportPreset property to UIImagePickerController. The bonus of my approach is iOS 9/10 compatibility. And maybe a snappier UI because you can do the export on a background thread.
I can't use UIImagePickerController because the same picker workflow in my app allows the user to select multiples, so I'm using CTAssetsPickerController, which requires the use of PHAsset for the returned media objects.
Ezekiel and nathan's discussion led me to this solution, so sharing it here.
PHAsset *phasset = <fetched-asset>;
NSURL *assetURL = <where-to-store-exported-asset>;
if(PHAssetMediaTypeVideo == phasset.mediaType) {
[[PHImageManager defaultManager] requestAVAssetForVideo: phasset
options: nil resultHandler:^(AVAsset * _Nullable avasset, AVAudioMix * _Nullable audioMix, NSDictionary * _Nullable info) {
AVAssetExportSession *exportSession =
[AVAssetExportSession exportSessionWithAsset: avasset presetName: AVAssetExportPresetHighestQuality];
exportSession.outputURL = assetURL;
exportSession.outputFileType = AVFileTypeMPEG4;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if(exportSession.status == AVAssetExportSessionStatusCompleted) {
//
// success!
//
}
}];
}];
}

iOS, how to keep Photo Sphere XMP Metadata while saving it to Camera Roll?

I am able to add the Photo Sphere XMP Metadata to an equirectangular photo.
I uploaded the photo with the XMP metadata to Google Photos, and the Google Photos is able to be recognized as a sphere photo.
Then, I tried to save the photo with the XMP metadata to camera roll.
I shared the photo to Google Photos from camera roll, but Google Photos does not know it's a sphere photo.
I tried to download the photo and analyze it, and found the XMP metadata are all gone.
It seems iOS will edit the metadata while the photo is saving to camera roll.
Is there any way to preserve the photo's XMP metadata while saving it to camera roll?
// raw jpg data to NSData
NSString *img = [[[NSBundle mainBundle] resourcePath] stringByAppendingPathComponent:#"equirectangular_orig.jpg"];
NSData* imgData = [[NSFileManager defaultManager] contentsAtPath:img];
NSMutableData *newImgData = [NSMutableData dataWithData:imgData];
// writing XMP metadata to newImgData
// ...
// ...
// saving the newImgData to new jpg file
NSString *path = [[NSHomeDirectory() stringByAppendingPathComponent:#"Documents"] stringByAppendingPathComponent:#"equirectangular_xmp.jpg"];
NSURL *url = [NSURL fileURLWithPath:path];
[[NSFileManager defaultManager] removeItemAtPath:[url absoluteString] error:nil];
[newImgData writeToURL:url atomically:YES];
// saving the new jpg file to camera roll
UIImage *newImg = [UIImage imageWithData:newImgData];
UIImageWriteToSavedPhotosAlbum(newImg, self, nil, nil);
Key point: using
[PHAssetChangeRequest creationRequestForAssetFromImageAtFileURL: url]
instead of
[PHAssetChangeRequest creationRequestForAssetFromImage:image]
since UIImage will break the original file, just use a temp url to save your original image raw data, and using creationRequestForAssetFromImageAtFileURL api.

Resources