How to access NSData/NSURL of slow motion videos using PhotoKit - ios

Working with new Photo framework, I can access the NSData of PHAssets using requestImageDataForAsset. I can also access the file URL using the PHImageFileURLKey of the returned info NSDictionary.
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
//imageData contains the correct data for images and videos
NSLog(#"info - %#", info);
NSURL* fileURL = [info objectForKey:#"PHImageFileURLKey"];
}];
This works fine for images and normal videos.
However, when the asset is a PHAssetMediaSubtypeVideoHighFrameRate (slow motion video), the returned data corresponds to a JPG file containing the first frame of the video (both the NSData, the dataUTI and the info dictionary point to the same jpg file). As example, this is the URL and the dataUTI returned for a slow motion video:
PHImageFileURLKey =
"file:///var/mobile/Media/PhotoData/Metadata/DCIM/100APPLE/IMG_0642.JPG";
PHImageFileUTIKey = "public.jpeg";
Why is this happening?
How can i access the NSData/NSURL of the slow motion video instead of this JPG preview?

After going nuts and testing every single option I found the problem.
The responsable of returning JPG images for slow motion videos is the default PHImageRequestOptionsVersionCurrent value for the PHImageRequestOptions.version property.
Simply assign the version to PHImageRequestOptionsVersionUnadjusted or PHImageRequestOptionsVersionOriginal will return the original slow motion video.
PHImageRequestOptions * imageRequestOptions = [[PHImageRequestOptions alloc] init];
imageRequestOptions.version = PHImageRequestOptionsVersionUnadjusted;
// or
imageRequestOptions.version = PHImageRequestOptionsVersionOriginal;
I consider this as an unexpected behaviour, since i am not expecting that the "current" version of a slow motion video is a still image (maybe a video with the slow motion effect applied, but not a photo).
Hope this is usefull to someone.

It is important to note that Slow motion videos are of type AVComposition not AVURLAsset. An AVComposition object combines media data from multiple sources together.
Exporting a slow motion video
To achieve this, I basically went through a three-step process:
Create an output URL for the video
Configure an export session
Export the video and grab the URL!
PHVideoRequestOptions *options = [PHVideoRequestOptions new];
options.networkAccessAllowed = YES;
[[PHImageManager defaultManager] requestAVAssetForVideo:asset options:options resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if(([asset isKindOfClass:[AVComposition class]] && ((AVComposition *)asset).tracks.count == 2)){
//slow motion videos. See Here: https://overflow.buffer.com/2016/02/29/slow-motion-video-ios/
//Output URL of the slow motion file.
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = paths.firstObject;
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"mergeSlowMoVideo-%d.mov",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
//Begin slow mo video export
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL = url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
if (exporter.status == AVAssetExportSessionStatusCompleted) {
NSURL *URL = exporter.outputURL;
self.filePath=URL.absoluteString;
// NSData *videoData = [NSData dataWithContentsOfURL:URL];
//
//// Upload
//[self uploadSelectedVideo:video data:videoData];
}
});
}];
}
}];
Please see this wonderful blog for slow motion videos in iOS.

Following code snippet for Swift 3/4
PHImageManager.default().requestAVAsset(forVideo: asset,
options: nil,
resultHandler: { (asset, _, _) in
// AVAsset has two sub classes: AVComposition and AVAssetURL
// AVComposition for slow mo vid
// AVAssetURL for normal videos
// For slow motion video checking for AVCompostion
// Creating an exporter to write the video into local file path and using the same to play/upload
if asset!.isKind(of: AVComposition.self){
let avCompositionAsset = asset as! AVComposition
if avCompositionAsset.tracks.count > 1{
let exporter = AVAssetExportSession(asset: avCompositionAsset, presetName: AVAssetExportPresetHighestQuality)
exporter!.outputURL = self.fetchOutputURL()
exporter!.outputFileType = AVFileTypeMPEG4
exporter!.shouldOptimizeForNetworkUse = true
exporter!.exportAsynchronously {
DispatchQueue.main.sync {
// Use this url for uploading or playing a video
let url = exporter!.outputURL
}
}
}
}else{
// Normal video, are stored as AVAssetURL
let url = (asset as! AVURLAsset).url
}
})
// Fetch local path
func fetchOutputURL() -> URL{
let documentDirectory = getDocumentsDirectory() as NSString
let path = documentDirectory.appendingPathComponent("test.mp4")
return URL(fileURLWithPath:path)
}

//video slo-mo
PHVideoRequestOptions *options=[[PHVideoRequestOptions alloc]init];
options.version=PHVideoRequestOptionsVersionOriginal;
Request AVAsset from PHImageManager
[[PHImageManager defaultManager] requestAVAssetForVideo:videoAsset options:options resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info)
{
if ([asset isKindOfClass:[AVURLAsset class]])
{
// use URL to get file content
NSURL *URL = [(AVURLAsset *)asset URL];
NSData *videoData=[NSData dataWithContentsOfURL:URL];
NSNumber *fileSizeValue = nil;
[URL getResourceValue:&fileSizeValue forKey:NSURLFileSizeKey error:nil];
}
}

Related

Objective-c how to add filter to an existing video like Instagram Application?

I'm trying to add filter to video after recording it, just like Instagram. After searching for a way I found GPUImage, but that didn't solve it since in this code the video is being written to directory before displaying it (causing delay):
//Setting path for temporary storing the video in document directory
NSURL *movieURL = [self dataFilePath: #"tempVideo.mp4"]; // url where we want to save our new edited video
Is there a way to preview the filtered video before saving it in order not to take time? If so, how it is done?
Also, I found in the apple documentation about CIFilter but still can not find a way that states how to add filters to the video.
In addition, that there are some codes but written in swift.
Thanks in advance.
I couldn't find a way to stop the delay that is happing with the GPUImage. So I tried working with the SCRecorder.
What I have done is:
[_player setItemByUrl:videoURL];
instead of:
[_player setItemByAsset:_recordSession.assetRepresentingSegments];
by this way I'm able to play and add filter to an existing video just like Instagram.
To export the video with the chosen filter I used this code:
- (void)saveToCameraRoll {
NSString *fileName = videoURL.lastPathComponent;
NSArray *dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docsDir = [dirPaths objectAtIndex:0];
NSString *videoPath = [NSString stringWithFormat:#"%#/%#",docsDir,fileName];
NSURL *urlPath = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#", videoPath]];
NSURL *assetUrl = urlPath;
AVAsset *asset = [AVAsset assetWithURL:assetUrl];
SCFilter *exportFilter = [self.filterSwitcherView.selectedFilter copy];
SCAssetExportSession *exportSession = [[SCAssetExportSession alloc] initWithAsset:asset];
NSURL *urlFile = [NSURL URLWithString:[NSString stringWithFormat:#"%#/%#.mov",docsDir,fileName]];
exportSession.outputUrl = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#",urlFile]];
filterURL = exportSession.outputUrl;
exportSession.videoConfiguration.filter = exportFilter;
exportSession.videoConfiguration.preset = SCPresetHighestQuality;
exportSession.audioConfiguration.preset = SCPresetHighestQuality;
exportSession.videoConfiguration.maxFrameRate = 35;
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.delegate = self;
exportSession.contextType = SCContextTypeAuto;
self.exportSession = exportSession;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (exportSession.error == nil) {
[[UIApplication sharedApplication] beginIgnoringInteractionEvents];
[exportSession.outputUrl saveToCameraRollWithCompletion:^(NSString * _Nullable path, NSError * _Nullable error) {
[[UIApplication sharedApplication] endIgnoringInteractionEvents];
if (error == nil) {
//Success
}
}];
} else {
NSLog(#"Error: %#", exportSession.error);
}
}];
}

iOS: Get Audio from AVAudioRecorder

Disclaimer New to AVAudioRecorder
What I'm doing I'm working on an app that uses the iPhone microphone to record sound. After the sound is recorded, I need to convert the sound (should be AVAsset, right?) into NSData to send to our backend.
What's the issue The issue is I am not sure how to "get" the audio that is supposed to be recorded with the AVAudioRecorder. AVAudioRecorder has a delegate method called - (void)audioRecorderDidFinishRecording:(AVAudioRecorder *) aRecorder successfully:(BOOL)flag. I would have expected the actually AVAsset that contains the audio to be passed from this delegate method, but it does not. What it does give me is the aRecorder object that has a .url property on it. When I NSLog the url from the passed aRecorder, it shows up. In fact I can NSLog the length of the file in the code below:
- (void)audioRecorderDidFinishRecording:(AVAudioRecorder *) aRecorder successfully:(BOOL)flag
{
DLog (#"audioRecorderDidFinishRecording:successfully: %#",aRecorder);
AVURLAsset* audioAsset = [AVURLAsset URLAssetWithURL:aRecorder.url options:nil];
CMTime audioDuration = audioAsset.duration;
float audioDurationSeconds = CMTimeGetSeconds(audioDuration);
NSLog(#"asset length = %f", audioDurationSeconds); //Logs 7.051 seconds, so I know it's "there".
self.audioURL = aRecorder.url;
}
Problem When I pass self.audioURL to the next viewController's self.mediaURL and try to grab the file from the AssetLibrary (similarly to how I did before), the asset is not returned from the AssetLibrary (even though when I po self.mediaURL it indeed logs the correct url:
if (self.mediaURL) {
ALAssetsLibrary *assetLibrary = [[ALAssetsLibrary alloc] init];
[assetLibrary assetForURL:self.mediaURL resultBlock:^(ALAsset *asset) {
if (asset) {
// This block does NOT get called...
ALAssetRepresentation *rep = [asset defaultRepresentation];
Byte *buffer = (Byte*)malloc((long)rep.size);
NSUInteger buffered =[rep getBytes:buffer fromOffset:0.0 length:(long)rep.size error:nil];
NSMutableData *body = [[NSMutableData alloc] init];
body = [NSMutableData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
[dataToSendToServer setObject:body forKey:#"audioData"];
}
} failureBlock:^(NSError *error) {
NSLog(#"FAILED TO ACCESS AUDIO FROM URL: %#!", self.mediaURL);
}];
}
else {
NSLog(#"NO AUDIO DATA!");
}
}
Because I am new to AVAudioRecorder, perhaps I am just not designing this flow correctly. Could anyone help me out in getting the actual audio data.
Thanks!
AVAudioRecorder records to a file, not to the Asset Library.
So you can simply read the data from that file.

How to set metadata for AVFileTypeMPEG4 file via AVAssetExportSession?

I'm trying to use AVAssetExportSession to set the metadata of a AVFileTypeMPEG4 type file,but it
doesn't work,if I change the file type to AVFileTypeQuickTimeMovie,it works.But I need mp4 file,I can't find any document say AVFileTypeMPEG4 file can not be set metadata,Has anyone set meta successfully?
Here is the code that I used:
NSMutableArray *metadata = [NSMutableArray array];
AVMutableMetadataItem *metaItem = [AVMutableMetadataItem metadataItem];
metaItem.key = AVMetadataCommonKeySource;
metaItem.keySpace = AVMetadataKeySpaceCommon;
metaItem.value = #"Test metadata";
[metadata addObject:metaItem];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
exportSession.metadata = metadata;
exportSession.audioMix = audioMix;
exportSession.videoComposition = videoComposition;
exportSession.outputFileType = AVFileTypeMPEG4;//AVFileTypeQuickTimeMovie;
NSString *outputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:#"testMetadata.mp4"];
exportSession.outputURL = [NSURL fileURLWithPath:outputFilePath];
[exportSession exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
if (exportSession.status == AVAssetExportSessionStatusCompleted) {
//todo
}else{
//todo
}
});
}];
Try it with
metaItem.key = AVMetadataiTunesMetadataKeyDescription;
metaItem.keySpace = AVMetadataKeySpaceiTunes;
tried the other keyspaces but only itunes worked for me.
Apple filters metadata depending on the output type. They don't consider iTunes metadata valid for MPEG4 output, so they strip it.
Some options:
Use AVFileTypeQuickTimeMovie > MOV is closely related to MP4, and is often compatible. This depends on what you are looking to do.
Try other types (some people report success with the MPV type)
use a library to write custom data/atoms (mp4v2 works for example). Lots of work, but the only real way to achieve it.

How can I determine file size on disk of a video PHAsset in iOS8

I can request a video PHAsset using the Photos framework in iOS8. I'd like to know how big the file is on disk. There doesn't seem to be a property of PHAsset to determine that. Does anyone have a solution? (Using Photos framework not required)
Edit
As for iOS 9.3, using requestImageDataForAsset on a video type PHAsset will result in an image, which is the first frame of the video, so it doesn't work anymore. Use the following method instead, for normal video, request option can be nil, but for slow motion video, PHVideoRequestOptionsVersionOriginal needs to be set.
PHVideoRequestOptions *options = [[PHVideoRequestOptions alloc] init];
options.version = PHVideoRequestOptionsVersionOriginal;
[[PHImageManager defaultManager] requestAVAssetForVideo:asset options:options resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if ([asset isKindOfClass:[AVURLAsset class]]) {
AVURLAsset* urlAsset = (AVURLAsset*)asset;
NSNumber *size;
[urlAsset.URL getResourceValue:&size forKey:NSURLFileSizeKey error:nil];
NSLog(#"size is %f",[size floatValue]/(1024.0*1024.0)); //size is 43.703005
}
}];
//original answer
For PHAsset, use this:
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
float imageSize = imageData.length;
//convert to Megabytes
imageSize = imageSize/(1024*1024);
NSLog(#"%f",imageSize);
}];
For ALAsset:
ALAssetRepresentation *rep = [asset defaultRepresentation];
float imageSize = rep.size/(1024.0*1024.0);
I tested on one video asset, PHAsset shows the size as 43.703125, ALAsset shows the size as 43.703005.
Edit
For PHAsset, another way to get file size. But as #Alfie Hanssen mentioned, it works on normal video, for slow motion video, the following method will return a AVComposition asset in the block, so I added the check for its type. For slow motion video, use the requestImageDataForAsset method.
[[PHImageManager defaultManager] requestAVAssetForVideo:asset options:nil resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if ([asset isKindOfClass:[AVURLAsset class]]) {
AVURLAsset* urlAsset = (AVURLAsset*)asset;
NSNumber *size;
[urlAsset.URL getResourceValue:&size forKey:NSURLFileSizeKey error:nil];
NSLog(#"size is %f",[size floatValue]/(1024.0*1024.0)); //size is 43.703005
NSData *data = [NSData dataWithContentsOfURL:urlAsset.URL];
NSLog(#"length %f",[data length]/(1024.0*1024.0)); // data size is 43.703005
}
}];
Swift version with file size formatting:
let options = PHVideoRequestOptions()
options.version = .original
PHImageManager.default().requestAVAsset(forVideo: asset, options: options) { avAsset, _, _ in
if let urlAsset = avAsset as? AVURLAsset { // Could be AVComposition class
if let resourceValues = try? urlAsset.url.resourceValues(forKeys: [.fileSizeKey]),
let fileSize = resourceValues.fileSize {
let formatter = ByteCountFormatter()
formatter.countStyle = .file
let string = formatter.string(fromByteCount: Int64(fileSize))
print(string)
}
}
}
You heave pretty high chance, that video you want to know is's size is not type of AVURLAsset. But it's ok that under the hood there are more files that your video is composited of (for example raw samples, slow-mo time ranges, filters, etc...), because you want to know size of a concrete playable file. I'm not sure how estimated file size meets reality in this case, but this is how it should be done:
PHImageManager.defaultManager().requestExportSessionForVideo(asset, options: nil, exportPreset: AVAssetExportPresetHighestQuality, resultHandler: { (assetExportSession, info) -> Void in // Here you set values that specifies your video (original, after edit, slow-mo, ...) and that affects resulting size.
assetExportSession.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(asset.duration, 30)) // Time interval is default from zero to infinite so it needs to be set prior to file size computations. Time scale is I believe (it is "only" preferred value) not important in this case.
let HERE_YOU_ARE = assetExportSession.estimatedOutputFileLength
})

How to add and retrieve custom meta data to a video file in iOS programming?

I am trying to save video with custom metadata relevant for my app and trying to retrieve it when the user selects that video from the library. I am not sure if I am saving the meta data right as I am not able to see anything when I try to retrieve the metadata. I am also not sure if I am retrieving the meta data correctly. I am new to iOS, any help is appreciated. I have searched many threads and developer library but could not get this to work.
I am trying to save metadata in the recordingDidFinishToOutputFileURL delegate function. Video is getting saved in the library.
NSMutableArray *metadata = [NSMutableArray array];
AVMutableMetadataItem *mi = [AVMutableMetadataItem metadataItem];
mi.key = AVMetadataCommonKeyTitle;
mi.keySpace = AVMetadataKeySpaceCommon;
mi.value = #"title";
[metadata addObject:mi];
NSLog(#"Output saving:%#",outputFileURL);
AVAsset *video = [AVAsset assetWithURL:outputFileURL];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:video presetName:AVAssetExportPresetPassthrough];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.metadata = metadata;
exportSession.outputURL = outputFileURL;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
NSLog(#"done processing video!");
UISaveVideoAtPathToSavedPhotosAlbum(outputFileURL.path, self, #selector(video:didFinishSavingWithError: contextInfo:), nil);
}];
I am trying to retrieve the video in didFinishPickingMediaWithInfo delegate function to check the metadata but not able to see anything in the completionhandler function
if ([mediaType isEqualToString:(NSString *)kUTTypeMovie]) {
video_selected = TRUE;
NSURL *videoURL = [info objectForKey:UIImagePickerControllerMediaURL];
NSLog(#"video has %#", videoURL.path);
AVAsset *videoAsset = [AVAsset assetWithURL:videoURL];
NSLog(#"Loading metadata...");
NSArray *keys = [[NSArray alloc] initWithObjects:#"commonMetadata", nil];
NSMutableArray *metadata = [[NSMutableArray alloc] init];
[videoAsset loadValuesAsynchronouslyForKeys:keys completionHandler:^{
[metadata removeAllObjects];
for (NSString *format in [videoAsset availableMetadataFormats])
{
[metadata addObjectsFromArray:[videoAsset metadataForFormat:format]];
NSLog(#"Printing metadata-%#",metadata);
}
}];
check your errors. You're exporting to where the file already exists so it's exporting into itself which won't work. Just export to a different location
These keys work fine:
AVMetadataCommonKeyAuthor
AVMetadataCommonKeyDescription
AVMetadataCommonKeyCopyrights
MP4 files support the ISO userdata keyspace and the iTunes metadata keyspace. The above keys don't have an ISO userdata mapping, so that's why they're not working. Likewise, you can't put AVMetadataQuickTimeMetadataKey* in an MP4 file because it doesn't support that keyspace.
var metadata: [AVMetadataItem] = []
let mi = AVMutableMetadataItem()
mi.identifier = AVMetadataIdentifier.commonIdentifierDescription
mi.keySpace = .common
mi.value = "Milan Description" as (NSCopying & NSObjectProtocol)?
metadata.append(mi)
let mi1 = AVMutableMetadataItem()
mi1.identifier = AVMetadataIdentifier.commonIdentifierCopyrights
mi1.keySpace = .common
mi1.value = "Milan copy" as (NSCopying & NSObjectProtocol)?
metadata.append(mi1)
print("Output saving:\(trackObj.getTrackUrl())")
let video = AVAsset(url: trackObj.getTrackUrl())
let exportSession = AVAssetExportSession(asset: video, presetName: AVAssetExportPresetPassthrough)
exportSession?.metadata = metadata
exportSession?.outputURL = volumeBoosterViewModel.audioResultRootFolderURL.appendingPathComponent(trackObj.filePathName)
exportSession?.outputFileType = AVFileType.mp3
exportSession?.exportAsynchronously(completionHandler: {[weak self]
() -> Void in
if exportSession!.status == AVAssetExportSession.Status.completed {
// All is working fine!!
print("success")
//get metadata
let asset = AVAsset(url: Your url)
if asset.commonMetadata.count > 0 {
for item in asset.metadata {
//get metadata
print(item.value)
}
}
}
})

Resources