How to get video attributes in iOS8 share extension - ios

I am trying to create a sharing extension to share videos using the new iOS 8 app extensions.I want to open the photo app and share the video by my extension.
I get the video url by following codes:
NSExtensionItem *inputItem = self.extensionContext.inputItems.firstObject;
NSItemProvider *provider = inputItem.attachments[0];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT,0), ^{
if ([provider hasItemConformingToTypeIdentifier:(NSString *)kUTTypeQuickTimeMovie])
{
[provider loadItemForTypeIdentifier:#"com.apple.quicktime-movie" options:nil completionHandler:^(NSURL *path,NSError *error){
if (path)
{
dispatch_async(dispatch_get_main_queue(), ^{
_moviePath = path;
});
}
}];
}
});
But I only got the video file url:
file:///var/mobile/Media/DCIM/100APPLE/IMG_0062.MOV
I also want to get more video attributes ,like: size and duration
I used following codes,but not work:
NSDictionary *opts = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:NO] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset *urlAsset = [AVURLAsset URLAssetWithURL:_moviePath options:opts];
int second = urlAsset.duration.value / urlAsset.duration.timescale;
and this code:
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:_moviePath];
CMTime duration = playerItem.duration;
float seconds = CMTimeGetSeconds(duration);
NSLog(#"duration: %.2f", seconds);
they are all do not work
and xcode tips:
Error [IRForTarget]: Call to a symbol-only function 'memset' that is not present in the target
error: 0 errors parsing expression
error: The expression could not be prepared to run in the target
You can help me to get the video attributes?Thank you very much!

Now the following codes do working:
NSDictionary *opts = [NSDictionary
dictionaryWithObject:[NSNumber numberWithBool:NO]
forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset *urlAsset = [AVURLAsset URLAssetWithURL:_moviePath options:opts];
int second = urlAsset.duration.value / urlAsset.duration.timescale;
And I don't know Why it does not work before!

Related

AVURLAsset load webvtt file

I am trying to use AVURLAsset to load a webvtt file.
Below is my code.
NSString *urlAddress = #"http://somewhere/some.vtt";
NSURL *urlStream = [[NSURL alloc] initWithString:urlAddress];
AVAsset *avAsset = [AVURLAsset URLAssetWithURL:urlStream options:nil];
NSArray *requestKeys = [NSArray arrayWithObjects:#"tracks",#"playable",nil];
[avAsset loadValuesAsynchronouslyForKeys:requestKeys completionHandler:^{
dispatch_async(dispatch_get_main_queue(),^{
//complete block here
AVKeyValueStatus status =[avAsset statusOfValueForKey:#"tracks" error:nil];
if(status == AVKeyValueStatusLoaded) {
//loaded block !
//Question 1
CMTime assetTime = [avAsset duration];
Float64 duration = CMTimeGetSeconds(assetTime);
NSLog(#"%f", duration);
//Question 2
AVMediaSelectionGroup *subtitle = [avAsset mediaSelectionGroupForMediaCharacteristic: AVMediaCharacteristicLegible];
NSLog(#"%#", subtitle);
}
else {
//don’t load block !
}
});
}];
Question 1: It always go into the "Loaded Block", but I find the avAsset's duration is not complete, that means the data is not loaded? How should I modify it?
Question 2: I am trying to use it to my avplayer's subtitle, but the AVMediaSelectionGroup is always null. What should I do?
For question 1, add duration to your keys:
NSArray *requestKeys = #[#"tracks",#"playable", #"duration"];
I've posted a solution over here: https://stackoverflow.com/a/37945178/171933 Basically you need to use an AVMutableComposition to join the video with the subtitles and then play back that composition.
About your second question: mediaSelectionGroupForMediaCharacteristic seems to only be supported when these "characteristics" are already baked into either the media file or your m3u8 stream, according to this statement by an Apple engineer (bottom of the page).

iOS: Get Audio from AVAudioRecorder

Disclaimer New to AVAudioRecorder
What I'm doing I'm working on an app that uses the iPhone microphone to record sound. After the sound is recorded, I need to convert the sound (should be AVAsset, right?) into NSData to send to our backend.
What's the issue The issue is I am not sure how to "get" the audio that is supposed to be recorded with the AVAudioRecorder. AVAudioRecorder has a delegate method called - (void)audioRecorderDidFinishRecording:(AVAudioRecorder *) aRecorder successfully:(BOOL)flag. I would have expected the actually AVAsset that contains the audio to be passed from this delegate method, but it does not. What it does give me is the aRecorder object that has a .url property on it. When I NSLog the url from the passed aRecorder, it shows up. In fact I can NSLog the length of the file in the code below:
- (void)audioRecorderDidFinishRecording:(AVAudioRecorder *) aRecorder successfully:(BOOL)flag
{
DLog (#"audioRecorderDidFinishRecording:successfully: %#",aRecorder);
AVURLAsset* audioAsset = [AVURLAsset URLAssetWithURL:aRecorder.url options:nil];
CMTime audioDuration = audioAsset.duration;
float audioDurationSeconds = CMTimeGetSeconds(audioDuration);
NSLog(#"asset length = %f", audioDurationSeconds); //Logs 7.051 seconds, so I know it's "there".
self.audioURL = aRecorder.url;
}
Problem When I pass self.audioURL to the next viewController's self.mediaURL and try to grab the file from the AssetLibrary (similarly to how I did before), the asset is not returned from the AssetLibrary (even though when I po self.mediaURL it indeed logs the correct url:
if (self.mediaURL) {
ALAssetsLibrary *assetLibrary = [[ALAssetsLibrary alloc] init];
[assetLibrary assetForURL:self.mediaURL resultBlock:^(ALAsset *asset) {
if (asset) {
// This block does NOT get called...
ALAssetRepresentation *rep = [asset defaultRepresentation];
Byte *buffer = (Byte*)malloc((long)rep.size);
NSUInteger buffered =[rep getBytes:buffer fromOffset:0.0 length:(long)rep.size error:nil];
NSMutableData *body = [[NSMutableData alloc] init];
body = [NSMutableData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
[dataToSendToServer setObject:body forKey:#"audioData"];
}
} failureBlock:^(NSError *error) {
NSLog(#"FAILED TO ACCESS AUDIO FROM URL: %#!", self.mediaURL);
}];
}
else {
NSLog(#"NO AUDIO DATA!");
}
}
Because I am new to AVAudioRecorder, perhaps I am just not designing this flow correctly. Could anyone help me out in getting the actual audio data.
Thanks!
AVAudioRecorder records to a file, not to the Asset Library.
So you can simply read the data from that file.

Looping a Video in AVFoundation AVSampleBufferDisplayLayer

I am trying to play a video in a loop on a AVSampleBufferDisplayLayer. I can get it to play though once with no problem. But, when I try to loop it, it doesn't keep playing.
Per the answer to AVFoundation to reproduce a video loop there isn't a way to rewind the AVAssetReader so I re-create it. (I did see the answer for Looping a video with AVFoundation AVPlayer? but AVPlayer is more full-features. I am reading for a file, but want the AVSampleBufferDisplayLayer still.)
One hypothesis is that I need to stop some of the H264 headers, but I have no idea if that'll help (and how). Another is that it has something to do with the CMTimebase, but I've tried several things to no avail.
Code below, based on Apple's WWDC talk on Direct Access to Video Encoding:
- (void)viewDidLoad {
[super viewDidLoad];
NSString *filepath = [[NSBundle mainBundle] pathForResource:#"sample-mp4" ofType:#"mp4"];
NSURL *fileURL = [NSURL fileURLWithPath:filepath];
AVAsset *asset = [AVURLAsset URLAssetWithURL:fileURL options:nil];
UIView *view = self.view;
self.videoLayer = [[AVSampleBufferDisplayLayer alloc] init];
self.videoLayer.bounds = view.bounds;
self.videoLayer.position = CGPointMake(CGRectGetMidX(view.bounds), CGRectGetMidY(view.bounds));
self.videoLayer.videoGravity = AVLayerVideoGravityResizeAspect;
self.videoLayer.backgroundColor = [[UIColor greenColor] CGColor];
CMTimebaseRef controlTimebase;
CMTimebaseCreateWithMasterClock( CFAllocatorGetDefault(), CMClockGetHostTimeClock(), &controlTimebase );
self.videoLayer.controlTimebase = controlTimebase;
CMTimebaseSetTime(self.videoLayer.controlTimebase, CMTimeMake(5, 1));
CMTimebaseSetRate(self.videoLayer.controlTimebase, 1.0);
[[view layer] addSublayer:_videoLayer];
dispatch_queue_t assetQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0); //??? right queue?
__block AVAssetReader *assetReaderVideo = [self createAssetReader:asset];
__block AVAssetReaderTrackOutput *outVideo = [assetReaderVideo outputs][0];
if( [assetReaderVideo startReading] )
{
[_videoLayer requestMediaDataWhenReadyOnQueue: assetQueue usingBlock: ^{
while( [_videoLayer isReadyForMoreMediaData] )
{
CMSampleBufferRef sampleVideo;
if ( ([assetReaderVideo status] == AVAssetReaderStatusReading) && ( sampleVideo = [outVideo copyNextSampleBuffer]) ) {
[_videoLayer enqueueSampleBuffer:sampleVideo];
CFRelease(sampleVideo);
CMTimeShow(CMTimebaseGetTime(_videoLayer.controlTimebase));
}
else {
[_videoLayer stopRequestingMediaData];
//CMTimebaseSetTime(_videoLayer.controlTimebase, CMTimeMake(5, 1));
//CMTimebaseSetRate(self.videoLayer.controlTimebase, 1.0);
//CMTimeShow(CMTimebaseGetTime(_videoLayer.controlTimebase));
assetReaderVideo = [self createAssetReader:asset];
outVideo = [assetReaderVideo outputs][0];
[assetReaderVideo startReading];
//sampleVideo = [outVideo copyNextSampleBuffer];
//[_videoLayer enqueueSampleBuffer:sampleVideo];
}
}
}];
}
}
-(AVAssetReader *)createAssetReader:(AVAsset*)asset {
NSError *error=nil;
AVAssetReader *assetReaderVideo = [[AVAssetReader alloc] initWithAsset:asset error:&error];
NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
AVAssetReaderTrackOutput *outVideo = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTracks[0] outputSettings:nil]; //dic];
[outVideo res]
[assetReaderVideo addOutput:outVideo];
return assetReaderVideo;
}
Thanks so much.
Try Making a loop with swift, then bridge the objective-c files with the swift files. google has many answers to bridging and looping, so just google it with swift.

Saved video filtering on iOS

How can I make the process that filtering saved video in photo library in iOS?
I got URLs of videos in the library using AssetsLibrary framework,
then, made a preview for the video.
Next step, I wanna make filtering process for video using CIFilter.
In case of real time issue, I made video filter process using AVCaptureVideoDataOutputSampleBufferDelegate.
But in case of saved video, I don't know how to make filter process.
Do I use AVAsset? If I must use that, how can I filter it? and how to save it?
always thank you.
I hope this will help you
AVAsset *theAVAsset = [[AVURLAsset alloc] initWithURL:mNormalVideoURL options:nil];
NSError *error = nil;
float width = theAVAsset.naturalSize.width;
float height = theAVAsset.naturalSize.height;
AVAssetReader *mAssetReader = [[AVAssetReader alloc] initWithAsset:theAVAsset error:&error];
[theAVAsset release];
NSArray *videoTracks = [theAVAsset tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoTrack = [videoTracks objectAtIndex:0];
mPrefferdTransform = [videoTrack preferredTransform];
NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
AVAssetReaderTrackOutput* mAssetReaderOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack outputSettings:options];
[mAssetReader addOutput:mAssetReaderOutput];
[mAssetReaderOutput release];
CMSampleBufferRef buffer = NULL;
//CMSampleBufferRef buffer = NULL;
while ( [mAssetReader status]==AVAssetReaderStatusReading ){
buffer = [mAssetReaderOutput copyNextSampleBuffer];//read next image.
}
You should have a look at CVImageBufferRef pixBuf = CMSampleBufferGetImageBuffer(sbuf) then you can have the image pointer first address, so you can add filter to pixBuf, but i find that the performance is not good, If you have any new idea,we can discuss about it further.

Render movie to a OpenGL texture in iOS

Is it possible to render a movie to an OpenGL texture in real time using the Apple iOS frameworks?
I've seen it in an old NeHe tutorial using glTexSubImage2D, but I'm wondering how can I access the RGB data using the Apple frameworks?
init
NSString *mPath = [[NSBundle mainBundle] pathForResource:#"movie" ofType:#"m4v"];
NSURL *url = [NSURL fileURLWithPath:mPath];
NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:options];
imgGen = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
imgGen.requestedTimeToleranceBefore = kCMTimeZero;
imgGen.requestedTimeToleranceAfter = kCMTimeZero;
each frame
double time = 0.xyz * asset.duration;
CMTime reqTime = CMTimeMakeWithSeconds (time, preferredTimeScale), actTime;
NSError *err = nil;
CGImageRef ref = [imgGen actualTime:&actTime error:&err];
//... GL calls to make an image from the CGImageRef
This method is incredibly slow for realtime rendering, and I can only generate ~15 frames. One way may be to generate frames on the fly asynchronsly, but surely it can be done in real time? The most time consuming part is the copyCGImageAtTime call.

Resources