Is it possible to render a movie to an OpenGL texture in real time using the Apple iOS frameworks?
I've seen it in an old NeHe tutorial using glTexSubImage2D, but I'm wondering how can I access the RGB data using the Apple frameworks?
init
NSString *mPath = [[NSBundle mainBundle] pathForResource:#"movie" ofType:#"m4v"];
NSURL *url = [NSURL fileURLWithPath:mPath];
NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:options];
imgGen = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
imgGen.requestedTimeToleranceBefore = kCMTimeZero;
imgGen.requestedTimeToleranceAfter = kCMTimeZero;
each frame
double time = 0.xyz * asset.duration;
CMTime reqTime = CMTimeMakeWithSeconds (time, preferredTimeScale), actTime;
NSError *err = nil;
CGImageRef ref = [imgGen actualTime:&actTime error:&err];
//... GL calls to make an image from the CGImageRef
This method is incredibly slow for realtime rendering, and I can only generate ~15 frames. One way may be to generate frames on the fly asynchronsly, but surely it can be done in real time? The most time consuming part is the copyCGImageAtTime call.
Related
I am very upset because from last two days i am searching for editing video frame and replacing them at same time(frame actual time) with edited frame but i am unable to do that.
I have seen so many links of stackoverflow but that is not perfect for my requirement- iFrame extractor is not working.
Get image from a video frame in iPhone
In mean while, I though to extract all frames and save it in an array of dictionary with frames and respecting time and when i got a frame from running video then after editing i will run a loop for actual frame with respecting time and i will match the actual time with captured(edited frame running from video) frame and if it got then replace actual frame with edited frame and again write the video from all frames.
but to do so i used -
Get all frames of Video IOS 6
but it is crashing after extracting some images .
I have written my code like this
-(void)editMovie:(id)sender
{
float FPS=1;
NSString *videoPath = [[NSBundle mainBundle] pathForResource:#"IMG_0879" ofType:#"MOV"];
NSURL *videoURl = [NSURL fileURLWithPath:videoPath];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURl options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.requestedTimeToleranceAfter = kCMTimeZero;
generator.requestedTimeToleranceBefore = kCMTimeZero;
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) * FPS ; i++){
#autoreleasepool {
CMTime time = CMTimeMake(i, FPS);
NSError *err;
CMTime actualTime;
CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];
[self saveImage: generatedImage atTime:actualTime]; // Saves the image on document directory and not memory
CGImageRelease(image);
}
}
}
-(void)saveImage:(UIImage*)image atTime:(CMTime)time
{
float t=time.timescale;
NSMutableDictionary *dict=[[NSMutableDictionary alloc]init];
[dict setObject:image forKey:#"image"];
[dict setObject:[NSString stringWithFormat:#"%f",t] forKey:#"time"];
[arrFrame addObject:dict];
NSLog(#"ArrayImg=%#",arrFrame);
}
My Requrement is ->>>
I need to run a video
From running video i have to pause a video and get image(frame) at pause time.
I have to edit captured image and save or replace at actual image.
4 when i agin play the video the edited image should be in video.
Please give me any example if you all have, I have downloaded so many projects or examples from given link of stack flow or other sites but no one perfect or even fulfilling my 20 % requirement.
Please give me any example or ides if you have.
I will be obliged to you, Thanks in Advance
I am using the code below to create a video thumbnail from a Url, the code is working perfect but it takes long time and it's jamming the app till it's create the image.
Her is my code:
NSString *one = self.currentList.videoLink;
NSURL * imageURL = [NSURL URLWithString:one];
AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:imageURL options:nil];
AVAssetImageGenerator *generate1 = [[AVAssetImageGenerator alloc] initWithAsset:asset1];
generate1.appliesPreferredTrackTransform = YES;
NSError *err = NULL;
CMTime time = CMTimeMake(2,1);
CGImageRef oneRef = [generate1 copyCGImageAtTime:time actualTime:NULL error:&err];
UIImage *oneme = [[UIImage alloc] initWithCGImage:oneRef];
[self.videoImage setImage:oneme];
self.videoImage.contentMode = UIViewContentModeScaleToFill;
As I said, the code work fine. Can any one help me to solve the delay in creating the thumbnail?
Thanks and I hope the question is clear.
If the one URL is some remote URL, you are networking synchronously. That would be a lot of your problem right there. You're blocking the main thread while you network ("jamming the app", as you put it). Network properly, with URLSession or AFNetworking or whatever. That way, you don't block the main thread.
(By the way, blocking the main thread will cause your app to crash if you do it on a device. You might not even get into the app store, if Apple notices you're doing that.)
I've been going through many codes on it but I've not found a working solution yet. I'm using Xcode 5.1.1 and Iphone Retina 4inch.I want to get a single image from a video by clicking on it. After that, I'll edit the image and apply the effects to entire video. Thank you.
UPDATE:
I've found this code for the same purpose. It's not working on the simulator either. Can somebody tell me what's the problem?
-(UIImage *)generateThumbImage : (NSString *)filepath
{
NSURL *url = [NSURL fileURLWithPath:filepath];
AVAsset *asset = [AVAsset assetWithURL:url];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
CMTime time = [asset duration];
time.value = 0;
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef); // CGImageRef won't be released by ARC
return thumbnail;
}
This code will generating an image from a particular time in the loaded video and convert the image from a CGImage to UIImage.
The code is commented to explain its parts.
- (UIImage*)loadImage {
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:vidURL options:nil];
AVAssetImageGenerator *generate = [[AVAssetImageGenerator alloc] initWithAsset:asset]; // Create object of Video.
NSError *err = NULL;
CMTime time = CMTimeMake(1, 60); // Select time of 1/60ths of a second.
CGImageRef imgRef = [generate copyCGImageAtTime:time actualTime:NULL error:&err]; // Get Image on that time frame.
NSLog(#"err==%#, imageRef==%#", err, imgRef); // if something is not as expected then u can log error.
return [[UIImage alloc] initWithCGImage:imgRef]; // Convert Image from CGImage to UIImage so you can display it easily in a image view.
}
Have you tried AVFoundation framework?
That's what it's for. The documentation is on the Apple site:
AV Foundation framework provides essential services for working with
time-based audiovisual media on iOS and OS X. Through a modern
Objective-C interface, you can easily play, capture, edit, or encode
media formats such as QuickTime movies and MPEG-4 files.
I am trying to create a sharing extension to share videos using the new iOS 8 app extensions.I want to open the photo app and share the video by my extension.
I get the video url by following codes:
NSExtensionItem *inputItem = self.extensionContext.inputItems.firstObject;
NSItemProvider *provider = inputItem.attachments[0];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT,0), ^{
if ([provider hasItemConformingToTypeIdentifier:(NSString *)kUTTypeQuickTimeMovie])
{
[provider loadItemForTypeIdentifier:#"com.apple.quicktime-movie" options:nil completionHandler:^(NSURL *path,NSError *error){
if (path)
{
dispatch_async(dispatch_get_main_queue(), ^{
_moviePath = path;
});
}
}];
}
});
But I only got the video file url:
file:///var/mobile/Media/DCIM/100APPLE/IMG_0062.MOV
I also want to get more video attributes ,like: size and duration
I used following codes,but not work:
NSDictionary *opts = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:NO] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset *urlAsset = [AVURLAsset URLAssetWithURL:_moviePath options:opts];
int second = urlAsset.duration.value / urlAsset.duration.timescale;
and this code:
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:_moviePath];
CMTime duration = playerItem.duration;
float seconds = CMTimeGetSeconds(duration);
NSLog(#"duration: %.2f", seconds);
they are all do not work
and xcode tips:
Error [IRForTarget]: Call to a symbol-only function 'memset' that is not present in the target
error: 0 errors parsing expression
error: The expression could not be prepared to run in the target
You can help me to get the video attributes?Thank you very much!
Now the following codes do working:
NSDictionary *opts = [NSDictionary
dictionaryWithObject:[NSNumber numberWithBool:NO]
forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset *urlAsset = [AVURLAsset URLAssetWithURL:_moviePath options:opts];
int second = urlAsset.duration.value / urlAsset.duration.timescale;
And I don't know Why it does not work beforeļ¼
How can I make the process that filtering saved video in photo library in iOS?
I got URLs of videos in the library using AssetsLibrary framework,
then, made a preview for the video.
Next step, I wanna make filtering process for video using CIFilter.
In case of real time issue, I made video filter process using AVCaptureVideoDataOutputSampleBufferDelegate.
But in case of saved video, I don't know how to make filter process.
Do I use AVAsset? If I must use that, how can I filter it? and how to save it?
always thank you.
I hope this will help you
AVAsset *theAVAsset = [[AVURLAsset alloc] initWithURL:mNormalVideoURL options:nil];
NSError *error = nil;
float width = theAVAsset.naturalSize.width;
float height = theAVAsset.naturalSize.height;
AVAssetReader *mAssetReader = [[AVAssetReader alloc] initWithAsset:theAVAsset error:&error];
[theAVAsset release];
NSArray *videoTracks = [theAVAsset tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoTrack = [videoTracks objectAtIndex:0];
mPrefferdTransform = [videoTrack preferredTransform];
NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
AVAssetReaderTrackOutput* mAssetReaderOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack outputSettings:options];
[mAssetReader addOutput:mAssetReaderOutput];
[mAssetReaderOutput release];
CMSampleBufferRef buffer = NULL;
//CMSampleBufferRef buffer = NULL;
while ( [mAssetReader status]==AVAssetReaderStatusReading ){
buffer = [mAssetReaderOutput copyNextSampleBuffer];//read next image.
}
You should have a look at CVImageBufferRef pixBuf = CMSampleBufferGetImageBuffer(sbuf) then you can have the image pointer first address, so you can add filter to pixBuf, but i find that the performance is not good, If you have any new idea,we can discuss about it further.