I'm using the code below to take a series of UIImage's and convert them into a mov. For some reason I keep on getting a green border at the bottom or on the right side depending on whether it's a landscape or portrait photo. The images are 215x320.
How can I remove the green borders? Is there a better way of creating .mov from UIImages?
- (void)createMov
{
UIImage *first = [self.videoFrames objectAtIndex:0];
CGSize frameSize = first.size;
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:fileName] fileType:AVFileTypeQuickTimeMovie
error:&error];
if (error) {
NSLog(#"error creating AssetWriter: %#",[error description]);
}
int numPixels = first.size.width * first.size.height;
int bitsPerSecond = numPixels * 11.04;
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:frameSize.width], AVVideoWidthKey,
[NSNumber numberWithInt:frameSize.height], AVVideoHeightKey,
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInteger:bitsPerSecond], AVVideoAverageBitRateKey,
[NSNumber numberWithInteger:30], AVVideoMaxKeyFrameIntervalKey,
nil], AVVideoCompressionPropertiesKey,
nil];
AVAssetWriterInput* writerInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
NSMutableDictionary *attributes = [[NSMutableDictionary alloc] init];
[attributes setObject:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32ARGB] forKey: (NSString*)kCVPixelBufferPixelFormatTypeKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:frameSize.width] forKey:(NSString*)kCVPixelBufferWidthKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:frameSize.height] forKey:(NSString*)kCVPixelBufferHeightKey];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:attributes];
[videoWriter addInput:writerInput];
// fixes all errors
writerInput.expectsMediaDataInRealTime = YES;
//Start a session:
BOOL start = [videoWriter startWriting];
NSLog(#"Session started? %d", start);
[videoWriter startSessionAtSourceTime:kCMTimeZero];
// Writing.
CVPixelBufferRef buffer = NULL;
buffer = [self pixelBufferFromCGImage:[first CGImage]];
BOOL result = [adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
if (result == NO) //failes on 3GS, but works on iphone 4
NSLog(#"failed to append buffer");
if(buffer)
CVBufferRelease(buffer);
[NSThread sleepForTimeInterval:0.05];
int fps = [Utils frameRate];
int i = 0;
for (UIImage *imgFrame in self.videoFrames) {
i = [self addFrame:adaptor videoWriter:videoWriter buffer:buffer imgFrame:imgFrame i:i fps:fps];
}
//Finish the session:
[writerInput markAsFinished];
[videoWriter finishWriting];
CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
[videoWriter release];
[writerInput release];
}
- (int)addFrame:(AVAssetWriterInputPixelBufferAdaptor *)adaptor videoWriter:(AVAssetWriter *)videoWriter buffer:(CVPixelBufferRef)buffer imgFrame:(UIImage *)imgFrame i:(int)i fps: (float)fps
{
if (adaptor.assetWriterInput.readyForMoreMediaData) {
i++;
NSLog(#"inside for loop %d",i);
CMTime frameTime = CMTimeMake(1, fps);
CMTime lastTime=CMTimeMake(i, fps);
CMTime presentTime=CMTimeAdd(lastTime, frameTime);
buffer = [self pixelBufferFromCGImage:[imgFrame CGImage]];
BOOL result = [adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
if (result == NO) { //fails on 3GS, but works on iphone 4
NSLog(#"failed to append buffer");
NSLog(#"The error is %#", [videoWriter error]);
}
if (buffer) {
CVBufferRelease(buffer);
}
[NSThread sleepForTimeInterval:0.05];
} else {
NSLog(#"error");
i--;
}
[NSThread sleepForTimeInterval:0.02];
return i;
}
- (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVPixelBufferCreate(kCFAllocatorDefault, CGImageGetWidth(image),
CGImageGetHeight(image), kCVPixelFormatType_32ARGB, (CFDictionaryRef) options,
&pxbuffer);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
size_t rowBytes = CVPixelBufferGetBytesPerRow(pxbuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, CGImageGetWidth(image),
CGImageGetHeight(image), 8, rowBytes, rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
It seems i got the similar issue, and I just got an idea come from my head after read two posts as below.
http://forum.videohelp.com/threads/314973-green-line-at-bottom-of-video-window
http://en.wikipedia.org/wiki/H.264/MPEG-4_AVC#Features
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithUnsignedLong:sourceWidth], AVVideoWidthKey,
[NSNumber numberWithUnsignedLong:sourceHeight], AVVideoHeightKey,
nil];
AVAssetWriterInput *writerInput =
[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
Before start write data by AVAssertWriter, please reset sourceWidth and sourceHeight with this method:
size_t fixedLineValue(size_t sourceValue){
size_t divide = sourceValue%4;
size_t finalValue = sourceValue;
if (divide) {
finalValue=(sourceValue/4+1)*4;
}
return finalValue;
}
I cant's finger out clearly why do this, but after that, the ugly green borders were removed. Please comment if there's a better answer and let's know. Thanks.
Related
I want to animate images smoothly while converting them to video. Dispite of searching SO, I am unable to understand how to achive it. I tried changing the Rotation angle(CGAffineTransformRotation), Translations and Scaling but didn't found a way to for the smooth animations. Heres how I am converting array of photos to video :
- (void)createVideoWithArrayImages:(NSMutableArray*)images size:(CGSize)size time:(float)time output:(NSURL*)output {
//getting a random path
NSError *error;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:output fileType:AVFileTypeMPEG4 error: &error];
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:size.width], AVVideoWidthKey,
[NSNumber numberWithInt:size.height], AVVideoHeightKey,
nil];
AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput: videoWriterInput sourcePixelBufferAttributes:nil];
videoWriterInput.expectsMediaDataInRealTime = YES;
[videoWriter addInput: videoWriterInput];
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = NULL;
//convert uiimage to CGImage.
//convert uiimage to CGImage.
NSInteger fps = 30;
int frameCount = 0;
for(UIImage *img in images) {
//for(VideoFrame * frm in imageArray)
NSLog(#"**************************************************");
//UIImage * img = frm._imageFrame;
buffer = [self videoPixelBufferFromCGImage:[img CGImage] andSize:size andAngle:(int)[images indexOfObject:img]];
double numberOfSecondsPerFrame = time / images.count;
double frameDuration = fps * numberOfSecondsPerFrame;
BOOL append_ok = NO;
int j = 0;
while (!append_ok && j < fps) {
if (adaptor.assetWriterInput.readyForMoreMediaData) {
//print out status:
NSLog(#"Processing video frame (%d,%d)",frameCount,(int)[images count]);
CMTime frameTime = CMTimeMake(frameCount * frameDuration,(int32_t) fps);
NSLog(#"Frame Time : %f", CMTimeGetSeconds(frameTime));
append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
if(!append_ok) {
NSError *error = videoWriter.error;
if(error!=nil) {
NSLog(#"Unresolved error %#,%#.", error, [error userInfo]);
}
}
}
else {
printf("adaptor not ready %d, %d\n", frameCount, j);
[NSThread sleepForTimeInterval:0.1];
}
j++;
}
if (!append_ok) {
printf("error appending image %d times %d\n, with error.", frameCount, j);
}
frameCount++;
NSLog(#"**************************************************");
}
[videoWriterInput markAsFinished];
[videoWriter finishWriting];
videoWriter = nil;
if(buffer != NULL)
CVPixelBufferRelease(buffer);
NSLog(#"************ write standard video successful ************");
}
Here CVPixelBufferRef is returned as follows :
- (CVPixelBufferRef)videoPixelBufferFromCGImage: (CGImageRef) image andSize:(CGSize) size andAngle:(int)angle {
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, [NSNumber numberWithBool:YES],kCVPixelBufferCGBitmapContextCompatibilityKey, nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options, &pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, kCGImageAlphaPremultipliedFirst);
NSParameterAssert(context);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, size.width, size.height), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
I tried adding Translations to the CVPixelBufferRef but thing didn't worked out for me. Any guide, any help would be very useful.
YOu don't specify what you want to achieve, but I've used AVVideoCompositionCoreAnimationTool to include animated CALayers in a AVMutableVideoComposition.
Here is the reference to that class
Hope that helps!
I have an NSMutableArray of images I need to create a video and save as .mp4 format
I tried a code found in many places to produce a video from images. But it does not work. It creates a blank file
-(void)writeImageAsMovie:(NSArray *)array toPath:(NSString*)path size:(CGSize)size duration:(int)duration
{
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(videoWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:size.width], AVVideoWidthKey,
[NSNumber numberWithInt:size.height], AVVideoHeightKey,
nil];
AVAssetWriterInput* writerInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:nil];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
[videoWriter addInput:writerInput];
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = NULL;
//convert uiimage to CGImage.
for (int i = 0; i < [array count]; i++) { //NSMutableArray
buffer = [self pixelBufferFromCGImage:[[array objectAtIndex:i] CGImage]];
NSLog(#"%#",[[array objectAtIndex:i] CGImage]);
}
[adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
//Write samples:
//Finish the session:
[writerInput markAsFinished];
}
-(CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, self.view.frame.size.width,
self.view.frame.size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options,
&pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, self.view.frame.size.width,
self.view.frame.size.height, 8, 4*self.view.frame.size.width, rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
NSParameterAssert(context);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
What is wrong?
Can anyone help me to find a solution please?
Is there any other way that I can create video from images?
Use [videowriter finishWriting]; after [writerInput markAsFinished];
in writeImageAsMovie method.
In ios7 its deprecated ,so use
[videoWriter finishWritingWithCompletionHandler:^(){
NSLog (#"finished writing");
}];
this is very important for the video creation.Thus solved the issue.
Thanks to all for the help.
I have used following code to create video from the images.
This code works fine when i select the image from camera roll which is downloaded from web or the screenshot but the image selected which are taken from camera shows zoomed in in the movie.
I don't know what is wrong with the images of camera.
can anyone please help me resolve this issue.
-(IBAction)createV:(id)sender
{
NSString *documentsDirectory = [NSHomeDirectory()
stringByAppendingPathComponent:#"Documents"];
NSString *videoOutputPath = [documentsDirectory stringByAppendingPathComponent:#"test_output.mp4"];
CGSize imageSize = [DatabaseAccess getusersetsize];
double nospf =[[[NSUserDefaults standardUserDefaults] valueForKey:#"duration"] intValue];
NSUInteger fps = 10;
NSMutableArray *imageArray;// = [DatabaseAccess getimagelist:#"select imgname from tbl_userimage"];
NSArray* imagePaths = [DatabaseAccess getimagelist:#"select imgname,strftime('%d-%m-%Y', tdate) as tdate from tbl_userimage"];
imageArray = [[NSMutableArray alloc] initWithCapacity:imagePaths.count];
int i=0;
for (NSString* path in [imagePaths valueForKey:#"image"] )
{
if ([[NSUserDefaults standardUserDefaults] boolForKey:#"disdate"])
{
CGSize imgsize = [DatabaseAccess getusersetsize];
//[imageArray addObject:[[DatabaseAccess drawText:[[imagePaths valueForKey:#"date"] objectAtIndex:i] inImage:[UIImage imageWithContentsOfFile:[DatabaseAccess documentsPathForFileName:path]] atPoint:CGPointMake(imgsize.width-250,imgsize.height-60) ] fixOrientation]];
[imageArray addObject:[DatabaseAccess drawText:[[imagePaths valueForKey:#"date"] objectAtIndex:i] inImage:[UIImage imageWithContentsOfFile:[DatabaseAccess documentsPathForFileName:path]] atPoint:CGPointMake(imgsize.width-250,imgsize.height-60) ]];
}
else
{
[imageArray addObject:[UIImage imageWithContentsOfFile:[DatabaseAccess documentsPathForFileName:path]]];
NSLog(#"%#",path);
// [imageArray addObject:[UIImage imageNamed:path]];
}
i++;
}
[self exportImages:imageArray asVideoToPath:videoOutputPath withFrameSize:imageSize framesPerSecond:fps numberOfSecondsPerFrame:nospf];
}
- (void)exportImages:(NSMutableArray *)imageArray asVideoToPath:(NSString *)videoOutputPath withFrameSize:(CGSize)imageSize framesPerSecond:(NSUInteger)fps numberOfSecondsPerFrame:(double)numberOfSecondsPerFrame {
NSError *error = nil;
NSString *documentsDirectory = [NSHomeDirectory()
stringByAppendingPathComponent:#"Documents"];
NSFileManager *fileMgr = [NSFileManager defaultManager];
if ([fileMgr removeItemAtPath:videoOutputPath error:&error] != YES)
NSLog(#"Unable to delete file: %#", [error localizedDescription]);
////////////// end setup ///////////////////////////////////
NSLog(#"Start building video from defined frames.");
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:videoOutputPath] fileType:AVFileTypeMPEG4 error:&error];
NSParameterAssert(videoWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:imageSize.width], AVVideoWidthKey,
[NSNumber numberWithInt:imageSize.height], AVVideoHeightKey,
nil];
AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
sourcePixelBufferAttributes:nil];
NSParameterAssert(videoWriterInput);
NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
videoWriterInput.expectsMediaDataInRealTime = YES;
[videoWriter addInput:videoWriterInput];
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = NULL;
//convert uiimage to CGImage.
int frameCount = 0;
//double numberOfSecondsPerFrame = 6;
double frameDuration = fps * numberOfSecondsPerFrame;
//for(VideoFrame * frm in imageArray)
NSLog(#"**************************************************");
for(UIImage * img in imageArray)
{
//UIImage * img = frm._imageFrame;
buffer = [self pixelBufferFromCGImage:[img CGImage]];
BOOL append_ok = NO;
int j = 0;
while (!append_ok && j < 30) {
if (adaptor.assetWriterInput.readyForMoreMediaData) {
//print out status:
NSLog(#"Processing video frame (%d,%lu)",frameCount,(unsigned long)[imageArray count]);
CMTime frameTime = CMTimeMake(frameCount,(int32_t) numberOfSecondsPerFrame);
//CMTime frameTime = CMTimeMake(frameCount*frameDuration,(int32_t) fps);
// NSLog(#"%#",frameTime);
append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
if(!append_ok){
NSError *error = videoWriter.error;
if(error!=nil) {
NSLog(#"Unresolved error %#,%#.", error, [error userInfo]);
}
}
}
else {
printf("adaptor not ready %d, %d\n", frameCount, j);
[NSThread sleepForTimeInterval:0.1];
}
j++;
}
if (!append_ok) {
printf("error appending image %d times %d\n, with error.", frameCount, j);
}
frameCount++;
}
NSLog(#"**************************************************");
//Finish the session:
[videoWriterInput markAsFinished];
[videoWriter finishWriting];
NSLog(#"Write Ended");
[self playMovie:videoOutputPath];
}
- (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image {
// CGSize size = CGSizeMake(400, 200);
CGSize size = [DatabaseAccess getusersetsize];
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
size.width,
size.height,
kCVPixelFormatType_32ARGB,
(__bridge CFDictionaryRef) options,
&pxbuffer);
if (status != kCVReturnSuccess){
NSLog(#"Failed to create pixel buffer");
}
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
size.height, 8, 4*size.width, rgbColorSpace,
kCGImageAlphaPremultipliedFirst);
//kCGImageAlphaNoneSkipFirst);
NSParameterAssert(context);
//CGContextConcatCTM(context, frameTransform);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
I have solved image zoomed in issue using this code
-(UIImage*)scaleImage:(UIImage*)image toSize:(CGSize)newSize
{
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
resize UIImage before converting it to CGImage,
make sure you resize image.width in multiple of 16.
CGSize your_size = CGSizeMake(1600, 800);
UIImage *tempImg = [self scaleImage:img toSize:your_size..];
buffer = [self pixelBufferFromCGImage:[tempImg CGImage]];
In my application , i am converting array of images to video. For that, i am using the below code.
NSFileManager *fileMgr = [NSFileManager defaultManager];
NSString *documentsDirectory = [NSHomeDirectory()
stringByAppendingPathComponent:#"Documents"];
NSString *videoOutputPath = [documentsDirectory stringByAppendingPathComponent:#"test_output.mp4"];
//NSLog(#"-->videoOutputPath= %#", videoOutputPath);
// get rid of existing mp4 if exists...
if ([fileMgr removeItemAtPath:videoOutputPath error:&error] != YES)
NSLog(#"Unable to delete file: %#", [error localizedDescription]);
NSUInteger fps = 30;
////////////// end setup ///////////////////////////////////
NSLog(#"Start building video from defined frames.");
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:videoOutputPath] fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(videoWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:288], AVVideoWidthKey,
[NSNumber numberWithInt:352], AVVideoHeightKey,
nil];
AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
sourcePixelBufferAttributes:nil];
NSParameterAssert(videoWriterInput);
NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
videoWriterInput.expectsMediaDataInRealTime = YES;
[videoWriter addInput:videoWriterInput];
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = NULL;
//convert uiimage to CGImage.
int frameCount = 0;
//for(VideoFrame * frm in imageArray)
NSLog(#"**************************************************");
for(UIImage * img in finalArrayVal)
{
//UIImage * img = frm._imageFrame;
buffer = [self pixelBufferFromCGImage:[img CGImage]];
BOOL append_ok = NO;
int j = 0;
while (!append_ok && j < 30) {
if (adaptor.assetWriterInput.readyForMoreMediaData) {
//print out status:
NSLog(#"Processing video frame (%d,%lu)",frameCount,(unsigned long)[finalArrayVal count]);
CMTime frameTime = CMTimeMake(frameCount,(int32_t) fps);
append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
if(!append_ok){
NSError *error = videoWriter.error;
if(error!=nil) {
NSLog(#"Unresolved error %#,%#.", error, [error userInfo]);
}
}
}
else {
printf("adaptor not ready %d, %d\n", frameCount, j);
[NSThread sleepForTimeInterval:0.1];
}
j++;
}
if (!append_ok) {
printf("error appending image %d times %d\n, with error.", frameCount, j);
}
frameCount++;
}
NSLog(#"**************************************************");
//Finish the session:
[videoWriterInput markAsFinished];
[videoWriter finishWriting];
It converts the array of images to video file. But it crops the part of image exactly like below when i am trying to convert it to video.
Input Image :
Output video screenshot:
You can see upper portion of image is highly ignored. Note that i have set AVCaptureSessionPreset352x288 for defaultAVCaptureSessionPreset in camera. Here i need convert full image as frames in video. Can anyone provide the solution to accomplish the above stated issue.
After the study , i replaced the below line to send the additional parameter (size of the image )
buffer = [myController pixelBufferFromImage:[img CGImage] andSize:imageSize];
Instead of this :
buffer = [self pixelBufferFromCGImage:[img CGImage]];
And added this method in myController
+ (CVPixelBufferRef) pixelBufferFromImage: (CGImageRef) image andSize:(CGSize) size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
size.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,&pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
size.height, 8, 4*size.width, rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
NSParameterAssert(context);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
I have done image to video conversion in iphone(of course I got the code from stack overflow questions). But the problem is speed of recorded video is very fast, it ran away within 2 seconds even though I have around 2250 frames. I know the problem is with its frame rate.
But i don't know how to make it correct.
my code is below
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [paths objectAtIndex:0];
NSString *myFilePath = [documentsDirectoryPath stringByAppendingPathComponent:#"test.mov"];
if ([self openVideoFile:myFilePath withSize:CGSizeMake (480.0, 320.0)]) {
for (int i=1; i<2226; i++) {
NSString *imagename=[NSString stringWithFormat:#"1 (%i).jpg",i];
UIImage *image=[ UIImage imageNamed:imagename];
[self writeImageToMovie:[image CGImage]];
}
[videoWriter finishWriting];
}
else {
NSLog(#"friled to open video file");
}
this code is in calling function and defenitions of the functions given below
- (BOOL) openVideoFile: (NSString *) path withSize:(CGSize)imageSize {
CGSize size = CGSizeMake (480.0, 320.0);//imageSize;
NSError *error = nil;
videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
error:&error];
if (error != nil){
NSLog(#"error>>>> %#",error);
return NO;
}
NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithDouble:size.width], AVVideoCleanApertureWidthKey,
[NSNumber numberWithDouble:size.height], AVVideoCleanApertureHeightKey,
[NSNumber numberWithInt:10], AVVideoCleanApertureHorizontalOffsetKey,
[NSNumber numberWithInt:10], AVVideoCleanApertureVerticalOffsetKey,
nil];
NSDictionary *videoAspectRatioSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:1], AVVideoPixelAspectRatioHorizontalSpacingKey,
[NSNumber numberWithInt:1],AVVideoPixelAspectRatioVerticalSpacingKey,
nil];
NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys:
videoCleanApertureSettings, AVVideoCleanApertureKey,
videoAspectRatioSettings, AVVideoPixelAspectRatioKey,
nil];
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
codecSettings,AVVideoCompressionPropertiesKey,
[NSNumber numberWithDouble:size.width], AVVideoWidthKey,
[NSNumber numberWithDouble:size.height], AVVideoHeightKey,
nil];
writerInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
NSMutableDictionary * bufferAttributes = [[NSMutableDictionary alloc] init];
[bufferAttributes setObject: [NSNumber numberWithInt: kCVPixelFormatType_32ARGB]
forKey: (NSString *) kCVPixelBufferPixelFormatTypeKey];
[bufferAttributes setObject: [NSNumber numberWithInt: 480]
forKey: (NSString *) kCVPixelBufferWidthKey];
[bufferAttributes setObject: [NSNumber numberWithInt: 320]
forKey: (NSString *) kCVPixelBufferHeightKey];
adaptor = [[AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:nil] retain];
NSMutableDictionary* attributes;
attributes = [NSMutableDictionary dictionary];
int width = 480;
int height = 320;
[attributes setObject:[NSNumber numberWithInt:kCVPixelFormatType_32ARGB] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
[attributes setObject:[NSNumber numberWithInt:width] forKey: (NSString*)kCVPixelBufferWidthKey];
[attributes setObject:[NSNumber numberWithInt:height] forKey: (NSString*)kCVPixelBufferHeightKey];
CVReturn theError = CVPixelBufferPoolCreate(kCFAllocatorDefault, NULL, (CFDictionaryRef) attributes, &pixelBufferPool);
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
[videoWriter addInput:writerInput];
writerInput.expectsMediaDataInRealTime = YES;
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
buffer = NULL;
lastTime = kCMTimeZero;
presentTime = kCMTimeZero;
return YES;
}
- (void) writeImageToMovie:(CGImageRef)image
{
if([writerInput isReadyForMoreMediaData])
{
buffer = [self pixelBufferFromCGImage:image];
BOOL success = [adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
if (!success) NSLog(#"Failed to appendPixelBuffer");
CVPixelBufferRelease(buffer);
presentTime = CMTimeAdd(lastTime, CMTimeMake(1, 1000));//I think problem is here but what will be given for correct output
lastTime = presentTime;
}
else
{
NSLog(#"error - writerInput not ready");
}
}
- (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image
{
CGSize size = CGSizeMake (480.0, 320.0);
CVPixelBufferRef pxbuffer;
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
if (pixelBufferPool == NULL) NSLog(#"pixelBufferPool is null!");
CVReturn status = CVPixelBufferPoolCreatePixelBuffer (NULL, pixelBufferPool, &pxbuffer);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
size.height, 8, 4*size.width, rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(90, 10, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
What to do with the CMTime variables and how can I made it correctly
One more help how can I add audio with this video.
Your PTSs are very close together. Instead of CMTimeMake(1, 1000), why not 30FPS: CMTimeMake(1, 30))?
I revised you code and came up to the solution, in my case I record every image in my view in 0.1 seconds so my fps is 0.1fps.
presentTime = CMTimeAdd(lastTime, CMTimeMake(1, 10));
The reason why your video is too fast because your ratio is 1 seconds into 1000 image. You can make it 1 image per second.
presentTime = CMTimeAdd(lastTime, CMTimeMake(1, 1));
Try this out, hope it will help you.
Add this library
AVFoundation
CoreMedia
CoreVideo
i am able to generate video , but not able to get audio is anyone had use this code and create video with audio