I've tried to make a video app like a vine by using "AVFoundation". Now I can save video through AVCaptureVideoDataOutput and can play.But somehow audio doesn't work and I don't know why.
I'm beginner of the iOS app so it may be not clear to explain. Hope you understand what I'm trying to say and give me some tips.
This is the code I'm using.
Setting up AVCaptureVideoDataOutput and AVCaptureAudioDataOutput:
AVCaptureVideoDataOutput* videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
[CaptureSession addOutput:videoDataOutput];
videoDataOutput.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA],kCVPixelBufferPixelFormatTypeKey,
nil];
dispatch_queue_t videoQueue = dispatch_queue_create("VideoQueue", NULL);
[videoDataOutput setSampleBufferDelegate:self queue:videoQueue];
AVCaptureAudioDataOutput *audioDataOutput = [[AVCaptureAudioDataOutput alloc] init];
[CaptureSession addOutput:audioDataOutput];
dispatch_queue_t audioQueue = dispatch_queue_create("AudioQueue", NULL);
[audioDataOutput setSampleBufferDelegate:self queue:audioQueue];
Setting up AVAssetWrite and AVAssetWriterInput:
- (void)makeWriter{
pathString = [NSHomeDirectory()stringByAppendingPathComponent:#"Documents/capture.mov"];
exportURL = [NSURL fileURLWithPath:pathString];
if ([[NSFileManager defaultManager] fileExistsAtPath:exportURL.path])
{
[[NSFileManager defaultManager] removeItemAtPath:exportURL.path error:nil];
}
NSError* error;
writer = [[AVAssetWriter alloc] initWithURL:exportURL
fileType:AVFileTypeQuickTimeMovie
error:&error];
NSDictionary* videoSetting = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:1280], AVVideoWidthKey,
[NSNumber numberWithInt:720], AVVideoHeightKey,
nil];
videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSetting];
videoWriterInput.expectsMediaDataInRealTime = YES;
// Add the audio input
AudioChannelLayout acl;
bzero( &acl, sizeof(acl));
acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;
NSDictionary* audioOutputSettings = nil;
// Both type of audio inputs causes output video file to be corrupted.
if( NO ) {
// should work from iphone 3GS on and from ipod 3rd generation
audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatMPEG4AAC ], AVFormatIDKey,
[ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
[ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
[ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey,
[ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
nil];
} else {
// should work on any device requires more space
audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatAppleLossless ], AVFormatIDKey,
[ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey,
[ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
[ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
[ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
nil ];
}
audioWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType: AVMediaTypeAudio
outputSettings: audioOutputSettings ];
audioWriterInput.expectsMediaDataInRealTime = YES;
// add input
[writer addInput:videoWriterInput];
[writer addInput:audioWriterInput];
}
And finally the CaptureOutput code:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
if ((isPause) && (isRecording)) { return; }
if( !CMSampleBufferDataIsReady(sampleBuffer) ){return;}
if( isRecording == YES ) {
isWritting = YES;
if( writer.status != AVAssetWriterStatusWriting ) {
[writer startWriting];
[writer startSessionAtSourceTime:kCMTimeZero];
}
if( [videoWriterInput isReadyForMoreMediaData] ) {
CFRetain(sampleBuffer);
CMSampleBufferRef newSampleBuffer = [self offsetTimmingWithSampleBufferForVideo:sampleBuffer];
[videoWriterInput appendSampleBuffer:newSampleBuffer];
CFRelease(sampleBuffer);
CFRelease(newSampleBuffer);
}
writeFrames++;
}
}
- (CMSampleBufferRef)offsetTimmingWithSampleBufferForVideo:(CMSampleBufferRef)sampleBuffer
{
CMSampleBufferRef newSampleBuffer;
CMSampleTimingInfo sampleTimingInfo;
sampleTimingInfo.duration = CMTimeMake(1, 30);
sampleTimingInfo.presentationTimeStamp = CMTimeMake(writeFrames, 30);
sampleTimingInfo.decodeTimeStamp = kCMTimeInvalid;
CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
sampleBuffer,
1,
&sampleTimingInfo,
&newSampleBuffer);
return newSampleBuffer;
}
At least one problem is that you put all the samplebuffers into the videowriterinput. You need to put the samples coming form the audiobuffer into the audiowriterinput.
You should check out this SO questions and answer!
performance-issues-when-using-avcapturevideodataoutput-and-avcaptureaudiodataout
This can be useful.
http://red-glasses.com/index.php/tutorials/ios4-take-photos-with-live-video-preview-using-avfoundation/
https://github.com/benlodotcom/MyAVControllerDemo
streaming video FROM an iPhone
Related
Here I am getting sample buffers using asset reader and then processing each frame for the customization purpose. But audio missing for the final video saved in documents. I know there is another way to process each frame like "applyingCIFiltersWithHandler" But need each sample buffer and render image or filter over that. Suggest me solution for this?
NSError *error;
NSString *path = [[NSBundle mainBundle] pathForResource:#"recordmovie" ofType:#"mov"];
NSURL *videoURL = [NSURL fileURLWithPath:path];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:videoURL options:nil];;
AVAssetReader *reader = [[AVAssetReader alloc] initWithAsset:asset error:nil];
AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
// add audio track here
AVAssetTrack *audioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] firstObject];
NSDictionary *readerOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange], kCVPixelBufferPixelFormatTypeKey, nil];
CGSize renderSize = [videoTrack naturalSize];
/*
NSDictionary *readerOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264 , AVVideoCodecKey,
renderSize.width , AVVideoWidthKey,
renderSize.height , AVVideoHeightKey,
AVVideoScalingModeResizeAspectFill,AVVideoScalingModeKey, nil];
*/
AVAssetReaderTrackOutput* readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack
outputSettings:readerOutputSettings];
AudioChannelLayout acl;
bzero( &acl, sizeof(acl));
acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;
NSDictionary* audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatMPEG4AAC], AVFormatIDKey,
[ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
[ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
[ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
[ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey,
nil];
NSDictionary *settings = #{ AVFormatIDKey : [NSNumber numberWithInt:kAudioFormatLinearPCM] };
AVAssetReaderTrackOutput *audioTrackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:settings];
[reader addOutput:readerOutput];
[reader addOutput:audioTrackOutput];
[reader startReading];
NSMutableArray *samples = [[NSMutableArray alloc] init];
CMSampleBufferRef sample;
while((sample = [readerOutput copyNextSampleBuffer])) {
[samples addObject:(__bridge id)sample];
CFRelease(sample);
}
NSString *outputPath = [self getDocumentsUrlForFilterMovie];
NSURL *outputURL = [NSURL fileURLWithPath:outputPath];
AVAssetWriter *writer = [[AVAssetWriter alloc] initWithURL:outputURL
fileType:AVFileTypeQuickTimeMovie
error:&error];
NSDictionary *videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys:
#(videoTrack.estimatedDataRate), AVVideoAverageBitRateKey,
nil];
NSDictionary *writerOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:videoTrack.naturalSize.width], AVVideoWidthKey,
[NSNumber numberWithInt:videoTrack.naturalSize.height], AVVideoHeightKey,
videoCompressionProps, AVVideoCompressionPropertiesKey,
nil];
AVAssetWriterInput *writerInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo
outputSettings:writerOutputSettings
sourceFormatHint:(__bridge CMFormatDescriptionRef)[videoTrack.formatDescriptions lastObject]];
[writerInput setExpectsMediaDataInRealTime:NO];
[writer addInput:writerInput];
AVAssetWriterInput *WriterAudioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioOutputSettings];
WriterAudioInput.expectsMediaDataInRealTime = YES;
if([writer canAddInput:WriterAudioInput]) {
[writer addInput:WriterAudioInput];
}
AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:writerInput sourcePixelBufferAttributes:nil];
[writer startWriting];
[writer startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp((__bridge CMSampleBufferRef)samples[0])];
//NSMutableArray *audioSamples = [[NSMutableArray alloc] init];
while((sample = [audioTrackOutput copyNextSampleBuffer])) {
//[audioSamples addObject:(__bridge id)sample];
[WriterAudioInput appendSampleBuffer:sample];
while (!WriterAudioInput.readyForMoreMediaData) {
[NSThread sleepForTimeInterval:0.1];
}
CFRelease(sample);
}
CIFilter *filter = [CIFilter filterWithName:#"CISepiaTone"];
[filter setDefaults];
[filter setValue:#(1) forKey:kCIInputIntensityKey];
//CIImage *outputImage = filter.outputImage;
for(NSInteger i = 0; i < samples.count; i++) {
CMTime presentationTime = CMSampleBufferGetPresentationTimeStamp((__bridge CMSampleBufferRef)samples[i]);
//CVPixelBufferRef videoFrameBuffer = CMSampleBufferGetImageBuffer((__bridge CMSampleBufferRef)samples[samples.count - i - 1]);
CVPixelBufferRef videoFrameBuffer = CMSampleBufferGetImageBuffer((__bridge CMSampleBufferRef)samples[i]);
CIImage *frameImage = [CIImage imageWithCVPixelBuffer:videoFrameBuffer];
[filter setValue:frameImage forKey:kCIInputImageKey];
CIImage *outputImage = filter.outputImage;
//}
[self->ciContext render:outputImage toCVPixelBuffer:videoFrameBuffer bounds:outputImage.extent colorSpace:self->colorSpace];
while (!writerInput.readyForMoreMediaData) {
[NSThread sleepForTimeInterval:0.1];
}
// [writerInput appendSampleBuffer:videoFrameBuffer];
[pixelBufferAdaptor appendPixelBuffer:videoFrameBuffer withPresentationTime:presentationTime];
}
[writerInput markAsFinished];
[writer finishWritingWithCompletionHandler:^(){
//[self.delegate didFinishReverse:YES andVideoURL:outputURL withError:error];
NSLog(#"Finish video rendering");
}];
});
Missed adding audio files to audioAssetWriterInput. I fixed that issue by adding audio sample buffers.
Here I am adding code to getting audio , video sample buffers from existing video, then write and save to local documents. You can apply filters and render images over image for required frames and total frames in specific frame area.
NSError *error;
AVURLAsset *videoAsset = [AVURLAsset URLAssetWithURL:videoURL options:nil];;
AVAssetReader *assetReader = [[AVAssetReader alloc] initWithAsset:videoAsset error:nil];
AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] firstObject];
AVAssetTrack *audioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
NSDictionary *videoReaderOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange], kCVPixelBufferPixelFormatTypeKey, nil];
AVAssetReaderTrackOutput* assetReaderVideoTrackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:videoReaderOutputSettings];
AudioChannelLayout acl;
bzero( &acl, sizeof(acl));
acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;
NSDictionary* audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatMPEG4AAC], AVFormatIDKey,
[ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
[ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
[ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
[ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey,
nil];
NSDictionary *audioDecodesettings = #{ AVFormatIDKey : [NSNumber numberWithInt:kAudioFormatLinearPCM] };
AVAssetReaderTrackOutput *assetReaderAudioTrackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:audioDecodesettings];
[assetReader addOutput:assetReaderVideoTrackOutput];
[assetReader addOutput:assetReaderAudioTrackOutput];
[assetReader startReading];
NSMutableArray *samples = [[NSMutableArray alloc] init];
CMSampleBufferRef sample;
while((sample = [assetReaderVideoTrackOutput copyNextSampleBuffer])) {
[samples addObject:(__bridge id)sample];
CFRelease(sample);
}
NSString *outputPath = [self getDocumentsUrlForFilterMovie];
NSURL *outputURL = [NSURL fileURLWithPath:outputPath];
AVAssetWriter *assetWriter = [[AVAssetWriter alloc] initWithURL:outputURL
fileType:AVFileTypeQuickTimeMovie
error:&error];
NSDictionary *videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys:
#(videoTrack.estimatedDataRate), AVVideoAverageBitRateKey,
nil];
NSDictionary *writerOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:videoTrack.naturalSize.width], AVVideoWidthKey,
[NSNumber numberWithInt:videoTrack.naturalSize.height], AVVideoHeightKey,
videoCompressionProps, AVVideoCompressionPropertiesKey,
nil];
AVAssetWriterInput *videoWriterInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo
outputSettings:writerOutputSettings
sourceFormatHint:(__bridge CMFormatDescriptionRef)[videoTrack.formatDescriptions lastObject]];
[videoWriterInput setExpectsMediaDataInRealTime:NO];
[assetWriter addInput:videoWriterInput];
AVAssetWriterInput *audioWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioOutputSettings];
audioWriterInput.expectsMediaDataInRealTime = YES;
if([assetWriter canAddInput:audioWriterInput]) {
[assetWriter addInput:audioWriterInput];
}
AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:videoWriterInput sourcePixelBufferAttributes:nil];
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp((__bridge CMSampleBufferRef)samples[0])];
while((sample = [assetReaderAudioTrackOutput copyNextSampleBuffer])) {
[audioWriterInput appendSampleBuffer:sample];
while (!audioWriterInput.readyForMoreMediaData) {
[NSThread sleepForTimeInterval:0.1];
}
CFRelease(sample);
}
for(NSInteger i = 0; i < samples.count; i++) {
CMTime presentationTime = CMSampleBufferGetPresentationTimeStamp((__bridge CMSampleBufferRef)samples[i]);
CVPixelBufferRef videoFrameBuffer = nil;
if(frameRenderType == KVideoNormal) {
videoFrameBuffer = CMSampleBufferGetImageBuffer((__bridge CMSampleBufferRef)samples[i]);
} else if (frameRenderType == KVideoReverse) {
videoFrameBuffer = CMSampleBufferGetImageBuffer((__bridge CMSampleBufferRef)samples[samples.count - i - 1]);
}
if(self.filters.count > 0) {
CIImage *frameImage = [CIImage imageWithCVPixelBuffer:videoFrameBuffer];
for(CIFilter *filter in self.filters) {
[filter setValue:frameImage forKey:kCIInputImageKey];
frameImage = filter.outputImage;
}
[self->ciContext render:frameImage toCVPixelBuffer:videoFrameBuffer bounds:frameImage.extent colorSpace:self->colorSpace];
}
while (!videoWriterInput.readyForMoreMediaData) {
[NSThread sleepForTimeInterval:0.1];
}
[pixelBufferAdaptor appendPixelBuffer:videoFrameBuffer withPresentationTime:presentationTime];
}
[videoWriterInput markAsFinished];
[assetWriter finishWritingWithCompletionHandler:^(){
dispatch_async(dispatch_get_main_queue(), ^{
NSLog(#"Finished video processing");
});
}];
});
I'm using AVAssetReader and AVAssetWriter to reverse an audio file. However, the resulting reversed audio is very jerky.
What's the best practice for reversing an audio file?
Any help is much appreciated.
-(void)reverseAudio:(NSURL *)videoURL andVideoAsset:(AVURLAsset *)videoAsset{
AVAssetReader *video2AssetReader = [[AVAssetReader alloc] initWithAsset:videoAsset error:nil];
video2AssetReader.timeRange = CMTimeRangeFromTimeToTime(kCMTimeZero, [videoAsset duration]);
NSArray *audioTracks = [videoAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack* audioTrack = [audioTracks objectAtIndex:0];
NSDictionary *outputSettingsDict = [[NSDictionary alloc] initWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatLinearPCM],AVFormatIDKey,
[NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsNonInterleaved,
nil];
AVAssetReaderTrackOutput *readerAudioTrackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:outputSettingsDict];
[video2AssetReader addOutput:readerAudioTrackOutput];
[video2AssetReader startReading];
// read in the samples
NSMutableArray *audioSamples = [[NSMutableArray alloc] init];
CMSampleBufferRef audioSample;
while((audioSample = [readerAudioTrackOutput copyNextSampleBuffer])){
[audioSamples addObject:(__bridge id)audioSample];
CFRelease(audioSample);
}
videoReverseProcess3TotalFrames = audioSamples.count;
NSLog(#"AUDIO SAMPLES COUNT = %f", videoReverseProcess3TotalFrames);
[video2AssetReader cancelReading];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *videoPath = [documentsDirectory stringByAppendingPathComponent:#"videoReverseAudioFile.m4a"];
NSError *error = nil;
if([[NSFileManager defaultManager] fileExistsAtPath:videoPath]){
[[NSFileManager defaultManager] removeItemAtPath:videoPath error:&error];
if(error){
NSLog(#"VIDEO DELETE FAILED");
}
else{
NSLog(#"VIDEO DELETED");
}
}
NSURL *audioExportURL = [[NSURL alloc] initFileURLWithPath:videoPath];
AVAssetWriter *writer = [[AVAssetWriter alloc] initWithURL:audioExportURL fileType:AVFileTypeAppleM4A error:&error];
AudioChannelLayout channelLayout;
memset(&channelLayout, 0, sizeof(AudioChannelLayout));
channelLayout.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo;
NSDictionary *audioCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt: kAudioFormatMPEG4AAC], AVFormatIDKey,
[NSNumber numberWithFloat:44100.0], AVSampleRateKey,
[NSNumber numberWithInt:2], AVNumberOfChannelsKey,
[NSNumber numberWithInt:128000], AVEncoderBitRateKey,
[NSData dataWithBytes:&channelLayout length:sizeof(AudioChannelLayout)], AVChannelLayoutKey, nil];
AVAssetWriterInput *writerAudioInput;
writerAudioInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeAudio outputSettings:audioCompressionSettings];
writerAudioInput.expectsMediaDataInRealTime = NO;
if([writer canAddInput:writerAudioInput]){
[writer addInput:writerAudioInput];
}
else{
NSLog(#"ERROR ADDING AUDIO");
}
[writer startWriting];
CMTime timeStamp = CMSampleBufferGetPresentationTimeStamp((__bridge CMSampleBufferRef)audioSamples[0]);
[writer startSessionAtSourceTime:timeStamp];
while(audioSamples.count > 0){
if(writer && writerAudioInput.readyForMoreMediaData){
CMSampleBufferRef audioBufferRef = (__bridge CMSampleBufferRef)audioSamples[audioSamples.count - 1];
[writerAudioInput appendSampleBuffer:audioBufferRef];
[audioSamples removeObjectAtIndex:audioSamples.count - 1];
}
else{
[NSThread sleepForTimeInterval:0.2];
}
}
if(writer.status != AVAssetWriterStatusCancelled){
[writerAudioInput markAsFinished];
[writer finishWritingWithCompletionHandler:^{
}];
}
}
You are not reversing audio, you just reversing audio fragments (buffers) order.
So you have this input: S1, S2, S3, S4 and you produce following output: S4, S3, S2, S1. But inside this fragments you still have original order of frames.
You need to reverse buffer data too.
Update #1
Here is example how you can do this.
- (void)reverseAudio:(AVURLAsset *)videoAsset {
AVAssetReader *video2AssetReader = [[AVAssetReader alloc] initWithAsset:videoAsset error:nil];
video2AssetReader.timeRange = CMTimeRangeFromTimeToTime(kCMTimeZero, [videoAsset duration]);
NSArray *audioTracks = [videoAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack* audioTrack = [audioTracks objectAtIndex:0];
NSDictionary *outputSettingsDict = [[NSDictionary alloc] initWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatLinearPCM],AVFormatIDKey,
[NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsNonInterleaved,
nil];
AVAssetReaderTrackOutput *readerAudioTrackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:outputSettingsDict];
[video2AssetReader addOutput:readerAudioTrackOutput];
[video2AssetReader startReading];
// read in the samples
CMTime timeStamp = kCMTimeInvalid;
NSMutableArray *audioSamples = [[NSMutableArray alloc] init];
CMSampleBufferRef audioSample;
while ((audioSample = [readerAudioTrackOutput copyNextSampleBuffer])) {
[audioSamples addObject:(__bridge id)[self reverseSampleBuffer:audioSample]];
if (CMTIME_IS_INVALID(timeStamp)) {
timeStamp = CMSampleBufferGetPresentationTimeStamp(audioSample);
}
CFRelease(audioSample);
}
NSLog(#"AUDIO SAMPLES COUNT = %d", (int)audioSamples.count);
[video2AssetReader cancelReading];
// rest of the code
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *videoPath = [documentsDirectory stringByAppendingPathComponent:#"videoReverseAudioFile.m4a"];
NSError *error = nil;
if ([[NSFileManager defaultManager] fileExistsAtPath:videoPath]) {
[[NSFileManager defaultManager] removeItemAtPath:videoPath error:&error];
if (error) {
NSLog(#"VIDEO DELETE FAILED");
} else {
NSLog(#"VIDEO DELETED");
}
}
NSURL *audioExportURL = [[NSURL alloc] initFileURLWithPath:videoPath];
AVAssetWriter *writer = [[AVAssetWriter alloc] initWithURL:audioExportURL fileType:AVFileTypeAppleM4A error:&error];
AudioChannelLayout channelLayout;
memset(&channelLayout, 0, sizeof(AudioChannelLayout));
channelLayout.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo;
NSDictionary *audioCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatMPEG4AAC], AVFormatIDKey,
[NSNumber numberWithFloat:44100.0], AVSampleRateKey,
[NSNumber numberWithInt:2], AVNumberOfChannelsKey,
[NSNumber numberWithInt:128000], AVEncoderBitRateKey,
[NSData dataWithBytes:&channelLayout length:sizeof(AudioChannelLayout)], AVChannelLayoutKey, nil];
AVAssetWriterInput *writerAudioInput;
writerAudioInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeAudio outputSettings:audioCompressionSettings];
writerAudioInput.expectsMediaDataInRealTime = NO;
if ([writer canAddInput:writerAudioInput]) {
[writer addInput:writerAudioInput];
} else {
NSLog(#"ERROR ADDING AUDIO");
}
[writer startWriting];
[writer startSessionAtSourceTime:timeStamp];
while (audioSamples.count > 0) {
if(writer && writerAudioInput.readyForMoreMediaData) {
CMSampleBufferRef audioBufferRef = (__bridge CMSampleBufferRef)audioSamples[audioSamples.count - 1];
[writerAudioInput appendSampleBuffer:audioBufferRef];
[audioSamples removeObjectAtIndex:audioSamples.count - 1];
} else {
[NSThread sleepForTimeInterval:0.2];
}
}
if (writer.status != AVAssetWriterStatusCancelled) {
[writerAudioInput markAsFinished];
[writer finishWritingWithCompletionHandler:^{
}];
}
}
- (CMSampleBufferRef)reverseSampleBuffer:(CMSampleBufferRef)buffer {
AudioBufferList list;
CMBlockBufferRef dataBuffer = NULL;
// TODO check result code
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(buffer,
NULL,
&list,
sizeof(list),
NULL,
NULL,
kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment,
&dataBuffer);
CMItemCount numberOfSamples = CMSampleBufferGetNumSamples(buffer);
for (int i = 0; i < list.mNumberBuffers; i++) {
SInt16 *samples = (SInt16 *)list.mBuffers[i].mData;
for (int j = 0; j < numberOfSamples / 2; j++) {
SInt16 t = samples[j];
samples[j] = samples[numberOfSamples - 1 - j];
samples[numberOfSamples - 1 - j] = t;
}
}
CMFormatDescriptionRef format = CMSampleBufferGetFormatDescription(buffer);
CMSampleBufferRef result = NULL;
// TODO check result code
CMSampleBufferCreate(kCFAllocatorDefault, dataBuffer, true, NULL, NULL, format, 0, 0, NULL, 0, NULL, &result);
return result;
}
I want to record a audio file and upload to server in .wav format, but recorder doesn't allow to me to record a file into wav format.
for recording i have used the code :
NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue:[NSNumber numberWithInt:kAudioFormatMPEG4AAC] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
// Initiate and prepare the recorder
recorder = [[AVAudioRecorder alloc] initWithURL:outputFileURL settings:recordSetting error:nil];
recorder.delegate = self;
recorder.meteringEnabled = YES;
[recorder prepareToRecord];
Another way i have found, convert file from caff to wav after recording but this also not working for me.
I have used the code for convert file from caff to wav:
-(BOOL)exportAssetAsWaveFormat:(NSURL*)filePath
{
NSError *error = nil ;
NSDictionary *audioSetting = [NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithFloat:44100.0], AVSampleRateKey,
[ NSNumber numberWithInt:2], AVNumberOfChannelsKey,
[ NSNumber numberWithInt:16], AVLinearPCMBitDepthKey,
[ NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey,
[ NSNumber numberWithBool:NO], AVLinearPCMIsFloatKey,
[ NSNumber numberWithBool:0], AVLinearPCMIsBigEndianKey,
[ NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved,
[ NSData data], AVChannelLayoutKey, nil ];
// NSString *audioFilePath = filePath;
AVURLAsset * URLAsset = [[AVURLAsset alloc] initWithURL:recorder.url options:nil];
if (!URLAsset) return NO ;
AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:URLAsset error:&error];
if (error) return NO;
NSArray *tracks = [URLAsset tracksWithMediaType:AVMediaTypeAudio];
if (![tracks count]) return NO;
AVAssetReaderAudioMixOutput *audioMixOutput = [AVAssetReaderAudioMixOutput
assetReaderAudioMixOutputWithAudioTracks:tracks
audioSettings :audioSetting];
if (![assetReader canAddOutput:audioMixOutput]) return NO ;
[assetReader addOutput :audioMixOutput];
if (![assetReader startReading]) return NO;
NSString *title = #"WavConverted";
NSArray *docDirs = NSSearchPathForDirectoriesInDomains (NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docDir = [docDirs objectAtIndex: 0];
NSString *outPath = [[docDir stringByAppendingPathComponent :title]
stringByAppendingPathExtension:#"wav" ];
NSURL *outURL = [NSURL fileURLWithPath:outPath];
AVAssetWriter *assetWriter = [AVAssetWriter assetWriterWithURL:outURL
fileType:AVFileTypeWAVE
error:&error];
if (error) return NO;
AVAssetWriterInput *assetWriterInput = [ AVAssetWriterInput assetWriterInputWithMediaType :AVMediaTypeAudio
outputSettings:audioSetting];
assetWriterInput. expectsMediaDataInRealTime = NO;
if (![assetWriter canAddInput:assetWriterInput]) return NO ;
[assetWriter addInput :assetWriterInput];
if (![assetWriter startWriting]) return NO;
[assetWriter startSessionAtSourceTime:kCMTimeZero ];
dispatch_queue_t queue = dispatch_queue_create( "assetWriterQueue", NULL );
[assetWriterInput requestMediaDataWhenReadyOnQueue:queue usingBlock:^{
NSLog(#"start");
while (1)
{
if ([assetWriterInput isReadyForMoreMediaData]) {
CMSampleBufferRef sampleBuffer = [audioMixOutput copyNextSampleBuffer];
if (sampleBuffer) {
[assetWriterInput appendSampleBuffer :sampleBuffer];
CFRelease(sampleBuffer);
} else {
[assetWriterInput markAsFinished];
break;
}
}
}
[assetWriter finishWriting];
NSLog(#"finish %#",assetWriter);
}];
return YES;
// dispatch_release(queue);
}
Thanks in advance.
Any help would be really appreciated.
You are uploading the file to a server, but I recommend you to not use the wav format because it's bigger in size than any other formats. Use caf, m4a, or any other formats instead.
You can record a wav file by setting AVFormatIDKey to kAudioFormatLinearPCM; no other encoding format will work.
NSURL *url = [NSURL fileURLWithPath: outPath];
NSError *err = nil;
audioRecorder = [[ AVAudioRecorder alloc] initWithURL:url
settings: audioSetting
error:&err];
//prepare to record
[audioRecorder setDelegate:self];
[audioRecorder prepareToRecord];
audioRecorder.meteringEnabled = YES;
[audioRecorder recordForDuration:(NSTimeInterval)10000000000];
[audioRecorder record];
Try this
NSString *wavFilePath = [[NSBundle mainBundle] pathForResource:#"sampleaudio" ofType:#"wav"];
NSURL *assetURL = [NSURL fileURLWithPath:wavFilePath];
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:assetURL options:nil];
NSError *assetError = nil;
AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:songAsset
error:&assetError]
;
if (assetError) {
NSLog (#"error: %#", assetError);
return;
}
AVAssetReaderOutput *assetReaderOutput = [AVAssetReaderAudioMixOutput
assetReaderAudioMixOutputWithAudioTracks:songAsset.tracks
audioSettings: nil];
if (! [assetReader canAddOutput: assetReaderOutput]) {
NSLog (#"can't add reader output... die!");
return;
}
[assetReader addOutput: assetReaderOutput];
NSString *strcafFileName = [NSString stringWithFormat:#"%#.caf",[wavFilePath stringByDeletingPathExtension]];
NSString *cafFilePath = [delegate.strCassettePathSide stringByAppendingPathComponent:strcafFileName];
NSURL *exportURL = [NSURL fileURLWithPath:cafFilePath];
AVAssetWriter *assetWriter = [AVAssetWriter assetWriterWithURL:exportURL
fileType:AVFileTypeCoreAudioFormat
error:&assetError];
if (assetError)
{
NSLog (#"error: %#", assetError);
return;
}
AudioChannelLayout channelLayout;
memset(&channelLayout, 0, sizeof(AudioChannelLayout));
channelLayout.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo;
NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey,
[NSNumber numberWithFloat:11025], AVSampleRateKey,
[NSNumber numberWithInt:2], AVNumberOfChannelsKey,
[NSData dataWithBytes:&channelLayout length:sizeof(AudioChannelLayout)], AVChannelLayoutKey,
[NSNumber numberWithInt:16], AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved,
[NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
[NSNumber numberWithBool:NO], AVLinearPCMIsBigEndianKey,
nil];
AVAssetWriterInput *assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio
outputSettings:outputSettings];
if ([assetWriter canAddInput:assetWriterInput])
{
[assetWriter addInput:assetWriterInput];
}
else
{
NSLog(#"can't add asset writer input... die!");
return;
}
assetWriterInput.expectsMediaDataInRealTime = NO;
[assetWriter startWriting];
[assetReader startReading];
AVAssetTrack *soundTrack = [songAsset.tracks objectAtIndex:0];
CMTime startTime = CMTimeMake (0, soundTrack.naturalTimeScale);
[assetWriter startSessionAtSourceTime: startTime];
__block UInt64 convertedByteCount = 0;
dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL);
[assetWriterInput requestMediaDataWhenReadyOnQueue:mediaInputQueue
usingBlock: ^
{
while (assetWriterInput.readyForMoreMediaData)
{
CMSampleBufferRef nextBuffer = [assetReaderOutput copyNextSampleBuffer];
if (nextBuffer)
{
// append buffer
[assetWriterInput appendSampleBuffer: nextBuffer];
convertedByteCount += CMSampleBufferGetTotalSampleSize (nextBuffer);
CMSampleBufferInvalidate(nextBuffer);
CFRelease(nextBuffer);
nextBuffer = NULL;
}
else
{
[assetWriterInput markAsFinished];
// [assetWriter finishWriting];
[assetReader cancelReading];
break;
}
}
}];
I'm trying capture a movie using AVAssetWriter, in the iphone 5 everything is all right, capture and save movie like a charm.
But when I try capture movie in iphone 4, the samplebuffer skip some frames and the movie is not good.
So, this is my code:
- (void) initCaptureSession{
// openSession and set quality to 1280x720
session = [[AVCaptureSession alloc] init];
if([session canSetSessionPreset:AVCaptureSessionPreset640x480]) session.sessionPreset = AVCaptureSessionPresetHigh;
// get devices for audio and video
deviceVideo = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
deviceAudio = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *error = nil;
// create input of audio and video
inputVideo = [AVCaptureDeviceInput deviceInputWithDevice:deviceVideo error:&error];
if (!inputVideo) NSLog(#"ERROR: trying to open camera: %#", error);
inputAudio = [AVCaptureDeviceInput deviceInputWithDevice:deviceAudio error:&error];
if (!inputAudio) NSLog(#"ERROR: trying to open audio: %#", error);
// CMTime maxDuration = CMTimeMake(60, 1);
// create output audio and video
outputVideo = [[AVCaptureVideoDataOutput alloc] init];
outputVideo.alwaysDiscardsLateVideoFrames = NO;
outputVideo.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
outputAudio = [[AVCaptureAudioDataOutput alloc] init];
// add inputs and outputs in the current session
[session beginConfiguration];
if ([session canAddInput:inputVideo])[session addInput:inputVideo];
if ([session canAddInput:inputAudio])[session addInput:inputAudio];
if ([session canAddOutput:outputVideo]) [session addOutput:outputVideo];
if ([session canAddOutput:outputAudio]) [session addOutput:outputAudio];
[session commitConfiguration];
// tourn of the torch
[deviceVideo lockForConfiguration:&error];
if([deviceVideo hasTorch] && [deviceVideo isTorchModeSupported:AVCaptureTorchModeOff]) [deviceVideo setTorchMode:AVCaptureTorchModeOff];
[deviceVideo unlockForConfiguration];
[self configDevice];
// create the preview view to show the video
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[captureVideoPreviewLayer setBackgroundColor:[[UIColor blackColor] CGColor]];
[captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
captureVideoPreviewLayer.frame = viewPreview.bounds;
[viewPreview.layer addSublayer:captureVideoPreviewLayer];
CALayer *viewLayer = viewPreview.layer;
[viewLayer setMasksToBounds:YES];
[captureVideoPreviewLayer setFrame:[viewLayer bounds]];
[viewLayer addSublayer:captureVideoPreviewLayer];
// dispatch outputs to delegate in a queue
dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
[outputVideo setSampleBufferDelegate:self queue:queue];
[outputAudio setSampleBufferDelegate:self queue:queue];
// dispatch_release(queue);
[session startRunning];
}
-(BOOL) setupWriter{
urlOutput = [self tempFileURL];
NSError *error = nil;
videoWriter = [[AVAssetWriter alloc] initWithURL:urlOutput fileType:AVFileTypeMPEG4 error:&error];
NSParameterAssert(videoWriter);
// Add metadata
NSArray *existingMetadataArray = videoWriter.metadata;
NSMutableArray *newMetadataArray = nil;
if (existingMetadataArray) {
newMetadataArray = [existingMetadataArray mutableCopy];
} else {
newMetadataArray = [[NSMutableArray alloc] init];
}
AVMutableMetadataItem *mutableItemLocation = [[AVMutableMetadataItem alloc] init];
mutableItemLocation.keySpace = AVMetadataKeySpaceCommon;
mutableItemLocation.key = AVMetadataCommonKeyLocation;
mutableItemLocation.value = [NSString stringWithFormat:#"%+08.4lf%+09.4lf/", location.latitude, location.longitude];
AVMutableMetadataItem *mutableItemModel = [[AVMutableMetadataItem alloc] init];
mutableItemModel.keySpace = AVMetadataKeySpaceCommon;
mutableItemModel.key = AVMetadataCommonKeyModel;
mutableItemModel.value = [[UIDevice currentDevice] model];
[newMetadataArray addObject:mutableItemLocation];
[newMetadataArray addObject:mutableItemModel];
videoWriter.metadata = newMetadataArray;
// video Configuration
NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:640], AVVideoCleanApertureWidthKey,
[NSNumber numberWithInt:360], AVVideoCleanApertureHeightKey,
[NSNumber numberWithInt:2], AVVideoCleanApertureHorizontalOffsetKey,
[NSNumber numberWithInt:2], AVVideoCleanApertureVerticalOffsetKey,
nil];
NSDictionary *videoAspectRatioSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:1], AVVideoPixelAspectRatioHorizontalSpacingKey,
[NSNumber numberWithInt:1],AVVideoPixelAspectRatioVerticalSpacingKey,
nil];
NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:1024000], AVVideoAverageBitRateKey,
[NSNumber numberWithInt:90],AVVideoMaxKeyFrameIntervalKey,
videoCleanApertureSettings, AVVideoCleanApertureKey,
videoAspectRatioSettings, AVVideoPixelAspectRatioKey,
AVVideoProfileLevelH264Main30, AVVideoProfileLevelKey,
nil];
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
codecSettings,AVVideoCompressionPropertiesKey,
[NSNumber numberWithInt:640], AVVideoWidthKey,
[NSNumber numberWithInt:360], AVVideoHeightKey,
nil];
videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
NSParameterAssert(videoWriterInput);
videoWriterInput.expectsMediaDataInRealTime = YES;
// Add the audio input
AudioChannelLayout acl;
bzero( &acl, sizeof(acl));
acl.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo;
NSDictionary* audioOutputSettings = nil;
// Both type of audio inputs causes output video file to be corrupted.
// if( NO ) {
// should work from iphone 3GS on and from ipod 3rd generation
audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatMPEG4AAC ], AVFormatIDKey,
[ NSNumber numberWithInt: 2 ], AVNumberOfChannelsKey,
[ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
[ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey,
[ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
nil];
// } else {
// // should work on any device requires more space
// audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:
// [ NSNumber numberWithInt: kAudioFormatAppleLossless ], AVFormatIDKey,
// [ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey,
// [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
// [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
// [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
// nil ];
// }
audioWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeAudio outputSettings: audioOutputSettings];
audioWriterInput.expectsMediaDataInRealTime = YES;
// add input
[videoWriter addInput:videoWriterInput];
[videoWriter addInput:audioWriterInput];
return YES;
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
if( !CMSampleBufferDataIsReady(sampleBuffer) ){
NSLog( #"sample buffer is not ready. Skipping sample" );
return;
}
if(isRecording == YES ){
lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
if(videoWriter.status != AVAssetWriterStatusWriting ){
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:lastSampleTime];
}
if( captureOutput == outputVideo ){
[self newVideoSample:sampleBuffer];
} else if( captureOutput == outputAudio) {
[self newAudioSample:sampleBuffer];
}
}
}
-(void) newVideoSample:(CMSampleBufferRef)sampleBuffer{
if( isRecording ){
if( videoWriter.status > AVAssetWriterStatusWriting ) {
NSLog(#"Warning: writer status is %d", videoWriter.status);
if( videoWriter.status == AVAssetWriterStatusFailed )
NSLog(#"Error: %#", videoWriter.error);
return;
}
while (!videoWriterInput.readyForMoreMediaData) {
NSDate *maxDate = [NSDate dateWithTimeIntervalSinceNow:0.1];
[[NSRunLoop currentRunLoop] runUntilDate:maxDate];
}
if( ![videoWriterInput appendSampleBuffer:sampleBuffer] )
NSLog(#"Unable to write to video input");
}
}
-(void) newAudioSample:(CMSampleBufferRef)sampleBuffer{
if( isRecording ){
if( videoWriter.status > AVAssetWriterStatusWriting ) {
NSLog(#"Warning: writer status is %d", videoWriter.status);
if( videoWriter.status == AVAssetWriterStatusFailed )
NSLog(#"Error: %#", videoWriter.error);
return;
}
while (!audioWriterInput.readyForMoreMediaData) {
NSDate *maxDate = [NSDate dateWithTimeIntervalSinceNow:0.1];
[[NSRunLoop currentRunLoop] runUntilDate:maxDate];
}
if( ![audioWriterInput appendSampleBuffer:sampleBuffer] )
NSLog(#"Unable to write to audio input");
}
}
-(void) startVideoRecording {
if( !isRecording ){
NSLog(#"start video recording...");
if( ![self setupWriter] ) {
NSLog(#"Setup Writer Failed") ;
return;
}
isRecording = YES;
recorded = NO;
}
}
-(void) stopVideoRecording {
if( isRecording ) {
isRecording = NO;
btRecord.hidden = NO;
btRecording.hidden = YES;
[timerToRecord invalidate];
timerToRecord = nil;
// [session stopRunning];
[videoWriter finishWritingWithCompletionHandler:^{
if (videoWriter.status != AVAssetWriterStatusFailed && videoWriter.status == AVAssetWriterStatusCompleted) {
videoWriterInput = nil;
audioWriterInput = nil;
videoWriter = nil;
NSLog(#"finishWriting returned succeful");
recorded = YES;
} else {
NSLog(#"finishWriting returned unsucceful") ;
}
}];
NSLog(#"video recording stopped");
[self performSelector:#selector(openPlayer) withObject:nil afterDelay:0.5];
}
}
When I remove this lines:
while (!audioWriterInput.readyForMoreMediaData) {
NSDate *maxDate = [NSDate dateWithTimeIntervalSinceNow:0.1];
[[NSRunLoop currentRunLoop] runUntilDate:maxDate];
}
I got this error:
* Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: '* -[AVAssetWriterInput appendSampleBuffer:] A sample buffer cannot be appended when readyForMoreMediaData is NO.'
In iphone 5 I'm not using this looping.
I read some examples here, but I didn't understand how can I make a movie smoother in iphone 4.
If anyone have one sugestion or full example to make movies using AVAssetWriter for iphone 3gs, iphone 4, iphone 4s and iphone 5, I would thanks a lot.
Thanks
After one week fight with AVFoundation I got a good solution.
After watch wwdc2012 - session 520 I made a good solution.
First I record the movie using AVCaptureMovieFileOutput with session presset AVCaptureSessionPreset640x480
So after record user choose if want save and share, just save or delete the movie.
If user want save/save and share I get the movie recorded and compact separately.
First I compress the movie, after I compress the audio and marge the tracks.
See my code:
-(void)exportMediaWithURL:(NSURL *)url location:(CLLocationCoordinate2D)location mirror:(BOOL)mirror{
urlMedia = url;
locationMedia = location;
videoRecorded = NO;
audioRecorded = NO;
asset = [AVAsset assetWithURL:urlMedia];
progressVideo = 0.0;
progressAudio = 0.0;
progressMarge = 0.0;
progressFactor = 3.0;
mirrored = mirror;
limitTime = CMTimeMake(1000*60, 1000);
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:#"tracks"] completionHandler:^() {
NSError *error;
AVKeyValueStatus stats = [asset statusOfValueForKey:#"tracks" error:&error];
if(stats == AVKeyValueStatusLoaded){
if([[asset tracksWithMediaType:AVMediaTypeVideo] count] > 0) video_track = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
if([[asset tracksWithMediaType:AVMediaTypeAudio] count] > 0) audio_track = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
if(!audio_track) progressFactor = 1.0;
if(video_track){
if (CMTimeCompare(asset.duration, limitTime) > 0) {
totalTime = limitTime;
}else{
totalTime = asset.duration;
}
[self exportVideo];
}
}
}];
}
-(void)exportVideo{
NSError *error;
AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:asset error:&error];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
AVAssetReaderOutput *videoOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:video_track outputSettings:videoSettings];
[assetReader addOutput:videoOutput];
assetReader.timeRange = CMTimeRangeMake(kCMTimeZero, totalTime);
// start session to make a movie
if (assetVideoWriter.status == AVAssetWriterStatusUnknown) {
if ([self setupWriterVideo]) {
if ([assetVideoWriter startWriting]) {
[assetVideoWriter startSessionAtSourceTime:kCMTimeZero];
}
}
}
if([assetReader startReading]){
BOOL videoDone = NO;
CMSampleBufferRef bufferVideo;
while (!videoDone) {
if ([assetReader status]== AVAssetReaderStatusReading ) bufferVideo = [videoOutput copyNextSampleBuffer];
if(bufferVideo){
[self newVideoSample:bufferVideo];
CFRelease(bufferVideo);
}else{
videoDone = YES;
}
}
// finish
[videoWriterInput markAsFinished];
[assetVideoWriter finishWritingWithCompletionHandler:^{}];
// gambiarra to resolve the dealloc problem when use a block to delegate something
while (!videoRecorded) {
if (assetVideoWriter.status == AVAssetWriterStatusCompleted) {
videoWriterInput = nil;
assetVideoWriter = nil;
videoRecorded = YES;
if (audio_track) {
[self exportAudio];
}else{
NSMutableDictionary *infoToSend = [NSMutableDictionary new];
[infoToSend setValue:urlOutputVideo forKey:#"url_media"];
[[NSNotificationCenter defaultCenter] postNotificationName:EXPORT_STATUS_DONE object:self userInfo:infoToSend];
}
}
}
}
}
-(void)exportAudio{
NSError *error;
AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:asset error:&error];
NSDictionary* audioSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey, nil];
AVAssetReaderOutput *audioOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audio_track outputSettings:audioSettings];
[assetReader addOutput:audioOutput];
assetReader.timeRange = CMTimeRangeMake(kCMTimeZero, totalTime);
// start session to make a movie
if (assetAudioWriter.status == AVAssetWriterStatusUnknown) {
if ([self setupWriterAudio]) {
if ([assetAudioWriter startWriting]) {
[assetAudioWriter startSessionAtSourceTime:kCMTimeZero];
}
}
}
if([assetReader startReading]){
BOOL audioDone = NO;
CMSampleBufferRef bufferAudio;
while (!audioDone) {
if ([assetReader status]== AVAssetReaderStatusReading ) bufferAudio = [audioOutput copyNextSampleBuffer];
if(bufferAudio){
[self newAudioSample:bufferAudio];
CFRelease(bufferAudio);
}else{
audioDone = YES;
}
}
// finish
[audioWriterInput markAsFinished];
[assetAudioWriter finishWritingWithCompletionHandler:^{}];
// gambiarra to resolve the dealloc problem when use a block to delegate something
while (!audioRecorded) {
if (assetAudioWriter.status == AVAssetWriterStatusCompleted) {
audioWriterInput = nil;
assetAudioWriter = nil;
audioRecorded = YES;
[self margeFile];
}
}
}
}
-(void)margeFile{
AVURLAsset *assetVideo = [AVURLAsset assetWithURL:urlOutputVideo];
AVAssetTrack *video_track_marge = [[assetVideo tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVURLAsset *assetAudio = [AVURLAsset assetWithURL:urlOutputAudio];
AVAssetTrack *audio_track_marge = [[assetAudio tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
CMTime startTime = CMTimeMake(1, 1);
CMTimeRange timeRangeVideo = CMTimeRangeMake(kCMTimeZero, assetVideo.duration);
CMTimeRange timeRangeAudio = CMTimeRangeMake(kCMTimeZero, assetAudio.duration);
AVMutableComposition * composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
if(mirrored) compositionVideoTrack.preferredTransform = CGAffineTransformMakeRotation(M_PI);
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSError *error;
[compositionVideoTrack insertTimeRange:timeRangeVideo ofTrack:video_track_marge atTime:startTime error:&error];
[compositionAudioTrack insertTimeRange:timeRangeAudio ofTrack:audio_track_marge atTime:startTime error:&error];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetPassthrough];
exportSession.outputFileType = AVFileTypeAppleM4V;
exportSession.outputURL = [self tempFileURL:media_mixed];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.metadata = newMetadataArray;
exportSession.timeRange = CMTimeRangeMake(CMTimeMakeWithSeconds(1.0, 600), totalTime);
[exportSession exportAsynchronouslyWithCompletionHandler:^{
NSMutableDictionary *infoToSend = [NSMutableDictionary new];
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted:
[infoToSend setValue:exportSession.outputURL forKey:#"url_media"];
[[NSNotificationCenter defaultCenter] postNotificationName:EXPORT_STATUS_DONE object:self userInfo:infoToSend];
break;
case AVAssetExportSessionStatusExporting:
[[NSNotificationCenter defaultCenter] postNotificationName:EXPORT_STATUS_EXPORTING object:self];
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"failed");
break;
}
}];
while (exportSession.status == AVAssetExportSessionStatusExporting) {
progressMarge = exportSession.progress;
[self postProgress];
}
}
-(BOOL) setupWriterVideo{
urlOutputVideo = [self tempFileURL:media_video];
NSError *error = nil;
assetVideoWriter = [[AVAssetWriter alloc] initWithURL:urlOutputVideo fileType:AVFileTypeMPEG4 error:&error];
NSParameterAssert(assetVideoWriter);
// Add metadata
NSArray *existingMetadataArray = assetVideoWriter.metadata;
if (existingMetadataArray) {
newMetadataArray = [existingMetadataArray mutableCopy];
} else {
newMetadataArray = [[NSMutableArray alloc] init];
}
AVMutableMetadataItem *mutableItemLocation = [[AVMutableMetadataItem alloc] init];
mutableItemLocation.keySpace = AVMetadataKeySpaceCommon;
mutableItemLocation.key = AVMetadataCommonKeyLocation;
mutableItemLocation.value = [NSString stringWithFormat:#"%+08.4lf%+09.4lf/", locationMedia.latitude, locationMedia.longitude];
AVMutableMetadataItem *mutableItemModel = [[AVMutableMetadataItem alloc] init];
mutableItemModel.keySpace = AVMetadataKeySpaceCommon;
mutableItemModel.key = AVMetadataCommonKeyModel;
mutableItemModel.value = [[UIDevice currentDevice] model];
[newMetadataArray addObject:mutableItemLocation];
[newMetadataArray addObject:mutableItemModel];
assetVideoWriter.metadata = newMetadataArray;
assetVideoWriter.shouldOptimizeForNetworkUse = YES;
videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:[self videoConfiguration]];
NSParameterAssert(videoWriterInput);
videoWriterInput.expectsMediaDataInRealTime = NO;
// add input
[assetVideoWriter addInput:videoWriterInput];
return YES;
}
-(BOOL) setupWriterAudio{
urlOutputAudio = [self tempFileURL:media_audio];
NSError *error = nil;
assetAudioWriter = [[AVAssetWriter alloc] initWithURL:urlOutputAudio fileType:AVFileTypeAppleM4A error:&error];
NSParameterAssert(assetAudioWriter);
audioWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:[self audioConfiguration]];
audioWriterInput.expectsMediaDataInRealTime = NO;
// add input
[assetAudioWriter addInput:audioWriterInput];
return YES;
}
- (NSDictionary *)videoConfiguration{
// video Configuration
// float bitsPerPixel;
// int numPixels = 640.0 * 360.0;
// int bitsPerSecond;
//
// // Assume that lower-than-SD resolutions are intended for streaming, and use a lower bitrate
// if ( numPixels < (640 * 360.0) )
// bitsPerPixel = 4.05; // This bitrate matches the quality produced by AVCaptureSessionPresetMedium or Low.
// else
// bitsPerPixel = 11.4; // This bitrate matches the quality produced by AVCaptureSessionPresetHigh.
//
// bitsPerSecond = numPixels * bitsPerPixel;
NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:640], AVVideoCleanApertureWidthKey,
[NSNumber numberWithInt:360], AVVideoCleanApertureHeightKey,
[NSNumber numberWithInt:2], AVVideoCleanApertureHorizontalOffsetKey,
[NSNumber numberWithInt:2], AVVideoCleanApertureVerticalOffsetKey,
nil];
NSDictionary *videoAspectRatioSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:1],AVVideoPixelAspectRatioHorizontalSpacingKey,
[NSNumber numberWithInt:1],AVVideoPixelAspectRatioVerticalSpacingKey,
nil];
NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:1024000], AVVideoAverageBitRateKey,
[NSNumber numberWithInt:90],AVVideoMaxKeyFrameIntervalKey,
videoCleanApertureSettings, AVVideoCleanApertureKey,
videoAspectRatioSettings, AVVideoPixelAspectRatioKey,
AVVideoProfileLevelH264Main30, AVVideoProfileLevelKey,
nil];
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
AVVideoScalingModeResizeAspectFill, AVVideoScalingModeKey,
codecSettings,AVVideoCompressionPropertiesKey,
[NSNumber numberWithInt:640], AVVideoWidthKey,
[NSNumber numberWithInt:360], AVVideoHeightKey,
nil];
return videoSettings;
}
-(NSDictionary *)audioConfiguration{
// Add the audio input
AudioChannelLayout acl;
bzero( &acl, sizeof(acl));
acl.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo;
NSDictionary* audioOutputSettings = nil;
// Both type of audio inputs causes output video file to be corrupted.
// if( NO ) {
// should work from iphone 3GS on and from ipod 3rd generation
audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatMPEG4AAC ], AVFormatIDKey,
[ NSNumber numberWithInt: 2 ], AVNumberOfChannelsKey,
[ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
[ NSNumber numberWithInt: 128000 ], AVEncoderBitRateKey,
[ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
nil];
// } else {
// // should work on any device requires more space
// audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:
// [ NSNumber numberWithInt: kAudioFormatAppleLossless ], AVFormatIDKey,
// [ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey,
// [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
// [ NSNumber numberWithInt: 2 ], AVNumberOfChannelsKey,
// [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
// nil ];
// }
return audioOutputSettings;
}
-(void) newVideoSample:(CMSampleBufferRef)sampleBuffer{
if( assetVideoWriter.status > AVAssetWriterStatusWriting ) {
if( assetVideoWriter.status == AVAssetWriterStatusFailed )
NSLog(#"Error: %#", assetVideoWriter.error);
return;
}
if (assetVideoWriter.status == AVAssetWriterStatusWriting ) {
while (!videoWriterInput.readyForMoreMediaData) NSLog(#"waitting video");
if (videoWriterInput.readyForMoreMediaData) {
CMTime presTime = CMSampleBufferGetPresentationTimeStamp( sampleBuffer );
float valueLoading = (presTime.value / presTime.timescale);
float valueTotal = (totalTime.value / totalTime.timescale);
progressVideo = valueLoading / valueTotal;
[self postProgress];
if (![videoWriterInput appendSampleBuffer:sampleBuffer]) NSLog(#"Unable to write to video input");
}
}
}
-(void) newAudioSample:(CMSampleBufferRef)sampleBuffer{
if( assetAudioWriter.status > AVAssetWriterStatusWriting ) {
if( assetAudioWriter.status == AVAssetWriterStatusFailed )
NSLog(#"Error: %#", assetAudioWriter.error);
return;
}
if (assetAudioWriter.status == AVAssetWriterStatusWriting ) {
while (!audioWriterInput.readyForMoreMediaData) NSLog(#"waitting audio");
if (audioWriterInput.readyForMoreMediaData) {
CMTime presTime = CMSampleBufferGetPresentationTimeStamp( sampleBuffer );
float valueLoading = (presTime.value / presTime.timescale);
float valueTotal = (totalTime.value / totalTime.timescale);
progressAudio = valueLoading / valueTotal;
[self postProgress];
if (![audioWriterInput appendSampleBuffer:sampleBuffer]) {
NSLog(#"Unable to write to audio input");
}
}
}
}
- (void)postProgress{
float totalProgress = (progressVideo + progressAudio + progressMarge) / progressFactor;
NSMutableDictionary *infoToSend = [NSMutableDictionary new];
[infoToSend setValue:[NSNumber numberWithFloat:totalProgress] forKey:#"progress"];
[[NSNotificationCenter defaultCenter] postNotificationName:EXPORT_STATUS_EXPORTING object:self userInfo:infoToSend];
}
- (NSURL *)tempFileURL:(int)typeMedia {
NSString *outputPath;
switch (typeMedia) {
case media_video:
outputPath = [[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"output_export.mp4"];
break;
case media_audio:
outputPath = [[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"output_export.m4a"];
break;
case media_mixed:
outputPath = [[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"mixed.mp4"];
break;
}
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputPath]) [[NSFileManager defaultManager] removeItemAtPath:outputPath error:nil];
return outputURL;
}
- (void) dealloc {
NSLog(#"dealloc video exporter");
[[NSNotificationCenter defaultCenter] removeObserver:self];
assetVideoWriter = nil;
assetAudioWriter = nil;
videoWriterInput = nil;
audioWriterInput = nil;
urlMedia = nil;
urlOutputVideo = nil;
urlOutputAudio = nil;
urlOutputFinal = nil;
}
#end
If someone have something to add, please post here!
set the AVAssetWriterInput.outputSettings[AVVideoCompressionPropertiesKey][AVVideoAllowFrameReorderingKey] = #(NO)
enter image description here
I have done image to video conversion in iphone(of course I got the code from stack overflow questions). But the problem is speed of recorded video is very fast, it ran away within 2 seconds even though I have around 2250 frames. I know the problem is with its frame rate.
But i don't know how to make it correct.
my code is below
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [paths objectAtIndex:0];
NSString *myFilePath = [documentsDirectoryPath stringByAppendingPathComponent:#"test.mov"];
if ([self openVideoFile:myFilePath withSize:CGSizeMake (480.0, 320.0)]) {
for (int i=1; i<2226; i++) {
NSString *imagename=[NSString stringWithFormat:#"1 (%i).jpg",i];
UIImage *image=[ UIImage imageNamed:imagename];
[self writeImageToMovie:[image CGImage]];
}
[videoWriter finishWriting];
}
else {
NSLog(#"friled to open video file");
}
this code is in calling function and defenitions of the functions given below
- (BOOL) openVideoFile: (NSString *) path withSize:(CGSize)imageSize {
CGSize size = CGSizeMake (480.0, 320.0);//imageSize;
NSError *error = nil;
videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
error:&error];
if (error != nil){
NSLog(#"error>>>> %#",error);
return NO;
}
NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithDouble:size.width], AVVideoCleanApertureWidthKey,
[NSNumber numberWithDouble:size.height], AVVideoCleanApertureHeightKey,
[NSNumber numberWithInt:10], AVVideoCleanApertureHorizontalOffsetKey,
[NSNumber numberWithInt:10], AVVideoCleanApertureVerticalOffsetKey,
nil];
NSDictionary *videoAspectRatioSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:1], AVVideoPixelAspectRatioHorizontalSpacingKey,
[NSNumber numberWithInt:1],AVVideoPixelAspectRatioVerticalSpacingKey,
nil];
NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys:
videoCleanApertureSettings, AVVideoCleanApertureKey,
videoAspectRatioSettings, AVVideoPixelAspectRatioKey,
nil];
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
codecSettings,AVVideoCompressionPropertiesKey,
[NSNumber numberWithDouble:size.width], AVVideoWidthKey,
[NSNumber numberWithDouble:size.height], AVVideoHeightKey,
nil];
writerInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
NSMutableDictionary * bufferAttributes = [[NSMutableDictionary alloc] init];
[bufferAttributes setObject: [NSNumber numberWithInt: kCVPixelFormatType_32ARGB]
forKey: (NSString *) kCVPixelBufferPixelFormatTypeKey];
[bufferAttributes setObject: [NSNumber numberWithInt: 480]
forKey: (NSString *) kCVPixelBufferWidthKey];
[bufferAttributes setObject: [NSNumber numberWithInt: 320]
forKey: (NSString *) kCVPixelBufferHeightKey];
adaptor = [[AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:nil] retain];
NSMutableDictionary* attributes;
attributes = [NSMutableDictionary dictionary];
int width = 480;
int height = 320;
[attributes setObject:[NSNumber numberWithInt:kCVPixelFormatType_32ARGB] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
[attributes setObject:[NSNumber numberWithInt:width] forKey: (NSString*)kCVPixelBufferWidthKey];
[attributes setObject:[NSNumber numberWithInt:height] forKey: (NSString*)kCVPixelBufferHeightKey];
CVReturn theError = CVPixelBufferPoolCreate(kCFAllocatorDefault, NULL, (CFDictionaryRef) attributes, &pixelBufferPool);
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
[videoWriter addInput:writerInput];
writerInput.expectsMediaDataInRealTime = YES;
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
buffer = NULL;
lastTime = kCMTimeZero;
presentTime = kCMTimeZero;
return YES;
}
- (void) writeImageToMovie:(CGImageRef)image
{
if([writerInput isReadyForMoreMediaData])
{
buffer = [self pixelBufferFromCGImage:image];
BOOL success = [adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
if (!success) NSLog(#"Failed to appendPixelBuffer");
CVPixelBufferRelease(buffer);
presentTime = CMTimeAdd(lastTime, CMTimeMake(1, 1000));//I think problem is here but what will be given for correct output
lastTime = presentTime;
}
else
{
NSLog(#"error - writerInput not ready");
}
}
- (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image
{
CGSize size = CGSizeMake (480.0, 320.0);
CVPixelBufferRef pxbuffer;
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
if (pixelBufferPool == NULL) NSLog(#"pixelBufferPool is null!");
CVReturn status = CVPixelBufferPoolCreatePixelBuffer (NULL, pixelBufferPool, &pxbuffer);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
size.height, 8, 4*size.width, rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(90, 10, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
What to do with the CMTime variables and how can I made it correctly
One more help how can I add audio with this video.
Your PTSs are very close together. Instead of CMTimeMake(1, 1000), why not 30FPS: CMTimeMake(1, 30))?
I revised you code and came up to the solution, in my case I record every image in my view in 0.1 seconds so my fps is 0.1fps.
presentTime = CMTimeAdd(lastTime, CMTimeMake(1, 10));
The reason why your video is too fast because your ratio is 1 seconds into 1000 image. You can make it 1 image per second.
presentTime = CMTimeAdd(lastTime, CMTimeMake(1, 1));
Try this out, hope it will help you.
Add this library
AVFoundation
CoreMedia
CoreVideo
i am able to generate video , but not able to get audio is anyone had use this code and create video with audio