iphone iOS : [AVAssetWriterInput appendSampleBuffer:] Cannot call method when status is 0 - ios

I am trying to append CMSampleBufferRefs to an AVAssetWriterInput and I keep getting a crash with error:
[AVAssetWriterInput appendSampleBuffer:] Cannot call method when status is 0
Code:
in viewDidLoad
NSArray *cachePaths = NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES);
NSString *cacheDirectory = [cachePaths firstObject];
NSString *filename = #"test.mp4";
NSString *filePath = [cacheDirectory stringByAppendingPathComponent:filename];
[[NSFileManager defaultManager] removeItemAtPath:filePath error:nil];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:filePath];
NSError *errors;
assetWriter = [[AVAssetWriter alloc] initWithURL:outputURL fileType:(NSString *)kUTTypeMPEG4 error:&errors];
videoWriteInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
audioWriteInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeAudio outputSettings:audioSettings];
[audioWriteInput addTrackAssociationWithTrackOfInput:videoWriteInput type:AVTrackAssociationTypeTimecode];
audioWriteInput.expectsMediaDataInRealTime = YES;
videoWriteInput.expectsMediaDataInRealTime = YES;
Record functions
-(void)prepareVideo {
if (![assetWriter startWriting]) {
NSLog(#"%li, %#", assetWriter.status, assetWriter.error.localizedDescription);
}
}
-(void)recordVideo {
recordingVideo = YES;
[assetWriter startSessionAtSourceTime:kCMTimeZero];
}
delegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
if (recordingVideo) {
if ([captureOutput isKindOfClass:[AVCaptureVideoDataOutput class]]) {
if (assetWriter.status != 0) {
[videoWriteInput appendSampleBuffer:sampleBuffer];
}
}
else if ([captureOutput isKindOfClass:[AVCaptureAudioDataOutput class]]) {
if (assetWriter.status != 0) {
[audioWriteInput appendSampleBuffer:sampleBuffer];
}
}
}
}
The status of the AVAssetWriter is 1 so I know this is not the issue..

Did you add the assetwriterinput to the assetwriter?
normally you should do that in the viewdidload method.
[self.assetWriter addInput:self.assetWriterInput];

Related

Unsupported Header Format while conversation of file from .mov to .wav format and It happens only in below iOS 11 devices (like iOS 9,10)

Below is my code about what i have done for this .mov to .wav conversations :
- (void)mp4ForURL:(NSURL *)videoURL{
// Create the asset url with the video file
AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:videoURL options:nil];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:avAsset];
// Check if video is supported for conversion or not
if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality])////************************************////
{
//Create Export session
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]initWithAsset:avAsset presetName:AVAssetExportPresetMediumQuality];////************************************////
//Creating temp path to save the converted video
NSString* documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString* myDocumentPath = [documentsDirectory stringByAppendingPathComponent:#"temp.mp4"];
NSURL *url = [[NSURL alloc] initFileURLWithPath:myDocumentPath];
//Check if the file already exists then remove the previous file
if ([[NSFileManager defaultManager]fileExistsAtPath:myDocumentPath])
{
[[NSFileManager defaultManager]removeItemAtPath:myDocumentPath error:nil];
}
exportSession.outputURL = url;
//set the output file format if you want to make it in other file format (ex .3gp)
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.shouldOptimizeForNetworkUse = YES;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status])
{
case AVAssetExportSessionStatusFailed:
NSLog(#"Export session failed");
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
case AVAssetExportSessionStatusCompleted:
{
//Video conversion finished
NSLog(#"Successful!");
[self convertMP4toMP3withFile:myDocumentPath];
}
break;
default:
break;
}
}];
}
else
{
NSLog(#"Video file not supported!");
}
}
-(void)convertMP4toCAFwithFile:(NSString*)dstPath //Converted to Core Audio Format .caf
{
NSURL *dstURL = [NSURL fileURLWithPath:dstPath];
AVMutableComposition* newAudioAsset = [AVMutableComposition composition];
AVMutableCompositionTrack* dstCompositionTrack;
dstCompositionTrack = [newAudioAsset addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAsset* srcAsset = [AVURLAsset URLAssetWithURL:dstURL options:nil];
AVAssetTrack* srcTrack = [[srcAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
CMTimeRange timeRange = srcTrack.timeRange;
NSError* error;
if(NO == [dstCompositionTrack insertTimeRange:timeRange ofTrack:srcTrack atTime:kCMTimeZero error:&error]) {
NSLog(#"track insert failed: %#\n", error);
return;
}
AVAssetExportSession* exportSesh = [[AVAssetExportSession alloc] initWithAsset:newAudioAsset presetName:AVAssetExportPresetPassthrough];
exportSesh.outputFileType = AVFileTypeCoreAudioFormat;
exportSesh.outputURL = dstURL;
[[NSFileManager defaultManager] removeItemAtURL:dstURL error:nil];
[exportSesh exportAsynchronouslyWithCompletionHandler:^{
AVAssetExportSessionStatus status = exportSesh.status;
NSLog(#"exportAsynchronouslyWithCompletionHandler: %li\n", (long)status);
if(AVAssetExportSessionStatusFailed == status) {
NSLog(#"FAILURE: %#\n", exportSesh.error);
} else if(AVAssetExportSessionStatusCompleted == status) {
NSLog(#"SUCCESS!\n");
NSError *error;
//append the name of the file in jpg form
//check if the file exists (completely unnecessary).
NSString *onlyPath = [dstPath stringByDeletingLastPathComponent];
NSInteger randomNumber = arc4random() % 100000;
strDateAndTime = [self getCurrentDateAndTime];
strAudioName = [NSString stringWithFormat:#"%#_%ld_%#.%#", #"Audio", (long)randomNumber, strDateAndTime, #"caf"];
NSString *toPathString = [NSString stringWithFormat:#"%#/%#", onlyPath, strAudioName];
[[NSFileManager defaultManager] moveItemAtPath:dstPath toPath:toPathString error:&error];
//[self uploadAudioOnAWSFromPath:toPathString];
[self convertToWavForFilePath:toPathString];
}
}];
}
- (void)convertToWavForFilePath:(NSString *)cafFilePath
{
// set up an AVAssetReader to read from the iPod Library
// NSString *cafFilePath=[[NSBundle mainBundle]pathForResource:#"test" ofType:#"caf"];
NSURL *assetURL = [NSURL fileURLWithPath:cafFilePath];
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:assetURL options:nil];
NSError *assetError = nil;
AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:songAsset
error:&assetError];
if (assetError) {
NSLog (#"error: %#", assetError);
return;
}
AVAssetReaderOutput *assetReaderOutput = [AVAssetReaderAudioMixOutput
assetReaderAudioMixOutputWithAudioTracks:songAsset.tracks
audioSettings: nil];
if (! [assetReader canAddOutput: assetReaderOutput]) {
NSLog (#"can't add reader output... die!");
return;
}
[assetReader addOutput: assetReaderOutput];
NSInteger randomNumber = arc4random() % 100000;
_finalAudioName = [NSString stringWithFormat:#"%#_%ld_%#", #"Audio", (long)randomNumber, strDateAndTime];
// NSString *title = #"MyRec";
NSArray *docDirs = NSSearchPathForDirectoriesInDomains (NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docDir = [docDirs objectAtIndex: 0];
__block NSString *wavFilePath = [[docDir stringByAppendingPathComponent :_finalAudioName]
stringByAppendingPathExtension:#"wav"];
_finalAudioName = [wavFilePath lastPathComponent];
if ([[NSFileManager defaultManager] fileExistsAtPath:wavFilePath])
{
[[NSFileManager defaultManager] removeItemAtPath:wavFilePath error:nil];
}
NSURL *exportURL = [NSURL fileURLWithPath:wavFilePath];
AVAssetWriter *assetWriter = [AVAssetWriter assetWriterWithURL:exportURL
fileType:AVFileTypeWAVE
error:&assetError];
if (assetError)
{
NSLog (#"error: %#", assetError);
return;
}
AudioChannelLayout channelLayout;
memset(&channelLayout, 0, sizeof(AudioChannelLayout));
channelLayout.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;
NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey,
[NSNumber numberWithFloat:44100.0], AVSampleRateKey,
[NSNumber numberWithInt:1], AVNumberOfChannelsKey,
[NSData dataWithBytes:&channelLayout length:sizeof(AudioChannelLayout)], AVChannelLayoutKey,
[NSNumber numberWithInt:16], AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved,
[NSNumber numberWithBool:NO], AVLinearPCMIsFloatKey,
[NSNumber numberWithBool:NO], AVLinearPCMIsBigEndianKey,
nil];
AVAssetWriterInput *assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio
outputSettings:outputSettings];
if ([assetWriter canAddInput:assetWriterInput])
{
[assetWriter addInput:assetWriterInput];
}
else
{
NSLog (#"can't add asset writer input... die!");
return;
}
assetWriterInput.expectsMediaDataInRealTime = NO;
[assetWriter startWriting];
[assetReader startReading];
AVAssetTrack *soundTrack = [songAsset.tracks objectAtIndex:0];
CMTime startTime = CMTimeMake (0, soundTrack.naturalTimeScale);
[assetWriter startSessionAtSourceTime: startTime];
__block UInt64 convertedByteCount = 0;
dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL);
[assetWriterInput requestMediaDataWhenReadyOnQueue:mediaInputQueue
usingBlock: ^
{
while (assetWriterInput.readyForMoreMediaData)
{
CMSampleBufferRef nextBuffer = [assetReaderOutput copyNextSampleBuffer];
if (nextBuffer)
{
// append buffer
[assetWriterInput appendSampleBuffer: nextBuffer];
convertedByteCount += CMSampleBufferGetTotalSampleSize (nextBuffer);
CMTime progressTime = CMSampleBufferGetPresentationTimeStamp(nextBuffer);
CMTime sampleDuration = CMSampleBufferGetDuration(nextBuffer);
if (CMTIME_IS_NUMERIC(sampleDuration))
progressTime= CMTimeAdd(progressTime, sampleDuration);
float dProgress= CMTimeGetSeconds(progressTime) / CMTimeGetSeconds(songAsset.duration);
NSLog(#"%f",dProgress);
}
else
{
[assetWriterInput markAsFinished];
[assetReader cancelReading];
[assetWriter finishWritingWithCompletionHandler:^{
[self uploadAudioOnAWSFromPath:wavFilePath];
}];
}
}
}];
}
In above code, the converted final file which is in .wav format that I have to send on AWS server to get text from it, For text conversation we used Google Speech API, and whenever I send file which is converted from below iOS 11 devices(like iOS 9,10), it shows me this error :
Error: WAV header indicates an unsupported format.
Above error is shown by Google Speech API while converting an Audio file(.wav) to Text format.
Above code is working perfectly with latest iOS 11 devices, only below iOS 11(like iOS 9,10) devices are not working properly.

How to save captured image in to the document directory

I captured image using below code
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
CALayer *viewLayer = self.vImagePreview.layer;
NSLog(#"viewLayer = %#", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
[session startRunning];
_stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[_stillImageOutput setOutputSettings:outputSettings];
[session addOutput:_stillImageOutput];
when i press the button
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in _stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
NSLog(#"about to request a capture from: %#", _stillImageOutput);
[_stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
self.vImage.image = image;
_vImage.hidden=YES;
UIStoryboard *storybord=[UIStoryboard storyboardWithName:#"Main" bundle:nil];
shareViewController *shareview=[storybord instantiateViewControllerWithIdentifier:#"share"];
[self presentViewController:shareview animated:YES completion:nil];
shareview.shareimageview.image=image;
NSMutableArray *temparray = [NSMutableArray arrayWithObjects:image,nil];
NSMutableArray *newparsetile=[#[#"you"]mutableCopy];
shareview.newtile=newparsetile;
shareview.selectedimgarray=temparray;
[[NSNotificationCenter defaultCenter] postNotificationName:#"Shareimage" object:image];
}];
how to save the output image in to the device document directory,can any body help me out,answer with code is appreciated,since i am new to the ios objective c,the people who want to customize the camera like instagram can use my code it is 100% working
NSData *pngData = UIImagePNGRepresentation(image);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsPath = [paths objectAtIndex:0]; //Get the docs directory
NSString *filePath = [documentsPath stringByAppendingPathComponent:[NSString stringWithFormat:#"image_name”]]; //Add the file name
[pngData writeToFile:filePath atomically:YES]; //Write the file
// Saving it to documents direcctory
NSArray *directoryPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentDirectory = [directoryPaths objectAtIndex:0];
NSString* filePath = [documentDirectory stringByAppendingPathComponent:#"FileName.png"];
NSData *imageData = // Some Image data;
NSURL *url = [NSURL fileURLWithPath:filePath];
if ([imageData writeToURL:url atomically:YES]) {
NSLog(#"Success");
}
else{
NSLog(#"Error");
}
You can use above code to save an image to documents directory. Instead of imagedata variable you can pass your variable.

how to create AVAudioPCMBuffer from NSData

I receive mp3 from stream and convert pcm data. I try to use AVAudioEngine but can not create AVAudioPCMBuffer without file.
- (void)playAudio:(NSNotification *)notification {
self.runAudioThread = YES;
dispatch_queue_t globalDispatchDefaultQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0);
[_player play];
dispatch_async(globalDispatchDefaultQueue, ^{
while (_runAudioThread) {
if (_rawDataArray.count > BUFFER_COUNT) {
NSMutableData *rawData = [[NSMutableData alloc] init];
NSArray *rawDataArray = _rawDataArray.copy;
#synchronized(_rawDataArray) {
[self.rawDataArray removeAllObjects];
}
for (AudioPacketInfo *audioPacketInfo in rawDataArray) {
[rawData appendData:audioPacketInfo.data];
}
NSData *mp3Data = [AudioUtil makeMP3Data:rawData];
if (mp3Data) {
NSData *pcmData = [AudioUtil convertToPCMWithData:mp3Data];
if (pcmData.length > 0) {
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = paths[0];
NSString *filePath = [documentsDirectory stringByAppendingString:#"/temp.mp3"];
[[NSFileManager defaultManager] createFileAtPath:filePath contents:mp3Data attributes:nil];
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:[NSURL URLWithString:filePath] error:nil];
AVAudioPCMBuffer *buffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:(AVAudioFrameCount)audioFile.length];
[audioFile readIntoBuffer:buffer error:nil];
[_player scheduleBuffer:buffer completionHandler:nil];
}
}
}
}
});
}
how would i do that?
Convert mp3 to pcm

iOS - Read a video file frame-by-frame, image processing, then save as new video file

I try to read a video frame-by-frame from iPhone photo album.
After image processing, I will save them as a new video.
I'm running my code without any error, but there is no new video in the album.
Here is my code.
// Video writer init
- (BOOL)setupAssetWriterForURL:(CMFormatDescriptionRef)formatDescription
{
float bitsPerPixel;
CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(formatDescription);
int numPixels = dimensions.width * dimensions.height;
int bitsPerSecond;
if ( numPixels < (640 * 480) )
bitsPerPixel = 4.05;
else
bitsPerPixel = 11.4;
bitsPerSecond = numPixels * bitsPerPixel;
NSDictionary *videoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInteger:dimensions.width], AVVideoWidthKey,
[NSNumber numberWithInteger:dimensions.height], AVVideoHeightKey,
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInteger:bitsPerSecond], AVVideoAverageBitRateKey,
[NSNumber numberWithInteger:30], AVVideoMaxKeyFrameIntervalKey,
nil], AVVideoCompressionPropertiesKey,
nil];
if ([assetWriter canApplyOutputSettings:videoCompressionSettings forMediaType:AVMediaTypeVideo]) {
assetWriterVideoIn = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:videoCompressionSettings];
assetWriterVideoIn.expectsMediaDataInRealTime = YES;
assetWriterVideoIn.transform = [self transformFromCurrentVideoOrientationToOrientation:self.referenceOrientation];
if ([assetWriter canAddInput:assetWriterVideoIn])
[assetWriter addInput:assetWriterVideoIn];
else {
NSLog(#"Couldn't add asset writer video input.");
return NO;
}
}
else {
NSLog(#"Couldn't apply video output settings.");
return NO;
}
return YES;
}
Read video
- (void)readMovie:(NSURL *)url
{
AVURLAsset * asset = [AVURLAsset URLAssetWithURL:url options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:#"tracks"] completionHandler:
^{
dispatch_async(dispatch_get_main_queue(),
^{
AVAssetTrack * videoTrack = nil;
NSArray * tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
if ([tracks count] == 1)
{
videoTrack = [tracks objectAtIndex:0];
NSError * error = nil;
// _movieReader is a member variable
AVAssetReader *movieReader = [[AVAssetReader alloc] initWithAsset:asset error:&error];
if (error)
NSLog(#"_movieReader fail!\n");
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings =
[NSDictionary dictionaryWithObject:value forKey:key];
[movieReader addOutput:[AVAssetReaderTrackOutput
assetReaderTrackOutputWithTrack:videoTrack
outputSettings:videoSettings]];
[movieReader startReading];
while ([movieReader status] == AVAssetReaderStatusReading)
{
AVAssetReaderTrackOutput * output = [movieReader.outputs objectAtIndex:0];
CMSampleBufferRef sampleBuffer = [output copyNextSampleBuffer];
if (sampleBuffer)
{
if ( !assetWriter ) {
outputURL = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/%llu.mov", NSTemporaryDirectory(), mach_absolute_time()]];
NSError *error = nil;
assetWriter = [[AVAssetWriter alloc] initWithURL:outputURL fileType:(NSString *)kUTTypeQuickTimeMovie error:&error];
if (error)
[self showError:error];
if (assetWriter) {
CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
[self setupAssetWriterForURL:formatDescription];
}
}
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
int bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
int bufferWidth = CVPixelBufferGetWidth(imageBuffer);
int bufferHeight = CVPixelBufferGetHeight(imageBuffer);
unsigned char *pixel = (unsigned char *)CVPixelBufferGetBaseAddress(imageBuffer);
for( int row = 0; row < bufferHeight; row++ ) {
for( int column = 0; column < bufferWidth; column++ ) {
pixel[0] = (pixel[0]+pixel[1]+pixel[2])/3;
pixel[1] = (pixel[0]+pixel[1]+pixel[2])/3;
pixel[2] = (pixel[0]+pixel[1]+pixel[2])/3;
pixel += 4;
}
}
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
if ( assetWriter ) {
[self writeSampleBuffer:sampleBuffer ofType:AVMediaTypeVideo];
}
CFRelease(sampleBuffer);
}
}
if (assetWriter) {
[assetWriterVideoIn markAsFinished];
assetWriter = nil;
[assetWriter finishWriting];
assetWriterVideoIn = nil;
assetWriter = nil;
[self saveMovieToCameraRoll];
}
else {
[self showError:[assetWriter error]];
}
}
});
}];
}
- (void) writeSampleBuffer:(CMSampleBufferRef)sampleBuffer ofType:(NSString *)mediaType
{
if ( assetWriter.status == AVAssetWriterStatusUnknown ) {
if ([assetWriter startWriting]) {
[assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
}
else {
[self showError:[assetWriter error]];
}
}
if ( assetWriter.status == AVAssetWriterStatusWriting ) {
if (mediaType == AVMediaTypeVideo) {
if (assetWriterVideoIn.readyForMoreMediaData) {
if (![assetWriterVideoIn appendSampleBuffer:sampleBuffer]) {
[self showError:[assetWriter error]];
}
}
}
}
}
- (void)saveMovieToCameraRoll
{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeVideoAtPathToSavedPhotosAlbum:outputURL
completionBlock:^(NSURL *assetURL, NSError *error) {
if (error){
[self showError:error];
NSLog(#"save fail");
}
else
{
[self removeFile:outputURL];
NSLog(#"!!!");
}
});
}];
}
- (void)removeFile:(NSURL *)fileURL
{
NSFileManager *fileManager = [NSFileManager defaultManager];
NSString *filePath = [fileURL path];
if ([fileManager fileExistsAtPath:filePath]) {
NSError *error;
BOOL success = [fileManager removeItemAtPath:filePath error:&error];
if (!success)
[self showError:error];
}
}
Any suggestions?
I am a bit late but it might be helpful for others that code is almost right but simply comment one line of code it will work, which is in readMovie: method
//assetWriter = nil; commented line
[assetWriter finishWriting];
assetWriterVideoIn = nil;
assetWriter = nil;
[self saveMovieToCameraRoll];
}`
And Here is an answer for Creating Movie from images.
Hope You will get Some Help from there.

Application crashed while importing songs from Ipod library in iPhone for iOs 5.0

Hello i am using below framworks,
#import <MediaPlayer/MediaPlayer.h>
#import <AudioToolbox/AudioToolbox.h>
#import <AVFoundation/AVFoundation.h>
In one of button event i have implemented below code to open Library.
MPMediaPickerController *mediaPicker = [[MPMediaPickerController alloc] initWithMediaTypes:MPMediaTypeMusic];
mediaPicker.delegate = self;
mediaPicker.allowsPickingMultipleItems = YES; // this is the default
[self presentModalViewController:mediaPicker animated:YES];
[mediaPicker release];
And in delegate methods of MPMediaPickerController implemented code as below
#pragma mark MPMediaPickerController delegate methods
- (void)mediaPicker: (MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection {
// We need to dismiss the picker
[self dismissModalViewControllerAnimated:YES];
// Assign the selected item(s) to the music player and start playback.
counterIpod = [mediaItemCollection.items count];
totalcollection = counterIpod;
if (totalcollection > 10) {
NSString *str = [NSString stringWithFormat:#"App Only supports importing 10 songs at a time"];
UIAlertView *connectionAlert = [[UIAlertView alloc] initWithTitle:#"Message !" message:str delegate:self cancelButtonTitle:#"OK" otherButtonTitles:nil];
[connectionAlert show];
[connectionAlert release];
}
else {
[self performSelector:#selector(saveMediaItem:) withObject:mediaItemCollection afterDelay:0.1];
//[self saveMediaItem:mediaItemCollection];
//[self showLoadingView];
}
}
- (void)mediaPickerDidCancel:(MPMediaPickerController *)mediaPicker {
// User did not select anything
// We need to dismiss the picker
[self dismissModalViewControllerAnimated:YES];
}
#pragma mark Sace Item Collection to documentsDirectory
-(void)saveMediaItem:(MPMediaItemCollection *)mediaItemCollection {
for (int i = 0; i < [mediaItemCollection.items count]; i++) {
[self exportAssetAsSourceFormat:[[mediaItemCollection items] objectAtIndex:i]];
NSLog(#"for loop : %d", i);
}
NSArray *itemsArray1 = appDelegate.mediaItemCollection1.items;
MPMediaItemCollection *mediaItemCollection2;
if ([itemsArray1 count] != 0) {
mediaItemCollection2 = [self collectionByAppendingCollection:mediaItemCollection];
}
else {
mediaItemCollection2 = mediaItemCollection;
}
[self saveMediaItemAfterDeletting:mediaItemCollection2];
}
-(void)saveMediaItemAfterDeletting:(MPMediaItemCollection *)mediaItemCollection {
NSMutableData* data = [[NSMutableData alloc] init];
NSKeyedArchiver *archiver = [[NSKeyedArchiver alloc] initForWritingWithMutableData:data];
[archiver encodeObject:mediaItemCollection forKey:#"my_playlist"];
[archiver finishEncoding];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *filePath = [NSString stringWithFormat:#"%#/playlist.data", documentsDirectory];
NSLog(#"file path = %#", filePath);
[data writeToFile:filePath atomically:YES];
if ([[NSFileManager defaultManager] fileExistsAtPath:filePath]){
NSLog(#"file exists : ===========>>>>>>>>>>>");
} else {
NSLog(#"file doesn't exist");
}
//NSLog(#"archiving playlist success = %d", success);
[archiver release];
[data release];
[self UpdateMediaCollection];
}
-(NSString*) getExtension:(MPMediaItem *)item {
// [self showLoadingView];
NSURL *assetURL = [item valueForProperty:MPMediaItemPropertyAssetURL];
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:assetURL options:nil];
// JP
// AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]
// initWithAsset:songAsset
// presetName:AVAssetExportPresetPassthrough];
NSArray *tracks = [songAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *track = [tracks objectAtIndex:0];
id desc = [track.formatDescriptions objectAtIndex:0];
const AudioStreamBasicDescription *audioDesc = CMAudioFormatDescriptionGetStreamBasicDescription((CMAudioFormatDescriptionRef)desc);
FourCharCode formatID = audioDesc->mFormatID;
//exportAudioMix.inputParameters = [NSArray arrayWithObject:exportAudioMixInputParameters];
//exportSession.audioMix = exportAudioMix;
NSString *fileType = nil;
NSString *ex = nil;
switch (formatID) {
case kAudioFormatLinearPCM:
{
UInt32 flags = audioDesc->mFormatFlags;
if (flags & kAudioFormatFlagIsBigEndian) {
fileType = #"public.aiff-audio";
ex = #"aif";
} else {
fileType = #"com.microsoft.waveform-audio";
ex = #"wav";
}
}
break;
case kAudioFormatMPEGLayer3:
fileType = #"com.apple.quicktime-movie";
ex = #"mp3";
break;
case kAudioFormatMPEG4AAC:
fileType = #"com.apple.m4a-audio";
ex = #"m4a";
break;
case kAudioFormatAppleLossless:
fileType = #"com.apple.m4a-audio";
ex = #"m4a";
break;
default:
break;
}
return ex;
}
#pragma mark Covert Item separate item collection and store songs into directory
- (void)exportAssetAsSourceFormat:(MPMediaItem *)item {
// [self showLoadingView];
NSLog(#"export asset called");
NSURL *assetURL = [item valueForProperty:MPMediaItemPropertyAssetURL];
NSLog(#"\n>>>> assetURL : %#",[assetURL absoluteString]);
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:assetURL options:nil];
// JP
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]
initWithAsset:songAsset
presetName:AVAssetExportPresetPassthrough];
NSArray *tracks = [songAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *track = [tracks objectAtIndex:0];
id desc = [track.formatDescriptions objectAtIndex:0];
const AudioStreamBasicDescription *audioDesc = CMAudioFormatDescriptionGetStreamBasicDescription((CMAudioFormatDescriptionRef)desc);
FourCharCode formatID = audioDesc->mFormatID;
//exportAudioMix.inputParameters = [NSArray arrayWithObject:exportAudioMixInputParameters];
//exportSession.audioMix = exportAudioMix;
NSString *fileType = nil;
NSString *ex = nil;
switch (formatID) {
case kAudioFormatLinearPCM:
{
UInt32 flags = audioDesc->mFormatFlags;
if (flags & kAudioFormatFlagIsBigEndian) {
fileType = #"public.aiff-audio";
ex = #"aif";
} else {
fileType = #"com.microsoft.waveform-audio";
ex = #"wav";
}
}
break;
case kAudioFormatMPEGLayer3:
fileType = #"com.apple.quicktime-movie";
ex = #"mp3";
break;
case kAudioFormatMPEG4AAC:
fileType = #"com.apple.m4a-audio";
ex = #"m4a";
break;
case kAudioFormatAppleLossless:
fileType = #"com.apple.m4a-audio";
ex = #"m4a";
break;
default:
break;
}
exportSession.outputFileType = fileType;
NSString *fileName = nil;
fileName = [NSString stringWithString:[item valueForProperty:MPMediaItemPropertyTitle]];
fileName = [[fileName stringByAppendingString:#"-"] stringByAppendingString:[item valueForProperty:MPMediaItemPropertyArtist]];
NSArray *fileNameArray = nil;
fileNameArray = [fileName componentsSeparatedByString:#" "];
fileName = [fileNameArray componentsJoinedByString:#""];
NSLog(#">>>>> fileName = %#", fileName);
NSString *docDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *filePath = [[docDir stringByAppendingPathComponent:fileName] stringByAppendingPathExtension:ex];
NSLog(#"filePath = %#", filePath);
if ([[NSFileManager defaultManager] fileExistsAtPath:filePath]) {
//NSLog(#"file exist::::::::::==============>>>>>>>>>>>>>>>>>");
counterIpod--;
if(counterIpod == 0) {
//[self showAlertView];
//[self hideLoadingView];
}
NSString *str = [NSString stringWithFormat:#"Loading %d of %d Beats", totalcollection - counterIpod ,totalcollection];
[lbl performSelectorOnMainThread:#selector(setText:) withObject:str waitUntilDone:NO];
//NSLog(#"loading string : %#", str);
return;
}
//NSLog(#"file not exist ===========>>>>>>>>>");
// -------------------------------------
int fileNumber = 0;
NSString *fileNumberString = nil;
NSString *fileNameWithNumber = nil;
while ([[NSFileManager defaultManager] fileExistsAtPath:filePath]) {
fileNumber++;
fileNumberString = [NSString stringWithFormat:#"-%02d", fileNumber];
fileNameWithNumber = [fileName stringByAppendingString:fileNumberString];
filePath = [[docDir stringByAppendingPathComponent:fileNameWithNumber] stringByAppendingPathExtension:ex];
//NSLog(#"filePath = %#", filePath);
}
// -------------------------------------
myDeleteFile(filePath);
exportSession.outputURL = [NSURL fileURLWithPath:filePath];
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (exportSession.status == AVAssetExportSessionStatusCompleted) {
NSLog(#"export session completed");
counterIpod--;
NSString *str = [NSString stringWithFormat:#"Loading %d of %d Beats", totalcollection - counterIpod ,totalcollection];
//[self performSelector:#selector(setLabelText:) withObject:str afterDelay:0.02];
[lbl performSelectorOnMainThread:#selector(setText:) withObject:str waitUntilDone:NO];
NSLog(#"loading string : %#", str);
if(counterIpod == 0) {
//[self showAlertView];
//[self hideLoadingView];
}
} else {
NSLog(#"export session error");
counterIpod--;
NSString *str = [NSString stringWithFormat:#"Loading %d of %d Beats", totalcollection - counterIpod ,totalcollection];
[lbl performSelectorOnMainThread:#selector(setText:) withObject:str waitUntilDone:NO];
//return NO;
if(counterIpod == 0) {
//[self showAlertView];
//[self hideLoadingView];
}
}
[exportSession release];
}];
//[appDelegate hideLoadingView];
}
#pragma mark method to delete file from document directory
void myDeleteFile (NSString* path) {
// NSLog(#"file path delete file :::::::::: %#", path);
if ([[NSFileManager defaultManager] fileExistsAtPath:path]) {
NSError *deleteErr = nil;
[[NSFileManager defaultManager] removeItemAtPath:path error:&deleteErr];
if (deleteErr) {
NSLog (#"Can't delete %#: %#", path, deleteErr);
}
}
}
Above code work without any error on iOS 4.0 or prior version but for iOS 5.0 is crashes on device, I can't resolve these issues since last 15 days.
Thanks in advance for Help.
I have solved this issues,
just comment out this line
fileName = [[fileName stringByAppendingString:#"-"] stringByAppendingString:[item valueForProperty:MPMediaItemPropertyArtist]];
because for some songs there is null artist so it's crash...................
This is because you are using some songs has artist name while some with blank artist name, so you are trying to append blank name in string that's why app going to crash.
Hope you unserstand what i say...

Resources