My app is like Vine or Instagram, which records small clips and joins them together. I have problem with memory management when creating avasset.
If I created 100 AVAsset objects from the beginning, it is working fine. But if I keep deleting some of them, and even all of them, and create new ones again, I receive memory warning and also reponds to memory pressure. I check my running app with Instrument and there is no leaked memory.
Below is how I create and remove AVAsset objects in array assets. Does anyone know what could be the problem?
- (void) copyFileToDocuments:(NSURL *)fileURL
{
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"yyyy-MM-dd_HH-mm-ss"];
NSString *destinationPath = [documentsDirectory stringByAppendingFormat:#"/output_%#.mov", [dateFormatter stringFromDate:[NSDate date]]];
NSError *error;
if (![[NSFileManager defaultManager] copyItemAtURL:fileURL toURL:[NSURL fileURLWithPath:destinationPath] error:&error]) {
if ([[self delegate] respondsToSelector:#selector(captureManager:didFailWithError:)]) {
[[self delegate] captureManager:self didFailWithError:error];
}
}
//add asset into the array or pieces
AVAsset *asset = [AVAsset assetWithURL:[NSURL fileURLWithPath:destinationPath]];
[self.assets addObject:asset];
}
-(void) deleteLastAsset
{
AVAsset *asset = [self.assets lastObject];
[self.delegate removeTimeFromDuration:CMTimeGetSeconds(asset.duration)];
NSURL *fileURL = nil;
if ([asset isKindOfClass:AVURLAsset.class])
{
AVURLAsset *urlAsset = (AVURLAsset*)asset;
fileURL = urlAsset.URL;
}
if (fileURL)
[self removeFile:fileURL];
[self.assets removeLastObject];
}
- (void) startRecording
{
if ([[UIDevice currentDevice] isMultitaskingSupported]) {
[self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{}]];
}
[self removeFile:[[self recorder] outputFileURL]];
[[self recorder] startRecordingWithOrientation:self.orientation];
}
- (void) removeFile:(NSURL *)fileURL
{
NSString *filePath = [fileURL path];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:filePath]) {
NSError *error;
if ([fileManager removeItemAtPath:filePath error:&error] == NO) {
if ([[self delegate] respondsToSelector:#selector(captureManager:didFailWithError:)]) {
[[self delegate] captureManager:self didFailWithError:error];
}
}
}
}
-(void)startRecordingWithOrientation:(AVCaptureVideoOrientation)videoOrientation;
{
AVCaptureConnection *videoConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self movieFileOutput] connections]];
if ([videoConnection isVideoOrientationSupported])
[videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
[[self movieFileOutput] startRecordingToOutputFileURL:[self outputFileURL] recordingDelegate:self];
}
-(void)stopRecording
{
[[self movieFileOutput] stopRecording];
}
Related
This question already has an answer here:
How to send data from Iphone to Apple Watch in OS2 in Objective-C
(1 answer)
Closed 7 years ago.
I am having an issue with getting an audio file that I record on my iPhone to play out of my Watch speakers.
Here is the code I have for sending from iPhone to Watch.
- (void)sendLiveAudioRecordingToWatch
{
NSError *error;
NSFileManager *files = [NSFileManager defaultManager];
NSString *docs = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) firstObject];
NSString *record = [docs stringByAppendingPathComponent: #"LiveAudio.m4a"];
NSURL *URL = [[NSFileManager defaultManager] containerURLForSecurityApplicationGroupIdentifier: #"group.myCompany.myApp"];
NSURL *outputURL = [URL URLByAppendingPathComponent: #"PhoneToWatch.mp4"];
NSString *path = [NSString stringWithFormat: #"%#", outputURL];
NSLog(#"%#", [NSURL fileURLWithPath: path]);
[files removeItemAtPath: [outputURL path] error: &error];
BOOL fileExists = [files isReadableFileAtPath: record];
if (fileExists)
{
NSLog(#"Live Audio Recording Saved!");
[files copyItemAtPath: record toPath: [outputURL path] error: &error];
if (!error)
{
NSLog(#"Temporary Recording Copied To Group");
BOOL deleteRecord = [files removeItemAtPath: record error: &error];
if (deleteRecord)
{
NSLog(#"Temporary Recording Removed");
if ([WCSession isSupported])
{
WCSession *session = [WCSession defaultSession];
NSLog(#"WCSession isSupported");
[session setDelegate: self];
[session activateSession];
if ([session isReachable])
{
NSLog(#"File Is Being Transferred");
[[WCSession defaultSession] transferFile: outputURL metadata: nil];
}
else
{
NSLog(#"Watch Not Reachable");
}
}
}
else
{
NSLog(#"Failed To Delete Temorary File");
}
}
else
{
NSLog(#"Error Copying Live Audio To Group");
}
}
else
{
NSLog(#"Error Locating Temporary Audio Location");
}
}
Here is the code I have for receiving on Watch from iPhone.
- (void)session: (WCSession *)session didReceiveFile: (WCSessionFile *)file
{
WKAudioFileAsset *asset = [WKAudioFileAsset assetWithURL: [file fileURL]];
WKAudioFilePlayerItem *playerItem = [WKAudioFilePlayerItem playerItemWithAsset: asset];
WKAudioFilePlayer *player = [WKAudioFilePlayer playerWithPlayerItem: playerItem];
[player play];
}
iOS 9.2 watchOS 2.0
Has anyone experienced this, or was able to get this working?
I got this working. WKAudioFilePlayer is used for playing through a bluetooth device, not the speakers of the Apple Watch. In WatchOS 2.0 and higher, you can use:
[self presentMediaPlayerControllerWithURL:url options:options completion:^(BOOL didPlayToEnd, NSTimeInterval endTime, NSError * __nullable error){}];
I am using UIImagePickerController to record video, the problem is its recording the video in mov format for android compatibility.
I need to convert the video into mp4 format using below code the issue is its taking hell of a time for a 6 sec video it takes around 30 to 35 seconds.
Any solution that I can directly record the video in mp4 format or faster method would be of great help. Thanks in Advance
-(void)movToMp4:(NSURL *)videoURL{ // method for mov to mp4 conversion
AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:videoURL options:nil];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:avAsset];
if ([compatiblePresets containsObject:AVAssetExportPresetLowQuality])
{
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]initWithAsset:avAsset presetName:AVAssetExportPresetPassthrough];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString* videoPath = [NSString stringWithFormat:#"%#/xyz.mp4", [paths objectAtIndex:0]];
exportSession.outputURL = [NSURL fileURLWithPath:videoPath];
exportSession.outputFileType = AVFileTypeMPEG4;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status]) { // switch case to get completion case where i put my delegate
return;
break;
case AVAssetExportSessionStatusCompleted: {
[self.delegate mp4Response:videoPath];
break;
}
}
}
}];
}
}
Yes because it takes time when you load video from library / album using AVURLAsset.
So you need to use block here to load video from library.
Also line
[self.delegate mp4Response:videoPath];
Which in a completion block - It should be on main thread.
Follow this approach:
UIImagePickerController delegate method to get videos from library.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSURL *localUrl = (NSURL *)[info valueForKey:UIImagePickerControllerMediaURL];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString* videoPath = [NSString stringWithFormat:#"%#/xyz.mp4", [paths objectAtIndex:0]];
NSURL *outputURL = [NSURL fileURLWithPath:videoPath];
[self convertVideoToLowQuailtyWithInputURL:localUrl outputURL:outputURL handler:^(AVAssetExportSession *exportSession)
{
if (exportSession.status == AVAssetExportSessionStatusCompleted) {
NSLog(#"Capture video complete");
[self performSelectorOnMainThread:#selector(doneCompressing) withObject:nil waitUntilDone:YES];
}
}];
[self dismissViewControllerAnimated:YES completion:nil];
}
- (void)convertVideoToLowQuailtyWithInputURL:(NSURL*)inputURL outputURL:(NSURL*)outputURL handler:(void (^)(AVAssetExportSession*))handler {
[[NSFileManager defaultManager] removeItemAtURL:outputURL error:nil];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetPassthrough];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileTypeMPEG4;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void) {
handler(exportSession);
}];
}
In didFinishPickingMediaWithInfo method observed this line:
[self performSelectorOnMainThread:#selector(doneCompressing) withObject:nil waitUntilDone:YES];
It will call a one more method doneCompressing in main thread (in foreground). So that you can call delegate method in doneCompressing. And this will reduce time.
- (void) doneCompressing {
[self.delegate mp4Response:videoPath];
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSLog(#"file info = %#", info);
NSString *pickedType = [info objectForKey:#"UIImagePickerControllerMediaType"];
NSData *videoData;
if ([pickedType isEqualToString:#"public.movie"]){
NSURL *videoUrl=(NSURL*)[info objectForKey:UIImagePickerControllerMediaURL];
videoData = [NSData dataWithContentsOfURL:videoUrl];
}
[self dismissViewControllerAnimated:YES completion:^{
//
[self writeFileData:videoData];
}];
}
// to get path of document directory
- (NSString *)applicationDocumentsDirectory
{
// return [[[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask] lastObject];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
return documentsDirectory;
}
- (void) writeFileData:(NSData *)fileData{
float size = fileData.length / (1024 * 1024);
NSString *fileName = nil;
NSString *strPath = nil;
NSString *documentsDirectory = [self applicationDocumentsDirectory];
double CurrentTime = CACurrentMediaTime();
fileName = [NSString stringWithFormat:#"%d.mp4",(int)CurrentTime];
strPath = [documentsDirectory stringByAppendingPathComponent:fileName];
NSFileManager *filemanager=[[NSFileManager alloc] init];
NSError *er;
if (![filemanager fileExistsAtPath:documentsDirectory]) {
[filemanager createDirectoryAtPath:documentsDirectory withIntermediateDirectories:YES attributes:nil error:&er];
NSLog(#"error in folder creation = %#", er);
}
NSLog(#"size of data = %lu", (unsigned long)[fileData length]);
BOOL saved = [fileData writeToFile:strPath atomically:YES];
if (saved) {
NSURL *videoURL = [NSURL URLWithString:strPath];
// now u can handle mp4 video from videoURL
}
else
return;
}
Basically I'm streaming audio to other iOS devices through multipeer connectivity. I am using this tutorial, and right now I can stream music to other devices and have the other devices play the music. However, my local device host doesn't play the music. In order to do this, I basically tried
- (void)mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection
{
self.outputStreamer = [[TDAudioOutputStreamer alloc] initWithOutputStream:[self.session outputStreamForPeer:peers[0]]];
[self.outputStreamer streamAudioFromURL:[self.song valueForProperty:MPMediaItemPropertyAssetURL]];
[self.outputStreamer start];
self.player = [[AVAudioPlayer alloc] initWithContentsOfURL:[self.song valueForProperty:MPMediaItemPropertyAssetURL]error: NULL];
[self.player play];
peers is an array of connected peers, everything is working fine with that. If I comment out the last two lines (the AVAudioPlayer), then the streaming to other devices works, vice versa. It seems like I can only do one or the other. (self.player is declared in the .h, it is fine.)
Any solution to this double audio playing? Thanks in advance.
You have to create object of
TDAudioInputStreamer
on client end too.
self.inputStream = [[TDAudioInputStreamer alloc] initWithInputStream:stream];
[self.inputstream start];
When you create output stream.
you can pick your song with media picker then need to convert your assets
**- (void)mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection
`
----------
-
`**
[self dismissViewControllerAnimated:YES completion:nil];
someMutableArray = [mediaItemCollection items];
NSLog(#"%#",someMutableArray);
MPMediaItem *song=[mediaItemCollection.items objectAtIndex:0];
NSString * type = [song valueForProperty:MPMediaItemPropertyMediaType];
NSURL * url = [song valueForProperty:MPMediaItemPropertyAssetURL];
NSDictionary*dict=[[NSDictionary alloc] init];
AVAsset *asset = [AVAsset assetWithURL:url];
NSArray * metadata = [asset commonMetadata];
NSArray * metadata1 = [asset metadata];
NSArray * metadata2 = [asset availableMetadataFormats];
NSMutableDictionary *info = [NSMutableDictionary dictionary];
info[#"title"] = [song valueForProperty:MPMediaItemPropertyTitle] ? [song valueForProperty:MPMediaItemPropertyTitle] : #"";
info[#"artist"] = [song valueForProperty:MPMediaItemPropertyArtist] ? [song valueForProperty:MPMediaItemPropertyArtist] : #"";
NSNumber *duration=[song valueForProperty:MPMediaItemPropertyPlaybackDuration];
int fullminutes = floor([duration floatValue] / 60); // fullminutes is an int
int fullseconds = trunc([duration floatValue] - fullminutes * 60); // fullseconds is an int
info[#"duration"] = [NSString stringWithFormat:#"%d:%d", fullminutes, fullseconds];
MPMediaItemArtwork *artwork = [song valueForProperty:MPMediaItemPropertyArtwork];
UIImage *image = [artwork imageWithSize:CGSizeMake(150, 150)];
NSData * data = UIImageJPEGRepresentation(image, 0.0);
image = [UIImage imageWithData:data];
if (image)
self.songArtWorkImageView.image = image;
else
self.songArtWorkImageView.image = nil;
self.songTitleLbl.text = [NSString stringWithFormat:#"%# \n[Artist : %#]", info[#"title"], info[#"artist"]];
[_session sendData:[NSKeyedArchiver archivedDataWithRootObject:[info copy]] toPeers:_session.connectedPeers withMode:MCSessionSendDataReliable error: nil];
#try {
if(_session && _session.connectedPeers && [_session.connectedPeers count] > 0) {
for(int i=0;i<someMutableArray.count;i++){
MPMediaItem *song = [someMutableArray objectAtIndex:i];
for(int i=0;i<someMutableArray.count;i++){
MPMediaItem *song = [someMutableArray objectAtIndex:i];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[song valueForProperty:MPMediaItemPropertyAssetURL] options:nil];
[self convertAsset: asset complition:^(BOOL Success, NSString *filePath) {
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, 1 * NSEC_PER_SEC), dispatch_get_main_queue(), ^{
if(Success) {
if(image) {
[self saveImage: image withComplition:^(BOOL status, NSString *imageName, NSURL *imageURL) {
if(status) {
#try {
[_session sendResourceAtURL:imageURL withName:imageName toPeer:[_session.connectedPeers objectAtIndex:0] withCompletionHandler:^(NSError *error) {
if (error) {
NSLog(#"Failed to send picture to %#", error.localizedDescription);
return;
}
//Clean up the temp file
NSFileManager *fileManager = [NSFileManager defaultManager];
[fileManager removeItemAtURL:imageURL error:nil];
}];
}
#catch (NSException *exception) {
}
}
}];
}
#try {
if(!self.outputStream) {
NSArray * connnectedPeers = [_session connectedPeers];
if([connnectedPeers count] != 0) {
[self outputStreamForPeer:[_session.connectedPeers objectAtIndex:0]];
}
}
}
#catch (NSException *exception) {
}
if(self.outputStream) {
if(!self.outputStreamer) {
self.outputStreamer = [[TDAudioOutputStreamer alloc] initWithOutputStream:self.outputStream];
}
[self.outputStreamer initStream:filePath];
if(self.outputStreamer) {
[self.outputStreamer start];
}
}
}
else {
[UIView showMessageWithTitle:#"Error!" message:#"Error occured!" showInterval:1.5];
}
});
}];
}
}
}
}
#catch (NSException *exception) {
NSLog(#"Expection: %#", [exception debugDescription]);
}
}
My iOS application just got rejected because the application is storing data on Documents so it is backed up by iCloud, this is not allowed since the data is downloaded from a server.
But even though I'm using addSkipBackupAttributeToItemAtURL it still shows up as a backup to iCloud.Here the code which i have used
NSError *error = nil;
NSData *data = [NSData dataWithContentsOfURL:url options:0 error:&error];
if(!error)
{
[self HidePreLoader];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSLog(#"_paths--->%#",paths);
NSString *path = [paths objectAtIndex:0];
NSLog(#"_path---->%#",path);
NSString *dataPath = [path stringByAppendingPathComponent:#"/MyFolder"];
NSLog(#"_dataPath---->%#",dataPath);
if (![[NSFileManager defaultManager] fileExistsAtPath:dataPath])
{
[[NSFileManager defaultManager] createDirectoryAtPath:dataPath withIntermediateDirectories:NO attributes:nil error:&error];
}
NSString *zipPath = [dataPath stringByAppendingPathComponent:downfilename];
[data writeToFile:zipPath options:0 error:&error];
if(!error)
{
[self HidePreLoader];
ZipArchive *za = [[ZipArchive alloc] init];
if ([za UnzipOpenFile: zipPath]) {
BOOL ret = [za UnzipFileTo: dataPath overWrite: YES];
if (NO == ret){} [za UnzipCloseFile];
NSLog(#"folderPath--->%#",folderPath);
//Here i have used the use code
NSURL *guidesURL = [NSURL fileURLWithPath:folderPath];
[guidesURL setResourceValue:[NSNumber numberWithBool:YES] forKey:NSURLIsExcludedFromBackupKey error:NULL];
//The following code also doesnt work
//NSURL *guidesURL = [NSURL fileURLWithPath:folderPath];
//[self addSkipBackupAttributeToItemAtURL:guidesURL];
[self HidePreLoader];
NSString *path = nil;
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPad)
{
path = [folderPath stringByAppendingPathComponent:#"module-B1-ipadmain-swipe-tablet.html"];
}
else
{
path = [folderPath stringByAppendingPathComponent:#"module-B1-ipadmain-swipe-phone.html"];
}
NSLog(#"path--->%#",path);
dispatch_async(dispatch_get_main_queue(), ^{
appDel.currentReadingPlanWithPath = path;
appDel.moduleDownloaded = YES;
[appDel SetCurrentDownloadPath];
NSLog(#"file download success");
[progview setHidden:YES];
[progval setHidden:YES];
[self HidePreLoader];
downloadView.userInteractionEnabled=YES;
[self SetModuleDownloaded:#"true"];
[self ShowAlert:#"Download" message:#"Success"];
[self HidePreLoader];
});
[self HidePreLoader];
}
}
else
{
NSLog(#"Error saving file %#",error);
}
}
else
{
NSLog(#"Error downloading zip file: %#", error);
}
});
AddSkipBackupAttributeToItemAtURL method
-(BOOL)addSkipBackupAttributeToItemAtURL:(NSURL *)URL
{
assert([[NSFileManager defaultManager] fileExistsAtPath: [URL path]]);
NSError *error = nil;
BOOL success = [URL setResourceValue: [NSNumber numberWithBool: YES]forKey: NSURLIsExcludedFromBackupKey error: &error];
if(!success){
NSLog(#"Error excluding %# from backup %#", [URL lastPathComponent], error);
}
return success;
}
When i try the above code.Still the iCloud shows the application in iOS7.Please help me to solve this issue.
Please check iOS version before using "addSkipBackupAttributeToItemAtURL" function.
Also follow this URL :
https://developer.apple.com/library/ios/qa/qa1719/_index.html
and this URL
https://developer.apple.com/library/ios/documentation/iPhone/Conceptual/iPhoneOSProgrammingGuide/PerformanceTuning/PerformanceTuning.html#//apple_ref/doc/uid/TP40007072-CH8-SW9
hi I am using the AVCam Liberary for automatic image capturing.I dont want to
save the image in photo libriary I want to save the image in document directory .it saves the image but having problem when i
load this image gives access bad.
- (void) captureStillImage
{
AVCaptureConnection *stillImageConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self stillImageOutput] connections]];
if ([stillImageConnection isVideoOrientationSupported])
[stillImageConnection setVideoOrientation:orientation];
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
ALAssetsLibraryWriteImageCompletionBlock completionBlock = ^(NSURL *assetURL, NSError *error) {
if (error) {
if ([[self delegate] respondsToSelector:#selector(captureManager:didFailWithError:)]) {
[[self delegate] captureManager:self didFailWithError:error];
}
}
};
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
UIImage *image = [[UIImage alloc] initWithData:imageData];
NSArray *sysPaths = NSSearchPathForDirectoriesInDomains( NSDocumentDirectory, NSUserDomainMask, YES );
NSString *docDirectory = [sysPaths objectAtIndex:0];
NSString *filePath = [NSString stringWithFormat:#"%#/Image.jpg", docDirectory];
NSData *imageDataToSave = [NSData dataWithData:UIImagePNGRepresentation(image)];
[imageDataToSave writeToFile:filePath atomically:YES];
//[self saveImage:image];
completionBlock:completionBlock];
[image release];
[library release];
}
else
completionBlock(nil, error);
if ([[self delegate] respondsToSelector:#selector(captureManagerStillImageCaptured:)]) {
[[self delegate] captureManagerStillImageCaptured:self];
}
}];
}
and loading the image
NSArray *sysPaths = NSSearchPathForDirectoriesInDomains( NSDocumentDirectory, NSUserDomainMask, YES );
NSString *docDirectory = [sysPaths objectAtIndex:0];
NSString *filePath = [NSString stringWithFormat:#"%#/Image.jpg", docDirectory];
UIImage* loadedImage = [UIImage imageWithContentsOfFile:filePath];
[ImageView setImage:loadedImage];
when this loadedImage is assign to any UIImage
While writing the file try -
[UIImagePNGRepresentation(self.imageView.image) writeToFile:pngPath atomically:YES];