Essentially I am looking to concatenate AVAsset files. I've got a rough idea of what to do but I'm struggling with loading the audio files.
I can play the files with an AVAudioPlayer, I can see them in the directory via my terminal, but when I attempt to load them with AVAssetURL it always returns an empty array for tracks.
The URL I am using:
NSURL *firstAudioFileLocation = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#%#", workingDirectory , #"/temp.pcm"]];
Which results in:
file:///Users/evolve/Library/Developer/CoreSimulator/Devices/8BF465E8-321C-47E6-BF2E-049C5E900F3C/data/Containers/Data/Application/4A2D29B2-E5B4-4D07-AE6B-1DD15F5E59A3/Documents/temp.pcm
The asset being loaded:
AVAsset *test = [AVURLAsset URLAssetWithURL:firstAudioFileLocation options:nil];
However when calling this:
NSLog(#" total tracks %#", test.tracks);
My output is always total tracks ().
My subsequent calls to add them to my AVMutableCompositionTrack end up crashing the app as the AVAsset seems to not have loaded correctly.
I have played with other variations for loading the asset including:
NSURL *alternativeLocation = [[NSBundle mainBundle] URLForResource:#"temp" withExtension:#"pcm"];
As well as trying to load AVAsset with the options from the documentation:
NSDictionary *assetOptions = #{AVURLAssetPreferPreciseDurationAndTimingKey: #YES};
How do I load the tracks from a local resource, recently created by the AVAudioRecorder?
EDIT
I had a poke around and found I can record and load a .CAF file extension.
Seems .PCM is unsupported for AVAsset, this page also was of great help. https://developer.apple.com/documentation/avfoundation/avfiletype
An AVAsset load is not instantaneous. You need to wait for the data to be available. Example:
AVAsset *test = [AVURLAsset URLAssetWithURL:firstAudioFileLocation options:nil];
[test loadValuesAsynchronouslyForKeys:#[#"playable",#"tracks"] completionHandler:^{
// Now tracks is available
NSLog(#" total tracks %#", test.tracks);
}];
A more detailed example can be found in the documentation.
Related
Problem: I want to modify the iTunes song beats per minute.
Solution that I am trying: modifying the beats per minute of an audio asset AVAsset. I have used this code.
AVMutableMetadataItem *metaBeats = [AVMutableMetadataItem metadataItem];
metaBeats.identifier = AVMetadataIdentifierID3MetadataBeatsPerMinute;
metaBeats.key = AVMetadataIdentifierID3MetadataBeatsPerMinute;
metaBeats.keySpace = AVMetadataKeySpaceID3;
metaBeats.value = [NSNumber numberWithUnsignedInteger:40];
I have also used the other keyAVMetadataIdentifieriTunesMetadataBeatsPerMin but no option is working at all.
AVAssetExportSession export function is working fine.
I have seen other Q/As on stackoverflow which updates the metadata Common key tags that are working fine but this is not working at all.
Can anybody tell me what I am doing wrong? Any help will be appreciated. Thanks
Here is the link to project code
https://www.dropbox.com/s/6cdos0k21b2fi3y/MusicEffectBeats.zip?dl=0
NSURL *inputURL = [[NSBundle mainBundle] URLForResource:#"sample"
withExtension:#"m4a"];
AVAsset *asset = [AVAsset assetWithURL:inputURL];
In the above code i cannot see your sample.m4a file
I am saving recorded file to Documents directory and then merging audios with other audios saved in NSBundle. Using m4a encoding for all files.File path format is path/to/documents/directory/filename.m4a. File extension included.
I am creating AVURLAsset like this:
NSURL* audioURL = [NSURL fileURLWithPath:audioPath];
AVURLAsset* audioAsset = [AVURLAsset assetWithURL:audioURL];
Logged like this:
NSLog(#"filename: %#", audioAsset.URL.lastPathComponent); // filename: filename.m4a
But AVURLAsset is not readable and playable and when I get AVAssetTrack array with function tracksWithMediaType like this:
NSArray<AVAssetTrack *>* tracks = [audioAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack* audioTrack = [tracks objectAtIndex:0]; //Exception occurs
Exception: *** -[__NSArrayM objectAtIndex:]: index 0 beyond bounds for empty array thrown
Because AVAssetTrack array is empty. Testing it on simulator with OS 10.3.1.
Tried this according to iCreative's comment:
AVMediaSelectionGroup *audioTracks = [audioAsset mediaSelectionGroupForMediaCharacteristic:AVMediaCharacteristicAudible];
NSLog(#"MediaSelectionGroup count: %lu", (unsigned long)audioTracks.options.count); //count is 0
There isn't any issue in this code. Problem was before this function. Calling merging function before stopping the recorder. The file wasn't released from the recorder and the merger was accessing the file.
AVAsset *asset = [AVAsset assetWithUrl:#"myUrl.mp4"];
AVPlayerItem * item = [AVPlayerItem itemWithAsset:asset];
AVPlayer * player = [AVPlayer playerWithItem:item];
playerViewController.player = player;
[player play];
How Can i know that asset or item or whatever downloaded video fully.
I need it for future caching it;
Now im using
NSData * videoData = [NSData dataWithUrl:myurl.mp4"];
NSCache *cache = [NSCache new];
[cache setObject:videoData forKey:myUrl];
And when i retrieve data from nscache i write it to file and play
NSData *videoData = [cache objectForKey:myUrl];
[videoData writeToFile:MyPath.mp4 atomically:YES];
And then
NSURL *url = [NSURL fileUrlWithPath:mypath];
AVAsset *asset = =[AVAsset assetWithURL:url];
..etc and play
But it a little bit slow.
if videoData contains in AVAsset , i can store it or AVPlayerItem .But i need to know when it downloaded.
How can i implement caching in a different way or upgrade this one. Help.
we can download any data from cloud using NSURLSession, Download in progress and completion NSURLSession fires delegate methods , we can know download completion events and and progression events by using that delegate methods, you can implement downloading functionality by using NSURLSession as follows
https://www.raywenderlich.com/110458/nsurlsession-tutorial-getting-started
I have an app with vertical collection view of video items, each item should repeat itself continuously, until swiped up or down.
So I need not to stream file every time and not download files that were downloaded earlier.
I came up with slightly different approach:
Stream AVURLAsset,
When reached end I use AVAssetExportSession to save file to disk,
Then I create new AVPLayerItem with new AVAsset using saved file,
AVPlayer's replaceCurrentItem(with: <#T##AVPlayerItem?#>)
But this is lame, I guess. I have to steam file two times. First time on first run, and then then when saving file. I'd like to just cache AVAsset into memory for the current application session, so I don't need to keep files.
I am working on an app where I need to upload videos to server.Now here I have 2 things:
Shoot video using UIImagePickerController,generate a thumbnail and then upload to server
Pick video from Photos gallery, generate thumbnail and then upload to server.
Now the only difference between the two is:
When I use 'generateImageAsynchronouslyForTimes:completionHandler:' method,I get a call in its completionHandler block and I get an AVAsset.Now I am using below code to get its URL:
NSURL *path_url = [(AVURLAsset*)asset URL];
This is where I think things are getting messed up because I am getting something like this in case 2(when I pick video from gallery):
file:///var/mobile/Media/DCIM/102APPLE/IMG_2439.mp4
So I can't upload it while case 1 is is working fine.Is it something related to sandbox?
What's the difference between these 2 paths?
file:///private/var/mobile/Containers/Data/Application/DA4632E3-FA25-4EBE-9102-62495BF105BF/tmp/trim.07786CFE-2477-4146-9EA0-0A04042A8D05.MOV"
file:///var/mobile/Media/DCIM/102APPLE/IMG_2439.mp4
I guess its appSandbox path in 1)
In iOS, every app is like an island and there is a sandbox environment for it.So if you like to upload your video that is not in your sandbox,you will have to copy that video to your sandbox and then you can upload it.This is how you can do this:
NSURL *path_url = [(AVURLAsset*)asset URL];
PHAssetResource *asset_resource = [[PHAssetResource assetResourcesForAsset:[fetchResult lastObject]]firstObject];
PHAssetResourceRequestOptions *options = [PHAssetResourceRequestOptions new];
options.networkAccessAllowed = YES;
NSURL *newURL = [self getSandboxURLFromURL:path_url];
[[PHAssetResourceManager defaultManager] writeDataForAssetResource:asset_resource toFile:newURL options:options completionHandler:^(NSError * _Nullable error) {
//here you will get the newURL that you will use...
}];
//method to get sandbox URL
-(NSURL*)getSandboxURLFromURL:(NSURL*)photos_gallery_url{
NSString *last_path_component = [photos_gallery_url lastPathComponent];
NSString *pathToWrite = [NSTemporaryDirectory() stringByAppendingString:last_path_component];
NSURL *localpath = [NSURL fileURLWithPath:pathToWrite];
return localpath;
}
Right now, I'm using MNavChapters to get the chapter metadata for audio files and using MPMediaPlayerController to play the audio files.
This works great until I try to load an Audible (AA) book's chapters. The MPMediaItemPropertyAssetURL returns nil because this is a "protected" file. Is there an alternative to read the chapter metadata?
Current non-working code:
NSURL *assetURL = [self.mpmediaitem valueForProperty:MPMediaItemPropertyAssetURL]; //this is null :(
AVAsset *asset = [AVAsset assetWithURL:assetURL];
MNAVChapterReader *reader = [MNAVChapterReader new];
NSArray *chapters = [reader chaptersFromAsset:asset];
I just noticed that this works in iOS 9