Right now, I'm using MNavChapters to get the chapter metadata for audio files and using MPMediaPlayerController to play the audio files.
This works great until I try to load an Audible (AA) book's chapters. The MPMediaItemPropertyAssetURL returns nil because this is a "protected" file. Is there an alternative to read the chapter metadata?
Current non-working code:
NSURL *assetURL = [self.mpmediaitem valueForProperty:MPMediaItemPropertyAssetURL]; //this is null :(
AVAsset *asset = [AVAsset assetWithURL:assetURL];
MNAVChapterReader *reader = [MNAVChapterReader new];
NSArray *chapters = [reader chaptersFromAsset:asset];
I just noticed that this works in iOS 9
Related
Problem: I want to modify the iTunes song beats per minute.
Solution that I am trying: modifying the beats per minute of an audio asset AVAsset. I have used this code.
AVMutableMetadataItem *metaBeats = [AVMutableMetadataItem metadataItem];
metaBeats.identifier = AVMetadataIdentifierID3MetadataBeatsPerMinute;
metaBeats.key = AVMetadataIdentifierID3MetadataBeatsPerMinute;
metaBeats.keySpace = AVMetadataKeySpaceID3;
metaBeats.value = [NSNumber numberWithUnsignedInteger:40];
I have also used the other keyAVMetadataIdentifieriTunesMetadataBeatsPerMin but no option is working at all.
AVAssetExportSession export function is working fine.
I have seen other Q/As on stackoverflow which updates the metadata Common key tags that are working fine but this is not working at all.
Can anybody tell me what I am doing wrong? Any help will be appreciated. Thanks
Here is the link to project code
https://www.dropbox.com/s/6cdos0k21b2fi3y/MusicEffectBeats.zip?dl=0
NSURL *inputURL = [[NSBundle mainBundle] URLForResource:#"sample"
withExtension:#"m4a"];
AVAsset *asset = [AVAsset assetWithURL:inputURL];
In the above code i cannot see your sample.m4a file
Essentially I am looking to concatenate AVAsset files. I've got a rough idea of what to do but I'm struggling with loading the audio files.
I can play the files with an AVAudioPlayer, I can see them in the directory via my terminal, but when I attempt to load them with AVAssetURL it always returns an empty array for tracks.
The URL I am using:
NSURL *firstAudioFileLocation = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#%#", workingDirectory , #"/temp.pcm"]];
Which results in:
file:///Users/evolve/Library/Developer/CoreSimulator/Devices/8BF465E8-321C-47E6-BF2E-049C5E900F3C/data/Containers/Data/Application/4A2D29B2-E5B4-4D07-AE6B-1DD15F5E59A3/Documents/temp.pcm
The asset being loaded:
AVAsset *test = [AVURLAsset URLAssetWithURL:firstAudioFileLocation options:nil];
However when calling this:
NSLog(#" total tracks %#", test.tracks);
My output is always total tracks ().
My subsequent calls to add them to my AVMutableCompositionTrack end up crashing the app as the AVAsset seems to not have loaded correctly.
I have played with other variations for loading the asset including:
NSURL *alternativeLocation = [[NSBundle mainBundle] URLForResource:#"temp" withExtension:#"pcm"];
As well as trying to load AVAsset with the options from the documentation:
NSDictionary *assetOptions = #{AVURLAssetPreferPreciseDurationAndTimingKey: #YES};
How do I load the tracks from a local resource, recently created by the AVAudioRecorder?
EDIT
I had a poke around and found I can record and load a .CAF file extension.
Seems .PCM is unsupported for AVAsset, this page also was of great help. https://developer.apple.com/documentation/avfoundation/avfiletype
An AVAsset load is not instantaneous. You need to wait for the data to be available. Example:
AVAsset *test = [AVURLAsset URLAssetWithURL:firstAudioFileLocation options:nil];
[test loadValuesAsynchronouslyForKeys:#[#"playable",#"tracks"] completionHandler:^{
// Now tracks is available
NSLog(#" total tracks %#", test.tracks);
}];
A more detailed example can be found in the documentation.
AVAsset *asset = [AVAsset assetWithUrl:#"myUrl.mp4"];
AVPlayerItem * item = [AVPlayerItem itemWithAsset:asset];
AVPlayer * player = [AVPlayer playerWithItem:item];
playerViewController.player = player;
[player play];
How Can i know that asset or item or whatever downloaded video fully.
I need it for future caching it;
Now im using
NSData * videoData = [NSData dataWithUrl:myurl.mp4"];
NSCache *cache = [NSCache new];
[cache setObject:videoData forKey:myUrl];
And when i retrieve data from nscache i write it to file and play
NSData *videoData = [cache objectForKey:myUrl];
[videoData writeToFile:MyPath.mp4 atomically:YES];
And then
NSURL *url = [NSURL fileUrlWithPath:mypath];
AVAsset *asset = =[AVAsset assetWithURL:url];
..etc and play
But it a little bit slow.
if videoData contains in AVAsset , i can store it or AVPlayerItem .But i need to know when it downloaded.
How can i implement caching in a different way or upgrade this one. Help.
we can download any data from cloud using NSURLSession, Download in progress and completion NSURLSession fires delegate methods , we can know download completion events and and progression events by using that delegate methods, you can implement downloading functionality by using NSURLSession as follows
https://www.raywenderlich.com/110458/nsurlsession-tutorial-getting-started
I have an app with vertical collection view of video items, each item should repeat itself continuously, until swiped up or down.
So I need not to stream file every time and not download files that were downloaded earlier.
I came up with slightly different approach:
Stream AVURLAsset,
When reached end I use AVAssetExportSession to save file to disk,
Then I create new AVPLayerItem with new AVAsset using saved file,
AVPlayer's replaceCurrentItem(with: <#T##AVPlayerItem?#>)
But this is lame, I guess. I have to steam file two times. First time on first run, and then then when saving file. I'd like to just cache AVAsset into memory for the current application session, so I don't need to keep files.
I am working on an app where I need to upload videos to server.Now here I have 2 things:
Shoot video using UIImagePickerController,generate a thumbnail and then upload to server
Pick video from Photos gallery, generate thumbnail and then upload to server.
Now the only difference between the two is:
When I use 'generateImageAsynchronouslyForTimes:completionHandler:' method,I get a call in its completionHandler block and I get an AVAsset.Now I am using below code to get its URL:
NSURL *path_url = [(AVURLAsset*)asset URL];
This is where I think things are getting messed up because I am getting something like this in case 2(when I pick video from gallery):
file:///var/mobile/Media/DCIM/102APPLE/IMG_2439.mp4
So I can't upload it while case 1 is is working fine.Is it something related to sandbox?
What's the difference between these 2 paths?
file:///private/var/mobile/Containers/Data/Application/DA4632E3-FA25-4EBE-9102-62495BF105BF/tmp/trim.07786CFE-2477-4146-9EA0-0A04042A8D05.MOV"
file:///var/mobile/Media/DCIM/102APPLE/IMG_2439.mp4
I guess its appSandbox path in 1)
In iOS, every app is like an island and there is a sandbox environment for it.So if you like to upload your video that is not in your sandbox,you will have to copy that video to your sandbox and then you can upload it.This is how you can do this:
NSURL *path_url = [(AVURLAsset*)asset URL];
PHAssetResource *asset_resource = [[PHAssetResource assetResourcesForAsset:[fetchResult lastObject]]firstObject];
PHAssetResourceRequestOptions *options = [PHAssetResourceRequestOptions new];
options.networkAccessAllowed = YES;
NSURL *newURL = [self getSandboxURLFromURL:path_url];
[[PHAssetResourceManager defaultManager] writeDataForAssetResource:asset_resource toFile:newURL options:options completionHandler:^(NSError * _Nullable error) {
//here you will get the newURL that you will use...
}];
//method to get sandbox URL
-(NSURL*)getSandboxURLFromURL:(NSURL*)photos_gallery_url{
NSString *last_path_component = [photos_gallery_url lastPathComponent];
NSString *pathToWrite = [NSTemporaryDirectory() stringByAppendingString:last_path_component];
NSURL *localpath = [NSURL fileURLWithPath:pathToWrite];
return localpath;
}
I use AVAudioPlayer for playing audio file and UISlider to show user current time. Firstly, it looked that everything is fine but I noticed that audio player returns wrong file duration. For example it returns me duration equals to 3.5sec but file durations is equal to 6 sec. in reality.
Do you know What can cause this problem?
Below you can see my code which return file duration:
- (NSTimeInterval)audioDurationAtURL:(NSURL *)url
{
NSError *error;
NSData *data = [NSData dataWithContentsOfURL:url];
_audioPlayer = [[AVAudioPlayer alloc] initWithData:data error:&error];
return _audioPlayer.duration;
}
To add a bit to TonyMkenu's answer, AVAsset is an alternative with the ability to give you a more accurate duration.
https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVAsset_Class/Reference/Reference.html#//apple_ref/occ/instp/AVAsset/duration
If you specify providesPreciseDurationAndTiming = YES, then AVAsset will decode the file if needed to determine its duration with accuracy. If the decode time is too long for your use, you can disable it.
In my situation, I use the AVURLAsset subclass:
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:localURL options:[NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], AVURLAssetPreferPreciseDurationAndTimingKey, nil]];
float t = CMTimeGetSeconds(asset.duration);
AVAudioPlayer appears to only returns the correct duration of a file when it is ready for play it, so try to check the length of the audio file after [_audioPlayer play];
Under certain compression formats, the audio player may change its estimate of the duration as it learns more and more about the song by playing (and hence decoding) more and more of it - https://stackoverflow.com/a/16265186
In my case, having added an audio file to a project then editing (making it longer or shorter) and then deleting and re-adding the file to the Xcode project was the problem.
Essentially the project is caching the old file. To debug this I renamed the audio file to something else, added the new audio file to the project after which the duration reported by the player was always correct, before and after calling play.