Caching Video iOS - ios

AVAsset *asset = [AVAsset assetWithUrl:#"myUrl.mp4"];
AVPlayerItem * item = [AVPlayerItem itemWithAsset:asset];
AVPlayer * player = [AVPlayer playerWithItem:item];
playerViewController.player = player;
[player play];
How Can i know that asset or item or whatever downloaded video fully.
I need it for future caching it;
Now im using
NSData * videoData = [NSData dataWithUrl:myurl.mp4"];
NSCache *cache = [NSCache new];
[cache setObject:videoData forKey:myUrl];
And when i retrieve data from nscache i write it to file and play
NSData *videoData = [cache objectForKey:myUrl];
[videoData writeToFile:MyPath.mp4 atomically:YES];
And then
NSURL *url = [NSURL fileUrlWithPath:mypath];
AVAsset *asset = =[AVAsset assetWithURL:url];
..etc and play
But it a little bit slow.
if videoData contains in AVAsset , i can store it or AVPlayerItem .But i need to know when it downloaded.
How can i implement caching in a different way or upgrade this one. Help.

we can download any data from cloud using NSURLSession, Download in progress and completion NSURLSession fires delegate methods , we can know download completion events and and progression events by using that delegate methods, you can implement downloading functionality by using NSURLSession as follows
https://www.raywenderlich.com/110458/nsurlsession-tutorial-getting-started

I have an app with vertical collection view of video items, each item should repeat itself continuously, until swiped up or down.
So I need not to stream file every time and not download files that were downloaded earlier.
I came up with slightly different approach:
Stream AVURLAsset,
When reached end I use AVAssetExportSession to save file to disk,
Then I create new AVPLayerItem with new AVAsset using saved file,
AVPlayer's replaceCurrentItem(with: <#T##AVPlayerItem?#>)
But this is lame, I guess. I have to steam file two times. First time on first run, and then then when saving file. I'd like to just cache AVAsset into memory for the current application session, so I don't need to keep files.

Related

AVAsset tracks is empty

Essentially I am looking to concatenate AVAsset files. I've got a rough idea of what to do but I'm struggling with loading the audio files.
I can play the files with an AVAudioPlayer, I can see them in the directory via my terminal, but when I attempt to load them with AVAssetURL it always returns an empty array for tracks.
The URL I am using:
NSURL *firstAudioFileLocation = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#%#", workingDirectory , #"/temp.pcm"]];
Which results in:
file:///Users/evolve/Library/Developer/CoreSimulator/Devices/8BF465E8-321C-47E6-BF2E-049C5E900F3C/data/Containers/Data/Application/4A2D29B2-E5B4-4D07-AE6B-1DD15F5E59A3/Documents/temp.pcm
The asset being loaded:
AVAsset *test = [AVURLAsset URLAssetWithURL:firstAudioFileLocation options:nil];
However when calling this:
NSLog(#" total tracks %#", test.tracks);
My output is always total tracks ().
My subsequent calls to add them to my AVMutableCompositionTrack end up crashing the app as the AVAsset seems to not have loaded correctly.
I have played with other variations for loading the asset including:
NSURL *alternativeLocation = [[NSBundle mainBundle] URLForResource:#"temp" withExtension:#"pcm"];
As well as trying to load AVAsset with the options from the documentation:
NSDictionary *assetOptions = #{AVURLAssetPreferPreciseDurationAndTimingKey: #YES};
How do I load the tracks from a local resource, recently created by the AVAudioRecorder?
EDIT
I had a poke around and found I can record and load a .CAF file extension.
Seems .PCM is unsupported for AVAsset, this page also was of great help. https://developer.apple.com/documentation/avfoundation/avfiletype
An AVAsset load is not instantaneous. You need to wait for the data to be available. Example:
AVAsset *test = [AVURLAsset URLAssetWithURL:firstAudioFileLocation options:nil];
[test loadValuesAsynchronouslyForKeys:#[#"playable",#"tracks"] completionHandler:^{
// Now tracks is available
NSLog(#" total tracks %#", test.tracks);
}];
A more detailed example can be found in the documentation.

How to manually change the streaming video quality in AV player in ios?

I am building application in which online streaming is handled by AV Player(Default iOS player).
I want to add button for HD streaming, how to I achieve that?
The solution I found was to ensure that the underlying AVAsset is ready to return basic info, such as its duration, before feeding it to the AVPlayer. AVAsset has a method loadValuesAsynchronouslyForKeys: which is handy for this:
AVAsset *asset = [AVAsset assetWithURL:self.mediaURL];
[asset loadValuesAsynchronouslyForKeys:#[#"duration"] completionHandler:^{
AVPlayerItem *newItem = [[AVPlayerItem alloc] initWithAsset:asset];
[self.avPlayer replaceCurrentItemWithPlayerItem:newItem];
}];
In my case the URL is a network resource, and replaceCurrentItemWithPlayerItem: will actually block for several seconds waiting for this information to download otherwise.

How to save an array of AVPlayerItems to file or nsuserdefaults

I am currently trying to figure out a way to create a playlist from an array of avplayeritems and save the array to the nssuserdefaults but it wont work.
here is some code
AVAsset *asset = [AVAsset assetWithURL:_videoURL];
AAPLPlayerViewController __weak *weakSelf = self;
NSArray *oldItemsArray = [weakSelf.player items];
AVPlayerItem *newPlayerItem = [AVPlayerItem playerItemWithAsset:asset];
[weakSelf.player insertItem:newPlayerItem afterItem:nil];
[weakSelf queueDidChangeFromArray:oldItemsArray toArray:[self.player items]];
i am trying to create a playlist and store the AVPlayerItems from the queue of the AVQueuePlayer but since it has complex data that includes the AVPlayerItem and AVAsset it will crash when i try to save the [self.player items];
Ive also tried to store it to a Mutable dictionary but no methods ive tried have worked. Can anyone with experience suggest a good method of creating a playlist and storing it for the AVQueuePlayer?
I am using the AVFoundationQueuePlayer Objective C sample from the IOS developer downloads at this link :
https://developer.apple.com/library/ios/samplecode/AVFoundationQueuePlayer-iOS/Introduction/Intro.html#//apple_ref/doc/uid/TP40016104

How to play a radio station on button press in iOS

I want to play a radio station when someone presses the play radio button. However, it does not work.
Here is my code:
NSURL *url = [NSURL URLWithString:#"http://www.radio.net.pk/"];
NSData *data = [NSData dataWithContentsOfURL:url];
AVAudioPlayer *audio = [[AVAudioPlayer alloc] initWithData:data error:nil];
[audio play];
The AVAudioPlayer class only lets you play sound in any audio format available in iOS and OS X
This means, the AVAudioPlayer class does not provide support for streaming audio based on HTTP URL's. The URL used with initWithContentsOfURL: must be a local path URL.
So you should save the NSData locally (with different chunks) and play it using AVAudioPlayer.
It won't work ever.
Does radio.net.pk offers any APIs?
datWithContentsOfURL will return HTML string that you are trying to play using AVAudioPlayer !!. Please go through the website again and check for any developer API.
// Using this Cannel
NSURL *url6 = [NSURL URLWithString:#"http://sc6.spacialnet.com:35464/"]
layerItem = [AVPlayerItem playerItemWithURL:_strTemp]; // add url to playerItem
player = [AVPlayer playerWithPlayerItem:playerItem]; // add player item to AVAudioPlayer
player = [AVPlayer playerWithURL:_strTemp]

AVAudioPlayer returns wrong file duration

I use AVAudioPlayer for playing audio file and UISlider to show user current time. Firstly, it looked that everything is fine but I noticed that audio player returns wrong file duration. For example it returns me duration equals to 3.5sec but file durations is equal to 6 sec. in reality.
Do you know What can cause this problem?
Below you can see my code which return file duration:
- (NSTimeInterval)audioDurationAtURL:(NSURL *)url
{
NSError *error;
NSData *data = [NSData dataWithContentsOfURL:url];
_audioPlayer = [[AVAudioPlayer alloc] initWithData:data error:&error];
return _audioPlayer.duration;
}
To add a bit to TonyMkenu's answer, AVAsset is an alternative with the ability to give you a more accurate duration.
https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVAsset_Class/Reference/Reference.html#//apple_ref/occ/instp/AVAsset/duration
If you specify providesPreciseDurationAndTiming = YES, then AVAsset will decode the file if needed to determine its duration with accuracy. If the decode time is too long for your use, you can disable it.
In my situation, I use the AVURLAsset subclass:
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:localURL options:[NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], AVURLAssetPreferPreciseDurationAndTimingKey, nil]];
float t = CMTimeGetSeconds(asset.duration);
AVAudioPlayer appears to only returns the correct duration of a file when it is ready for play it, so try to check the length of the audio file after [_audioPlayer play];
Under certain compression formats, the audio player may change its estimate of the duration as it learns more and more about the song by playing (and hence decoding) more and more of it - https://stackoverflow.com/a/16265186
In my case, having added an audio file to a project then editing (making it longer or shorter) and then deleting and re-adding the file to the Xcode project was the problem.
Essentially the project is caching the old file. To debug this I renamed the audio file to something else, added the new audio file to the project after which the duration reported by the player was always correct, before and after calling play.

Resources