First of all, thank you for your help.
The Questions:
1.I need to test a lot videos.I wrote two methods to test videos.
2.When playing a video status change, and I set avPlayer = nil and avPlayerItem = nil, i also add the #autoreleasepool {}, but the memory is slowly increased, I used the instruments but I did not find memory leaks, I find a lot questions and did not find a solution.
3.Here is my codeļ¼
-(void)CheckVideoURlCanPlay{
#autoreleasepool {
VideoPlayDataObj *videoPlay = [self.needCheckArr objectAtIndex:0];
NSString *str = videoPlay.videoURL;
NSURL *url = [NSURL URLWithString:str];
self.playItem = [AVPlayerItem playerItemWithURL:url];
self.player = [AVPlayer playerWithPlayerItem:self.playItem];
self.playLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
self.playLayer.videoGravity = AVLayerVideoGravityResizeAspect;
self.playLayer.frame=CGRectMake(100, 80, 100, 100);
[self.view.layer addSublayer:self.playLayer];
[self.player play];
[self.playItem addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:nil];
}}
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary<NSString *,id> *)change context:(void *)context{
#autoreleasepool {
if ([keyPath isEqualToString:#"status"]) {
if (self.player.currentItem.status == AVPlayerItemStatusReadyToPlay) {
#autoreleasepool {
[self.player.currentItem.asset cancelLoading];
[self.player.currentItem cancelPendingSeeks];
[self.player cancelPendingPrerolls];
[self.playItem removeObserver:self forKeyPath:#"status"];
self.playItem = nil;
self.player = nil;
}
[self.needCheckArr removeObjectAtIndex:0];
[NSThread sleepForTimeInterval:5];
[self CheckVideoURlCanPlay];
}
}
}
}
4.I also tried to relieve the current view, but it does not relieve all. If you play a large number of video memory will be high, the final app will be killed.
5.I wonder if after loading a video, it will produce dirty memory?
this my app start memory:13M
this test some videos and i pop the view :
enter image description here
this test some videos again and pop the view :
enter image description here
6.Finally, I hope you can help me, thank you very much.
It does not look like you are ever removing the AVPlayerLayer from the view:
[self.playerLayer removeFromSuperlayer];
Also I cannot see the point of #autoreleasepool here, especially if you have ARC turned on (which you almost certainly do!). #autoreleasepool is usually only useful when you have ARC turned off, or when you are churning through a lot of memory (as in, many megabytes) in a single main-loop invocation and need to control when they get cleaned up.
Related
So I have an AVPlayer that playing a live stream of an .m3u8 video and from what I've found searching it looks like you can't use AVAssetImageGenerator to make a screen shot but instead should use AVPlayerItemVideoOutput and
- (CVPixelBufferRef)copyPixelBufferForItemTime:(CMTime)itemTime itemTimeForDisplay:(CMTime *)outItemTimeForDisplay
but when I try and get the outputs from my AVPlayer.
NSArray *outputs = self.mainPlayer.currentItem.outputs;
I get an empty array.
The video plays just fine. Ultimatly what I want is a method like this.
-(UIImage *)frameFor:(CMTime)time;
At some point the CALayer on the view needs to be getting this image data so their has to be a way to grab that at some point. I tried just capturing the CALayer my AVPLayerLayer is attached to but I don't get anything more than the blank view color (bright pink just to make sure its return something). Their has to be some way of grabbing this data.
You're getting an empty array when you do
NSArray *outputs = self.mainPlayer.currentItem.outputs;
because the AVPlayerItemVideoOutput object needs to be added first
NSDictionary *settings = #{(id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]};
self.videoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
[playerItem addOutput:self.videoOutput];
and then when the AVPlayerItem status is AVPlayerStatusReadyToPlay you can capture the current frame using copyPixelBufferForItemTime:itemTimeForDisplay. For that, you need to create your AVPlayer object and add an observer to the AVPlayerItem status property
self.player = [AVPlayer playerWithPlayerItem:playerItem];
[self.player.currentItem addObserver:self forKeyPath:#"status" options:0 context:NULL];
and then in your callback function
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (self.player.currentItem.status == AVPlayerStatusReadyToPlay)
{
CMTime currentTime = self.player.currentItem.currentTime;
CVPixelBufferRef buffer = [self.videoOutput copyPixelBufferForItemTime:currentTime itemTimeForDisplay:nil];
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:buffer];
UIImage *image = [UIImage imageWithCIImage:ciImage];
// use `image`
}
}
If you only need to capture a thumbnail and you don't have to play the video first, you can use a simple utility that I've created that follows this logic.
(https://github.com/acotilla91/ACThumbnailGenerator)
How to use:
double bitRate = 1000000; // force video bit rate (can be use to cap video quality and improve performance). Pass 0 to use default bit rate.
self.thumbnailGenerator = [[ACThumbnailGenerator alloc] initWithPreferredBitRate:bitRate];
NSURL *videoURL = [NSURL URLWithString:#"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"];
int position = 10; // video position (in seconds) from where thumbnail should be extracted. Always pass 0 for live streams.
[self.thumbnailGenerator loadImageFrom:videoURL position:position withCompletionBlock:^(UIImage *image) {
// use `image`
}];
Hope it helps.
Swift version:
ACThumbnailGenerator-Swift
Usage:
var generator: ACThumbnailGenerator!
func captureSomeImage() {
let streamUrl = URL(string: "https://p-events-delivery.akamaized.net/18oijbasfvuhbfsdvoijhbsdfvljkb6/m3u8/hls_vod_mvp.m3u8")!
generator = ACThumbnailGenerator(streamUrl: streamUrl)
generator.delegate = self
generator.captureImage(at: 300)
}
func generator(_ generator: ACThumbnailGenerator, didCapture image: UIImage, at position: Double) {
// Use `image`
}
I'm trying to implement a fade-in effect based on AVPlayer + AVAudioMix + AVAudioMixInputParameters. It basically works except when playing the audio for the first time after starting my app there is a click in the beginning. Subsequent plays work perfect though, but the first-time glitch is pretty stable and reproducible.
My Play button is enabled only after the AVPlayerItem's status is set to ready, so it's impossible to fire a play method while the player is not ready. In fact it doesn't matter how long I wait after loading the audio file and constructing all the objects.
This happens on OS X, I haven't tested it on iOS (yet).
Note that for this test you need an audio file that starts with sound and not silence. Here is my stripped down code without the GUI part (testFadeIn is the entry point):
static AVPlayer* player;
static void* PlayerItemStatusObserverContext = &PlayerItemStatusObserverContext;
- (void)testFadeIn
{
AVURLAsset* asset = [AVURLAsset.alloc initWithURL:[NSURL fileURLWithPath:#"Helicopter.m4a"] options:#{AVURLAssetPreferPreciseDurationAndTimingKey: #YES}];
AVPlayerItem* item = [AVPlayerItem playerItemWithAsset:asset];
player = [AVPlayer playerWithPlayerItem:item];
[item addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:PlayerItemStatusObserverContext];
}
- (void)observeValueForKeyPath:(NSString*)keyPath ofObject:(id)object change:(NSDictionary*)change context:(void*)context
{
if (context == PlayerItemStatusObserverContext)
{
AVPlayerStatus status = (AVPlayerStatus)[[change objectForKey:NSKeyValueChangeNewKey] integerValue];
if (status == AVPlayerStatusReadyToPlay)
{
[self applyFadeIn];
[self performSelector:#selector(play:) withObject:nil afterDelay:1.0];
}
}
}
- (void)applyFadeIn
{
assert(player.currentItem.tracks.firstObject);
AVMutableAudioMixInputParameters* fadeIn = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:player.currentItem.tracks.firstObject];
[fadeIn setVolume:0 atTime:kCMTimeZero];
[fadeIn setVolume:1 atTime:CMTimeMake(2, 1)];
NSMutableArray* paramsArray = [NSMutableArray new];
[paramsArray addObject:fadeIn];
AVMutableAudioMix* audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = paramsArray;
player.currentItem.audioMix = audioMix;
}
- (void)play:(id)unused
{
[player play];
}
Click! What is wrong with this?
Edit:
An obvious workaround that I use at the moment is: when the player reports it's ready, I do a short 100ms playback with volume=0, then restore currentTime and volume and only then I report to the main app that the player is ready. This way there are no clicks. Interestingly, anything less than 100ms still gives the click.
This seems like an issue with something that's being cached by AVFoundation after the first playback. It's neither the tracks, as they are available when I set the fade in params, nor the seek status.
I have an AVPlayer class all set up that streams an audio file. It's a bit long, so I can't post the whole thing here. What I am stuck on is how to allow the user to replay the audio file after they have finished listening to it once. When it finishes the first time, I correctly receive a notification AVPlayerItemDidPlayToEndTimeNotification. When I go to replay it, I immediately receive the same notification, which blocks me from replaying it.
How can I reset this such that the AVPlayerItem doesn't think that it has already played the audio file? I could deallocate everything and set it up again, but I believe that would force the user to download the audio file again, which is pointless and slow.
Here are some parts of the class that I think are relevant. The output that I get when attempting to replay the file looks like this. The first two lines are exactly what I would expect, but the third is a surprise.
is playing no timer audio player has finished playing audio
- (id) initWithURL : (NSString *) urlString
{
self = [super init];
if (self) {
self.isPlaying = NO;
self.verbose = YES;
if (self.verbose) NSLog(#"url: %#", urlString);
NSURL *url = [NSURL URLWithString:urlString];
self.playerItem = [AVPlayerItem playerItemWithURL:url];
self.player = [[AVPlayer alloc] initWithPlayerItem:self.playerItem];
[self determineAudioPlayTime : self.playerItem];
self.lengthOfAudioInSeconds = #0.0f;
[self.player addObserver:self forKeyPath:#"status" options:0 context:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(itemDidFinishPlaying:) name:AVPlayerItemDidPlayToEndTimeNotification object:self.playerItem];
}
return self;
}
// this is what gets called when the user clicks the play button after they have
// listened to the file and the AVPlayerItemDidPlayToEndTimeNotification has been received
- (void) playAgain {
[self.playerItem seekToTime:kCMTimeZero];
[self toggleState];
}
- (void) toggleState {
self.isPlaying = !self.isPlaying;
if (self.isPlaying) {
if (self.verbose) NSLog(#"is playing");
[self.player play];
if (!timer) {
NSLog(#"no timer");
CMTime audioTimer = CMTimeMake(0, 1);
[self.player seekToTime:audioTimer];
timer = [NSTimer scheduledTimerWithTimeInterval:1.0
target:self
selector:#selector(updateProgress)
userInfo:nil
repeats:YES];
}
} else {
if (self.verbose) NSLog(#"paused");
[self.player pause];
}
}
-(void)itemDidFinishPlaying:(NSNotification *) notification {
if (self.verbose) NSLog(#"audio player has finished playing audio");
[[NSNotificationCenter defaultCenter] postNotificationName:#"audioFinished" object:self];
[timer invalidate];
timer = nil;
self.totalSecondsPlayed = [NSNumber numberWithInt:0];
self.isPlaying = NO;
}
You can call the seekToTime method when your player received AVPlayerItemDidPlayToEndTimeNotification
func itemDidFinishPlaying() {
self.player.seek(to: CMTime.zero)
self.player.play()
}
Apple recommends using AVQueueplayer with an AVPlayerLooper.
Here's Apple's (slightly revised) sample code:
AVQueuePlayer *queuePlayer = [[AVQueuePlayer alloc] init];
AVAsset *asset = // AVAsset with its 'duration' property value loaded
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
// Create a new player looper with the queue player and template item
self.playerLooper = [AVPlayerLooper playerLooperWithPlayer:queuePlayer
templateItem:playerItem];
// Begin looping playback
[queuePlayer play];
The AVPlayerLooper does all the event listening and playing for you, and the queue player is used to create what they call a "treadmill pattern". This pattern is essentially chaining multiple instances of the same AVAssetItem in a queue player and moving each finished asset back to the beginning of the queue.
The advantage of this approach is that it enables the framework to preroll the next asset (which is the same asset in this case, but its start still needs prerolling) before it arrives, reducing latency between the asset's end and looped start.
This is described in greater detail at ~15:00 in the video here: https://developer.apple.com/videos/play/wwdc2016/503/
Mucho tiempo wasted on this - I have fairly straightforward playback using AVAudioPLayer but when the file finishes playing I get this message in the debug window:
objc[39752]: Object 0x7304d80 of class _NSThreadPerformInfo
autoreleased with no pool in place - just leaking - break on
objc_autoreleaseNoPool() to debug
Here are my play and callback functions (I have simplified the filename creation but the error is the same):
#implementation PlaybackEngine {
int _pbIndex;
AVAudioPlayer *_player;
NSArray *fileList;
BufferStores *_BS;
}
....
-(BOOL)startPlayback {
NSURL *playURL = [[NSURL alloc] initWithString:[[DOCUMENTS_FOLDER stringByAppendingPathComponent:#"28Jan13_17:13:21.aif"] stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding]];
printf("Playback file URL = %s\n", [[playURL path] UTF8String]);
_player = [[AVAudioPlayer alloc] initWithContentsOfURL:playURL error:nil];
_player.delegate = self;
[_player prepareToPlay];
[_player play];
self.playing = YES;
return YES;
}
-(void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag {
// check for player stopping due to bad audio data
if (!flag) {
printf("Player finished playing due to bad audio data");
}
self.playing = NO;
_player = nil;
}
I've tried setting a breakpoint on objc_autoreleaseNoPool but it never gets hit. What blindingly obvious ARC mistake am I making?
Well I seem to have solved it. Apple Dev Support told me that this error should never happen, which made me think it must be some obscure bug in the 5.x SDK, but in the end I tidied up all my classes, redefining ivars as either properties or properties in an extension as appropriate, made sure all my property modifiers were correct etc and the problem has gone.
Sorry I can't explicitly pinpoint the culprit, but clearly a memory management issue.
I want to use the AVURLAsset to play a video file but on a server not local file. I have read that the AVURLAsset cant be used directly for remote files .
I read another link of stack Overflow
AVURLAsset cannot load with remote file
this link has some method to use AVURLAsset to play remote files but I am not able to understand it fully. My observer is not being called. Can someone please help me? Actually I don't want to use AVPlayer to play video for some reasons. I am grabbing frames from AVAsset and then rendering them as textures in OpenGL so I need to do this by AVURLAsset only.
Here is code to look at
-(void) startPlayer
{
NSURL *url=[NSURL fileURLWithPath:#"http://gamooz.com/wildlife.mp4"];
pItem = [AVPlayerItem playerItemWithURL:url];
player = [AVPlayer playerWithPlayerItem:pItem];
[player play];
pItem addObserver:self forKeyPath:#"status" options:0 context:nil];
}
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change: (NSDictionary *)change context:(void *)context
{
NSLog(#"heyy");
if ([keyPath isEqualToString:#"status"])
{
AVPlayerItem *pItemTemp = (AVPlayerItem *)object;
if (pItemTemp.status == AVPlayerItemStatusReadyToPlay)
{
///now i can use playerItem asset
asset = (AVURLAsset *)pItemTemp.asset;
}
}
}
but the observer is never getting called. Why is that?
Also I put the observer code in some other function and tried to check if the playerItem is ready to play
-(void) checkForPlayer
{
if (pItem.status == AVPlayerItemStatusReadyToPlay)
{
asset = (AVURLAsset *)pItem.asset;
}
}
it is never giving status equal to ready.