After setting mPlayer.usesExternalPlaybackWhileExternalScreenIsActive to YES in AVPlayerDemoPlaybackViewController of Apple's AVPlayerDemo sample project, how do you throttle the scrubbing so it doesn't lag behind on the AppleTV?
What I mean is that when you move the slider really fast back and forth the AppleTV performs each and every seekToTime operation, but takes longer to do it then the user takes to slide.
One of the problems with the demo is it uses both the "Touch Drag Inside" and "Value Changed" events which causes it to send the same value twice. If you remove "Value Changed" it improves a bit, but still lags.
I've tried rounding to whole seconds and then only send seekToTime when the second changes, but that doesn't seem to help as much. What I really need to do is send fewer commands the faster the user moves the slider, but more when the user moves slower.
Any ideas on how to accomplish this?
The UISlider already somewhat throttles itself. The faster you move it the fewer values you get from point A to point B. This isn't enough to stop the seek operations from stacking up over AirPlay.
You can, however, use the seekToTime:completionHandler: to prevent the stack up like this:
if(seeking) {
return;
}
seeking = YES;
[player seekToTime:CMTimeMakeWithSeconds(time, NSEC_PER_SEC) completionHandler:^(BOOL finished) {
seeking = NO;
}];
This drops any new seeks until the one in progress finishes. This seems to work well. You just need to make sure to send one last seek operation after the user stops scrubbing.
While an NSTimer can do the same thing, it's less accurate, and results will vary depending on the latency of the connection. The completionHandler used in this manner ensures that seeks do not stack up, regardless of latency times.
I also found that the "Value Changed" action of the UISlider can happen before any touch start actions. So it's better to use the touch drag inside/outside actions instead, which are guaranteed to happen after a touch start.
Improved Luke answer with some additional code:
static NSTimeInterval ToleranceForAsset(AVAsset *asset) {
NSTimeInterval tolerance = 0.0;
for (AVAssetTrack *track in asset.tracks) {
NSTimeInterval trackTolerance = CMTimeGetSeconds(track.minFrameDuration);
tolerance = MAX(tolerance, trackTolerance);
}
return tolerance;
}
#interface MyPlayerWrapper ()
#property (strong, nonatomic) AVPlayer *player;
#property (assign, nonatomic) NSTimeInterval playerTime;
#property (assign, nonatomic, getter=isSeeking) BOOL seeking;
#property (assign, nonatomic) CGFloat latestSetTime;
#end
#implementation MyPlayerWrapper
- (NSTimeInterval)playerTime {
return CMTimeGetSeconds(self.player.currentItem.currentTime);
}
- (void)setPlayerTime:(NSTimeInterval)playerTime {
NSTimeInterval tolerance = ToleranceForAsset(self.player.currentItem.asset);
if (tolerance) {
// round to nearest seek tolerance (for example 1/30 sec)
playerTime = floor(playerTime / tolerance) * tolerance;
}
self.latestSetTime = playerTime;
if (self.isSeeking) {
return;
}
self.seeking = YES;
[self.player seekToTime:CMTimeMakeWithSeconds(playerTime, self.player.currentItem.duration.timescale) toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero completionHandler:^(BOOL finished) {
self.seeking = NO;
if (ABS(self.player.currentItem.currentTime - latestSetTime) > MAX(tolerance, DBL_EPSILON)) {
self.playerTime = latestSetTime;
}
}];
}
#end
Related
Question:
How can I make sure that the code executed due to a runloop event (timer, user interaction, performSelector, etc) have the same concept of "now"?
Background:
Say that event handler takes 100ms to execute, that means that [NSDate date] will return a slightly different "now" depending on when in the execution you make the call. If you are very unlucky with the timing you might even end up with different dates between the calls.
This creates problems for things that rely on the current time for doing various calculations since those calculations can differ during the execution.
Of course, for a specific event handler you could just store the date in the AppDelegate or similar or pass it on in each call starting from the entry point.
However, I want something safer and automatic. Ideally I want to know at what time the current run loop started processing the event. Something I can simply replace [NSDate date] with and always get the same result until the next event is fired.
I looked into the documentation of NSRunLoop without much luck. I also looked into CADisplayLink for potential workarounds. Neither provided a clear cut answer.
It feels like this should be a common thing to need, not something that needs "workarounds". My guess is that I am looking in the wrong places or using the wrong search terms.
Code Example:
UIView *_foo, _fie;
NSDate *_hideDate;
- (void)handleTimer
{
[self checkVisible:_foo];
[self checkVisible:_fie];
}
- (void)checkVisible:(UIView *)view
{
view.hidden = [_hideDate timeIntervalSinceNow] < 0];
}
In this case we could end up with _fie being hidden when _foo is still visible since "now" has changed by a very small amount between calls.
This is a very simplified example in which a fix is trivial by simply calling [NSDate date] and sending that instance to all callers. It is the general case that I am interested in though where call chains might be very deep, cyclic, re-entrant, etc.
NSRunLoop is a wrapper for CFRunLoop. CFRunLoop has features that NSRunLoop doesn't expose, so sometimes you have to drop down to the CF level.
One such feature is observers, which are callbacks you can register to be called when the run loop enters different phases. The phase you want in this case is an after-waiting observer, which is called after the run loop receives an event (from a source, or due to a timer firing, or due to a block being added to the main queue).
Let's add a wakeDate property to NSRunLoop:
// NSRunLoop+wakeDate.h
#import <Foundation/Foundation.h>
#interface NSRunLoop (wakeDate)
#property (nonatomic, strong, readonly) NSDate *wakeDate;
#end
With this category, we can ask an NSRunLoop for its wakeDate property any time we want, for example like this:
#import "AppDelegate.h"
#import "NSRunLoop+wakeDate.h"
#implementation AppDelegate
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
NSTimer *timer = [NSTimer timerWithTimeInterval:0.5 repeats:YES block:^(NSTimer *timer){
NSLog(#"timer: %.6f", NSRunLoop.currentRunLoop.wakeDate.timeIntervalSinceReferenceDate);
}];
[NSRunLoop.currentRunLoop addTimer:timer forMode:NSRunLoopCommonModes];
return YES;
}
#end
To implement this property, we'll create a WakeDateRecord class that we can attach to the run loop as an associated object:
// NSRunLoop+wakeDate.m
#import "NSRunLoop+wakeDate.h"
#import <objc/runtime.h>
#interface WakeDateRecord: NSObject
#property (nonatomic, strong) NSDate *date;
- (instancetype)initWithRunLoop:(NSRunLoop *)runLoop;
#end
static const void *wakeDateRecordKey = &wakeDateRecordKey;
#implementation NSRunLoop (wakeDate)
- (NSDate *)wakeDate {
WakeDateRecord *record = objc_getAssociatedObject(self, wakeDateRecordKey);
if (record == nil) {
record = [[WakeDateRecord alloc] initWithRunLoop:self];
objc_setAssociatedObject(self, wakeDateRecordKey, record, OBJC_ASSOCIATION_RETAIN_NONATOMIC);
}
return record.date;
}
#end
The run loop can run in different modes, and although there are a small number of common modes, new modes can in theory be created on the fly. If you want an observer to be called in a particular mode, you have to register it for that mode. So, to ensure that the reported date is always correct, we'll remember not just the date but also the mode in which we recorded the date:
#implementation WakeDateRecord {
NSRunLoop *_runLoop;
NSRunLoopMode _dateMode;
NSDate *_date;
CFRunLoopObserverRef _observer;
}
To initialize, we just store the run loop and create the observer:
- (instancetype)initWithRunLoop:(NSRunLoop *)runLoop {
if (self = [super init]) {
_runLoop = runLoop;
_observer = CFRunLoopObserverCreateWithHandler(nil, kCFRunLoopEntry | kCFRunLoopAfterWaiting, true, -2000000, ^(CFRunLoopObserverRef observer, CFRunLoopActivity activity) {
[self setDate];
});
}
return self;
}
When asked for the date, we first check whether the current mode is different from the date in which we recorded the mode. If so, then the date wasn't updated when the run loop awoke in the current mode. That means the observer wasn't registered for the current mode, so we should register it now and update the date now:
- (NSDate *)date {
NSRunLoopMode mode = _runLoop.currentMode;
if (![_dateMode isEqualToString:mode]) {
// My observer didn't run when the run loop awoke in this mode, so it must not be registered in this mode yet.
NSLog(#"debug: WakeDateRecord registering in mode %#", mode);
CFRunLoopAddObserver(_runLoop.getCFRunLoop, _observer, (__bridge CFRunLoopMode)mode);
[self setDate];
}
return _date;
}
When we update the date, we also need to update the stored mode:
- (void)setDate {
_date = [NSDate date];
_dateMode = _runLoop.currentMode;
}
#end
An important warning about this solution: the observer fires once per pass through the run loop. The run loop can service multiple timers and multiple blocks added to the main queue during a single pass. All of the serviced timers or blocks will see the same wakeDate.
I am creating a simple drum machine. This function controls the time between each sample that is played (thus controlling the tempo of the drum machine). I need to control the tempo with a slider, so I'm hoping to be able to control the 'time duration until next step' value with this if possible. However, when I have tried to do this, it tells me "time is part of NSDate"
-(void)run
{
#autoreleasepool
{
// get current time
NSDate* time = [NSDate date];
// keeping going around the while loop if the sequencer is running
while (self.running)
{
// sleep until the next step is due
[NSThread sleepUntilDate:time];
// update step
int step = self.step + 1;
// wrap around if we reached NUMSTEPS
if (step >= NUMSTEPS)
step = 0;
// store
self.step = step;
// time duration until next step
time = [time dateByAddingTimeInterval:0.5];
}
// exit thread
[NSThread exit];
}
}
This tells me NSTimeInterval is an incompatable type
// time duration until next step
time = [time dateByAddingTimeInterval: self.tempoControls];
Here is where the slider is declared
.m
- (IBAction)sliderMoved:(UISlider *)sender
{
AppDelegate* app = [[UIApplication sharedApplication] delegate];
if (sender == self.tempoSlider)
{
PAEControl* tempoControl = app.tempoControls[app.editIndex];
tempoControl.value = self.tempoSlider.value;
}
}
.h
#interface DetailController : UIViewController
#property (weak, nonatomic) IBOutlet UISlider *tempoSlider;
- (IBAction)sliderMoved:(UISlider *)sender;
Any help would me much appriciated, thanks in advance.
It looks like self.tempoControls is an array of PAEControl objects. The method named dateByAddingTimeInterval: needs an argument of type NSTimeInterval (aka double). It looks like you're trying to pass in this array instead.
Try changing this line -
time = [time dateByAddingTimeInterval: self.tempoControls];
To maybe this -
PAEControl* tempoControl = self.tempoControls[self.editIndex];
time = [time dateByAddingTimeInterval: (NSTimeInterval)tempoControl.value];
On another note, if this is all running on the main thread, be aware that you are blocking it and the UI will become very unresponsive.
I amy trying to make simple progress bar in SpriteKit. To simplify the example I will use SKLabelNode and it's text property, which will indicate the progress.
Here is the code( GameScene.m ):
#import "GameScene.h"
#interface GameScene ()
typedef void (^CompletitionHandler)(void);
#property (nonatomic,strong)SKLabelNode* progressBar;
#end
#implementation GameScene
-(void)didMoveToView:(SKView *)view {
/* Setup your scene here */
self.progressBar = [SKLabelNode labelNodeWithFontNamed:#"Chalkduster"];
self.progressBar.fontColor = [SKColor redColor];
self.progressBar.fontSize = 24.0;
self.progressBar.position = CGPointMake(CGRectGetMidX(self.frame),CGRectGetMidY(self.frame));
[self addChild:self.progressBar];
[self loadSceneAssetsWithCompletionHandler:^{
[self setupScene:view];
}];
}
- (void)loadSceneAssetsWithCompletionHandler:(CompletitionHandler)handler {
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
// Load the shared assets in the background.
//This part simulates bunch of assets(textures, emitters etc) loading in background
for (int i = 0; i <= 100; i++) {
[NSThread sleepForTimeInterval:0.01];
float progressValue = (float)i;
dispatch_async(dispatch_get_main_queue(), ^{
self.progressBar.text = [NSString stringWithFormat:#"%1.f", progressValue];
});
}
if (!handler) {
return;
}
dispatch_async(dispatch_get_main_queue(), ^{
// Call the completion handler back on the main queue.
handler();
});
});
}
Now I am trying to find an obvious way, if there is any, to update the progress based on percentage of data loaded from background queue.
The problem is that I want to show percentage of loaded data from 0 to 100. But I don't know how to tell what would be 1% of bunch of textures, emitters and nodes, or I can say, I don't know how to update a progress bar after %1 is loaded. This is because I am working with different kind of objects. Is there any way to check the state of certain background queue to see how much stuff is left to be executed(loaded)?
Anybody has any idea, or suggestion ?
If you are looking to get an update on, for example, how much is left to load for a texture atlas then the answer is you can't.
You can however keep a "load items" counter and update that once an asset is loaded. For example, you have 13 texture atlases to load and some sound files, both of those have completion capability in their loading methods.
- (void)preloadWithCompletionHandler:(void (^)(void))completionHandler
playSoundFileNamed:(NSString *)soundFile
waitForCompletion:(BOOL)wait
Every time an asset finishes loading, update your counter. To be honest though, I am not sure if this is really necessary as loading usually happens very quickly. Display a generic "loading" message for a few seconds is probably your best (and easiest) option.
In my application I have an object CameraHandler that uses GPUImage to detect certain movement from camera. It is initialized in my GameViewController.
It (CameraHandler) is able to successfully detect movements, and fire the relevant methods, however it locks up the GameViewController's view for a significant amount of time (~5 to 10 seconds) before any of the changes are displayed on screen. Once the CameraHandler detects a change, it fires a method that changes the background of the top view on the view controller and displays a UIAlertView (for testing purposes). Like I said, this only happens after 5-10 seconds from the moment it is called. I know the program itself is not frozen because I get the relevant log outputs from the methods. I've tried different techniques to try and fix this but I have come out empty handed for several weeks now.
In GameViewController (where I call and initiate the CameraHandler):
-(void)startRound{
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0ul);
dispatch_async(queue, ^{
[_shotDetector captureStillImage];
dispatch_sync(dispatch_get_main_queue(), ^{
NSLog(#"finish capture still image thread");
});
});
}
/* this method gets called from CameraHandler once it detects movement */
-(void)shotLifted:(NSNumber*)afterTime{
NSLog(#"shot lifted fired");
UIAlertView *lost = [[UIAlertView alloc]initWithTitle:#"Good Job!" message:#"Shot lifted in time" delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil];
[lost show];
[_questionView setBackgroundColor:[UIColor whiteColor]];
NSLog(#"shot lifted done");
}
CameraHandler.h
#interface CameraHandler : NSObject <GPUImageVideoCameraDelegate>
#property (strong) GPUImageOutput<GPUImageInput> *filter,*emptyFilter;
#property (strong) GPUImageVideoCamera *videoCamera;
#property (strong) GPUImagePicture *sourcePicture;
#property (strong) GPUImageOutput *pictureOutput;
#property (strong) GPUImageStillCamera *stillCamera;
#property (strong) __block UILabel *shotRemovedLabel;
#property (strong) __block NSDate *startTime;
#property (strong) NSMutableArray *averageLum;
#property (strong) id delegate;
#property (strong) GPUImageLuminosity *lumin;
CameraHandler.m - relevant method
-(void)startBlackoutShotMotionAnalysis{
NSLog(#"starting shot motion analysis");
[_videoCamera addTarget:self.filter];
[_sourcePicture processImage];
[_sourcePicture addTarget:self.filter];
[_videoCamera startCameraCapture];
_lumin = [[GPUImageLuminosity alloc] init];
[self.filter addTarget:_lumin];
__block int i =0;
__unsafe_unretained GameViewController* weakDelegate = self.delegate;
//begin luminosity detecting of live-video from uiimage
[(GPUImageLuminosity *)_lumin setLuminosityProcessingFinishedBlock:^(CGFloat luminosity, CMTime frameTime) {
if(i<60){
if(i>10){
_startTime = [NSDate date];
[_averageLum addObject:[NSNumber numberWithFloat:luminosity]];
}
i++;
}
else{
CGFloat average = [[_averageLum valueForKeyPath:#"#avg.floatValue"]floatValue];
CGFloat difference = fabsf(luminosity-average);
if(difference > 0.05){
NSTimeInterval liftedAfter = [_startTime timeIntervalSinceDate:[NSDate date]];
[weakDelegate performSelector:#selector(shotLifted:) withObject:[NSNumber numberWithFloat:liftedAfter]];
[_videoCamera stopCameraCapture];
NSLog(#"should turn white now");
return ;
}
}
}];
NSLog(#"finished returning executing starBlackoutMotionAnalysis Method");
}
NSLOG OUTPUT:
2014-04-08 20:22:45.450 Groupy[2887:5c0f] starting shot motion analysis
2014-04-08 20:22:46.152 Groupy[2887:5c0f] finished returning executing starBlackoutMotionAnalysis Method
2014-04-08 20:22:48.160 Groupy[2887:1303] shot lifted fired
2014-04-08 20:22:48.221 Groupy[2887:1303] shot lifted done
2014-04-08 20:22:48.290 Groupy[2887:1303] should turn white now
Any help in the right direction would be huge. I've been struggling with figuring this out.Thanks!
The first thing I usually look for when I see unusually delayed updates in the UI is whether or not my UI updating code is being executed on the main queue. Apart from a few exceptions, you should always dispatch any UI-related code to the main queue, or you'll get weird behaviour like this.
From what I can see, you perform the shotLifted: selector directly from within the luminosityProcessingFinishedBlock. We can safely assume that GPUImage will be calling that block off the main thread. This means that your code to initialise and show the alert view is happening off the main thread too.
To change this, you should try wrapping your call to shotLifted: in a block and dispatch that to the main queue:
dispatch_async(dispatch_get_main_queue(), ^{
[weakDelegate performSelector:#selector(shotLifted:) withObject:obj];
}
Or alternatively you can do:
[weakDelegate performSelectorOnMainThread:#selector(shotLifted:) withObject:obj waitUntilDone:NO];
I have a UITextfield and a UIButton. The user can enter, for example, search word such as "dog" or "cat" and it will trigger a method in another class that runs on a custom dispatch GCD queue to fetch the images (around 100 or so).
Everything works fine, except if the user in the midst of fetching, decides to change and enter another search word such as "cat" and then press the fetch button, I would like to be able to stop that thread / method while it is fetching the images from the previous search term.
I have thought about NSThread (something I never used before) or blocks (to get notified once the method has finished running), but the problem with blocks is, I will get notified once the method had finished doing its thing, but what I need here is to tell it to stop fetching (because the user has decided on another search and entered another search term).
Can someone please cite me with some samples, as to how we can be able to stop a loop / method while it is running on a custom GCD thread before it is finished? Thanks in advance.
I'm using NSOperationand NSOperationQueue to cluster markers on a map in the background and to cancel the operation if necessary.
The function to cluster the markers is implemented in a subclass of NSOperation:
ClusterMarker.h:
#class ClusterMarker;
#protocol ClusterMarkerDelegate <NSObject>
- (void)clusterMarkerDidFinish:(ClusterMarker *)clusterMarker;
#end
#interface ClusterMarker : NSOperation
-(id)initWithMarkers:(NSSet *)markerSet delegate:(id<ClusterMarkerDelegate>)delegate;
// the "return value"
#property (nonatomic, strong) NSSet *markerSet;
// use the delegate pattern to inform someone that the operation has finished
#property (nonatomic, weak) id<ClusterMarkerDelegate> delegate;
#end
and ClusterMarker.m:
#implementation ClusterMarker
-(id)initWithMarkers:(NSSet *)markerSet delegate:(id<ClusterMarkerDelegate>)delegate
{
if (self = [super init]) {
self.markerSet = markerSet;
self.delegate = delegate;
}
return self;
}
- (void)main {
#autoreleasepool {
if (self.isCancelled) {
return;
}
// perform some Überalgorithmus that fills self.markerSet (the "return value")
// inform the delegate that you have finished
[(NSObject *)self.delegate performSelectorOnMainThread:#selector(clusterMarkerDidFinish:) withObject:self waitUntilDone:NO];
}
}
#end
You could use your controller to manage the queue,
self.operationQueue = [[NSOperationQueue alloc] init];
self.operationQueue.name = #"Überalgorithmus.TheKillerApp.makemyday.com";
// make sure to have only one algorithm running
self.operationQueue.maxConcurrentOperationCount = 1;
to enqueue operations, kill previous operations and the like,
ClusterMarker *clusterMarkerOperation = [[ClusterMarker alloc] initWithMarkers:self.xmlMarkerSet delegate:self];
// this sets isCancelled in ClusterMarker to true. you might want to check that variable frequently in the algorithm
[self.operationQueue cancelAllOperations];
[self.operationQueue addOperation:clusterMarkerOperation];
and to respond to the callbacks when the operation has finished:
- (void)clusterMarkerDidFinish:(ClusterMarker *)clusterMarker
{
self.clusterMarkerSet = clusterMarker.markerSet;
GMSProjection *projection = [self.mapView projection];
for (MapMarker *m in self.clusterMarkerSet) {
m.coordinate = [projection coordinateForPoint:m.point];
}
// DebugLog(#"now clear map and refreshData: self.clusterMarkerSet.count=%d", self.clusterMarkerSet.count);
[self.mapView clear];
[self refreshDataInGMSMapView:self.mapView];
}
If I remember correctly I used this tutorial on raywenderlich.com as a starter.
I would recommend using NSOperation as it has cancel method which will cancel the current running operation.