iOS - How To Find Which Method Is Taking Up So Much Time - ios

In my app, a user can "speed-read" text by having words flashed on the screen at a speed that they set. I have coded up this functionality in my UIViewController using a repeating NSTimer and updating the UILabel by displaying the next word index, but it's not going as fast as it should be.
For example, I tested it with 100 words at 1000 words per minute. Instead of taking 6 seconds like it should be, it takes 6.542045 to finish flashing all of the words. This is a big problem since I'm supposed to spit back to user user how long it took for them to read the text.
How do I find out what part of the code is taking so long? Is it the updating of the UILabel that's eating up 0.54~~ of the time?
EDIT
My sample project can be viewed here: https://github.com/cnowak7/RSVPTesting
The flashText method that I have should be firing only 100 times. Well, 101 if we count the time when the method realizes there are no more words and terminates the NSTimer. In the console, at the end of reading, I can see that the method is being fired 111 times. I think I might be doing this the wrong way.

Your specific question seems to be: How do I find out what part of the code is taking so long? Is it the updating of the UILabel that's eating up 0.54~~ of the time?
Inside Instruments, provided with Xcode, is a Time Profiler tool.
https://developer.apple.com/library/ios/documentation/DeveloperTools/Conceptual/InstrumentsUserGuide/Instrument-TimeProfiler.html
You can run your code and watch this tool to see exactly how much time is being spent executing every part of your routines. It will break down exactly which method is taking the most time, by percentage of overall time and concrete time spans, giving you a precise understanding of where you should focus your efforts in shaving off those precious partial seconds through refactoring/optimizations.
I'm an Objective-C guy, so rather than try to muddle my way through a Swift example, I'll let this guy do the talking.
https://www.raywenderlich.com/97886/instruments-tutorial-with-swift-getting-started

Whenever you want to know about time consumption in iOS you should always go for Instruments and select time profiler as shown in the image.
Time profiler will help you get to the code which is taking too much time.

Related

Gtkmm 3.0 draw blinking shapes and use of timeouts

In a Gtk::DrawingArea I have a pixbuf showing the layout of my house. I draw the measured room temperatures on it. I also would like to draw the state of my shutters on it with some lines. When and only when a shutter changes its state, I would love to make these lines blink with a time offset of 1 second. I assume, I would have to make use of a timeout triggered every second to redraw the lines for the shutters. I am already making use of a timeout every 2 minutes to fetch new data from the internet to be shown on my screen. I could set up the timeout to get called every second and then I would have to remember, when my last 2-minute fetch was accomplished, to trigger the next one on time. Also, if my shutters are not changing state like in 99.9 percent of their lifetime, I do not need blinking. It feels over engineered to me to call a method every second just to make a line blink. Is there a smarter way to do this?
I could post a lot of code here, but I think that would not help anybody understand my question. I am helpful for any hint.

NSTimer Milliseconds Accuracy [duplicate]

This question already has answers here:
Format realtime stopwatch timer to the hundredth using Swift
(2 answers)
Closed 7 years ago.
I've got a 9 figure number that needs to be incremented by 500 each second, but i decided to increment the number each milliseconds and update the label that displays the number. I'm using a NSTimer but as i've read everywhere around they're not accurate nor meant to be. I've tried using CFAbsoluteTimeGetCurrent but couldn't get it to work. Simply using NSTimer yields and inaccurate value. The incrementation doesn't stop each time the user opens the app it simply adds up the value and starts incrementing again.
Any Ideas?
Update:
Even though most answers point in the right direction, i solved my issue a little bit different. Thanks to all who answered and Martin.
I used a CADisplayLink instead of a NSTimer and got pretty accurate and constant results. Now notice i say pretty because the results are not totally accurate, but since in my case i'm incrementing a 9 figure number they're not noticeable, and my numbers are corrected as soon as the view appears again.
You can get Accuracy
timer = [NSTimer scheduledTimerWithTimeInterval:0.001 target:self selector:#selector(countup)userInfo:nil repeats:YES];
Use a timer only as a trigger to update the screen, do not rely on the exact time between each fire.
Keep an NSDate which represents the start time and use the current date when the timer fires to calculate the difference and update the label.
Consider using CADisplayLink to update your interface — it gives you very accurate numbers of the time passed since the previous frame was drawn, so you can always keep your UILabel up to date, regardless of how high or low your framerate is.
The Time won't be very accurate - but it doesn't need to be, if you get the accurate current time each time you go into the loop and add 500 x (whole number of seconds), you will get a display that increases by 500 each second (plus / minus 50-100 milliseconds)
The advantage of this approach is that you won't get an ever-increasing discrepancy in the timing, only ever 50-100 milliseconds.
If you want the timer to stop when the user switches out of the app, then you need to disable the timer when the app becomes inactive - have a look at this tutorial on the Ray Wenderlich site http://www.raywenderlich.com/92428/background-modes-ios-swift-tutorial

"Calibrate" NSTimer for UI updates

I'm trying to implement a countdown feature for my program. It's a second-timer, so I use a NSTimer object with a time interval of 1.0 second to update the UI. But in order not to accumulate error (every 1.0-second interval will incur a little bit of lag), the program caculates the absolute difference between current time and beginning time for the remaining time displayed in the UI.
The problem is, after the NSTimer object runs for a significant time (say half an hour), it's no longer "synced" with the absolute time due to accumulated error: the UI update happens between two "absolute" seconds. For example, if the countdown starts at 00:00:00.000, at first UI updates at 00:00:01.000, 00:00:02.000 ... but after a while it becomes 00:30:03.567 or something like that.
Any idea how I can deal with this? Are there any other better ways to implement this? Thanks!
One high level idea is to detect when the timer is getting too far out of sync based on your absolute time calculation. When it gets past a specific threshold, say 0.01 seconds or whatever you desire, cancel the current timer and start a new one after an appropriate delay that gets it back "in sync".

Interpreting downward spikes in Time Profiler

I'd appreciate some help on how I should interpret some results I get from Time Profiler and Activity Monitor. I couldn't find anything on this on the site, probably because the question is rather specific. However, I imagine I'm not the only one not sure what to read into the spikes they get on the Time Profiler.
I'm trying to figure out why my game is having regular hiccups on the iPhone 4. I'm trying to run it at 60 FPS, so I know it's tricky on such an old device, but I know some other games manage that fine. I'm using Unity, but this is a more general question about interpreting Instruments results. I don't have enough reputation to post images, and I can only post two links, so I can't post everything I'd like.
Here is what I get running my game on Time Profiler:
Screenshot of Time Profiler running my game
As far as I understand (but please correct me if I'm wrong), this graph is showing how much CPU my game uses during each sample the Time Profiler takes (I've set the samples to be taken once per millisecond). As you can see, there are frequent downward spikes in that graph, which (based on looking at the game itself as it plays) coincide with the hiccups in the game.
Additionally, the spikes are more common while I touch the device, especially if I move my finger on it continuously (which is what I did while playing our game above). (I couldn't make a comparable non-touching version because my game requires touching, but see below for a comparison.)
What confuses me here is that the spikes are downward: If my code was inefficient, doing too many calculations on some frames, I'd expect to see upward spikes, now downward. So here are the theories I've managed to come up with:
1) The downward spikes represent something else stealing CPU time (like, a background task, or the CPU's speed itself varying, or something). Because less time is available for my processing, I get hiccups, and it also shows as my app using less CPU.
2) My code is in fact inefficient, causing spikes every now and then. Because the processing takes isn't finished in one frame, it continues onto the next, but only needs a little extra time. That means that on that second frame, it uses less CPU, resulting in a downward spike. (It is my understanding that iOS frames are always equal legnth, say, 1/60 s, and so the third frame cannot start early even if we spent just a little extra time on the second.)
3) This is just a sampling problem, caused by the fact that the sampling frequency is 1ms while the frame length is about 16ms.
Both theories would make sense to me, and would also explain why our game has hiccups but some lighter games don't. 1) Lighter games would not suffer so badly from CPU stolen, because they don't need that much CPU to begin with. 2) Lighter games don't have as many spikes of their own.
However, some other tests seem to go against each of these theories:
1) If frames always get stolen like this, I'd expect similar spikes to appear on other games too. However, testing with another game (from the App Store, also using Unity), I don't get them (I had an image to show that but unfortunately I cannot post it).
Note: This game has lots of hiccups while running in the Time Profiler as well, so hiccups don't seem to always mean downward spikes.
2) To test the hypothesis that my app is simply spiking, I wrote a program (again in Unity) that wastes a consistent amount of milliseconds per frame (by running a loop until the specified time has passed according to the system clock). Here's what I get on Time Profiler when I make it waste 8ms per frame:
Screenshot of Time Profiler running my time waster app
As you can see, the downward spikes are still there, even though the app really shouldn't be able to cause spikes. (You can also see the effect of touching here, as I didn't touch it for the first half of the visible graph, and touched it continuously for the second.)
3) If this was due to unsync between the framerate and the sampling, I'd expect there to be a lot more oscillation there. Surely, my app would use 100% of the milliseconds until it's done with a frame, then drop to zero?
So I'm pretty confused about what to make of this. I'd appreciate any insight you can provide into this, and if you can tell me how to fix it, all the better!
Best regards,
Tommi Horttana
Have you tried unity's profiler? Does it show simillar results? Note that unity3d has two profilers on ios:
editor profiler - pro only (but there is a 30 day trial)
internal profiler - you have to enable it in xcode project's source
Look at http://docs.unity3d.com/Manual/MobileProfiling.html, maybe something will hint you.
If i had to guess, I'd check one of the most common source timing hickups - the mono garbage collector.
Try running it yourself in a set frequency (like every 250ms) and see if there is a difference in the pattern:
System.GC.Collect();

What is the best way (performance-wise) to flash things at a varying speed?

I have an app where I flash words at a constant speed. Say it's set to 60 times a minute. Each word then shows for 1 second each. It was pretty easy to accomplish with NSTimer.
However, I want to make it a little more intelligent now. Longer words show for slightly longer than shorter words. I've figured out the math on how to calculate this, but I'm not sure how in Objective-C to present a word for say, 0.15 seconds, then another word for 0.18 seconds, then a third word for 0.04 seconds, etc., depending on the length of the word.
Would just using a delay be the best way to do this?
You could use performSelector to delay, but it isn't necessarily very easy to manage.
You could use NSTimer, repeating, and set the fireDate for each new update required. This is relatively expensive but less so than repeatedly creating new timers.
You could use CADisplayLink with a combination of duration and frameInterval to get updates at multiples of the screen refresh rate. This should probably be the most performant and accurate.
But, overall, you shouldn't worry about performance until you have some evidence of a problem and / or have done some profiling. Think instead about what features you need and how easy they are to implement with each solution.

Resources