I am trying to detect speed of my internet using NSURLConnection. What I do is, I start downloading a file, in delegates of NSURLConnection, I start a time and then when the download finishes, it gets the time frame as well as the data received and then I calculated to get the mb/sec using the below code.
if (startTime != nil) {
elapsed = NSDate().timeIntervalSinceDate(startTime)
NSLog("\(length) -- \(elapsed)")
var d = (Double(length) / elapsed)
var result = CGFloat( d/1024)
result = result * 0.0078125
result = result * 0.0009765625
return result
}
My question is why I am dividing 1024 here because If I don't do I get something is bits/bytes...
I am assuming I am getting seconds from NSDate().timeIntervalSinceDate(startTime) and bytes from Nsdata length
I think I am getting right value however I am not sure. Let me know why it's necessary to divide 1024!
I was dividing by 1024 in the example you took this from simply because I think looking at bytes per second yields a number that is too large to make sense of (and it suggests a misleading degree of accuracy given the variability in that number).
By dividing by 1024, you get kilobytes per second. To get megabytes per second, you'd divide by 1024 * 1024. The 1024 in that original code sample was a mistake, as that would yield kilobytes per second.
So, use whatever measurement you want. Bytes per second, kilobytes per second, megabytes per second. Or you can multiply megabytes per second by 8 and get megabits per second (another common measure of speed). Just divide the bytes per second by the appropriate factor.
Related
Is there any good way how to estimate Realm file size on iOS and what is the highest safe Realm DB size? I am doing an app that gathers time series data and I am now elaborating which granularity to choose.
Let's say that I would like to keep following amount: Set of 15 doubles, saved each 5 seconds for 2 hours in a day, kept for 10 years.
This makes approx 15x(60/5)x60x2x365x10 = 78 mil. of double entries. I suppose this amount is too big to be stored in iPhone thus I should always keep data let's for 1 year max?
Let's check the math
You said
Set of 15 doubles, saved each 5 seconds for 2 hours in a day, kept for
10 years.
here's that in numbers
8 Bytes in a double * 15 doubles * 1440 samples per day (see below) * 3650 Days
= 630720000 Bytes
Now take 630720000 Bytes * 0.00000095367432 = 601Mb (in Binary) or 640Mb in Decimal
That could easily be done in Realm and on a phone. 650Mb is a lot less than a 64Gb phone can hold.
I am sure there is more overhead involved so the numbers may vary but it seems possible unless I missed something.
Samples Per Day = Two hours is 7200 seconds / 5 = 1440
I have been working on an H264 hardware accelerated encoder implementation using VideoToolbox's VTCompressionSession for a while now, and a consistent problem has been the unreliable bitrate coming out of it. I have read many forum posts and looked through existing code for this, and tried to follow suit, but the bitrate out of my encoder is almost always somewhere between 5% and 50% off what it is set at, and on occasion I've seen some huge errors, like even 400% overshoot, where even one frame will be twice the size of the given average bitrate.
My session is setup as follows:
kVTCompressionPropertyKey_AverageBitRate = desired bitrate
kVTCompressionPropertyKey_DataRateLimits = [desired bitrate / 8, 1]; accounting for bits vs bytes
kVTCompressionPropertyKey_ExpectedFrameRate = framerate (30, 15, 5, or 1 fps)
kVTCompressionPropertyKey_MaxKeyFrameInterval = 1500
kVTCompressionPropertyKey_MaxKeyFrameIntervalDuration = 1500 / framerate
kVTCompressionPropertyKey_AllowFrameReordering = NO
kVTCompressionPropertyKey_ProfileLevel = kVTProfileLevel_H264_Main_AutoLevel
kVTCompressionPropertyKey_RealTime = YES
kVTCompressionPropertyKey_H264EntropyMode = kVTH264EntropyMode_CABAC
kVTCompressionPropertyKey_BaseLayerFrameRate = framerate / 2
And I adjust the average bitrate and datarate values throughout the session to try and compensate for the volatility (if it's too high, I reduce them a bit, if too low, I increase them, with restrictions on how high and low to go).
I create the session and then apply the above configuration as a single dictionary using VTSessionSetProperties and feed frames into it like this:
VTCompressionSessionEncodeFrame(compressionSessionRef,
static_cast<CVImageBufferRef<(pixelBuffer),
CMTimeMake(capturetime, 1000),
kCMTimeInvalid,
frameProperties,
frameDetailsStruct,
&encodeInfoFlags);
So I'm supplying timing information as the API says to do.
Then I add up the size of the output for each frame and divide over a periodic time period, to determine the outgoing bitrate and error from desired. This is where I see the significant volatility.
I'm looking for any help in getting the bitrate under control, as I'm not sure what to do at this point. Thank you!
I think you can check the frameTimestamp set in VTCompressionSessionEncodeFrame, it seems affects the bitrate. If you change frame rate, change the frameTimestamp.
I am trying to calculate the Read/second and Write/Second in my Cassandra 2.1 cluster. After searching and reading, I came to know about JMX bean
org.apache.cassandra.metrics:type=ClientRequest,scope=Write,name=Latency
Here I can see oneMinuteRate. I have started a brand new cluster and started collected these metrics from 0.
When I started my first record, I can see
Count = 1
OneMinuteRate = 0.01599111...
Does it mean that my write/s is 0.0159911?
Or does it mean that based on 1 minute data, my write latency is 0.01599 where Write Latency refers to the response time for writing a record?
Please help me understand the value.
Thanks.
It means that in the last minute, your writes per second were occuring at a rate of .01599 writes per second. Think about it this way: the rate of writes in the last 60 seconds would be
WritesInLastMinute ÷ 60
So in your case
1 ÷ 60 = 0.0166
Or more precisely, .01599.
If you observed no further writes after that, the value would descend down to zero over the next minute.
OneMinuteRate, FiveMinuteRate, and FifteenMinuteRate are exponential moving averages, meaning they are not simply dividing readings against time, instead, as the name implies they take an exponential series of averages as below:
result(t) = (1 - w) * result(t - 1) + (w) * event_this_period
where w is the weighting factor, t is the ticking time, in other words, simply they take 20% or the new reading and 80% of old readings, it's the same way UNIX systems measure CPU loads.
however, if this applies to requests that the server receives, below is a chart from one request to a server, measures taken by dropwizard.
as you can see, from one request a curve is drawn by time, it's really useful to determine trends, but not sure if they are great to monitor live traffic and especially critical one.
I'd like to be able to determine at what byte positions a segment of an NSData compressed mp3 file begins and ends.
For example, if I am playing an mp3 file using the AVPlayer (or any player) that is 1 minute long and 1000000 bytes, I'd like to know approximately at how many bytes in the file the 30 second mark happens, then how many bytes the 40 second mark happens.
Note that due to the mp3 file being compressed I can't just divide the bytes in half to determine the 30 second mark.
If this can't be done with Swift/Objective-C, do you know if this determination can be done with any programming language? Thanks!
It turns out I had a different problem to solve. I was trying to approximate the byte position of a specific time, say, the 4:29 point of a 32:45 long podcast episode, within a few seconds of accuracy.
I used a function along these lines to calculate the approximate byte position:
startTimeBytesPosition = (startTimeInSeconds / episodeDuration) * episodeFileSize
That function worked like a charm for some episodes, but for others the resulting start time would be off by about 30-40 seconds.
It turns out this inaccuracy was happening because some mp3s contain metadata at the very beginning, and image files stored within metadata can be +500KB, so my calculation of time based on byte position for any episode with an image file would be off by about 500KB (which translated into about 30-40 seconds in this case).
To resolve this, I am first determining the size in bytes of the metadata in an mp3 file, and then use that to offset the approximation function:
startTimeBytesPosition = metadataBytesOffset + (startTimeInSeconds / episodeDuration) * episodeFileSize
So far this code seems to be doing a good job of approximating time based on byte position accurately within a few seconds.
I should note that this assumes that the metadata for the image will always appear at the beginning of the mp3 file, and I don't know if that will always be the case.
I am making a iOS app where the size of some files is diplayed in MB. My question is if it is correct to calculate 1000 byte = 1kb or 1024 byte = 1kb ? I have seen that Finder on the mac calculates with 1000b, but an iOS file manager called iFile calculates with 1024b. The wikipedia article didn't really answer my question. I am just askig speifically for file size not HD capacity etc.
My question is if it is correct to calculate 1000 byte = 1kb or 1024
byte = 1kb ?
Both are correct, and both are used in different situations.
1024 is more common for file sizes, while 1000 is more common for physical disk sizes, but neither is always used that way. As you mentioned, some programs uses 1000 for file sizes, and for memory cards 1024 is often used rather than 1000.
An example of how inconsistently the units are used is the 1.44 MB floppy disk. It's neither 1.44 * 1000 * 1000 bytes nor 1.44 * 1024 * 1024 bytes, but actually 1.44 * 1000 * 1024 bytes.
An effort was made to introduce the kibibyte unit, which is always 1024 bytes. It never was a hit, but you can see it used sometimes.
A kilobyte was, and sometimes (usually?) still is, 1024 bytes. And a megabyte is 1024 KB, a gigabyte is 1024 MB, and so on. But lately, those decimal-lovers have redefined them to powers of 1000, making a kilobyte 8000 bits instead of a nice power of two. They renamed the old units to "kibibites" and "mibibytes" or KiB and MiB.
So, if you want to please both crowds1, you can use KiB and powers of 1024. However, I'd suggest that, if you think it's worth the effort, make it a setting you can change that defaults to binary KB.
1 This isn't really pleasing both crowds, though. I personally hate seeing KiB. It shouldn't matter. When you need an exact measurement, measure in bytes and don't abbreviate.
1024b = 1kb
This 1000b stuff is metric... ;)
basic units(Physic, math...) :
K = 10^3,
M = 10^6
so...
1Km are 1000m.. but no 1km are 1024m
So...
A lot of programs using not good units 1024Kb = 1Mb
Historical bug. :)
Windows using normal 1kb = 1024
But if you buy disc 1GB you buy 10^9 B
The true unit of measurement for 1KB is 1024B: http://oxforddictionaries.com/definition/kilobyte?q=kilobyte
However, some manufactures of software and hardware, in an effort to decieve consumers in order to make themselves look better, may calculate it as 1000B. This is actually a pretty recent trend.
Kilo- denotes multiplication by one thousand (not 1024). Modern terminology reflects this fact:
1 kilobyte = 1000 bytes = 8000 bits
1 kibibyte = 1024 bytes = 8192 bits
Previous use of kilo (with bytes) was based on the approximation that 210 (1024) is merely close to 1000.
Imagine being tasked with coming up with a word that means 1000 bytes after some "loose approximation" had already taken the most obvious term you'd want to use. This lead to the corrected meanings listed above.
This terminology has been standardized. The following is a quote from page 143 of the The International System of Units:
The SI prefixes refer strictly to powers of 10. They should not be
used to indicate powers of 2 (for example, one kilobit represents 1000
bits and not 1024 bits). The names and symbols for prefixes to be used
with powers of 2 are recommended as follows:
kibi Ki 210
mebi Mi 220
gibi Gi 230
tebi Ti 240
pebi Pi 250
exbi Ei 260
zebi Zi 270
yobi Yi 280
The bi in the prefixes above are based on the word "binary". When you append "bit" or "byte" onto them, you get the units listed here (where conversions are also provided).