Given a list of CMSampleBuffers that have been read in from an asset, I want to adjust the duration of the asset so that it's half length (twice the speed) of the original.
Currently my function for generating new time stamps looks like:
func adjustTimeStampsForBuffers(buffers: [CMSampleBuffer]) -> [CMTime] {
let frameCount = buffers.count
// self.duration is CMTimeGetSeconds(asset.duration)
let increment = Float(self.duration / 2) / Float(frameCount)
return Array(0.stride(to: frameCount, by: 1)).enumerate().map {
let seconds: Float64 = Float64(increment) * Float64($0.index)
return CMTimeMakeWithSeconds(seconds, self.asset.duration.timescale)
}
}
however this doesn't seem to work and the outputted assets are in fact twice the length, not half. Can anybody point out where I'm going wrong?
Edit:
Thanks to #sschale, here's my final answer:
func adjustTimeStampsForBuffers(buffers: [CMSampleBuffer]) -> [CMTime] {
return buffers.map {
let time = CMSampleBufferGetPresentationTimeStamp($0)
return CMTimeMake(time.value, time.timescale * 2)
}
}
Instead of calculating new values, the timestamp is adjusted instead.
Based on my reading of the docs, it looks like that self.asset.duration.timescale may be the key here, as changing it will influence the whole file (if I'm understanding the reference you're making that that timescale is for the whole file, or maybe you need to adjust it in each of the buffers).
See here for more info as well.
Relevant section:
A CMTime is represented as a rational number, with a numerator (an
int64_t value), and a denominator (an int32_t timescale).
Conceptually, the timescale specifies the fraction of a second each
unit in the numerator occupies. Thus if the timescale is 4, each unit
represents a quarter of a second; if the timescale is 10, each unit
represents a tenth of a second, and so on. In addition to a simple
time value, a CMTime can represent non-numeric values: +infinity,
-infinity, and indefinite. Using a flag CMTime indicates whether the time been rounded at some point.
CMTimes contain an epoch number, which is usually set to 0, but can be
used to distinguish unrelated timelines: for example, it could be
incremented each time through a presentation loop, to differentiate
between time N in loop 0 from time N in loop 1
Related
I am getting current playing value in seconds but i need in milliseconds. I tried to currentTime.value/currentTime.scale. But it didn't get exact value.
CMTime currentTime = vPlayer.currentItem.currentTime; //playing time
CMTimeValue tValue=currentTime.value;
CMTimeScale tScale=currentTime.timescale;
NSTimeInterval time = CMTimeGetSeconds(currentTime);
NSLog(#"Time :%f",time);//This is in seconds, it misses decimal value double shot=(float)tValue/(float)tScale;
shotTimeVideo=[NSString stringWithFormat:#"%.2f",(float)tValue/(float)tScale];
CMTime currentTime = vPlayer.currentItem.currentTime; //playing time
CMTimeValue tValue=currentTime.value;
CMTimeScale tScale=currentTime.timescale;
NSTimeInterval time = CMTimeGetSeconds(currentTime);
NSLog(#"Time :%f",time);//This is in seconds, it misses decimal value
double shot=(float)tValue/(float)tScale;
shotTimeVideo=[NSString stringWithFormat:#"%.2f", (float)tValue/(float)tScale];
okay,first of all, the value you want is millisecond not seconds
So,you can just use CMTimeGetSeconds(<#CMTime time#>) to get Secondsthen,if you want millisecond , use seconds / 1000.f for float or double value
for CMTime calculating use CMTime method CMTimeMultiplyByRatio(<#CMTime time#>, <#int32_t multiplier#>, <#int32_t divisor#>)just do this --> CMTimeMultiplyByRatio(yourCMTimeValue, 1, 1000)
Apple's Doc
#function CMTimeMultiplyByRatio
#abstract Returns the result of multiplying a CMTime by an integer, then dividing by another integer.
#discussion The exact rational value will be preserved, if possible without overflow. If an overflow
would occur, a new timescale will be chosen so as to minimize the rounding error.
Default rounding will be applied when converting the result to this timescale. If the
result value still overflows when timescale == 1, then the result will be either positive
or negative infinity, depending on the direction of the overflow.
If any rounding occurs for any reason, the result's kCMTimeFlags_HasBeenRounded flag will be
set. This flag will also be set if the CMTime operand has kCMTimeFlags_HasBeenRounded set.
If the denominator, and either the time or the numerator, are zero, the result will be
kCMTimeInvalid. If only the denominator is zero, the result will be either kCMTimePositiveInfinity
or kCMTimeNegativeInfinity, depending on the signs of the other arguments.
If time is invalid, the result will be invalid. If time is infinite, the result will be
similarly infinite. If time is indefinite, the result will be indefinite.
#result (time * multiplier) / divisor
A little late, but may be useful for others.
You can get the timestamp in milliseconds from the CMTime object by
CMTimeConvertScale(yourCMTime, timescale:1000, method:
CMTimeRoundingMethod.roundHalfAwayFromZero
An example:
var yourtime:CMTime = CMTimeMake(value:1234567, timescale: 10)
var timemilli:CMTime = CMTimeConvertScale(yourtime, timescale:1000, method:
CMTimeRoundingMethod.roundHalfAwayFromZero)
var timemillival:Int64 = timemilli.value
print("timemilli \(timemillival)")
which will produce
timemilli 123456700
I'm using AVPlayer to play a live streaming. This stream supports one hour catch-up which means user can seek to one hour ago and play. But I have one question how do I know the accurate position that the player is playing. I need to display current position on the player view. For example,if user is playing half an hour ago then display -30:00; if user is playing the latest content, the player will show 00:00 or live. Thanks
Swift solution :
override func getLiveDuration() -> Float {
var result : Float = 0.0;
if let items = player.currentItem?.seekableTimeRanges {
if(!items.isEmpty) {
let range = items[items.count - 1]
let timeRange = range.timeRangeValue
let startSeconds = CMTimeGetSeconds(timeRange.start)
let durationSeconds = CMTimeGetSeconds(timeRange.duration)
result = Float(startSeconds + durationSeconds)
}
}
return result;
}
To get a live position poison and seek to it you can by using seekableTimeRanges of AVPlayerItem:
CMTimeRange seekableRange = [player.currentItem.seekableTimeRanges.lastObject CMTimeRangeValue];
CGFloat seekableStart = CMTimeGetSeconds(seekableRange.start);
CGFloat seekableDuration = CMTimeGetSeconds(seekableRange.duration);
CGFloat livePosition = seekableStart + seekableDuration;
[player seekToTime:CMTimeMake(livePosition, 1)];
Also when you seek some time back, you can get current playing position by calling currentTime method
CGFloat current = CMTimeGetSeconds([self.player.currentItem currentTime]);
CGFloat diff = livePosition - current;
I know this question is old, but I had the same requirement and I believe the solutions aren't addressing properly the intent of the question.
What I did for this same requirement was to gather the current point in time, the starting time, and the length of the total duration of the stream.
I'll explain something before going further, the current point in time could surpass the (starting time + total duration) this is due to the way hls is structured as ts segments. Ts segments are small chucks of playable video, you could have on your seekable range 5 ts segments of 10 seconds each. This doesn't mean that 50 secs is the full length of the live stream, there is around a full segment more (so 60 seconds of playtime total) but it isn't categorized as seekable since you shouldn't seek to that segment. If you were to do this you'll notice in most instances rebuffering (cause the source may be still creating the next ts segment when you already reached the end of playback).
What I did was checking if the current stream time is further than the seekable rage, if so this would mean were are live on stream. If it isn't you could easily calculate how far behind you are from live if you subtract the current time, starting time, and total duration.
let timeRange:CMTimeRange = player.currentItem?.seekableTimeRanges.last
let start = timeRange.start.seconds
let totalDuration = timeRange.duration.seconds
let currentTime = player.currentTime().seconds
let secondsBehindLive = currentTime - totalDuration - start
The code above will give you a negative number with the number of seconds behind "live" or more specifically the start of the lastest ts segment. Or a positive number or zero when it's playing the latest ts segment.
Tbh I don't really know when does the seekableTimeRanges will have more than 1 value, it has always been just one for the streams I have tested with, but if you find in your streams more than 1 value you may have to figure if you want to add all the ranges duration, which time range to use as the start value, etc. At least for my use case, this was enough.
I'm trying to generate a spectrogram from an AVAudioPCMBuffer in Swift. I install a tap on an AVAudioMixerNode and receive a callback with the audio buffer. I'd like to convert the signal in the buffer to a [Float:Float] dictionary where the key represents the frequency and the value represents the magnitude of the audio on the corresponding frequency.
I tried using Apple's Accelerate framework but the results I get seem dubious. I'm sure it's just in the way I'm converting the signal.
I looked at this blog post amongst other things for a reference.
Here is what I have:
self.audioEngine.mainMixerNode.installTapOnBus(0, bufferSize: 1024, format: nil, block: { buffer, when in
let bufferSize: Int = Int(buffer.frameLength)
// Set up the transform
let log2n = UInt(round(log2(Double(bufferSize))))
let fftSetup = vDSP_create_fftsetup(log2n, Int32(kFFTRadix2))
// Create the complex split value to hold the output of the transform
var realp = [Float](count: bufferSize/2, repeatedValue: 0)
var imagp = [Float](count: bufferSize/2, repeatedValue: 0)
var output = DSPSplitComplex(realp: &realp, imagp: &imagp)
// Now I need to convert the signal from the buffer to complex value, this is what I'm struggling to grasp.
// The complexValue should be UnsafePointer<DSPComplex>. How do I generate it from the buffer's floatChannelData?
vDSP_ctoz(complexValue, 2, &output, 1, UInt(bufferSize / 2))
// Do the fast Fournier forward transform
vDSP_fft_zrip(fftSetup, &output, 1, log2n, Int32(FFT_FORWARD))
// Convert the complex output to magnitude
var fft = [Float](count:Int(bufferSize / 2), repeatedValue:0.0)
vDSP_zvmags(&output, 1, &fft, 1, vDSP_length(bufferSize / 2))
// Release the setup
vDSP_destroy_fftsetup(fftsetup)
// TODO: Convert fft to [Float:Float] dictionary of frequency vs magnitude. How?
})
My questions are
How do I convert the buffer.floatChannelData to UnsafePointer<DSPComplex> to pass to the vDSP_ctoz function? Is there a different/better way to do it maybe even bypassing vDSP_ctoz?
Is this different if the buffer contains audio from multiple channels? How is it different when the buffer audio channel data is or isn't interleaved?
How do I convert the indices in the fft array to frequencies in Hz?
Anything else I may be doing wrong?
Update
Thanks everyone for suggestions. I ended up filling the complex array as suggested in the accepted answer. When I plot the values and play a 440 Hz tone on a tuning fork it registers exactly where it should.
Here is the code to fill the array:
var channelSamples: [[DSPComplex]] = []
for var i=0; i<channelCount; ++i {
channelSamples.append([])
let firstSample = buffer.format.interleaved ? i : i*bufferSize
for var j=firstSample; j<bufferSize; j+=buffer.stride*2 {
channelSamples[i].append(DSPComplex(real: buffer.floatChannelData.memory[j], imag: buffer.floatChannelData.memory[j+buffer.stride]))
}
}
The channelSamples array then holds separate array of samples for each channel.
To calculate the magnitude I used this:
var spectrum = [Float]()
for var i=0; i<bufferSize/2; ++i {
let imag = out.imagp[i]
let real = out.realp[i]
let magnitude = sqrt(pow(real,2)+pow(imag,2))
spectrum.append(magnitude)
}
Hacky way: you can just cast a float array. Where reals and imag values are going one after another.
It depends on if audio is interleaved or not. If it's interleaved (most of the cases) left and right channels are in the array with STRIDE 2
Lowest frequency in your case is frequency of a period of 1024 samples. In case of 44100kHz it's ~23ms, lowest frequency of the spectrum will be 1/(1024/44100) (~43Hz). Next frequency will be twice of this (~86Hz) and so on.
4: You have installed a callback handler on an audio bus. This is likely run with real-time thread priority and frequently. You should not do anything that has potential for blocking (it will likely result in priority inversion and glitchy audio):
Allocate memory (realp, imagp - [Float](.....) is shorthand for Array[float] - and likely allocated on the heap`. Pre-allocate these
Call lengthy operations such as vDSP_create_fftsetup() - which also allocates memory and initialises it. Again, you can allocate this once outside of your function.
I'm doing some glsl fractals, and I'd like to make the calculations bail if they're taking too long to keep the frame rate up (without having to figure out what's good for each existing device and any future ones).
It would be nice if there were a timer I could check every 10 iterations or something....
Failing that, it seems the best approach might be to track how long it took to render the previous frame (or previous N frames) and change the "iterate to" number dynamically as a uniform...?
Or some other suggestion? :)
As it appears there's no good way to do this in the GPU, one can do a simple approach to "tune" the "bail after this number of iterations" threshold outside the loop, once per frame.
CFTimeInterval previousTimestamp = CFAbsoluteTimeGetCurrent();
// gl calls here
CFTimeInterval frameDuration = CFAbsoluteTimeGetCurrent() - previousTimestamp;
float msecs = frameDuration * 1000.0;
if (msecs < 0.2) {
_dwell = MIN(_dwell + 16., 256.);
} else if (msecs > 0.4) {
_dwell = MAX(_dwell - 4., 32.);
}
So my "dwell" is kept between 32 and 256, and more optimistically raised than decreased, and is pushed as a uniform in the "gl calls here" section.
I have seen some examples of CMTime (Three separate links), but I still don't get it. I'm using an AVCaptureSession with AVCaptureVideoDataOutput and I want to set the max and min frame rate of the the output. My problem is I just don't understand the CMTime struct.
Apparently CMTimeMake(value, timeScale) should give me value frames every 1/timeScale seconds for a total of value/timeScale seconds, or am I getting that wrong?
Why isn't this documented anywhere in order to explain what this does?
If it does truly work like that, how would I get it to have an indefinite number of frames?
If its really simple, I'm sorry, but nothing has clicked just yet.
A CMTime struct represents a length of time that is stored as rational number (see CMTime Reference). CMTime has a value and a timescale field, and represents the time value/timescale seconds .
CMTimeMake is a function that returns a CMTime structure, for example:
CMTime t1 = CMTimeMake(1, 10); // 1/10 second = 0.1 second
CMTime t2 = CMTimeMake(2, 1); // 2 seconds
CMTime t3 = CMTimeMake(3, 4); // 3/4 second = 0.75 second
CMTime t4 = CMTimeMake(6, 8); // 6/8 second = 0.75 second
The last two time values t3 and t4 represent the same time value, therefore
CMTimeCompare(t3, t4) == 0
If you set the videoMinFrameDuration of a AVCaptureSession is does not make a difference if you set
connection.videoMinFrameDuration = CMTimeMake(1, 20); // or
connection.videoMinFrameDuration = CMTimeMake(2, 40);
In both cases the minimum time interval between frames is set to 1/20 = 0.05 seconds.
My experience differs.
For let testTime = CMTime(seconds: 3.83, preferredTimescale: 100)
If you set a breakpoint and look in the debugger side window it says:
"383 100ths of a second"
Testing by seeking to a fixed offset in a video in AVPlayer has confirmed this.
So put the actual number of seconds in the seconds field, and the precision in the preferredTimescale field. So 100 means precision of hundredths of a second.
Doing
let testTime = CMTime(seconds: 3.83, preferredTimescale: 100)
Still seeks to the same place in the video, but it displays in the debugger side window as "3833 1000ths of a second"
Doing
let testTime = CMTime(seconds: 3.83, preferredTimescale: 1)
Does not seek to the same place in the video, because it's been truncated, and it displays in the debugger side window as "3 seconds". Notice that the .833 part has been lost due to the preferredTimescale.
CMTime(seconds: value, timescale: scale)
means value/scale in a just one second