I'm using CMTime for AVAssets for video clip. To trim the video without saving the new video file I just want to keep track of the start time and the duration.
The CMTimeGetSeconds() method will return a Float64, what would be the best way to store this in CoreData?
I can't use a NSNumber as the float type round the Float64 way to much. 1.200000 is 1.0000 when I create my NSNumber.
Thanks in advance
Based on your comments it is highly likely that that the videoTrack object will adjust the duration to a nice round number that makes since for video playback. Try creating an NSNumber and printing it without setting it to the duration property and you will probably get the exact same result. Also make sure the data type is set to a Double in the CoreData model.
Related
I am trying to implement an SLM app for iOS using AudioKit. Therefore I need to determine different loudness values to a) display the current loudness (averaged over a second) and b) do further calculations (e.g. to calculate the "Equivalent Continuous Sound Level" over a longer time span). The app should be able to track frequency-weighted decibel values like dB(A) and dB(C).
I do understand that some of the issues im facing are related to my general lack of understanding in the field of signal and audio processing. My question is how one would approach this task with AudioKit. I will describe my current process and would like to get some input:
Create an instance of AKMicrophone and a AKFrequencyTracker on this microphone
Create a Timer instance with some interval (currently 1/48_000.0)
Inside the timer: retrieve the amplitude and frequency. Calculate a decibel value from the amplitude with 20 * log10(amplitude) + calibrationOffset (calibration offset will be determined per device model with the help of a professional SLM). Calculate offsets for the retrieved frequency according to frequency-weighting (A and C) and apply these to the initial dB value. Store dB, dB(A) and dB(C) values in an array.
Calculate the average for arrays over the give timeframe (1 second).
I read somewhere else that using a Timer this is not the best approach. What else is there that I could use for the "sampling"? What exactly is the frequency of AKFrequencyTracker? Will this frequency be sufficient to determine dB(A) and dB(C) values or will I need an AKFFTTap for this? How are values retrieved from the AKFrequencyTracker averaged, i.e. what time frame is used for the RMS?
Possibly related questions: Get dB(a) level from AudioKit in swift, AudioKit FFT conversion to dB?
Let's say I have an AVAudioFile with a duration of 10 seconds. I want to load that file into an AVAudioPCMBuffer but I only want to load the audio frames that come after a certain number of seconds/milliseconds or after a certain AVAudioFramePosition.
It doesn't look like AVAudioFile's readIntoBuffer methods give me that kind of precision so I'm assuming I'll have to work at the AVAudioBuffer level or lower?
You just need to set the AVAudioFile's framePosition property before reading.
I am using the MIKMIDI framework and this is using the AudioToolbox type MusicTimeStamp
How can i convert this timestamp to milliseconds?
The MusicTimeStamp is a raw beat count, you need to know the tempo (and tempo map, tempo isn't an invariant) of the music you're working with in order to convert this into milliseconds.
Outside of a MusicSequence a MTS can't be mapped to a wall time.
Edit: A CoreMedia CMTime can be converted to wall times if that helps.
There's new API for this in MIKMIDI. It's in a branch (1.8) as I write this, but should be merged soon, and released in the 1.8 release. It makes it much easier to do the conversion you're asking about.
In the context of a sequence, do:
let seconds = sequence.timeInSeconds(forMusicTimeStamp: musicTimeStamp)
There's also a method to convert in the opposite direction. MIKMIDISequencer has very similar, but more sophisticated (to account for looping, tempo override, etc.) methods to do the same kinds of conversions.
Without this new API in MIKMIDI, you can still use MusicSequenceGetSecondsForBeats(). You can get the underlying MusicSequence for an MIKMIDISequence using its musicSequence property:
var timeInSeconds = Float64(0)
MusicSequenceGetSecondsForBeats(sequence, musicTimeStamp, &timeInSeconds)
As far as I know this doesn't take into account looping even if you're doing it with the MusicPlayer API, and certainly not an overridden tempo if one is set on MIKMIDISequencer, so you should prefer MIKMIDI's API above if possible.
Using an MPMusicPlayerController is there a way to query the tracks length?
Using .currentPlaybackTime I can figure out how far I am into the track but I have no way of knowing how long the track is.
My [MPMediaItem valueForProperty:MPMediaItemPropertyAssetURL] returns null for some tracks so using the AVAsset won't work for me.
Easy just use
NSNumber *duration=[item valueForProperty:MPMediaItemPropertyPlaybackDuration];
Given an audio file of duration say 10s, how do I find out the no of samples betweeen 2s and 8s?
if it's LPCM (e.g. not compressed), then use the sample rate.
in pseudocode:
double sampleRate = audioFile.getSampleRate();
// you may also need to account for channel count here
size_t sampleCount = sampleRate * (8-2);
-- where (8-2) represents "betweeen 2s and 8s"
You can use the duration property of the audio player for that purpose..
The duration property gives the duration of the current object of audio player(i.e the file that is currently being played)
So then you can supply all the audio files to the player one by one and check their durations and perform the required operations...
Cheers