Using an MPMusicPlayerController is there a way to query the tracks length?
Using .currentPlaybackTime I can figure out how far I am into the track but I have no way of knowing how long the track is.
My [MPMediaItem valueForProperty:MPMediaItemPropertyAssetURL] returns null for some tracks so using the AVAsset won't work for me.
Easy just use
NSNumber *duration=[item valueForProperty:MPMediaItemPropertyPlaybackDuration];
Related
Let's say I have an AVAudioFile with a duration of 10 seconds. I want to load that file into an AVAudioPCMBuffer but I only want to load the audio frames that come after a certain number of seconds/milliseconds or after a certain AVAudioFramePosition.
It doesn't look like AVAudioFile's readIntoBuffer methods give me that kind of precision so I'm assuming I'll have to work at the AVAudioBuffer level or lower?
You just need to set the AVAudioFile's framePosition property before reading.
I need to send audio data in real-time in PCM format 8 KHz 16 Bit Mono.
Audio must been sent like array of chars with length
(<#char *data#>, <#int len#>).
Now I'm beginner in Audio processing and cant really understand how to accomplish that. My best try was been to convert to iLBC format and try but it couldn't work. Is there any sample how to record and convert audio to any format. I have already read Learning Core Audio from Chris Adamson and Kevin Avila but I really didn't find solution that works.
Simple what i need:
(record)->(convert?)-> send(char *data, int length);
Couse I need to send data like arrays of chars i cant use player.
EDIT:
I managed to make everything work with recording and with reading buffers. What I can't manage is :
if (ref[i]->mAudioDataByteSize != 0){
char * data = (char*)ref[i]->mAudioData;
sendData(mHandle, data, ref[i]->mAudioDataByteSize);
}
This is not really a beginner task. The solutions are to use either the RemoteIO Audio Unit, the Audio Queue API, or an AVAudioEngine installTapOnBus block. These will give you near real-time (depending on the buffer size) buffers of audio samples (Int16's or Floats, etc.) that you can convert, compress, pack into other data types or arrays, etc. Usually by calling a callback function or block that you provide to do whatever you want with the incoming recorded audio sample buffers.
I am using the MIKMIDI framework and this is using the AudioToolbox type MusicTimeStamp
How can i convert this timestamp to milliseconds?
The MusicTimeStamp is a raw beat count, you need to know the tempo (and tempo map, tempo isn't an invariant) of the music you're working with in order to convert this into milliseconds.
Outside of a MusicSequence a MTS can't be mapped to a wall time.
Edit: A CoreMedia CMTime can be converted to wall times if that helps.
There's new API for this in MIKMIDI. It's in a branch (1.8) as I write this, but should be merged soon, and released in the 1.8 release. It makes it much easier to do the conversion you're asking about.
In the context of a sequence, do:
let seconds = sequence.timeInSeconds(forMusicTimeStamp: musicTimeStamp)
There's also a method to convert in the opposite direction. MIKMIDISequencer has very similar, but more sophisticated (to account for looping, tempo override, etc.) methods to do the same kinds of conversions.
Without this new API in MIKMIDI, you can still use MusicSequenceGetSecondsForBeats(). You can get the underlying MusicSequence for an MIKMIDISequence using its musicSequence property:
var timeInSeconds = Float64(0)
MusicSequenceGetSecondsForBeats(sequence, musicTimeStamp, &timeInSeconds)
As far as I know this doesn't take into account looping even if you're doing it with the MusicPlayer API, and certainly not an overridden tempo if one is set on MIKMIDISequencer, so you should prefer MIKMIDI's API above if possible.
I'm trying to use the currentPlaybackRate property on MPMusicPlayerController to adjust the tempo of a music track as it plays. The property works as expected when the rate is less than 0.90 or greater than 1.13, but for the range just above and below 1, there seems to be no change in tempo. Here's what I'm trying:
UIAppDelegate.musicPlayer = [MPMusicPlayerController iPodMusicPlayer];
... load music player with track from library
[UIAppDelegate.musicPlayer play];
- (void)speedUp{
UIAppDelegate.musicPlayer.currentPlaybackRate = UIAppDelegate.musicPlayer.currentPlaybackRate + 0.03125;
}
- (void)speedDown
{
UIAppDelegate.musicPlayer.currentPlaybackRate = UIAppDelegate.musicPlayer.currentPlaybackRate - 0.03125;
}
I can monitor the value currentPlaybackRate and see that it's being correctly set, but there seems to be no different in playback tempo until the 0.9 or 1.13 threshold has been reached. Does anyone have any guidance or experience on the matter?
I'm no expert, but I suspect that this phenomenon may be merely an artefact of the algorithm used to change the playback speed without raising or lowering the pitch. It's a tricky business, and here it must be done in real time without much distortion, so probably an integral multiple of the tempo is needed. You might want to read the wikipedia article on time stretching, http://en.wikipedia.org/wiki/Audio_timescale-pitch_modification
Actually I've found out the problem: the sentence myMusicPlayer.currentPlaybackRate = 1.2 must be placed after the sentence .play(). If you put the rate setting before the .play(), it would not work.
I'm using CMTime for AVAssets for video clip. To trim the video without saving the new video file I just want to keep track of the start time and the duration.
The CMTimeGetSeconds() method will return a Float64, what would be the best way to store this in CoreData?
I can't use a NSNumber as the float type round the Float64 way to much. 1.200000 is 1.0000 when I create my NSNumber.
Thanks in advance
Based on your comments it is highly likely that that the videoTrack object will adjust the duration to a nice round number that makes since for video playback. Try creating an NSNumber and printing it without setting it to the duration property and you will probably get the exact same result. Also make sure the data type is set to a Double in the CoreData model.