Writing and Reading Data to an NSMutableArray at the same time - ios

I am trying to make a progressive download audio player that will store as much as of the audio while playing it.
The format of the audio is stream optimized m4a.
For this problem I thought get the audio packets with a streamer into the memory, dont save it to anyfile in order to keep things faster.
And by the nature of m4a files I can't write and read the file at the same time from disk anyways...
So I stream and parse audiopackets from a remote source then put them into a Singleton NSMutableArray...
While streamer downloads the audiopackets, player reads and play audio packets from NSMutableArray at the same time...
Average file has around 11000 audiopackets so the count of the array reaches to 11000.
NSMutableDictionary * myDict = [[NSMutableDictionary alloc] init];
NSData *inputData = [NSData dataWithBytes:inInputData length:inPacketDescriptions[i].mDataByteSize];
[myDict setObject:inputData forKey:#"inInputData"];
NSNumber *numberBytes = [NSNumber numberWithInt:inNumberBytes];
[myDict setObject:numberBytes forKey:#"inNumberBytes"];
NSNumber *numberPackets = [NSNumber numberWithInt:inNumberPackets];
[myDict setObject:numberPackets forKey:#"inNumberPackets"];
NSNumber *mStartOffset = [NSNumber numberWithInt:inPacketDescriptions[i].mStartOffset];
NSNumber *mDataByteSize = [NSNumber numberWithInt:inPacketDescriptions[i].mDataByteSize];
NSNumber *mVariableFramesInPacket = [NSNumber numberWithInt:inPacketDescriptions[i].mVariableFramesInPacket];
[myDict setObject:mStartOffset forKey:#"mStartOffset"];
[myDict setObject:mDataByteSize forKey:#"mDataByteSize"];
[myDict setObject:mVariableFramesInPacket forKey:#"mVariableFramesInPacket"];
[sharedCache.baseAudioCache addObject:myDict];
My question is at some point will I encounter deadlocks?
Is this a good practice for audio streaming?

I would really recommend to use NSArrays after you've built the NSMutableArray.
You can synchronize to lock the NSMutableArray too.
#synchronized(yourMutableArray) {
[yourMutableArray stuffMethod];
}

Related

Displaying AVAudio player duration and elapsed time on iPad lock screen (background)

So far this is my attempt to have the lock screen display how much time has passed (elapsed) in the audio mp3 file and how much time in total the audio mp3 file is...
Here is my array of objects:
NSArray *madMoneyArray = [NSArray arrayWithObjects:[NSString stringWithFormat:#"Episode %d", a],
#"Jim Cramer",
#"Mad Money Podcast",
madMoneyArtwork,
[NSNumber numberWithFloat:(float)_audioPlayer.currentPlaybackTime],
[NSNumber numberWithDouble:(double)_audioPlayer.duration],
[NSNumber numberWithDouble:(double)1.0],
#"MadMoneyPodcast.png",
madMoneyURL, nil];
Here is my array of keys:
NSArray *array = [NSArray arrayWithObjects:MPMediaItemPropertyTitle,
MPMediaItemPropertyArtist,
MPMediaItemPropertyAlbumTitle,
MPMediaItemPropertyArtwork,
MPNowPlayingInfoPropertyElapsedPlaybackTime,
MPMediaItemPropertyPlaybackDuration,
MPNowPlayingInfoPropertyPlaybackRate, nil];
I successfully put the objects and their keys into the MPNowPlayingInfoCenter defaultCenter with this:
songInfo = [NSDictionary dictionaryWithObjects:[[_podcastArray objectAtIndex:(value - 1)] subarrayWithRange:NSMakeRange(0, imageIndex)] forKeys:_keys];
[[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:songInfo];
Where podcast array is where I hold several arrays with these objects and keys. Now when I go to the lock screen, everything I want to be shown is shown (i.e.: title, artist name, album name, album artwork). However, the one thing I can't get to show up is the stupid elapsed time and duration.
I have done a lot of research and implemented what I thought was the right thing to do, but apparently it isn't right/working.
I could really use some help here, thanks.
Try replacing
[NSNumber numberWithFloat:(float)_audioPlayer.currentPlaybackTime]
with
[NSNumber numberWithDouble:_audioPlayer.currentPlaybackTime]
Also check that _audioPlayer.duration is not 0.
Your madMoneyArray has some extra values replace this array with this
NSArray *madMoneyArray = [NSArray arrayWithObjects:[NSString stringWithFormat:#"Episode %d", a],
#"Jim Cramer",
#"Mad Money Podcast",
madMoneyArtwork,
[NSNumber numberWithFloat:(float)_audioPlayer.currentPlaybackTime],
[NSNumber numberWithDouble:(double)_audioPlayer.duration],
[NSNumber numberWithInt:1],
nil];
try this , hope this helps !!!

How to detect BPM of audio file in iOS app

I tried to find BPM using AVFoundation framework but getting 0 as a result and not able to get BPM.
Here is my code,
MPMediaItem * mediaItem = [[collection items] objectAtIndex: 0];
NSString * albumIDKey = [MPMediaItem persistentIDPropertyForGroupingType: MPMediaGroupingAlbum];
NSLog(#"mpmediaitem:%#", albumIDKey);
int BPM = [[mediaItem valueForProperty: MPMediaItemPropertyBeatsPerMinute] intValue];
NSString * bpm = [mediaItem valueForProperty: MPMediaItemPropertyBeatsPerMinute];
NSLog(#"bpm:%#", bpm);
NSURL * url = [mediaItem valueForProperty: MPMediaItemPropertyAssetURL];
Am I missing anything here?
The BPM is extracted from the metadata accompanying the audio file. Which often is not present. It is not calculated from the audio.
Also be aware that any BPM metadata that does exist is flawed by the assumption that a track has a constant tempo. Not always a safe assumption.
Quality audio-metadata can be obtained from The Echonest

Extracting unknown data from plist

I'm writing an iOS loader that loads data from a plist intending to send vertex data, etc. to the GPU via OpenGL. I can easily extract objects of standard types, like strings, integers, etc.
Where I get stumped is when I encounter what appears to be raw data as a dictionary object. The plist is a native file saved by my 3D modeling software, of which I'm not the author, so I don't know how the data was written into this object.
Some things I DO know about the object, it's likely an array of floats, each vertex needs a float value for X, Y, and Z, and there are 26 vertices in the example below.
Here's the actual data object in the plist file:
<key>vertex</key>
<data>
AAAAAL8AAAAAAAAAAAAAAAAAAAC/AAAAPwAAAAAAAAA+gAAAvwAA
AD7ds9cAAAAAPt2z2L8AAAA+f///AAAAAD8AAAC/AAAAsru9LgAA
AAA+3bPXvwAAAL6AAAEAAAAAPoAAAb8AAAC+3bPXAAAAALM7vS6/
AAAAvwAAAAAAAAC+gAADvwAAAL7ds9UAAAAAvt2z2L8AAAC+f//9
AAAAAL8AAAC/AAAAMczeLgAAAAC+3bPYvwAAAD5///0AAAAAvn//
+L8AAAA+3bPaAAAAAD6AAAA/AAAAPt2z1wAAAAAAAAAAPwAAAD8A
AAAAAAAAAAAAAD8AAAAAAAAAAAAAAD7ds9g/AAAAPn///wAAAAA/
AAAAPwAAALK7vS4AAAAAPt2z1z8AAAC+gAABAAAAAD6AAAE/AAAA
vt2z1wAAAACzO70uPwAAAL8AAAAAAAAAvoAAAz8AAAC+3bPVAAAA
AL7ds9g/AAAAvn///QAAAAC/AAAAPwAAADHM3i4AAAAAvt2z2D8A
AAA+f//9AAAAAL5///g/AAAAPt2z2gAAAAA=
</data>
Any ideas about how to read this? Here's where I am:
// get plist
NSString *path = [[NSBundle mainBundle] pathForResource:#"Cylinder" ofType:#"jas"];
NSDictionary *cheetahFile = [NSDictionary dictionaryWithContentsOfFile:path];
NSArray *objectArray = [cheetahFile objectForKey:#"Objects"];
NSDictionary *model = [objectArray objectAtIndex:1];
//get vertex count
GLshort vertCount = [[model valueForKey:#"vertexcount"] intValue];
//All good so far...but...
//get vertex data?... this doesn't work:
NSMutableArray *vertArray = [NSMutableArray arrayWithObject:[model objectForKey:#"vertex"]];
P.S. Sorry in advance if I'm making a rookie mistake. I'm a designer by profession, not a programmer. So talk slow using soothing tones while I eat my crayons. :)
The <data> part is an encoded NSData object. You can do this:
NSData *vertexData = model[#"vertex"];
What you do with that data is a whole other discussion.

Save game level in my app even when the app is closed?

I am making an RPG game for the iPhone and everything is working out great but I need to know how to save my game level so that even if the user were to close the app running in the background the entire game wouldn't start over again. I was even thinking of bringing back old style gaming and making it so that you have to enter a password to start from where you left off. But even then I wouldn't know how to save the game properly. Plus even if I did save the game how would I be able to make it stay saved even when the app closes completely? So far I have tried adding save data code to the AppWillTerminate line but still nothing. Any help is appreciated.
I'm not sure if you want to save which level the user was on, or if you want to save the game state. If you simply want to save which level the user was on, you should go with #EricS's method (NSUserDefaults). It's a little more complicated to save game state. I would do something like this:
//Writing game state to file
//Some sample data
int lives = player.kLives;
int enemiesKilled = player.kEnemiesKilled;
int ammo = player.currentAmmo;
//Storing the sample data in an array
NSArray *gameState = [[NSArray alloc] initWithObjects: [NSNumber numberWithInt:lives], [NSNumber numberWithInt:enemiesKilled], [NSNumber numberWithInt:ammo], nil];
//Writing the array to a .plist file located at "path"
if([gameState writeToFile:path atomically:YES]) {
NSLog(#"Success!");
}
//Reading from file
//Reads the array stored in a .plist located at "path"
NSArray *lastGameState = [NSArray arrayWithContentsOfFile:path];
The .plist would look like this:
Using an array would mean that upon reloading the game state, you would have to know the order that you stored the items in, which isn't that bad, but if you want a more reliable method, you could try using an NSDictionary like this:
//Writing game state to file
//Some sample data
int lives = player.kLives;
int enemiesKilled = player.kEnemiesKilled;
int ammo = player.currentAmmo;
int points = player.currentPoints;
//Store the sample data objects in an array
NSArray *gameStateObjects = [NSArray arrayWithObjects:[NSNumber numberWithInt:lives], [NSNumber numberWithInt:enemiesKilled], [NSNumber numberWithInt:points], [NSNumber numberWithInt:ammo], nil];
//Store their keys in a separate array
NSArray *gameStateKeys = [NSArray arrayWithObjects:#"lives", #"enemiesKilled", #"points", #"ammo", nil];
//Storing the objects and keys in a dictionary
NSDictionary *gameStateDict = [NSDictionary dictionaryWithObjects:gameStateObjects forKeys:gameStateKeys];
//Write to file
[gameStateDict writeToFile:path atomically: YES];
//Reading from file
//Reads the array stored in a .plist located at "path"
NSDictionary *lastGameState = [NSDictionary dictionaryWithContentsOfFile:path];
The dictionary .plist would look like this:
To save the level:
[[NSUserDefaults standardUserDefaults] setInteger:5 forKey:#"level"];
To read the level:
NSInteger level = [[NSUserDefaults standardUserDefaults] integerForKey:#"level"];
I would set it whenever the user enters that level. You could wait until you are sent into the background, but there's really no point in waiting.

Is it possible to use AVAssetReader to get back a stereo channel layout?

I'd like to be able to get back AudioBufferList from AVAssetReader which has 2 buffers so that I can process the left and right audio through an AudioUnit. I tried using the output settings below but it will not read as long as I specify the stereo layout set by kAudioChannelLayoutTag_Stereo.
Is it possible for AVAssetReader to return a non-interleaved result?
If not, how would I convert it to a non-interleaved AudioBufferList? I have tried to use Audio Converter Services but I cannot get it to accept either the the input or output values for the AudioStreamBasicDescription. (ASBD) If I cannot get the data in the format I want from AVAssetReader I would like to at least be able to convert it to the format I need.
Any tips are appreciated.
- (NSDictionary *) getOutputSettings {
AudioChannelLayout channelLayout;
memset(&channelLayout, 0, sizeof(AudioChannelLayout));
channelLayout.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo;
NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey,
[NSNumber numberWithFloat:44100.0], AVSampleRateKey,
[NSNumber numberWithInt:2], AVNumberOfChannelsKey,
[NSData dataWithBytes:&channelLayout length:sizeof(AudioChannelLayout)], AVChannelLayoutKey,
[NSNumber numberWithInt:16], AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved,
[NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
[NSNumber numberWithBool:NO], AVLinearPCMIsBigEndianKey,
nil];
return outputSettings;
}
I think that kAudioChannelLayoutTag_Stereo is requesting interleaved samples, so I'd lose it.
It all depends on what kind of AVAssetReaderOutput you're creating with those output settings. AVAssetReaderTrackOutput does no conversion beyond decoding to LPCM, but AVAssetReaderAudioMixOutput accepts a bunch more format keys, in fact it probably IS an AVAssetReaderTrackOutput + AudioConverter.
I've learned that I can have AVAssetReader return results with the default output settings (nil) which will give me an interleaved result of float values. The buffer of float values alternates from Left to Right through the buffer. I am able to work with these values which are in the range of -1.0 to 1.0 but in order to play the audio it is necessary to increase the values to the range of a short signed int, so I multiply them by SHRT_MAX and ensure the values stay within the range of SHRT_MAX and SHRT_MIN so the audio plays as expected.
Since the interleaved buffer returns the L and R values on the same buffer it is considered 2 channels on the 1 buffer which is reflected in the AudioBufferList. Previously I was able to get back 2 buffers with 1 channel per buffer but that is not really necessary now that I understand the very simple interleaved format.

Resources