How to detect BPM of audio file in iOS app - ios

I tried to find BPM using AVFoundation framework but getting 0 as a result and not able to get BPM.
Here is my code,
MPMediaItem * mediaItem = [[collection items] objectAtIndex: 0];
NSString * albumIDKey = [MPMediaItem persistentIDPropertyForGroupingType: MPMediaGroupingAlbum];
NSLog(#"mpmediaitem:%#", albumIDKey);
int BPM = [[mediaItem valueForProperty: MPMediaItemPropertyBeatsPerMinute] intValue];
NSString * bpm = [mediaItem valueForProperty: MPMediaItemPropertyBeatsPerMinute];
NSLog(#"bpm:%#", bpm);
NSURL * url = [mediaItem valueForProperty: MPMediaItemPropertyAssetURL];
Am I missing anything here?

The BPM is extracted from the metadata accompanying the audio file. Which often is not present. It is not calculated from the audio.
Also be aware that any BPM metadata that does exist is flawed by the assumption that a track has a constant tempo. Not always a safe assumption.
Quality audio-metadata can be obtained from The Echonest

Related

Writing and Reading Data to an NSMutableArray at the same time

I am trying to make a progressive download audio player that will store as much as of the audio while playing it.
The format of the audio is stream optimized m4a.
For this problem I thought get the audio packets with a streamer into the memory, dont save it to anyfile in order to keep things faster.
And by the nature of m4a files I can't write and read the file at the same time from disk anyways...
So I stream and parse audiopackets from a remote source then put them into a Singleton NSMutableArray...
While streamer downloads the audiopackets, player reads and play audio packets from NSMutableArray at the same time...
Average file has around 11000 audiopackets so the count of the array reaches to 11000.
NSMutableDictionary * myDict = [[NSMutableDictionary alloc] init];
NSData *inputData = [NSData dataWithBytes:inInputData length:inPacketDescriptions[i].mDataByteSize];
[myDict setObject:inputData forKey:#"inInputData"];
NSNumber *numberBytes = [NSNumber numberWithInt:inNumberBytes];
[myDict setObject:numberBytes forKey:#"inNumberBytes"];
NSNumber *numberPackets = [NSNumber numberWithInt:inNumberPackets];
[myDict setObject:numberPackets forKey:#"inNumberPackets"];
NSNumber *mStartOffset = [NSNumber numberWithInt:inPacketDescriptions[i].mStartOffset];
NSNumber *mDataByteSize = [NSNumber numberWithInt:inPacketDescriptions[i].mDataByteSize];
NSNumber *mVariableFramesInPacket = [NSNumber numberWithInt:inPacketDescriptions[i].mVariableFramesInPacket];
[myDict setObject:mStartOffset forKey:#"mStartOffset"];
[myDict setObject:mDataByteSize forKey:#"mDataByteSize"];
[myDict setObject:mVariableFramesInPacket forKey:#"mVariableFramesInPacket"];
[sharedCache.baseAudioCache addObject:myDict];
My question is at some point will I encounter deadlocks?
Is this a good practice for audio streaming?
I would really recommend to use NSArrays after you've built the NSMutableArray.
You can synchronize to lock the NSMutableArray too.
#synchronized(yourMutableArray) {
[yourMutableArray stuffMethod];
}

AVURLAsset can't get mp3 duration in documents directory, but it works well with mp3 in app bundle

I have two mp3 files. One is in app bundle, the other is in user's Documents directory. I want to get the duration of mp3 file.
AVURLAsset* audioAsset = [AVURLAsset URLAssetWithURL:[NSURL URLWithString:newPath] options:nil];
[audioAsset loadValuesAsynchronouslyForKeys:#[#"duration"] completionHandler:^{
CMTime audioDuration = audioAsset.duration;
float audioDurationSeconds = CMTimeGetSeconds(audioDuration);
NSLog(#"duration:%f",audioDurationSeconds);
}];
Those codes work well with the mp3 file in app bundle, but it doesn't work with the mp3 in Documents directory which only log "duration:0.000000". Why?
The following is a way to compose a path associated to the mp3 file in document directory.
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *path = [documentsDirectory stringByAppendingPathComponent:#"Professor_and_the_Plant.mp3"];
NSURL *mp3_url = [[NSURL alloc] initFileURLWithPath:path];
NSURLAsset defaults to working quickly. If the length or metadata isn't readily available (i.e., in the ID3 header), it doesn't yield it. To demand the correct data, even if it takes longer, you have to give it options:
static NSDictionary *options;
if (!options) {
NSNumber *val = [[NSNumber alloc] initWithBool: YES];
options = [[NSDictionary alloc] initWithObjectsAndKeys:val, AVURLAssetPreferPreciseDurationAndTimingKey, nil];
}
AVAsset *asset = [[AVURLAsset alloc] initWithURL:url options:options];
(I stuck the options in a static so I don't have to keep recreating and disposing it.)
If your assets in documents vs. app bundle are tagged differently, this could be your problem.

Not able to find the DPI for an image in iOS

I want to find the DPI for an image that has been captured from iPhone/iPad Camera
this is how i am trying to get the DPI
CFDictionaryRef exifDict = CMGetAttachment(imageDataSampleBuffer,
kCGImagePropertyExifDictionary ,
NULL);
originalExifDict = (__bridge NSMutableDictionary *)(exifDict);
[originalExifDict objectForKey:(NSString *)kCGImagePropertyDPIHeight]
[originalExifDict objectForKey:(NSString *)kCGImagePropertyDPIWidth]
However both the entries in the dictionary come to be 0.
What is the correct way to find the DPI ?
Thanks in advance for the help
CGSize size;
NSNumber *width = (NSNumber *)CFDictionaryGetValue(exifDict, kCGImagePropertyDPIWidth);
NSNumber *height = (NSNumber *)CFDictionaryGetValue(exifDict, kCGImagePropertyDPIHeight);
size.width = [width floatValue];
size.height = [height floatValue];
//Tell me its work or not.
The information isn't in the metadata that comes with your imageDataSampleBuffer. It is written (72 dpi) at the time the image is saved, unless you have, first, manually set it yourself when editing the metadata, before the save.
For most purposes, it is meaningless, However, some software uses it to calculate the "correct size" of an image when placing it in a document. A 3000 pixel square image at 300 dpi will thus appear 10 inches (c.25.4 cm) square; at 72 dpi it will be nearly 42 inches (c.105.8 cm) square. Also, some online image uploaders (especially those used by stock photo libraries and the like) insist on images having high-ish dpi.
If you are using imagePickerController use this below code
NSURL *assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetURL
resultBlock:^(ALAsset *asset) {
NSMutableDictionary *imageMetadata = nil;
NSDictionary *metadata = asset.defaultRepresentation.metadata;
imageMetadata = [[NSMutableDictionary alloc] initWithDictionary:metadata];
NSLog (#"imageMetaData from AssetLibrary %#",imageMetadata);
NSString *dpi = [imageMetadata objectForKey:#"DPIHeight"];
NSLog (#"Dpi: %#",dpi);
}
failureBlock:^(NSError *error) {
NSLog (#"error %#",error);
}];

Objective C Array Count String Issue

Okay I am new to objective C and am trying hard to learn it on my own with out bother the stacked overflow community to much but it is really quite different then what I'm used to (C++).
But I have come across a issue that I for the life of me can't figure out and I'm sure it's going be something stupid. But I am pulling questions and answers from a website that then will display on my iOS application by using this code.
NSString * GetUrl = [NSString stringWithFormat:#"http://www.mywebpage.com/page.php"];
NSString * GetAllHtml = [NSString stringWithContentsOfURL:[NSURL URLWithString:GetUrl] encoding:1 error:nil];
NSString *PullWholeQuestion = [[GetAllHtml componentsSeparatedByString:#"<tr>"] objectAtIndex:1];
NSString *FinishWholeQuestion = [[PullWholeQuestion componentsSeparatedByString:#"</tr>"] objectAtIndex:0];
After I get all of the webpage information I strip down each question and want to make it where it will do a loop process to pull the questions so basically I need to count how many array options there are for the FinishedWholeQuestion variable
I found this snippet online that seemed to work with there example but I cant repeat it
NSArray *stringArray = [NSArray arrayWithObjects:#"1", #"2", nil];
NSLog(#"count = %d", [stringArray count]);
"componentsSeparatedByString" returns an NSArray object, not a single NSString.
An array object can contain zero, one or more NSString objects, depending on the input.
If you change "FinishWholeQuestion" into a NSArray object, you'll likely get a few components (separate by a string).
And now that I'm looking at your code a little more closely, I see you're making an assumption that your array is always valid (and has more than 2 entries, as evidenced by the "objectAtIndex: 1" bit).
You should also change the first character of all your Objective-C variables. Best practices in Objective-C are that the first character of variables should always be lower case.
Like this:
NSString * getUrl = [NSString stringWithFormat:#"http://www.mywebpage.com/page.php"];
NSString * getAllHtml = [NSString stringWithContentsOfURL:[NSURL URLWithString:getUrl] encoding: NSUTF8StringEncoding error:nil];
NSArray * allQuestions = [getAllHtml componentsSeparatedByString:#"<tr>"];
if([allQuestions count] > 1)
{
// assuming there is at least two entries in this array
NSString * pullWholeQuestion = [allQuestions objectAtIndex: 1];
if(pullWholeQuestion)
{
NSString *finishWholeQuestion = [[pullWholeQuestion componentsSeparatedByString:#"</tr>"] objectAtIndex:0];
}
}

ALAsset GPS Metadata does not match the exif GPS data

I am updating my app to allow photo uploads to included GPS metadata when using UIImagePickerControllerSourceTypeSavedPhotosAlbum. The GPS data's accuracy is very important. I am running into an issue where the location data derived using ALAsset is different than the photo's actual exif data I can see when opening the same photo in Photoshop.
I have used two methods to read the GPS data in xcode:
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset) {
CLLocation *location = [myasset valueForProperty:ALAssetPropertyLocation];
latitudeString = [NSString stringWithFormat:#"%g",point.latitude];
longitudeString = [NSString stringWithFormat:#"%g",point.longitude];
}
AND
ALAssetRepresentation *representation = [myasset defaultRepresentation];
NSDictionary *metadata = [representation metadata];
NSDictionary *gpsDict = [metadata objectForKey:#"{GPS}"];
NSNumber *latitudeNumber = [gpsDict objectForKey:#"Latitude"];
NSNumber *longitudeNumber = [gpsDict objectForKey:#"Longitude"];
if ([[gpsDict valueForKey:#"LatitudeRef"] isEqualToString:#"S"])
{
//latitudeNumber = -latitudeNumber;
}
if ([[gpsDict valueForKey:#"LongitudeRef"] isEqualToString:#"W"])
{
//longitudeNumber = -longitudeNumber);
}
On a representative photo I am using as an example both sets of code above give me a latitude of 47.576333 which converts to 47,34,35N
If I look in Photoshop exif data - the latitude is 47,34,59N
These numbers are close - but they aren't the same. This happens without about 30% of my photos. Any idea why?
Edit - Photo shop does not give seconds - it give 34.59 minutes which is indeed accurate.
Your conversion is wrong, photoshop is more correct.
47.576333 (DEG) converts to 47* 34.5799' (DM). which can be rounded to 47* 34.58
which is the format photoshop obviously displays.
converted to DMS it gives your value: 47* 34' 35" N.
(Please replace all "*" with degrees symbol.)
So you exchanged DMS (Degress Minutes Seconds) with DM (Degrees Minutes) representation.

Resources