ALAsset GPS Metadata does not match the exif GPS data - ios

I am updating my app to allow photo uploads to included GPS metadata when using UIImagePickerControllerSourceTypeSavedPhotosAlbum. The GPS data's accuracy is very important. I am running into an issue where the location data derived using ALAsset is different than the photo's actual exif data I can see when opening the same photo in Photoshop.
I have used two methods to read the GPS data in xcode:
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset) {
CLLocation *location = [myasset valueForProperty:ALAssetPropertyLocation];
latitudeString = [NSString stringWithFormat:#"%g",point.latitude];
longitudeString = [NSString stringWithFormat:#"%g",point.longitude];
}
AND
ALAssetRepresentation *representation = [myasset defaultRepresentation];
NSDictionary *metadata = [representation metadata];
NSDictionary *gpsDict = [metadata objectForKey:#"{GPS}"];
NSNumber *latitudeNumber = [gpsDict objectForKey:#"Latitude"];
NSNumber *longitudeNumber = [gpsDict objectForKey:#"Longitude"];
if ([[gpsDict valueForKey:#"LatitudeRef"] isEqualToString:#"S"])
{
//latitudeNumber = -latitudeNumber;
}
if ([[gpsDict valueForKey:#"LongitudeRef"] isEqualToString:#"W"])
{
//longitudeNumber = -longitudeNumber);
}
On a representative photo I am using as an example both sets of code above give me a latitude of 47.576333 which converts to 47,34,35N
If I look in Photoshop exif data - the latitude is 47,34,59N
These numbers are close - but they aren't the same. This happens without about 30% of my photos. Any idea why?
Edit - Photo shop does not give seconds - it give 34.59 minutes which is indeed accurate.

Your conversion is wrong, photoshop is more correct.
47.576333 (DEG) converts to 47* 34.5799' (DM). which can be rounded to 47* 34.58
which is the format photoshop obviously displays.
converted to DMS it gives your value: 47* 34' 35" N.
(Please replace all "*" with degrees symbol.)
So you exchanged DMS (Degress Minutes Seconds) with DM (Degrees Minutes) representation.

Related

Not able to find the DPI for an image in iOS

I want to find the DPI for an image that has been captured from iPhone/iPad Camera
this is how i am trying to get the DPI
CFDictionaryRef exifDict = CMGetAttachment(imageDataSampleBuffer,
kCGImagePropertyExifDictionary ,
NULL);
originalExifDict = (__bridge NSMutableDictionary *)(exifDict);
[originalExifDict objectForKey:(NSString *)kCGImagePropertyDPIHeight]
[originalExifDict objectForKey:(NSString *)kCGImagePropertyDPIWidth]
However both the entries in the dictionary come to be 0.
What is the correct way to find the DPI ?
Thanks in advance for the help
CGSize size;
NSNumber *width = (NSNumber *)CFDictionaryGetValue(exifDict, kCGImagePropertyDPIWidth);
NSNumber *height = (NSNumber *)CFDictionaryGetValue(exifDict, kCGImagePropertyDPIHeight);
size.width = [width floatValue];
size.height = [height floatValue];
//Tell me its work or not.
The information isn't in the metadata that comes with your imageDataSampleBuffer. It is written (72 dpi) at the time the image is saved, unless you have, first, manually set it yourself when editing the metadata, before the save.
For most purposes, it is meaningless, However, some software uses it to calculate the "correct size" of an image when placing it in a document. A 3000 pixel square image at 300 dpi will thus appear 10 inches (c.25.4 cm) square; at 72 dpi it will be nearly 42 inches (c.105.8 cm) square. Also, some online image uploaders (especially those used by stock photo libraries and the like) insist on images having high-ish dpi.
If you are using imagePickerController use this below code
NSURL *assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetURL
resultBlock:^(ALAsset *asset) {
NSMutableDictionary *imageMetadata = nil;
NSDictionary *metadata = asset.defaultRepresentation.metadata;
imageMetadata = [[NSMutableDictionary alloc] initWithDictionary:metadata];
NSLog (#"imageMetaData from AssetLibrary %#",imageMetadata);
NSString *dpi = [imageMetadata objectForKey:#"DPIHeight"];
NSLog (#"Dpi: %#",dpi);
}
failureBlock:^(NSError *error) {
NSLog (#"error %#",error);
}];

CLLocation distanceFromLocation result is differ when I calculate

I have used CLLocation's distanceFromLocation: method to calculate distance from some location.
But it's result is slightly different when I calculate by using "Haversine formula".
- (CLLocationDistance)distanceFromCoordinate:(CLLocationCoordinate2D)fromCoord {
double earthRadius = 6371.0; // Earth's radius in Kilometers
// Get the difference between our two points then convert the difference into radians
double nDLat = RADIANS((fromCoord.latitude - self.coordinate.latitude));
double nDLon = RADIANS((fromCoord.longitude - self.coordinate.longitude));
double fromLat = RADIANS(self.coordinate.latitude);
double toLat = RADIANS(fromCoord.latitude);
double nA = pow ( sin(nDLat/2.0), 2 ) + cos(fromLat) * cos(toLat) * pow ( sin(nDLon/2.0), 2 );
double nC = 2.0 * atan2( sqrt(nA), sqrt( 1 - nA ));
double nD = earthRadius * nC;
return nD * 1000.0;
}
CLLocation * loc = [[CLLocation alloc] initWithLatitude:[location.latitude doubleValue]
longitude:[location.longitude doubleValue]];
CLLocationDistance dist = [userLocation distanceFromLocation:loc];
CLLocationDistance dist2 = [userLocation distanceFromCoordinate:loc.coordinate];
Why two values are different?
Should I init location object with horizontalAccuracy and verticalAccuracy?
Your results are different because you are using different code.
You don't say how different.
Totally different? There's a bug in your code.
Big differences for places close together? Maybe your formula has problems with rounding errors.
Differences that grow as places are further apart? Maybe your definition of distance is different. Should be the closest distance on a path along earth surface.
In general, Earth is not a sphere but a spheroid. Taking that into account is more difficult but gives more precise results.
It is slightly different because Earth's Radius is not exact 6371 km. Use the correct Earth's radius, may be you can get better results.
Use the official WGS84 earth radius:
6 378 137 meter
I remember that ios delivers exactly the same result.
Should I init location object with horizontalAccuracy and
verticalAccuracy?
No, for sure not. That attributes are hints how acurate the position might be.
Distance is calculated by latitude and longitude only.
There are not much formulas:
- haversine formula (a bit slower than law of cosines, otherwise fine)
- law of cosines (problematic on small distances if not using 64 bit precision)
- vicenties which i smore acurate, it uses an elipsoidal earth model.
I also got the same problem... so i used google webservice to calculate distance. use this method, you will get accurate distance
-(void)calculateDistance()
{
//http://maps.googleapis.com/maps/api/directions/json?origin=41.742964,-87.995971& destination=41.811511,-87.967923&mode=driving&sensor=false
NSString *LocationUrl = [NSString stringWithFormat:#"http://maps.googleapis.com/maps/api/directions/json?origin=%#,%#&destination=%#,%#&mode=driving&sensor=false",origin.latitude,origin.longitude,destination.latitude,destination.latitude];
NSLog(#"Location URL:%#",LocationUrl);
NSURL *finalurl = [NSURL URLWithString: LocationUrl];
NSLog(#"Final URL = %#",finalurl);
NSData *data = [NSData dataWithContentsOfURL:finalurl];
NSLog(#"Data:-%#",data);
NSError *error;
NSDictionary *json = [NSJSONSerialization JSONObjectWithData:data
options:kNilOptions error:&error];
// NSLog(#"josn data of location:%#",[json description]);
NSMutableDictionary *routes = [json objectForKey:#"routes"];
NSMutableArray *legs = [routes valueForKey:#"legs"];
NSMutableDictionary *newDistance =[legs valueForKey:#"distance"];
NSMutableArray distanceList =[[newDistance valueForKey:#"text"]objectAtIndex:0];
distance = [[distanceList objectAtIndex:0]floatValue];
NSLog(#"%.1f",distance);
}
Hope it will help you

How to detect BPM of audio file in iOS app

I tried to find BPM using AVFoundation framework but getting 0 as a result and not able to get BPM.
Here is my code,
MPMediaItem * mediaItem = [[collection items] objectAtIndex: 0];
NSString * albumIDKey = [MPMediaItem persistentIDPropertyForGroupingType: MPMediaGroupingAlbum];
NSLog(#"mpmediaitem:%#", albumIDKey);
int BPM = [[mediaItem valueForProperty: MPMediaItemPropertyBeatsPerMinute] intValue];
NSString * bpm = [mediaItem valueForProperty: MPMediaItemPropertyBeatsPerMinute];
NSLog(#"bpm:%#", bpm);
NSURL * url = [mediaItem valueForProperty: MPMediaItemPropertyAssetURL];
Am I missing anything here?
The BPM is extracted from the metadata accompanying the audio file. Which often is not present. It is not calculated from the audio.
Also be aware that any BPM metadata that does exist is flawed by the assumption that a track has a constant tempo. Not always a safe assumption.
Quality audio-metadata can be obtained from The Echonest

Extracting latitude and longitude from image Objective c

I'm having a problem while getting the latitude and longitude data from an image(which is having geo location details). I have imported the EXIF framework and I'm using the following code to achieve this:
NSData *jpegData = [UIImageJPEGRepresentation(image, 0.5) base64String];
EXFJpeg* jpegScanner = [[EXFJpeg alloc] init];
[jpegScanner scanImageData: jpegData];
EXFMetaData* exifData = jpegScanner.exifMetaData;
id latitudeValue = [exifData tagValue:[NSNumber numberWithInt:EXIF_GPSLatitude]];
id longitudeValue = [exifData tagValue:[NSNumber numberWithInt:EXIF_GPSLongitude]];
NSLog(#"Longitude: %# Longitude: %#", latitudeValue, longitudeValue);
But its returning the NULL value for both latitude and longitude, can anyone please tell me what I'm doing wrong in the above code? Please help me out. Thanks in Advance!!
You can do it with the alasset framework.
ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc]init];
[assetsLibrary assetForURL:photoUrl resultBlock:resultBlock failureBlock:nil];
ALAssetsLibraryAssetForURLResultBlock resultBlock = ^(ALAsset *photoAsset) {
CLLocation *location = [photoAsset valueForProperty:ALAssetPropertyLocation];
NSMutableDictionary *exifDataDict = [[NSMutableDictionary alloc] init];
if (location != nil) {
[exifDataDict setObject:[NSNumber numberWithDouble:location.coordinate.latitude] forKey:#"latitude"];
[exifDataDict setObject:[NSNumber numberWithDouble:location.coordinate.longitude] forKey:#"longitude"];
}
}
I had a similar issue once. While dealing with it I got the impression that UIImage sort of strips all or some of the EXIF data. EXIFJpeg worked fine for me when the image data was read from file, boundle or webservice etc. direclty but I did not manage to extract any reasonalbe EXIFs when I stored the image in memory as UIImage object and then used UIImageJPEGRepresentaiion to get the image data and the EXIF from that data.
I will not sign this in blood but that was my impression and using the "raw" data from file did actually work for me. So I received the file from some server, then extracted the EXIF including geo tags (if any) and after that created the UIImage.

Does Core Image load image data immediately?

Let's say I want to find out the size of an image, so if a user tries to load a 10,000x10,000 pixel image in my iPad app I can present them with a dialog and not crash. If I do [UIImage imageNamed:] or [UIImage imageWithContentsOfFile:] that will load my potentially large image into memory immediately.
If I use Core Image instead, say like this:
CIImage *ciImage = [CIImage imageWithContentsOfURL:[NSURL fileURLWithPath:imgPath]];
Then ask my new CIImage for its size:
CGSize imgSize = ciImage.extent.size;
Will that load the entire image into memory to tell me this, or will it just look at the metadata of the file to discover the size of the image?
The imageWithContentsOfURL function loads the image into memory, yes.
Fortunately Apple implemented CGImageSource for reading image metadata without loading the actual pixel data into memory in iOS4, you can read about how to use it in this blog post (conveniently it provides a code sample on how to get image dimensions).
EDIT: Pasted code sample here to protect against link rot:
#import <ImageIO/ImageIO.h>
NSURL *imageFileURL = [NSURL fileURLWithPath:...];
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((CFURLRef)imageFileURL, NULL);
if (imageSource == NULL) {
// Error loading image
...
return;
}
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:NO], (NSString *)kCGImageSourceShouldCache,nil];
CFDictionaryRef imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, (CFDictionaryRef)options);
if (imageProperties) {
NSNumber *width = (NSNumber *)CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelWidth);
NSNumber *height = (NSNumber *)CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelHeight);
NSLog(#"Image dimensions: %# x %# px", width, height);
CFRelease(imageProperties);
}
The full API reference is also available here.

Resources