Arrow to point particular location in iPad - ios

I have to point the arrow to particular location based on the current location. For that I am using didUpdateHeading delegate method for getting the true heading value.
In iPhone( Portrait mode) the arrow image is showing perfectly. But in iPad its not showing correctly. It is in Landscape mode.
The code I am using to find the bearing distance is,
- (void)UpdateCompass:(NSNotification *)notification
{
CLLocation *newLocation = [[CLLocation alloc] initWithLatitude:m_nLatitude longitude:m_nLongitude];
CLLocation* target = [[CLLocation alloc] initWithLatitude:questionLatitude longitude:questionLangitude ];
_latestBearing = [self getHeadingForDirectionFromCoordinate:newLocation.coordinate toCoordinate:target.coordinate];
CGFloat degrees = _latestBearing - m_CLHeading.magneticHeading;
CGAffineTransform cgaRotate = CGAffineTransformMakeRotation([self degreesToRadians:degrees]);
m_arrowImage.transform = cgaRotate;
}
- (double)degreesToRadians:(double)degrees
{
return degrees * M_PI / 180.0;
}
- (double)radiansToDegrees:(double)radians
{
return radians * 180.0/M_PI;
}
- (double)getHeadingForDirectionFromCoordinate:(CLLocationCoordinate2D)fromLoc toCoordinate:(CLLocationCoordinate2D)toLoc
{
double lat1 = [self degreesToRadians:fromLoc.latitude];
double lon1 = [self degreesToRadians:fromLoc.longitude];
double lat2 = [self degreesToRadians:toLoc.latitude];
double lon2 = [self degreesToRadians:toLoc.longitude];
double dLon = lon2 - lon1;
double y = sin(dLon) * cos(lat2);
double x = cos(lat1) * sin(lat2) - sin(lat1) * cos(lat2) * cos(dLon);
double radiansBearing = atan2(y, x);
if(radiansBearing < 0.0)
radiansBearing += 2*M_PI;
return [self radiansToDegrees:radiansBearing];
}
How can I make this work for iPad?

Check whether device is ipad or iphone and write your code accordingly
if (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad)
{
// The device is an iPad running iPhone 3.2 or later.
}
else
{
// The device is an iPhone or iPod touch.
}

Related

Get iPads direction (N, E, S, W) [duplicate]

This question already has answers here:
iPhone - CLHeading to find direction
(2 answers)
Closed 6 years ago.
Is it possible to get the current iPads direction like North, East, South, or West for example by CLLocationManager?
This is for rotating pin in which direction its move
so might be it will helpful for you and by pin angle you will get ipad direction.
-(void)locationManager:(CLLocationManager *)manager didUpdateHeading:(CLHeading *)newHeading {
// Use the true heading if it is valid.
CLLocationDirection direction = newHeading.magneticHeading;
CGFloat radians = -direction / 180.0 * M_PI;
self.strAccuracy = [NSString stringWithFormat:#"%.1fmi",newHeading.headingAccuracy];
[lblAccuracy setText:self.strAccuracy];
//Rotate Bearing View
[self rotateBearingView:bearingView radians:radians];
//For Rotate Niddle
CGFloat angle = RadiansToDegrees(radians);
[self setLatLonForDistanceAndAngle];
[self rotateArrowView:arrowView degrees:(angle + fltAngle)];
}
-(void)rotateArrowView:(UIView *)view degrees:(CGFloat)degrees
{
CGAffineTransform transform = CGAffineTransformMakeRotation(DegreesToRadians(degrees));
view.transform = transform;
}
-(void)setLatLonForDistanceAndAngle
{
dblLat1 = DegreesToRadians(appDelegate.dblLatitude);
dblLon1 = DegreesToRadians(appDelegate.dblLongitude);
dblLat2 = DegreesToRadians(objClsProductSearch.dblLatitude);
dblLon2 = DegreesToRadians(objClsProductSearch.dblLongitude);
fltLat = dblLat2 - dblLat1;
fltLon = dblLon2 - dblLon1;
}
-(float)getAngleFromLatLon
{
//Calculate angle between two points taken from http://www.movable-type.co.uk/scripts /latlong.html
double y = sin(fltLon) * cos(dblLat2);
double x = cos(dblLat1) * sin(dblLat2) - sin(dblLat1) * cos(dblLat2) * cos(dblLon2);
CGFloat angle = RadiansToDegrees(atan2(y, x));
return angle;
}

How to rotate custom marker image on google map objective c

Currently i am working an app like Uber iOS application. I Already integrated Google Maps SDK and I showed custom image for User current location also. Currently I am getting some Driver's Current location details(Ex: 100 Driver's) from server. I saved in one NSArray and I tried to display those Lat & Long on GoogleMaps by using following code:
for(int i=0;i<[latLongArr count];i++)
{
GMSMarker *marker = [[GMSMarker alloc] init];
marker.position = CLLocationCoordinate2DMake([[(NSDictionary *)[latLongArr objectAtIndex:i] valueForKey:#"Latitude"] doubleValue], [[(NSDictionary *)[latLongArr objectAtIndex:i] valueForKey:#"Longitude"] doubleValue]);
marker.appearAnimation = kGMSMarkerAnimationPop;
marker.title = #"Title";
marker.snippet = #"Sub title";
marker.map = self.gMapView;
}
But I am looking UIDesign & Functionality like this:
Can Any one help me out how can I show User Current location & Driver's list of Annotations
(How to rotate custom marker image on google map)
In directions API (apple or google map) have a list points. So , to calculate the angle between two points, you can:
func DegreeBearing(A:CLLocation,B:CLLocation)-> Double{
var dlon = self.ToRad(degrees: B.coordinate.longitude - A.coordinate.longitude)
let dPhi = log(tan(self.ToRad(degrees: B.coordinate.latitude) / 2 + M_PI / 4) / tan(self.ToRad(degrees: A.coordinate.latitude) / 2 + M_PI / 4))
if abs(dlon) > M_PI{
dlon = (dlon > 0) ? (dlon - 2*M_PI) : (2*M_PI + dlon)
}
return self.ToBearing(radians: atan2(dlon, dPhi))
}
func ToRad(degrees:Double) -> Double{
return degrees*(M_PI/180)
}
func ToBearing(radians:Double)-> Double{
return (ToDegrees(radians: radians) + 360).truncatingRemainder(dividingBy: 360)
}
func ToDegrees(radians:Double)->Double{
return radians * 180 / M_PI
}
and set rotation for maker
maker.rotation = DegreeBearing(self.fromPoint, B: self.toPoint)
Updated ObjC Code below
-(double) DegreeBearing:(CLLocation*) A locationB: (CLLocation*)B{
double dlon = [self ToRad:(B.coordinate.longitude - A.coordinate.longitude)];
double dPhi = log(tan([self ToRad:(B.coordinate.latitude)] / 2 + M_PI / 4) / tan([self ToRad:(A.coordinate.latitude)] / 2 + M_PI / 4));
if (fabs(dlon) > M_PI){
dlon = (dlon > 0) ? (dlon - 2*M_PI) : (2*M_PI + dlon);
}
return [self ToBearing:(atan2(dlon, dPhi))];
}
-(double) ToRad: (double)degrees{
return degrees*(M_PI/180);
}
-(double) ToBearing:(double)radians{
double degree = [self ToDegrees:radians];
return degree+360% 360;
}
-(double) ToDegrees:(double)radians{
return radians * 180 / M_PI;
}

iOS using accelerometer to move an object within a circle

I am trying use the accelerometer to move an image within a circle. I am having issue that when the image hits the edge of the circle, it just moves the other side of the circle. My code is below:
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration {
//NSLog(#"x : %g", acceleration.x);
//NSLog(#"y : %g", acceleration.y);
//NSLog(#"z : %g", acceleration.z);
delta.x = acceleration.x * 10;
delta.y = acceleration.y * 10;
joypadCap.center = CGPointMake(joypadCap.center.x + delta.x, joypadCap.center.y - delta.y);
distance = sqrtf(((joypadCap.center.x - 160) * (joypadCap.center.x - 160)) +
((joypadCap.center.y -206) * (joypadCap.center.y - 206)));
//NSLog(#"Distance : %f", distance);
touchAngle = atan2(joypadCap.center.y, joypadCap.center.x);
NSLog(#"Angle : %f", touchAngle);
if (distance > 50) {
joypadCap.center = CGPointMake(160 - cosf(touchAngle) * 50, 206 - sinf(touchAngle) * 50);
}
I was having the same issue when attempting to implement a circular spirit level using CMDeviceMotion. I found it was an issue with the coordinates passed to atan2(y,x). This function requires cartesian coordinates, with (0,0) in the centre of the view. However, the screen coordinates have (0,0) in the top left corner. I created methods to convert a point between the two coordinate systems, and now it's working well.
I put up a sample project here on github, but here's the most important part:
float distance = sqrtf(((point.x - halfOfWidth) * (point.x - halfOfWidth)) +
((point.y - halfOfWidth) * (point.y - halfOfWidth)));
if (distance > maxDistance)
{
// Convert point from screen coordinate system to cartesian coordinate system,
// with (0,0) located in the centre of the view
CGPoint pointInCartesianCoordSystem = [self convertScreenPointToCartesianCoordSystem:point
inFrame:self.view.frame];
// Calculate angle of point in radians from centre of the view
CGFloat angle = atan2(pointInCartesianCoordSystem.y, pointInCartesianCoordSystem.x);
// Get new point on the edge of the circle
point = CGPointMake(cos(angle) * maxDistance, sinf(angle) * maxDistance);
// Convert back to screen coordinate system
point = [self convertCartesianPointToScreenCoordSystem:point inFrame:self.view.frame];
}
And:
- (CGPoint)convertScreenPointToCartesianCoordSystem:(CGPoint)point
inFrame:(CGRect)frame
{
float x = point.x - (frame.size.width / 2.0f);
float y = (point.y - (frame.size.height / 2.0f)) * -1.0f;
return CGPointMake(x, y);
}
- (CGPoint)convertCartesianPointToScreenCoordSystem:(CGPoint)point
inFrame:(CGRect)frame
{
float x = point.x + (frame.size.width / 2.0f);
float y = (point.y * -1.0f) + (frame.size.height / 2.0f);
return CGPointMake(x, y);
}

How to keep rollingY consistent when user flips over iDevice while in landscape mode?

In my app, I'm the game is set for landscape mode allowing the user to be able to flip the device with the game auto-rotating to match the angle. When the user tilts the device to its right, the hero will travel to the right, and when tilted to the left, the hero goes left.
I have that part figured out, but the problem comes in when the user flips the iDevice over (because of the headphone jack or placement of the speaker), the app is auto-rotated, but the rollingY is setting the hero's position.x to be inverted. Tilting left makes hero go right, and tilting right makes hero go left. After looking at the NSLogs of the accelY value when flipping around the iDevice, it doesn't seem possible to be able to reverse the position.x when rotated?
I guess this is more of a math question.
Here is my code:
-(void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration
{
#define kFilteringFactor 0.75
static UIAccelerationValue rollingX = 0, rollingY = 0, rollingZ = 0;
rollingX = (acceleration.x * kFilteringFactor) + (rollingX * (1.0 - kFilteringFactor));
rollingY = (acceleration.y * kFilteringFactor) + (rollingY * (1.0 - kFilteringFactor));
rollingZ = (acceleration.z * kFilteringFactor) + (rollingZ * (1.0 - kFilteringFactor));
float accelX = rollingX;
float accelY = rollingY;
float accelZ = rollingZ;
NSLog(#"accelX: %f, accelY: %f, accelZ: %f", accelX, accelY, accelZ);
CGSize winSize = [CCDirector sharedDirector].winSize;
#define kRestAccelX 0.6
#define kShipxMaxPointsPerSec (winSize.height * 1.0)
#define kMaxDiffX 0.2
#define kRestAccelY 0.0
#define kShipyMaxPointsPerSec (winSize.width * 0.5)
#define kMaxDiffY 0.2
float accelDiffX = kRestAccelX - ABS(accelX);
float accelFractionX = accelDiffX / kMaxDiffX;
float pointsPerSecX = kShipxMaxPointsPerSec * accelFractionX;
float accelDiffY = -accelY;
float accelFractionY = accelDiffY / kMaxDiffY;
float pointsPerSecY = kShipyMaxPointsPerSec * accelFractionY;
_shipPointsPerSecX = pointsPerSecY;
_shipPointsPerSecY = pointsPerSecX;
CCLOG(#"accelX: %f, pointsPerSecX: %f", accelX, pointsPerSecX);
CCLOG(#"accelY: %f, pointsPerSecY: %f", accelY, pointsPerSecY);
}
-(void)updateShipPos:(ccTime)dt
{
CGSize winSize = [CCDirector sharedDirector].winSize;
float maxX = winSize.width - _ship.contentSize.width;
float minX = _ship.contentSize.width / 2;
float maxY = winSize.height - _ship.contentSize.height / 2;
float minY = _ship.contentSize.height / 2;
float newX = _ship.position.x + (_shipPointsPerSecX * dt);
float newY = _ship.position.y + (_shipPointsPerSecY * dt);
newX = MIN(MAX(newX, minX), maxX);
newY = MIN(MAX(newY, minY), maxY);
_ship.position = ccp(newX, newY);
}
The idea is, I'm trying to make it so that pointsPerSecY is positive when tilting to the right and negative when tilted to the left. The problem is when the user flips the device, they are inverted since the device is flipped. Is there a way to make it so that it will stay positive when tilted to the right and negative to the left, no matter the orientation, when flipped while being in landscape?
How about to do something like this?
if ([[UIApplication sharedApplication] statusBarOrientaion]==UIInterfaceOrientationLandscapeLeft) {
pointsPerSecY *= -1;
}

iOS 5 - AVCaptureDevice setting focus point and focus mode freezes the live camera picture

I'm using the following method to set point of focus since iOS 4:
- (void) focusAtPoint:(CGPoint)point
{
AVCaptureDevice *device = [[self captureInput] device];
NSError *error;
if ([device isFocusModeSupported:AVCaptureFocusModeAutoFocus] &&
[device isFocusPointOfInterestSupported])
{
if ([device lockForConfiguration:&error]) {
[device setFocusPointOfInterest:point];
[device setFocusMode:AVCaptureFocusModeAutoFocus];
[device unlockForConfiguration];
} else {
NSLog(#"Error: %#", error);
}
}
}
On iOS 4 devices this works without any problems. But on iOS 5 the live camera feed freezes and after some seconds gets completely black. There is no exception or error thrown.
The error won't occur if I comment out either setFocusPointOfInterest or setFocusMode. So the combination of them both will lead to this behavior.
The point you've given the setFocusPointOfInterest: function is incorrect. It's the reason why it's crashing.
Add this method to your program and use the value returned by this function
- (CGPoint)convertToPointOfInterestFromViewCoordinates:(CGPoint)viewCoordinates
{
CGPoint pointOfInterest = CGPointMake(.5f, .5f);
CGSize frameSize = [[self videoPreviewView] frame].size;
AVCaptureVideoPreviewLayer *videoPreviewLayer = [self prevLayer];
if ([[self prevLayer] isMirrored]) {
viewCoordinates.x = frameSize.width - viewCoordinates.x;
}
if ( [[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize] ) {
pointOfInterest = CGPointMake(viewCoordinates.y / frameSize.height, 1.f - (viewCoordinates.x / frameSize.width));
} else {
CGRect cleanAperture;
for (AVCaptureInputPort *port in [[[[self captureSession] inputs] lastObject] ports]) {
if ([port mediaType] == AVMediaTypeVideo) {
cleanAperture = CMVideoFormatDescriptionGetCleanAperture([port formatDescription], YES);
CGSize apertureSize = cleanAperture.size;
CGPoint point = viewCoordinates;
CGFloat apertureRatio = apertureSize.height / apertureSize.width;
CGFloat viewRatio = frameSize.width / frameSize.height;
CGFloat xc = .5f;
CGFloat yc = .5f;
if ( [[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect] ) {
if (viewRatio > apertureRatio) {
CGFloat y2 = frameSize.height;
CGFloat x2 = frameSize.height * apertureRatio;
CGFloat x1 = frameSize.width;
CGFloat blackBar = (x1 - x2) / 2;
if (point.x >= blackBar && point.x <= blackBar + x2) {
xc = point.y / y2;
yc = 1.f - ((point.x - blackBar) / x2);
}
} else {
CGFloat y2 = frameSize.width / apertureRatio;
CGFloat y1 = frameSize.height;
CGFloat x2 = frameSize.width;
CGFloat blackBar = (y1 - y2) / 2;
if (point.y >= blackBar && point.y <= blackBar + y2) {
xc = ((point.y - blackBar) / y2);
yc = 1.f - (point.x / x2);
}
}
} else if ([[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) {
if (viewRatio > apertureRatio) {
CGFloat y2 = apertureSize.width * (frameSize.width / apertureSize.height);
xc = (point.y + ((y2 - frameSize.height) / 2.f)) / y2;
yc = (frameSize.width - point.x) / frameSize.width;
} else {
CGFloat x2 = apertureSize.height * (frameSize.height / apertureSize.width);
yc = 1.f - ((point.x + ((x2 - frameSize.width) / 2)) / x2);
xc = point.y / frameSize.height;
}
}
pointOfInterest = CGPointMake(xc, yc);
break;
}
}
}
return pointOfInterest;
}
I want to give some additional info to #Louis 's answer.
According to Apple's documents (Please pay attention to the bold part):
In addition, a device may support a focus point of interest. You test for support using focusPointOfInterestSupported. If it’s supported, you set the focal point using focusPointOfInterest. You pass a CGPoint where {0,0} represents the top left of the picture area, and {1,1} represents the bottom right in landscape mode with the home button on the right—this applies even if the device is in portrait mode.
We should involve the orientation when calculate FocusPointOfInterest.

Resources