I want to apply a moving background according to device orientation. If device is held vertically upright an it is rotated around y axis then the background should move left or right according to the value of rotation.
Other icons on the background will static so it will give a 3d effect. For reference i need the exact same effect as shown in this video: http://www.youtube.com/watch?v=429kM-yXGz8
The code i have developed till now is:
- (void)updateViewsWithAcceleration:(CMAcceleration)acceleration;
{
_accelX = ((acceleration.x* 0.1) + (_accelX*.9));
_accelY = ((acceleration.y* 0.1) + (_accelY*.9));
_accelZ = ((acceleration.z* 0.1) + (_accelZ*.9));
[_backgroundShadow setFrame:CGRectMake(_accelY*50,_accelZ*50, _backgroundShadow.bounds.size.width, _backgroundShadow.bounds.size.width)];
[_titleShadow setFrame:CGRectMake(_accelY*50, _accelZ*50, _titleShadow.bounds.size.width, _titleShadow.bounds.size.width)];
}
Just an FYI: You're looking at the acceleration, not the gyroscope; try this instead:
- (void)viewDidLoad {
[super viewDidLoad];
cmmm = [[CMMotionManager alloc] init];
if (cmmm.gyroAvailable) {
cmmm.gyroUpdateInterval = 1.0/60.0;
[cmmm startGyroUpdates];
CMGyroHandler gyroHandler = ^ (CMGyroData *gyroData, NSError *error) {
CMRotationRate rotate = gyroData.rotationRate;
NSLog(#"rotation rate: {%6.2f, %6.2f, %6.2f}", rotate.x, rotate.y, rotate.z);
};
} else {
NSLog(#"No gyro");
[cmmm release];
}
}
Related
How can I get camera line-of-sight angle (relative to North, in degrees, zero being north, 180 being south) when a picture is taken using an iPhone (and not where the device is moving) ?
For example, I am walking towards north (0 degrees), taking a picture to my right, (which is 90 degrees), and I want that 90 degrees as my result, because that is the iPhone line-of-sight angle relative to north.
Somewhat accurate result within +/- 20 degrees is fine.
You can use the gyroscope API in ios.
See this tuto for example : http://ios-programming.blogspot.fr/2011/11/gyroscope-accelerometer-ios-sdk.html
Use CMMotionManager class for receiving motion data. Don't forget to include CoreMotion framework.
#import <CoreMotion/CoreMotion.h>
- (void)startProcess
{
CMMotionManager *motionManager = [CMMotionManager new];
motionManager.deviceMotionUpdateInterval = .2;
if (motionManager.deviceMotionAvailable ) {
NSOperationQueue *motionQueue = [[NSOperationQueue alloc] init];
CMAttitudeReferenceFrame attitudeFrame = CMAttitudeReferenceFrameXTrueNorthZVertical;
[motionManager startDeviceMotionUpdatesUsingReferenceFrame:attitudeFrame
toQueue:motionQueue
withHandler:^ (CMDeviceMotion *newMotionData, NSError *newError) {
CMAcceleration gravity = motionData.gravity;
CGFloat rotation = [self rotationForGravity:gravity];
});
}];
}
- (CGFloat)rotationForGravity:(CMAcceleration)customGravity
{
UIInterfaceOrientation orientation = [UIApplication sharedApplication].statusBarOrientation;
if(orientation == UIInterfaceOrientationLandscapeLeft)
return customGravity.y;
else
return -customGravity.y;
}
I created an extension of the UIView class in order to implement the parallax effect throughout my entire project. I would advice you to do the same, it's so much easier.
#pragma mark - Handling Effects
- (void)setMotion:(CGFloat)motion
{
if (motion == 0) {
for (UIMotionEffect *motionEffect in self.motionEffects) {
[self removeMotionEffect:motionEffect];
}
}
else {
UIInterpolatingMotionEffect *horizontalEffect = [[UIInterpolatingMotionEffect alloc] initWithKeyPath:#"center.x" type:UIInterpolatingMotionEffectTypeTiltAlongHorizontalAxis];
horizontalEffect.minimumRelativeValue = #(motion);
horizontalEffect.maximumRelativeValue = #(-motion);
UIInterpolatingMotionEffect *verticalEffect = [[UIInterpolatingMotionEffect alloc] initWithKeyPath:#"center.y" type:UIInterpolatingMotionEffectTypeTiltAlongVerticalAxis];
verticalEffect.minimumRelativeValue = #(motion);
verticalEffect.maximumRelativeValue = #(-motion);
[self addMotionEffect:horizontalEffect];
[self addMotionEffect:verticalEffect];
}
}
- (CGFloat)motion
{
if (self.motionEffects.count == 0) {
return 0;
}
else {
UIInterpolatingMotionEffect *horizontalEffect = (UIInterpolatingMotionEffect *)[self.motionEffects objectAtIndex:0];
return [horizontalEffect.minimumRelativeValue floatValue];
}
}
Example:
view1.motion = 10;
view2.motion = 15;
view3.motion = 20;
The problem is that the effect is not quite accurate. Yes, it looks like parallax but I don't get the strong feeling that I'm emerged in a 3D scenery. The views slide a bit more to left or a bit more to the right.
Is there anyway, I can calculate the parallax motion (.motion) of a UIView based on it's size?
I'm guessing angular sizes, and view points but I have no idea how those are measured...
This equation describes the size of objects relative to the distance they are placed at:
angularSize = realSize / distance;
By implementing your own initialization method:
- (instancetype)initWithRealSize:(CGSize)realSize;
You can change the size of the view by adjusting it's distance.
The motion value can be calculated as follows:
horizontalMotion = angularSize.width / 10.0;
verticalMotion = angularSize.height / 10.0;
I am using this code to implement infinite looping, but I'v got gaps for 1-2 seconds every time the offscreen image coordinates are changed. Why do they appear? How to fix it? I am also using SpriteBuilder.
#import "MainScene.h"
static const CGFloat scrollSpeed =100.f;
#implementation MainScene{
CCPhysicsNode *_world;
CCNode *_oneb;
CCNode *_twob;
NSArray *_bb;
}
- (void)didLoadFromCCB {
_bb = #[_oneb, _twob];
}
-(void)update:(CCTime)delta{
_world.position=ccp(_world.position.x - (scrollSpeed * delta), _world.position.y ); // moving world
for (CCNode *ground in _bb) {
// get the world position of the ground
CGPoint groundWorldPosition = [_world convertToWorldSpace:ground.position];
// get the screen position of the ground
CGPoint groundScreenPosition = [self convertToNodeSpace:groundWorldPosition];
// if the left corner is one complete width off the screen, move it to the right
if (groundScreenPosition.x <= (-1 * ground.contentSize.width)) {
ground.position = ccp(ground.position.x + 2 * ground.contentSize.width, ground.position.y);
}
}
}
#end
EDIT: I changed -1 to -0.5. Works fine!
Seems like you are using small image for iPhone 3.5-inch on iPhone 4-inch simulator. What resolution of your background image?
EDIT: In my game I have an infinite loop, too. Maybe my code may help you? First background sprite should be 1137x640, second 1136x640. And you will never have gaps again! Hope it helps.
init method:
backgroundSprite = [CCSprite spriteWithFile:#"background.png"];
backgroundSprite.anchorPoint = ccp(0,0);
backgroundSprite.position = ccp(0,0);
[self addChild:backgroundSprite z:0];
backgroundSprite2 = [CCSprite spriteWithFile:#"background2.png"];
backgroundSprite2.anchorPoint = ccp(0,0);
backgroundSprite2.position = ccp([backgroundSprite boundingBox].size.width,0);
[self addChild:backgroundSprite2 z:0];
tick method:
backgroundSprite.position = ccp(backgroundSprite.position.x-1,backgroundSprite.position.y);
backgroundSprite2.position = ccp(backgroundSprite2.position.x-1,backgroundSprite2.position.y);
if (backgroundSprite.position.x<-[backgroundSprite boundingBox].size.width) {
backgroundSprite.position = ccp(backgroundSprite2.position.x+[backgroundSprite2 boundingBox].size.width,backgroundSprite.position.y);
}
if (backgroundSprite2.position.x<-[backgroundSprite2 boundingBox].size.width) {
backgroundSprite2.position = ccp(backgroundSprite.position.x+[backgroundSprite boundingBox].size.width,backgroundSprite2.position.y);
}
I'm trying to develop an App with an "Around Me"-like feature of a location list with small directional arrows on the side.
Bearing and offset to the different locations hadn't been a problem thanks to Stackoverflow and compensating the compass-lag did well with following tutorial:
http://www.sundh.com/blog/2011/09/stabalize-compass-of-iphone-with-gyroscope/
All the stuff works fine with only one location in that UITableView.
But when there are more than one location, the arrows won't turn smooth and it feels like my iPhone isn't fast enough for calculating the stuff and turning these multiple arrows but I don't know how to do that better.
At the moment I'm trying this (without the locations specific directional offset):
I'm saving all the UIImageViews of all the cells in an array
when getting a new yaw value I loop through the array an actualize all the Images Rotation
if(motionManager.isDeviceMotionAvailable) {
// Listen to events from the motionManager
motionHandler = ^ (CMDeviceMotion *motion, NSError *error) {
CMAttitude *currentAttitude = motion.attitude;
float yawValue = currentAttitude.yaw; // Use the yaw value
// Yaw values are in radians (-180 - 180), here we convert to degrees
float yawDegrees = CC_RADIANS_TO_DEGREES(yawValue);
currentYaw = yawDegrees;
// We add new compass value together with new yaw value
yawDegrees = newCompassTarget + (yawDegrees - offsetG);
// Degrees should always be positive
if(yawDegrees < 0) {
yawDegrees = yawDegrees + 360;
}
compassDif.text = [NSString stringWithFormat:#"Gyro: %f",yawDegrees]; // Debug
float gyroDegrees = (yawDegrees*radianConst);
// If there is a new compass value the gyro graphic animates to this position
if(updateCompass) {
[self setRotateArrow:gyroDegrees animated:YES];
[self commitAnimations];
updateCompass = 0;
} else {
[self setRotateArrow:gyroDegrees animated:NO];
[UIView commitAnimations];
}
};
and the setRotateArrow:animated method:
- (void) setRotateArrow:(float)degrees animated:(BOOL)animated{
UIImage *arrowImage = [UIImage imageNamed:#"DirectionArrow.png"];
for (int i = 0; i<arrowImageViews.count; i++) {
[(UIImageView *)[arrowImageViews objectAtIndex:i] setImage:arrowImage];
CGFloat arrowTransform = degrees;
//Rotate the Arrow
CGAffineTransform rotate = CGAffineTransformMakeRotation(arrowTransform);
[(UIImageView *)[arrowImageViews objectAtIndex:i] setTransform:rotate];
}
}
If anyone got an idea how to get the arrows rotation following smoothly the device rotation I would be very thankful.
In Android, the API provides the field of view angle:
Camera.Parameters.getHorizontalViewAngle()
Camera.Parameters.getVerticalViewAngle()
What's the equivalent in iOS?
I don't want to pre-write those values because it's not flexible.
I'm not entirely sure what "horizontal" and "vertical" mean in this context, but I think of two calculations, the rotation about the "z" axis (i.e. how level we are with the horizon in the photo), and how much it's tilted forward and backward (i.e. the rotation about the "x" axis, namely is it pointing up or down). You can do this using Core Motion. Just add it to your project and then you can do something like:
Make sure to import CoreMotion header:
#import <CoreMotion/CoreMotion.h>
Define a few class properties:
#property (nonatomic, strong) CMMotionManager *motionManager;
#property (nonatomic, strong) NSOperationQueue *deviceQueue;
Start the motion manager:
- (void)startMotionManager
{
self.deviceQueue = [[NSOperationQueue alloc] init];
self.motionManager = [[CMMotionManager alloc] init];
self.motionManager.deviceMotionUpdateInterval = 5.0 / 60.0;
[self.motionManager startDeviceMotionUpdatesUsingReferenceFrame:CMAttitudeReferenceFrameXArbitraryZVertical
toQueue:self.deviceQueue
withHandler:^(CMDeviceMotion *motion, NSError *error)
{
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
CGFloat x = motion.gravity.x;
CGFloat y = motion.gravity.y;
CGFloat z = motion.gravity.z;
// how much is it rotated around the z axis
CGFloat rotationAngle = atan2(y, x) + M_PI_2; // in radians
CGFloat rotationAngleDegrees = rotationAngle * 180.0f / M_PI; // in degrees
// how far it it tilted forward and backward
CGFloat r = sqrtf(x*x + y*y + z*z);
CGFloat tiltAngle = (r == 0.0 ? 0.0 : acosf(z/r); // in radians
CGFloat tiltAngleDegrees = tiltAngle * 180.0f / M_PI - 90.0f); // in degrees
}];
}];
}
When done, stop the motion manager:
- (void)stopMotionManager
{
[self.motionManager stopDeviceMotionUpdates];
self.motionManager = nil;
self.deviceQueue = nil;
}
I'm not doing anything with the values here, but you can save them in class properties which you can then access elsewhere in your app. Or you could dispatch UI updates back to the main queue right from here. A bunch of options.
Since this is iOS 5 and higher, if the app is supporting earlier versions you might also want to weakly link Core Motion then then check to see everything is ok, and if not, just realize that you're not going to be capturing the orientation of the device:
if ([CMMotionManager class])
{
// ok, core motion exists
}
And, in case you're wondering about my fairly arbitrary choice of twelve times per second, in the Event Handling Guide for iOS, they suggest 10-20/second if just checking the orientation of the device.
In iOS 7.0+, you can obtain FOV angle of a camera by reading this property.
https://developer.apple.com/documentation/avfoundation/avcapturedeviceformat/1624569-videofieldofview?language=objc
AVCaptureDevice *camera;
camera = ...
float fov = [[camera activeFormat] videoFieldOfView];
NSLog("FOV=%f(deg)", fov);