With UIInterpolatingMotionEffect, twist the iPhone, and you can have an image move.
Now: imagine a red block you will "bounce around" on screen, using UICollisionBehavior and UIDynamicItemBehavior. When the user twists the iPhone: I want the boxes to "start moving" WITH SIMILAR PHYSICS FEEL to using UIInterpolatingMotionEffect.
http://tinypic.com/player.php?v=b67mlc%3E&s=8#.VBVEq0uZNFx
Aside:UX explanation: the bouncy effect (example: SMS on iPhone) has the same "feel" as the parallax image effect in iOS. (By "feel" I really just mean the same speed, acceleration.) This would be a third effect: like parallax it would "move things slightly" but they would "keep moving", bouncing a little. (You could say, somewhat combining the feel of bouncy-lists-effect and and parallax-images-effect.)
Now: it's relatively easy to do what I describe, using CMAccelerometerData, and applying pushes using UIPushBehaviorModeInstantaneous. But it's a lot of messy code.
In contrast, UIInterpolatingMotionEffect is ridiculously easy to use.
Essentially, how can I get the values from UIInterpolatingMotionEffect (which I will then use as pushes). Cheers!
Similar thought ...
Simply display the values of UIInterpolatingMotionEffect?
It asks simply: how can one easily "just get" the values from UIInterpolatingMotionEffect ? ie, it seems incredible one has to go to the effort of carefully subclassing CALayer, etc.
It's an interesting notion of updating some behavior via the UIInterpolatingMotionEffect, though I don't suspect it's designed for that. If you want to update behaviors based upon accelerometer information, I personally would have thought that the CMMotionManager is ideal for that purpose.
The desired UX isn't entirely clear from the video clip, but it looks like that video is having things continue to slide in the direction the phone is tilted until you stop tilting the phone. If that's what you want, I'd be inclined to marry CMMotionManager with a UIKit Dynamics UIGravityBehavior:
self.animator = [[UIDynamicAnimator alloc] initWithReferenceView:container];
UICollisionBehavior *collision = [[UICollisionBehavior alloc] initWithItems:container.subviews];
collision.translatesReferenceBoundsIntoBoundary = YES;
[self.animator addBehavior:collision];
UIGravityBehavior *gravity = [[UIGravityBehavior alloc] initWithItems:container.subviews];
gravity.gravityDirection = CGVectorMake(0, 0);
[self.animator addBehavior:gravity];
self.motionManager = [[CMMotionManager alloc] init];
typeof(self) __weak weakSelf = self;
[self.motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue mainQueue] withHandler:^(CMDeviceMotion *motion, NSError *error) {
if (weakSelf.referenceAttitude == nil) {
weakSelf.referenceAttitude = motion.attitude;
} else {
CMAttitude *attitude = motion.attitude;
[attitude multiplyByInverseOfAttitude:weakSelf.referenceAttitude];
gravity.gravityDirection = CGVectorMake(attitude.roll * 5.0, attitude.pitch * 5.0);
}
}];
If you want them to move precisely in accordance to the attitude, stopping the movement when you stop moving the device (like UIInterpolatingMotionEffect does), you could use a UIAttachmentBehavior, something like:
UIAttachmentBehavior *attachment = [[UIAttachmentBehavior alloc] initWithItem:viewToAttachTo attachedToAnchor:viewToAttachTo.center];
[self.animator addBehavior:attachment];
self.motionManager = [[CMMotionManager alloc] init];
self.motionManager.deviceMotionUpdateInterval = 1.0 / 20.0;
typeof(self) __weak weakSelf = self;
CGPoint originalAnchorPoint = viewToAttachTo.center;
[self.motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue mainQueue] withHandler:^(CMDeviceMotion *motion, NSError *error) {
if (weakSelf.referenceAttitude == nil) {
weakSelf.referenceAttitude = motion.attitude;
} else {
CMAttitude *attitude = motion.attitude;
[attitude multiplyByInverseOfAttitude:weakSelf.referenceAttitude];
attachment.anchorPoint = CGPointMake(originalAnchorPoint.x + attitude.roll * 10.0, originalAnchorPoint.y + attitude.pitch * 10.0);
}
}];
Note, in both of those examples, I'm applying the adjustments to the behaviors as the device shifts from the orientation of the device when the user started the app. Thus I capture a CMAttitude property, referenceAttitude to be the attitude of the device when the user fired up the app, and then uses multiplyByInverseOfAttitude on subsequent updates to apply gravity in relation to that original orientation. You could, obviously, use a predetermined attitude if you wanted, too.
But hopefully the above illustrates one approach to tackling this sort of UX.
Related
Let's say we have a SCNNode and we want to rotate, change flight direction and move in space with no gravity. The node currently is just a camera. actually it should react to gyroscope accelerometer data. i think scene update can be done in - (void)renderer:(id <SCNSceneRenderer>)aRenderer didSimulatePhysicsAtTime:(NSTimeInterval)time.
The problem I'm facing is how to utilise the gyroscope & accelerometer data for calculating flight direction, and how to calculate the rotation direction & movement. I refreshed my memory about 3D rotation matrix for each axis(x,y,z) but still missing above mentioned part to solve this problem. May be thinking very complicated and the solution is very easy.
- (void)viewDidLoad
{
SCNView *scnView = (SCNView *) self.view;
SCNScene *scene = [SCNScene scene];
scnView.scene = scene;
_cameraNode = [[SCNNode alloc] init];
_cameraNode.camera = [SCNCamera camera];
_cameraNode.camera.zFar = 500;
_cameraNode.position = SCNVector3Make(0, 60, 50);
_cameraNode.rotation = SCNVector4Make(1, 0, 0, -M_PI_4*0.75);
[scene.rootNode addChildNode:_cameraNode];
scnView.pointOfView = _cameraNode;
[self setupAccelerometer];
}
- (void)setupAccelerometer
{
_motionManager = [[CMMotionManager alloc] init];
GameViewController * __weak weakSelf = self;
if ([[GCController controllers] count] == 0 && [_motionManager isAccelerometerAvailable] == YES) {
[_motionManager setAccelerometerUpdateInterval:1/60.0];
[_motionManager startAccelerometerUpdatesToQueue:[NSOperationQueue mainQueue] withHandler:^(CMAccelerometerData *accelerometerData, NSError *error) {
[weakSelf accelerometerDidChange:accelerometerData.acceleration];
}];
}
}
- (void)accelerometerDidChange:(CMAcceleration)acceleration
{
//acceleration.x
//acceleration.y
//acceleration.z
// now we have the data and saved somewhere
}
If you want to combine gyroscope and accelerometer data to get an orientation, you'd probably be better off letting CoreMotion do that for you. The methods listed under Managing Device Motion Updates in the docs get you CMDeviceMotion objects; from one of those you can get a CMAttitude that represents an orientation, and from that you can get a rotation matrix or quaternion that you can apply to a node in your scene. (Possibly after further transforming it so that a neutral orientation fits into your scene the way you want.)
After you've oriented the node the way you want, getting a flight direction from that should be pretty simple:
Choose a flight direction in terms of the local coordinate space of the node and make a vector that points in that direction; e.g. a camera looks in the -z direction of its node, so if you want the camera node to look along its flight path you'll use a vector like {0, 0, -1}.
Convert that vector to scene space with convertPosition:toNode: and the scene's rootNode. This gets you the flight direction in terms of the scene, taking the node's orientation into account.
Use the converted vector to move the node, either by assigning it as the velocity for the node's physics body or by using it to come up with a new position and moving the node there with an action or animation.
I'd like to make it when the user shakes the device the ball rattles around inside the object on screen. I'm assuming I need to set up an invisible box for it to collide with. It doesn't matter if it moved randomly or follows a predefined path, whichever is easiest.
I think I understand the "Activate on shake" part of the code, just the ball/object movement I'm not sure of
This should work:
//You need an #property (nonatomic, strong) UIDynamicAnimator *animator; in your .h
self.animator = [[UIDynamicAnimator alloc] initWithReferenceView:self.viewToBounceAroundIn];
UICollisionBehavior *collision = [[UICollisionBehavior alloc] initWithItems:#[self.viewThatBouncesAround]];
collision.translatesReferenceBoundsIntoBoundary = YES;
[self.animator addBehavior:collision];
UIPushBehavior *push = [[UIPushBehavior alloc] initWithItems:#[self.viewThatBouncesAround] mode:UIPushBehaviorModeInstantaneous];
push.magnitude = 1; //Play with this, it's how much force is applied to your object
push.angle = 0; //play with this too
[self.animator addBehavior:push];
I typed this away from a compiler - let me know if it works. The idea is that you use UIKitDynamics as a physics engine, use a UICollisionBehavior to let the item bounce around inside the box, and a UIPushBehavior to apply the initial force.
If the item slows down too quickly for you, or loses too much energy when it bounces off walls, you can adjust its properties:
UIDynamicItemBehavior *behavior = [[UIDynamicItemBehavior alloc] initWithItems:#[self.itemThatBouncesAround]];
behavior.friction = 0; //no friction. play with this.
behavior.elasticity = 1;; //completely elastic, play with this.
[self.animator addBehavior:behavior];
I have created a simple sks particle file in my project,I would like to know A: how can I implement this particle in my view & how can I add the appropriate parameters so that the particles travel in the direction that the device is being held in, (very similar to iOS7 Dynamic Wallpapers for example), so in my case, if I have stones falling straight down and the device is tilted to the right, the stones should start falling with a different angle. I'd really appreciate some advice.
I was actually thinking about applying this effect to my own game as well. This is a crude answer and I haven't actually tried combining these two elements, but it should definitely give you some pointers.
#property (strong,nonatomic) SKEmitterNode * starEmitter;
#property (strong,nonatomic) CMMotionManager * self.motionManager;
// Create your emitter
NSString *starPath = [[NSBundle mainBundle] pathForResource:#"startButtonDisappear" ofType:#"sks"];
self.starEmitter = [NSKeyedUnarchiver unarchiveObjectWithFile:starPath];
// Create a motion manager
self.motionManager = [CMMotionManager new];
[self.motionManager startAccelerometerUpdates];
In your update method, calculate the gravity vector and assign acceleration properties of the emitter.
- (void)update:(NSTimeInterval)currentTime {
CMAccelerometerData* data = self.motionManager.accelerometerData;
// You really don't want NSLogs in this method in a release build, by the way
NSLog(#"x [%2.1f] y [%2.1f] z [%2.1f]",data.acceleration.x,data.acceleration.y,data.acceleration.z)
CGVector gravity;
if ([[UIApplication sharedApplication] statusBarOrientation] == UIInterfaceOrientationLandscapeLeft) {
gravity = CGVectorMake(data.acceleration.x, data.acceleration.y);
} else {
gravity = CGVectorMake(-data.acceleration.x, -data.acceleration.y);
}
CGFloat tmp = gravity.dy;
gravity.dy = gravity.dx;
gravity.dx = tmp;
// self.physicsWorld.gravity = gravity;
self.starEmitter.xAcceleration = gravity.x;
self.starEmitter.yAcceleration = gravity.y;
NSLog(#"Emitter acceleration set to [%+4.2f,%+4.2f]",gravity.dx,gravity.dy);
}
In Android, the API provides the field of view angle:
Camera.Parameters.getHorizontalViewAngle()
Camera.Parameters.getVerticalViewAngle()
What's the equivalent in iOS?
I don't want to pre-write those values because it's not flexible.
I'm not entirely sure what "horizontal" and "vertical" mean in this context, but I think of two calculations, the rotation about the "z" axis (i.e. how level we are with the horizon in the photo), and how much it's tilted forward and backward (i.e. the rotation about the "x" axis, namely is it pointing up or down). You can do this using Core Motion. Just add it to your project and then you can do something like:
Make sure to import CoreMotion header:
#import <CoreMotion/CoreMotion.h>
Define a few class properties:
#property (nonatomic, strong) CMMotionManager *motionManager;
#property (nonatomic, strong) NSOperationQueue *deviceQueue;
Start the motion manager:
- (void)startMotionManager
{
self.deviceQueue = [[NSOperationQueue alloc] init];
self.motionManager = [[CMMotionManager alloc] init];
self.motionManager.deviceMotionUpdateInterval = 5.0 / 60.0;
[self.motionManager startDeviceMotionUpdatesUsingReferenceFrame:CMAttitudeReferenceFrameXArbitraryZVertical
toQueue:self.deviceQueue
withHandler:^(CMDeviceMotion *motion, NSError *error)
{
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
CGFloat x = motion.gravity.x;
CGFloat y = motion.gravity.y;
CGFloat z = motion.gravity.z;
// how much is it rotated around the z axis
CGFloat rotationAngle = atan2(y, x) + M_PI_2; // in radians
CGFloat rotationAngleDegrees = rotationAngle * 180.0f / M_PI; // in degrees
// how far it it tilted forward and backward
CGFloat r = sqrtf(x*x + y*y + z*z);
CGFloat tiltAngle = (r == 0.0 ? 0.0 : acosf(z/r); // in radians
CGFloat tiltAngleDegrees = tiltAngle * 180.0f / M_PI - 90.0f); // in degrees
}];
}];
}
When done, stop the motion manager:
- (void)stopMotionManager
{
[self.motionManager stopDeviceMotionUpdates];
self.motionManager = nil;
self.deviceQueue = nil;
}
I'm not doing anything with the values here, but you can save them in class properties which you can then access elsewhere in your app. Or you could dispatch UI updates back to the main queue right from here. A bunch of options.
Since this is iOS 5 and higher, if the app is supporting earlier versions you might also want to weakly link Core Motion then then check to see everything is ok, and if not, just realize that you're not going to be capturing the orientation of the device:
if ([CMMotionManager class])
{
// ok, core motion exists
}
And, in case you're wondering about my fairly arbitrary choice of twelve times per second, in the Event Handling Guide for iOS, they suggest 10-20/second if just checking the orientation of the device.
In iOS 7.0+, you can obtain FOV angle of a camera by reading this property.
https://developer.apple.com/documentation/avfoundation/avcapturedeviceformat/1624569-videofieldofview?language=objc
AVCaptureDevice *camera;
camera = ...
float fov = [[camera activeFormat] videoFieldOfView];
NSLog("FOV=%f(deg)", fov);
I would like to know how many degrees is the iPhone leaning when being held upright - that is, if I point the iPhone up, I would like to know how many degrees it is looking up at? I think that I can use the gyroscope and core motion for this, and I think that it is pitch that I want, however, the pitch is a "rate of speed" in radians per second. I am not really interested in how quickly a user moved the iPhone, I am just interested in the leaning angle. So if I am pointing a phone up to take a picture, I would like to know the angle of pointing up - any ideas??
Thank you.
Gyro data is in radians per second, but what you are looking for is CMMotionManager.attitude property. It shows the attitude of the object in radians relative to a some frame of reference.
create class variable motionManager and init:
motionManager = [[CMMotionManager alloc] init];
motionManager.deviceMotionUpdateInterval = 0.1f;
[motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue currentQueue] withHandler:^(CMDeviceMotion *motion, NSError *error) {
[self processMotion:motion];
}];
process the updates, you are looking for pitch, but in this sample it will show you all three values so you can play around and decide what you need:
-(void)processMotion:(CMDeviceMotion*)motion {
NSLog(#"Roll: %.2f Pitch: %.2f Yaw: %.2f", motion.attitude.roll, motion.attitude.pitch, motion.attitude.yaw);
}
these are Euler Angles, you also have an option to get rotationMatrix, or quaternion format. Each with their own advantages and disadvantages.