Let's say we have a SCNNode and we want to rotate, change flight direction and move in space with no gravity. The node currently is just a camera. actually it should react to gyroscope accelerometer data. i think scene update can be done in - (void)renderer:(id <SCNSceneRenderer>)aRenderer didSimulatePhysicsAtTime:(NSTimeInterval)time.
The problem I'm facing is how to utilise the gyroscope & accelerometer data for calculating flight direction, and how to calculate the rotation direction & movement. I refreshed my memory about 3D rotation matrix for each axis(x,y,z) but still missing above mentioned part to solve this problem. May be thinking very complicated and the solution is very easy.
- (void)viewDidLoad
{
SCNView *scnView = (SCNView *) self.view;
SCNScene *scene = [SCNScene scene];
scnView.scene = scene;
_cameraNode = [[SCNNode alloc] init];
_cameraNode.camera = [SCNCamera camera];
_cameraNode.camera.zFar = 500;
_cameraNode.position = SCNVector3Make(0, 60, 50);
_cameraNode.rotation = SCNVector4Make(1, 0, 0, -M_PI_4*0.75);
[scene.rootNode addChildNode:_cameraNode];
scnView.pointOfView = _cameraNode;
[self setupAccelerometer];
}
- (void)setupAccelerometer
{
_motionManager = [[CMMotionManager alloc] init];
GameViewController * __weak weakSelf = self;
if ([[GCController controllers] count] == 0 && [_motionManager isAccelerometerAvailable] == YES) {
[_motionManager setAccelerometerUpdateInterval:1/60.0];
[_motionManager startAccelerometerUpdatesToQueue:[NSOperationQueue mainQueue] withHandler:^(CMAccelerometerData *accelerometerData, NSError *error) {
[weakSelf accelerometerDidChange:accelerometerData.acceleration];
}];
}
}
- (void)accelerometerDidChange:(CMAcceleration)acceleration
{
//acceleration.x
//acceleration.y
//acceleration.z
// now we have the data and saved somewhere
}
If you want to combine gyroscope and accelerometer data to get an orientation, you'd probably be better off letting CoreMotion do that for you. The methods listed under Managing Device Motion Updates in the docs get you CMDeviceMotion objects; from one of those you can get a CMAttitude that represents an orientation, and from that you can get a rotation matrix or quaternion that you can apply to a node in your scene. (Possibly after further transforming it so that a neutral orientation fits into your scene the way you want.)
After you've oriented the node the way you want, getting a flight direction from that should be pretty simple:
Choose a flight direction in terms of the local coordinate space of the node and make a vector that points in that direction; e.g. a camera looks in the -z direction of its node, so if you want the camera node to look along its flight path you'll use a vector like {0, 0, -1}.
Convert that vector to scene space with convertPosition:toNode: and the scene's rootNode. This gets you the flight direction in terms of the scene, taking the node's orientation into account.
Use the converted vector to move the node, either by assigning it as the velocity for the node's physics body or by using it to come up with a new position and moving the node there with an action or animation.
Related
I downloaded sample code from https://github.com/yshrkt/VuforiaSampleSwift to implement Vuforia with swift. It didn't work oob with the latest SDK, but I have successfully loaded xml/dat dataSet, and when I point to target method ViewController.createStonesScene(with view: VuforiaEAGLView) -> SCNScene is called. Unfortunately I don't see anything on camera preview. There should be a plane with background color, but nothing is added. Can someone help me resolve this? I wan't to display a simple plane for start. My project is at: https://www.dropbox.com/s/fk71oay1sopc1vp/test.zip?dl=1
Please update Vuforia License key in AppDelegate, I can't provide it as I'm not the owner of this application.
I get to the point, that I know, that scene is being displayed, but camera transformations must be wrong. Either _cameraNode.camera.projectionTransform or _cameraNode.transform (or both).
I added zNear = 0.01 to cameraNode, which was a little step. Now If I just apply cameraNode.transform and disable cameraNode.camera.projectionTransform I can see a node of correct size, but it's rotated 90 degrees, and also when I move camera it moves 90 degrees wrong (moving up/down moves it left/right). A little bit of code that does this is below:
- (void)setNeedsChangeSceneWithUserInfo: (NSDictionary*)userInfo {
SCNScene* scene = [self.sceneSource sceneForEAGLView:self userInfo:userInfo];
if (scene == nil) {
return;
}
SCNCamera* camera = [SCNCamera camera];
_cameraNode = [SCNNode node];
_cameraNode.camera = camera;
_cameraNode.camera.zNear = 0.01;
// _cameraNode.camera.projectionTransform = _projectionTransform;
[scene.rootNode addChildNode:_cameraNode];
_renderer.scene = scene;
_renderer.pointOfView = _cameraNode;
}
// Set camera node matrix
- (void)setCameraMatrix:(Vuforia::Matrix44F)matrix {
SCNMatrix4 extrinsic = [self SCNMatrix4FromVuforiaMatrix44:matrix];
SCNMatrix4 inverted = SCNMatrix4Invert(extrinsic);
_cameraNode.transform = inverted;
}
- (void)setProjectionMatrix:(Vuforia::Matrix44F)matrix {
_projectionTransform = [self SCNMatrix4FromVuforiaMatrix44:matrix];
// _cameraNode.camera.projectionTransform = _projectionTransform;
}
So now I need to rotate whole scene 90 degrees. I think it might have been in _cameraNode.camera.projectionTransform = _projectionTransform;, but when I enable this, I can't see anything anymore. How to apply 90" rotation to this scene?
Here is a video of what I mean: https://www.dropbox.com/s/z6pwaztlfyad8fx/ScreenRecording_09-03-2018%2015-01-17.MP4?dl=0
I think this can be in VuforiaManager.mm:
// Cache the projection matrix
const Vuforia::CameraCalibration& cameraCalibration = Vuforia::CameraDevice::getInstance().getCameraCalibration();
_projectionMatrix = Vuforia::Tool::getProjectionGL(cameraCalibration, 0.05f, 5000.0f);
[_eaglView setProjectionMatrix:_projectionMatrix];
Anyone know how to fix this?
The quaternion you get from CMAttitude seems flawed. In Unity I can get the iPhone's rotation, apply it to a 3d object, and the object will rotate as you rotate the iPhone. In Xcode it seems to be different.
To set up a quick test project, follow these steps:
In Xcode, create a new project from the iOS > Game template (this gives us a 3d object to test on).
In GameViewController.h add #import <CoreMotion/CoreMotion.h> and #property (strong, nonatomic) CMMotionManager *motionManager;
In GameViewController.m remove the animation at line:46 [ship runAction: etc.. and don't allow camera control at line:55 (not necessary)
In GameViewController.m add to bottom of ViewDidLoad
self.motionManager = [[CMMotionManager alloc] init];
self.motionManager.deviceMotionUpdateInterval = .1;
[self.motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue currentQueue]
withHandler:^(CMDeviceMotion *motion, NSError *error) {
CMQuaternion q = motion.attitude.quaternion;
ship.rotation = SCNVector4Make(q.x, q.y, q.z, q.w);
if(error){ NSLog(#"%#", error); }
} ];
To avoid annoyances, in Project > Targets > Project Name > Deployment Info > Device Orientation, allow only Portrait (or just lock your iPhone's rotation)
When I build this to my iPhone the 3d object (an airplane) doesn't follow the rotation of the phone. Pitching kind of works.
Is this intended? How am I using this wrong?
In 3D space there are different representations of rotations.
You are using the rotation property of SCNNode:
The four-component rotation vector specifies the direction
of the rotation axis in the first three components and the
angle of rotation (in radians) in the fourth.
But you are assigning the device rotation as quaternion.
Either assign the quaternion to the orientation property of SCNNode or use the yaw, pitch and roll angles of CMAttitude and assign these to the eulerAngles property of SCNNode:
The node’s orientation, expressed as pitch, yaw, and roll angles,
each in radians.
In my app I load models from different files (format is the same) and they have different geometry: big, small, wide, etc. I have object and camera position hard coded and for some cases I don't see anything because camera not point to object.
Maybe there is a way to normalise model before adding it to scene.
Update.
With Moustach answer I came up to following solution:
// import object from file
SCNNode *object = [importer load:path];
object.position = SCNVector3Make(0, 0, 0);
[scene.rootNode addChildNode:object];
// create and add a camera to the scene
SCNNode *cameraNode = [SCNNode node];
cameraNode.camera = [SCNCamera camera];
// to avoid view clipping
cameraNode.camera.automaticallyAdjustsZRange = YES;
// set camera position to front of object
SCNVector3 sphereCenter;
CGFloat sphereRadius;
[object getBoundingSphereCenter:&sphereCenter radius:&sphereRadius];
cameraNode.position = SCNVector3Make(sphereCenter.x, sphereCenter.y, sphereCenter.z + 2 * sphereRadius);
[scene.rootNode addChildNode:cameraNode];
Works well for me.
You can calculate the bounding box of your mesh, and scale it based on the number you get to make it the same size as other objects.
I am unable to detect collision between two sprites, one moving in action.
I have a sprite of class "Enemy", moving across the screen via CCMoveTo, and another sprite of class "Hero", controlled by touch, both added onto a scene on class "MainGame"
The following indicates how the two sprites are added onto the scene and Enemy actioned:
MainGame.m
-(id) init{
if( (self=[super init]) ) {
_enemies = [[NSMutableArray alloc]init];
_enemyLink = [Enemy nodeWithTheGame:self];
_enemyLink.position = ccp(10, 10);
[self.enemies addObject:_enemyLink];
CCMoveTo *test = [CCMoveTo actionWithDuration:40 position:ccp(500, 250)];
[_enemyLink runAction:test];
_heroArray = [[NSMutableArray alloc]init];
_heroLink = [Hero nodeWithTheGame:self location:ccp(100,100)];
[_heroArray addObject:_heroLink];
[self scheduleUpdate];
}
}
-(void)update:(ccTime)delta{
if (CGRectIntersectsRect([self.enemyLink.enemySprite boundingBox], [self.heroLink.heroSprite boundingBox])) {
NSLog(#"rect intersects rect");
}
for (CCSprite *enemies in self.enemies) {
NSLog(#"enemy position: %f, %f",enemies.position.x,enemies.position.y);
}
}
I am unable to detect collision between these two sprites during and after the action. However if I move the Hero to the position on the scene (0,0) the log in my code will trigger, and the two will engage in attacks.
The for loop in the update indicates that the Enemy sprite position is constantly moving during the action. Hence why I am stumped as to why the collision is not being detected.
Have you checked that both enemy and hero positions are tested within the same coordinates space?
This is because boundingBox() is local, relative to its parent and if the nodes you compare do not have the same parent, the check will be invalid.
You can use convertToNodeSpace()/convertToWorldSpace() prior to checking bounding boxes.
Another answer that may be relevant to your case here: A sprite bounding box shows wrong position after moving the layer.
I have created a simple sks particle file in my project,I would like to know A: how can I implement this particle in my view & how can I add the appropriate parameters so that the particles travel in the direction that the device is being held in, (very similar to iOS7 Dynamic Wallpapers for example), so in my case, if I have stones falling straight down and the device is tilted to the right, the stones should start falling with a different angle. I'd really appreciate some advice.
I was actually thinking about applying this effect to my own game as well. This is a crude answer and I haven't actually tried combining these two elements, but it should definitely give you some pointers.
#property (strong,nonatomic) SKEmitterNode * starEmitter;
#property (strong,nonatomic) CMMotionManager * self.motionManager;
// Create your emitter
NSString *starPath = [[NSBundle mainBundle] pathForResource:#"startButtonDisappear" ofType:#"sks"];
self.starEmitter = [NSKeyedUnarchiver unarchiveObjectWithFile:starPath];
// Create a motion manager
self.motionManager = [CMMotionManager new];
[self.motionManager startAccelerometerUpdates];
In your update method, calculate the gravity vector and assign acceleration properties of the emitter.
- (void)update:(NSTimeInterval)currentTime {
CMAccelerometerData* data = self.motionManager.accelerometerData;
// You really don't want NSLogs in this method in a release build, by the way
NSLog(#"x [%2.1f] y [%2.1f] z [%2.1f]",data.acceleration.x,data.acceleration.y,data.acceleration.z)
CGVector gravity;
if ([[UIApplication sharedApplication] statusBarOrientation] == UIInterfaceOrientationLandscapeLeft) {
gravity = CGVectorMake(data.acceleration.x, data.acceleration.y);
} else {
gravity = CGVectorMake(-data.acceleration.x, -data.acceleration.y);
}
CGFloat tmp = gravity.dy;
gravity.dy = gravity.dx;
gravity.dx = tmp;
// self.physicsWorld.gravity = gravity;
self.starEmitter.xAcceleration = gravity.x;
self.starEmitter.yAcceleration = gravity.y;
NSLog(#"Emitter acceleration set to [%+4.2f,%+4.2f]",gravity.dx,gravity.dy);
}