I am using 3ds max for a long time and I know xyz axis. what I see in xcode in rotation the scnnode what made my mind blowed up is w component of scnvector4.
Can someone explain in detail how to use this method because I searched a lot of time but I can't make my object spin as I desire. anyone can help to make him spin to his back in 180 degree but I will appreciate if someone explain more for further rotations, Knowing that I saw this link but I didn't understand something.
http://www.cprogramming.com/tutorial/3d/rotationMatrices.html
I believe that your are trying to rotate nodes (rotation property).
From the documentation :
The four-component rotation vector specifies the direction of the rotation axis in the first three components and the angle of rotation (in radians) in the fourth.
You might find it easier to use eulerAngles :
The node’s orientation, expressed as pitch, yaw, and roll angles, each in radians
Use .transform to rotate a node
node.transform = SCNMatrix4Mult(node.transform, SCNMatrix4MakeRotation(angle, x, y, z))
if you want to rotate your node 180 degree by x axis
node.transform = SCNMatrix4Mult(node.transform, SCNMatrix4MakeRotation(Float(M_PI), 1, 0, 0))
Related
I'd like to simulate the shift of a tilt-shift/perspective-control lens in Scene Kit on MacOS.
Imagine the user has the camera facing a tall building at ground level, I'd like to be able to shift the 'lens' so that the projective distortion shifts (see e.g. Wikipedia).
Apple provides lots of physically-based parameters for SCNCamera (sensor height, aperture blade count), but I can't see anything obvious for this. It seems to exist in Unity.
Crucially I'd like to shift the lens so that the object stays in the same position relative to the camera. Obviously I could move the camera to get the effect, but the object needs to stay centred in the viewport (and I can't see a way to modify the viewport either). I've tried to modify the .projectionTransform matrix directly, but it was unsuccessful.
Thanks!
There's is no API on SCNCamera that does that out of the box. As you guessed one has to create a custom projection matrix and set it to the projectionTransform property.
I finally worked out the correct adjustment to the projection matrix – it's quite confusing to follow the maths, because it is a 4x4 matrix rather than 3x4 or 4x3 as you'd use for a plain camera projection matrix, which additionally makes it especially confusing to work out whether it is expecting row vectors or column vectors.
Anyway, the correct element is .m32 for the y axis
let camera = SCNNode()
camera.camera = SCNCamera()
let yShift: CGFloat = 1.0
camera.camera!.projectionTransform.m32 = yShift
Presumably .m31 will shift in the x axis, but I have to admit I haven't tested this.
When I thought about it a bit more, I also realised that the effect I actually wanted involves moving the camera too. Adjusting .m32 simulates moving the sensor, which will appear to move the subject relative to the camera, as if you had a wide angle lens and you were moving the crop. To keep the subject centred in frame, you need to move the camera's position too.
With a bit (a lot) of help from this blog post and in particular this code, I implemented this too:
let distance: CGFloat = 1.0 // calculate distance from subject here
let fovRadians = camera.camera!.fieldOfView * CGFloat.pi / 180.0
let yAdjust = tan(fovRadians / 2) * distance * yShift
camera.position = camera.position - camera.worldUp * yAdjust
(any interested readers could presumably work out the x axis shift from the source above)
I'm new to ARKit. I want to get the direction from anchor 1 to anchor 2. Currently, I can get the position from transform.columns.3. However, this works only for fixed axis.(z-axis always toward user)
How can I compare two anchor with respect to 6 axes (pitch, yaw, roll)? What should I read to get more detail information about this?
func showDirection(of object: ARAnchor) { // only work for fixed axis
if let currentFrame = sceneView.session.currentFrame {
print("diff(x) = \(currentFrame.camera.transform.columns.3.x - object.transform.columns.3.x)")
print("diff(y) = \(currentFrame.camera.transform.columns.3.y - object.transform.columns.3.y)")
print("diff(z) = \(currentFrame.camera.transform.columns.3.z - object.transform.columns.3.z)")
}
}
I think my answer to this other user's question may be helpful. Basically, using SceneKit or ARKit, you can find the orientations of the camera and of your target anchor, and do some quaternion math to find the axis and angle of the relative rotation between them on x, y and z axes. My example assumed a SceneKit/ARKit app, which allows you to use quaternions instead of matrices, but the math should essentially be the same for ARKit transforms. If you use ARKit's simd_float4x4 transform matrices, you could find one matrix in the space of the other (A.inverse * B) and use the resulting matrix to glean relative position and orientation.
Your question was a little hard to follow, as I'm not sure if the orientation of the anchor you're targeting matters in your case, but this should help as far as comparing two anchors with respect to pitch, yaw and roll.
The effect I'm trying to achieve is to have an arrow pointed out from the camera's pointOfView position, aligned with the scene (and gravity) on the x and z axis, but pointing in the same direction as the camera. It might look something like this:
Right now, I have its euler angles x and z set to 0, and it's y set to match that of the ARSCNView.pointOfView.eulerAngles.y. The problem is that as I rotate the device, the eulerAngles.y can end up having the same value for different points. For example, facing the device in one direction, my eulerAngles are:
x: 2.52045, y: -0.300239, z: 3.12887
Facing it in another direction, the eulerAngles are:
euler angles x: -0.383826, y: -0.305686, z: -0.0239297
Even though these directions are quite far apart, the eulerAngles is still pretty much the same. The different x and z values mean the y value doesn't represent which direction the device is facing. As a result, my arrow follows the camera's heading to some point, and then starts rotating back in the opposite direction. How can I zero-out the x and z values in a way that I'll get a truthful y value, that I can then use to orient my arrow?
Do not use euler angles to detect 'real' or 'truthful' values.
Those values are always correct. To work with rotation you have to use either matricies or quaternions.
Remember. It is possible to define lots of 'euler angles' using matrix.
Each euler angle is relative to the previous one.
SceneKit applies these rotations in the reverse order of the
components:
1. first roll
2. then yaw
3. then pitch
So, to calculate the value you angle I would suggest to calculate the vector and its projection to Oxy plane. This angle is not 'euler' angle.
I have a problem that has been puzling me for the last few days. I have a camera pose obtained with Opencv that is right handed (X-right, Y-up, Z-back) and I would like to visualize in Unity (X-right, Y-up, Z-forward) but I cannot really manage to get it right.
I tried to do it using either quaternions or matrices and it should be just mirroring the Z axis and set the rotation of the transform of the camera in Unity to the computed transformation however I cannot get the right conversion.
With quaternions I tried to mirror by negating the Z and W term and I achieved a coordinate system (X-right, Y-down, Z-forward), it makes sense but it is not what I want to achieve. With matrices I think I should multiply my right hand camera by an identity matrix with the element [2,2] set to -1, however I don't get what I want.
I am definitely missing something, probably something really stupid I forgot :)
Has anybody a suggestion?
A Quaternion can be thought of as a rotation around an axis a = (ax, ay, az) by an angle theta
qx = ax * sin(theta/2)
qy = ay * sin(theta/2)
qz = az * sin(theta/2)
qw = cos(theta/2)
In a right-handed coordinate system a rotation of theta will be counter-clockwise, while in a left-handed coordinate system a rotation of theta will be clockwise (depending on your point-of-view, of course).
So to get your quaternion from a right-handed system to Unity's Left-Handed system you have to account for two factors:
The Z-Axis is negated
The direction of rotation is flipped from CCW to CW
We first factor is accounted for by negating the qz component of the quaternion. The second factor is accounted for by flipping the axis of rotation (rotating by 90 degrees around 1,0,0 is the inverse of rotating 90 degrees around -1,0,0).
If your original right-handed quaternion is q and your left-handed quaternion is q'That means you end up with:
q'=(-qx, -qy, qz, qw)
Additional Note
Quaternions don't inherently have a handedness. A quaterion q applies equally well in RH or LH coordinate systems. However, when you apply the quaternion to a spatial vector, the resulting transformation takes on the handedness of the vector's space.
I have done a tiny bit of 3d graphics in the past. When you move or rotate a Scene Kit sprite does it automatically update its translation matrix, or do you have to make it yourself?
Are "position" and "eulerAngles" both properties that are... absolute.
For example if I am in sprite kit and set the translation to (1, 0) it will be at that point relative to the origin.
And if I set the z rotation to 90 it will be rotated 90 degrees.
And if I incrament the translation (with +=) x it will start going in a line.
And same for zRotation if incremented it will rotate. In scene kit if I do similar things to the translation and euler angle values will they do the same thing?
Also what exactly does the accelerometer think its measuring, it is like the amount of motion in a certain period? So basically is it the delta between the two simultaneous points that the device was in.
Yes, this question is definitely broad, however they are much better placed here, then scattered in three tiny posts.
Doe, let me see if I can help
Translation matrix? It has a TRANSFORM matrix that includes translation, scale and rotation, and yes, it is automatically updated when you change one of these 3, and vice-versa.
If I understood well, yes, just like in SpriteKit. They are related to their parent coordinates. The position (1,0,0) would mean the Node (its center, unless you change its pivot (anchorPoint in spriteKit)) will be at distance 1 along the X axis of its parent from its parent origin).
The same works for the rotation, if a NodeA has 30 degrees rotation at axis X and you add a NodeB with 20 degrees rotation at X in NodeA, you would have the NodeA having visually a 50 degrees rotation at X.
Accelerometer measures the acceleration forces given to the device in a specific moment, in the three axis of the device. Its unit is not [m^2/s] but [Gravity/s] (would be approximately [10m^2/s]). An important detail is that this measure includes the gravity acceleration as well.
So, if you try to measure the acceleration with the device standing ortogonal to the ground, you would expect (0, 0, -1) (or 0,0,1, if upside down).
Lying down the device on the ground it would be (0, 1or-1, 0) (depending if the screen is facing the ground or the ceiling)
For for every tick (of update rate of the accelerometer) it calculates what was the acceleration imposed to the device at that moment. That's not the delta itself, but it can be easily calculated if you store the values.