DirectX having a sphere orbit a point at a given distance, over time - directx

I'm trying to make a basic model of the solar system in Direct X. I want to have the planets orbit the sun and the moon orbit the planets. So I have the planet/star/moon which takes a pointer to it's parent position vector. From that, how would I be able to make the object orbit at a set speed?
So for example the moon would have a pointer to the earth's position vector and a distance to stay from the earth. I'd need the moon to orbit the earth over 24 hours (or any time frame, I'm sure I would be able to adjust that myself). Similarly the earth would have a pointer to the sun's position and would rotate around that over 365 days.
I just don't know how to work out how to work out the orbital position.

Try this thread for ideas:
Making an object orbit a fixed point in directx?
As an aside: Be warned graphics hardware uses single precision floats. Depending on the scales involved you may find you run out of precision quite quickly on something the scale of the solar system. If this happens don't forget you can sort object groups by z and then render them at a large distance but with full, local, Z-buffer precision. You then need to clear the Z0buffer and draw the next local group forward.

Related

How to determine the rotation angle?

I'm trying to implement a russian roulette game and want it to brute-force the solution for it. Here is my problem. I'm going to hard code the relative angles of the numbers on the wheel (eg. there are 36 numbers and each number would have 10 degree offset to each other, the one on the top, 12 o'clock position, will have the 0 and the next 10 and vice versa). I will rotate the wheel randomly and then determine the rotation of it based on some values that I can calculate (startPosition to finishedPosition). The wheel is an ImageView. Is there a way to actually do this? For example, get the top left x,y pos for its start and end, then by some formula to calculate how much it rotated. Or is there a better way to do this? There is not much of a source code to show it, so this is more like a mathematical question rather than a swift one. Any feedback is much appreciated.
To calculate rotation, you need coordinates of three points: start location sx, sy, end location ex, ey of the same point after rotation and center of rotation cx, cy
Then you can find angle using atan2 function
rot_angle = atan2((ex-cx)*(sx-cx)+(ey-cy)*(sy-cy), (ex-cx)*(sy-cy)-(ey-cy)*(sx-cx))
Note - I used argument order (x,y) from here, while most languages use reverse order (y,x), so check what order you really need (I have no experience in IOS languages). Also result value might be in radians or in degrees (above link doesn't specify it clearly)
Your question doesn't make much sense. If you rotate the wheel randomly, calculate the random value as an angle. If you want to change the previous rotation by some random angle, then do the math on the starting rotation and ending rotation. That is just adding and subtracting angles (modulo 2π). Then you will know how far it is rotated, and not have to calculate it.
Assuming you're talking about a roulette wheel, and not "Russian Roulette" (In American English at least, that term involves pointing a loaded revolver at your head) you'll need to track both the wheel rotation and the ball rotation. To apply the rotation to the wheel, you'll just take the image of the wheel and rotate it on the Z axis around it's x/y center point.
To plot the ball, you'll need to use trig to calculate the center of the ball based on the radius of the track the ball follows and the angle. But again, always track the angle, and then convert the angle to an x/y center point for the ball to plot it. Don't forget the angle and then have to convert back from the ball position to its angle. That's silly.

Converting from Real world coordinate System to another

I have issue I am tryin to sort.
So basically I have a coordinate system where +x = traveling East, +y is up, and +z is traveling North. Effectively I have taken Lat/Long and projected it to OSGB.
I have a list of points in this coordinate system, but when I render them they are flipped on one axis The Z(North) axis so my point list looks incorrect.This is because the rendering API has the +z axis running the other way.
I was thinking my solution to this could be, have all my objects/3d models/points etc drawn in my "Real World" coordinate system, then at the last moment before I render then apply a Scale Matrix (1,1,-1) to each of the world matrices so that the Z Axis is flipped on everything.
So if my real world projected coordinate is: 281852; 161.488; 655844
After I apply my "RealWorldToXNA" matrix, the point will be 281852; 161.488; -655844;
I will then apply the same thing to my camera so it renders from the correct position.
Will this work or am I missing something? I haven't done a lot of 3d maths lately and have suffered a bout of cerebral flatulence. Part of my brain thinks this will work, but another part thinks it shouldn't be so simple.
FYI I used the solution in my question - just tested it and it did in-fact work as expected.

How to I get more reliable Y position tracking for the Google Tango in Unity?

We have a unity scene that uses arealearning which has been extremely reliable and consistent about XZ position. However we are noticing that sometimes the tango delta camera’s Y position will "jump up" very high in the scene. When we force the tango to relocalize (by covering the sensors for a few seconds), the Y position remains very off. At other times, the Y position varies a 0.5 - 1.5 unity units when we first start up our Unity app on the tango and are holding it in the exact same position in the exact same room using the same ADF file. Is there a way to get a more reliable Y position tracking and/or correct for these jumps?
(All the XYZ coordinate is in the Unity convention in this context, x is right, y is up, z is forward)
Y position should work same as XZ coordinates, it relocalized to the height based on the ADF origin.
But note that, the ADF's origin is where you started learning(recording) ADF. Let's say you started the learning session by holding the device normally, then the ADF's origin might be a little bit higher than ground level. When you construct a virtual world to relocalize, you should take the height difference into consideration.
Another thing to check is that making sure there's no offset or original location set for DeltaPoseController prefab. DeltaPoseController will take the initial starting transformation as a offset, and add up pose on it. For example, if my DeltaPoseController's starting position is at (0,1,0), and my pose from device is (0,1,1), then the actually position for DeltaPoseController in Unity would be (0,2,1). This applies to both translation and rotation.
Another advanced (and preferred) way of defining ground level is to use the depth sensor to find out the ground height. In the Unity Augmented Reality example, it showed how to detect the plane and place a marker on it. You can easily apply the similar method to the ground plane, do a PlaneFinding and place the ground at the right height in Unity world space.

Convert world to object coordinates

The iPhone gyroscope receives rotation data relative to some reference attitude and it doesn't change (unless multiplied.) Lets say I face the wall using my iPhone camera, and rotate 45 degrees left (roll += PI/4.)
Now, if I lift the phone towards the ceiling, both yaw and pitch change since the coordinate space is fixed (world coordinate space, doesn't move or rotate with the phone.) Is there a way to determine this angle (the one between the floor plane and the camera direction vector), roll, yaw and pitch given?
Edit: Instead of opening another question I'll try here. Luc's solution works. But how to get the other two angles of rotation? I've read the info on the posted link but it's been years since I studied linear algebra. This might be more math than a programming question, actually.
I don't really code for iPhone so I'll trust you on the "real world coordinates" frame.
In that case, you want the dot product between both z-axis' vectors. That'll give you the cosine of the angle you're looking for, pretty close thus. Since an angle between planes only really makes sense as a value between 0° and 90°, you actually have all the information you need in that cosine.
And there is no latex formatting here, otherwise I'd go into a bit more of detail, but read this page if you're interested, I'll just include the final result here, the rotation matrix for your three rotations :
Now the z-axis' vector of the horizontal plan is (0,0,1) (read this as a vertical vector though) and rotated with this matrix, you simply get its third column.
So we want to have the dot product between that third column and our (0,0,1) vector, so you get cos(β)cos(γ) which is cos(pitch)*cos(roll)
In conclusion, the angle between your plans is arccos(cos(pitch)*cos(roll)). This value will tell you how much your iPhone is inclined, not in which direction of course. But you can work that out from the values of the vector (rightmost column of the matrix) we spoke of.

What is this rotation behavior in XNA?

I am just starting out in XNA and have a question about rotation. When you multiply a vector by a rotation matrix in XNA, it goes counter-clockwise. This I understand.
However, let me give you an example of what I don't get. Let's say I load a random art asset into the pipeline. I then create some variable to increment every frame by 2 radians when the update method runs(testRot += 0.034906585f). The main thing of my confusion is, the asset rotates clockwise in this screen space. This confuses me as a rotation matrix will rotate a vector counter-clockwise.
One other thing, when I specify where my position vector is, as well as my origin, I understand that I am rotating about the origin. Am I to assume that there are perpendicular axis passing through this asset's origin as well? If so, where does rotation start from? In other words, am I starting rotation from the top of the Y-axis or the x-axis?
The XNA SpriteBatch works in Client Space. Where "up" is Y-, not Y+ (as in Cartesian space, projection space, and what most people usually select for their world space). This makes the rotation appear as clockwise (not counter-clockwise as it would in Cartesian space). The actual coordinates the rotation is producing are the same.
Rotations are relative, so they don't really "start" from any specified position.
If you are using maths functions like sin or cos or atan2, then absolute angles always start from the X+ axis as zero radians, and the positive rotation direction rotates towards Y+.
The order of operations of SpriteBatch looks something like this:
Sprite starts as a quad with the top-left corner at (0,0), its size being the same as its texture size (or SourceRectangle).
Translate the sprite back by its origin (thus placing its origin at (0,0)).
Scale the sprite
Rotate the sprite
Translate the sprite by its position
Apply the matrix from SpriteBatch.Begin
This places the sprite in Client Space.
Finally a matrix is applied to each batch to transform that Client Space into the Projection Space used by the GPU. (Projection space is from (-1,-1) at the bottom left of the viewport, to (1,1) in the top right.)
Since you are new to XNA, allow me to introduce a library that will greatly help you out while you learn. It is called XNA Debug Terminal and is an open source project that allows you to run arbitrary code during runtime. So you can see if your variables have the value you expect. All this happens in a terminal display on top of your game and without pausing your game. It can be downloaded at http://www.protohacks.net/xna_debug_terminal
It is free and very easy to setup so you really have nothing to lose.

Resources