Transferring billiard game state from player to player with box2d - ios

Im using Union server for my iOS (Starling) billiard game.
So far the connections work great.
My question is:
How would you handle the transfer of the ball positions from the opponent.
Lets say I make the break, and I want to copy that shot to the other player?
Do you think its a good idea to send a message over union every frame (x, y)?
Will this cause latency problems?

First about your question:
The game is installed on both devices, the rules are same. So on shot send the white balls force and all other properties you modify. On the receiving device add that force etc. and the ball will repeat the action done by the sending user.
Now the sad part:
I will be the first one to disappoint you - even if you solve your problem and send the message from player to player with out problems the result won't be pleasant: box2d calculations are optimised for performance, but the result will differ as it's calculated with approximate accuracy. As a result even on the same device the balls will end up in a different location on different runs. You won't notice it in one-two hits, but after playing for a minute you'll end up with deferent ball locations.
You can also try to send a message of every ball position in space after all balls stop moving and relocate the remote users ball positions. After that "correcting" message was received return the control to the player.
We had a similar game and I just wrote my own 2d engine. If you're working only with ball to ball and ball to rectangle collisions it's easy to write your own engine.

Related

ARKit Tracking - offsetting phone movement

Currently I am trying to record the movements and rotations of my SCNNode. I am recording the movement data to a csv and then examining it afterwards. Everything is working fine except for the fact that when you move the phone, the data changes because of the SCNNode changing in world space. To elaborate, the node isn't moving or rotating but the movement of the phone is messing with the data in a way that makes it looks like it's moving.
I have read Apple's documentation about ARSessionConfiguration.worldAlignment and I think it could be possible to cancel out the movement of the phone using the gravity property of the node (default worldAlignment).
Does anyone have any advice as to how I could achieve this?
Update:
As mentioned above my original approach to solving this was to change the ARSessionConfiguration. When you change that, the only thing that really changes is where the SCNNode starts in the world space. Therefore, the change did not effect how the movement is represented.

iOS game center game is out of sync

I've been building a game center game for iOS and it works great so far. I finally started testing the game and theres (obviously) some latency, which is causing the game to be out of sync.
Basically I have 2 players, each controlling a game character. The game character can shoot fireballs and iceballs etc. These attacks do damage, and they have effects. For example. the ice ball effect will freeze the opponent for 3 seconds if it makes contact. the fireball will do extended fire damage for 3 seconds.
so the problem is, when I was playing against my brother, because of the latency, my game said I had 40 health left and he was down to 0, while on his device, the game said I had 0 and he had 20.
This means that attacks were registering/colliding on one device (based on the positions of the characters/fireballs) and not on the other. and vice versa.
I'm currently using the default peer-to-peer game-center architecture. Would using a client-server architecture (one person becomes the server) solve this out-of-sync problem?
if not, what other options do I have with the game kit API?
I've found a solution to this problem.
Note: this will be a relatively long answer.
One thing that I did implement was the use of udp for some of my data transfers that are not as critical as others. For example, since I'm sending movement data about 10 times a second, I figured its ok if 1 or 2 of the 10 get lost once in a while.
Now to the actual problem:
So what was happening is the following- since I'm using a p2p architecture, both clients have a delay when they see the game 'world'. This means that on my device, I see the enemy player at a position that is delayed 100-200 milli seconds. (so he was actually there 100-200 ms in the past).
The problem with this, is when I shoot a projectile at the enemy, and I see a collision, if the collision was right on the edge of his sprite (his feet or head), in his screen, he was already past that position - PLUS - my fireball appears 100ms delayed on his screen. This means on his device, he was able to dodge my attack. This can happen a random number of times, but I'd guess its probably below 30% of the time. 70%, both devices see the collision.
The solution
What I came up with is to send a message to the other player when either device sees a collision. And remember, each device has no idea if the other saw a collision or not. So I have no way of knowing whether both devices saw a collision, or only 1, and the other saw that the attack was dodged.
This means I have to send a message to the other player every time theres a collision. Now, because of the way I've architected my game, when a collision is sent, I'm automatically applying the collision event (meaning, damage dealt, projectile effect that took place - like ice bolt freezing the player it collided with) This is problematic, because what if both devices saw the collision. That means both devices are sending each other a message of collision, and applying collisions again.
To get around this, I've added a "spell number" to each spell/attack, and when a collision happens, I save this number with the player that the collision happened with as the "last collision spell number". So if a collision takes place with that spell, the player knows he/she already collided with that specific object, so collision logic doesn't run twice.

handle sound triggered on contact without SCNAudioSource

How does one handle sound triggered on contact with other nodes in SceneKit without SCNAudioSource class (SCNAudioSource is only available in iOS9). What frameworks should be used, which are preferable?
I will be doing the following: I am just animating a box falling on the ground and than roll a few times. It would be nice if a could ad a realistic "hit" sound every time the box hits the ground. I have also implemented a method that calculates the current linear and rotational speed so that I can link the volume level of a hit proportional to the speed of box at that time.
I was thinking I could use physicsWorld(world: SCNPhysicsWorld, didBeginContact contact: SCNPhysicsContact)and produce the sound in here, but I am a little sceptical, because this gets called really a lot.

Is it possible to determine that all Sprite Kit scene collisions have played out?

I've made my way through the pages of SO questions regarding sprite-kit, the docs, available demos, and the vast empty space that is my brain to try and figure this one out but have come up short.
Working on a proof-of-concept that is a turn-based (using Game Center), physics-based game. Players take turns placing a node amongst a collection of other nodes and letting the physics play out. Where I'm flummoxed is in determining that all collisions have played out and the turn has ended.
It would seem that the nodes never stop colliding.
This is a tricky task with physics simulations.
First of all, with gravity enabled you always have a force acting upon bodies that may prevent it from resting. So you need to test every body every couple of steps (or every frame if you must) to see whether its velocity vector length is smaller than a given threshold. Then manually set the body's resting state to YES.
The bodies may wake up again during contacts, so it is crucial to perform this step after contact resolution in didSimulatePhysics.
The trick is in finding a threshold that is large enough to guarantee an end to the "physics entanglements" while not being too high to make objects stop where they clearly shouldn't.
Even then you probably also need to set a timer that starts when every body's velocity is below another (higher) threshold indicating that there will likely be little forces coming from those bodies anyway, and if after a grace period (ie 10 seconds) the bodies are still not all resting but below this threshold, consider the collisions resolved (as far as the game is concerned).
Also the number of bodies still not resting may be an indicator to end the game, ie if there are only 3 or 4 bodies of dozens still moving, there's probably not much going to happen as well.

iOS how to record moves and play them back

I have a game where user controls a character (with his finger) and i would like to add functionality so the user can record his moves while playing and then play it back. The problem is that the game includes physics and i guess it would be very hard to replicate the exact same moves. How can i implement such a system that will perfectly replay all the users actions? Do i have to record every touch and then play back all the touches? Does any one have any experience with this? I am using Box2D for physics.
"We record replays by storing keystrokes and frame numbers" - box2d.org/forum/viewtopic.php?f=3&t=1982&view=next Seems as if it's the only way to do it. Write these to a PLIST or something and you'll have your replays. Also, if your physics isn't already deterministic (ie. random) then just take down the random values too)
From the comments:
"Just record all the position and rotation state for all the objects every frame (or every other possibly), then, when you want to play things back simply skip the physics engine entirely and just reposition your objects every frame from your recorded position/rotation states.
All you'll need to do is make sure your frames for the play-back are the same duration as they were when the physics was running."

Resources