Game in Swift 2 - "touchesBegan"? - ios

I want to build a game that when I touch and drag, in the place of the touch it creates a line that goes in the same direction as the drag, and ends in the boundaries of the game (in this case my frame).
Any tips of how could I approach this problem?

Your search term in UIPanGestureRecognizer.
You'd create a pan gesture recognizer and attach it to your view. In the method that it calls, look for the state UIGestureRecognizerStateBegan. Record the start position, then respond to calls to your action method with the state UIGestureRecognizerStateChanged.
That should be enough to get you started. As Matt says, this is a fairly common use case, so you should be able to find sample code online if you can't work this out from the docs.

Related

How to terminate a SCNTransaction ongoing?

I use SCNTransaction to move game objects. More specifically, when the player taps somewhere on the screen, the object will move towards that destination. But sometimes the player may make a wrong move, so I want to create a button which can terminate all SCNTransactions.
However, unlike SKAction, which can be terminated with a simple line - self.removeAllActions(), SCNTransaction cannot be terminated or even paused from the outside according to the Apple Developer Documentation. Even worse, I find that before the object reaches its destination, its position has already changed to the destination's position, so I cannot simply use another SCNTransaction to counteract the ongoing one after knowing the object's current position.
Can anybody give me some hints? Thanks a lot.
SCNTransaction and its animation principles follow the one of Core Animation and CATransaction. To stop an animation you will have to set the model value to the current presentation value. For instance:
node.position = node.presentation.position
But if you are familiar with SKAction and would like to implement the same logic in your SceneKit app, you might want to have a look at SCNAction. They work identically.

Begin UIPanGesture Event From A Pressed State At Time Of Instantiation

Is there a way to begin a UIPanGestureEvent if the finger is already pressed at the time the object is instantiated?
I have a situation where when a user holds their find on a screen I create a UIView under their finger.
I want them to be able to drag that around and as such I have put a UIPanGestureRecognizer inside the UIView.
Problem is I need to take my finger off and put it back to trigger the UIPanGestureRecognizer to start up. I need it to start from an already pressed state.
Do you know how I can activate a UIPanGesture from an already pressed state i.e. can I get the touch event thats already active at the time of instantiation and pass it along?
You can do it, but the UIPanGestureRecognizer will need to exist already on the view behind the view you create (and you will then have to adjust your calculations based on this; not difficult).
The reason is that, under the circumstances you describe, the touch does not belong to the UIView you create - it belongs to the UIView behind it, the one that the user was originally touching. And given the nature of iOS touch delivery, you can't readily change that. So it will be simpler to let that view, the actual original touch view, do the processing of this touch.
I think Matt's solution is best so I am going to mark it as correct.
However my code structure wasn't going to allow me to cleanly implement it. Compounding the issue was the object listening was listening for a UILongGestureRecognizer.
So my solution was as follows:
Create a callback in my ViewController that would handle the longGestureOverride call
Add a callback to the object listening for the longGesture that would call the longGestureOverride callback and pass along the point
Manually move the object based on the point passed back
If the user lifts their finger, I disable the longGestureOverride callback, and begin using the UIPanGesture inside the new object

Slide Effect for iOS

I'm new to developing iOS apps,
I've successfully implemented a Swipe Gesture Recognizer,
What I was wondering is if there is an easy to use recognizer like the swipe gesture. That would let you implement the homescreen page turning effect but just on a small view in the view controller?
If your unclear on what effect I mean, when you look at the iPhone's homescreen you can drag your finger and it responds instantly (unlike swipe) and also has some spring feeling to it, is this some effect I can use, or do I manually have to program this into the code if so is there a tutorial that explains this?
Thanks,
I hope my question makes sense.
Have a look at UIPanGestureRecognizer:
https://developer.apple.com/library/ios/documentation/uikit/reference/UIPanGestureRecognizer_Class/Reference/Reference.html
UIPanGestureRecognizer is a concrete subclass of UIGestureRecognizer
that looks for panning (dragging) gestures. The user must be pressing
one or more fingers on a view while they pan it. Clients implementing
the action method for this gesture recognizer can ask it for the
current translation and velocity of the gesture.
A panning gesture is continuous. It begins
(UIGestureRecognizerStateBegan) when the minimum number of fingers
allowed (minimumNumberOfTouches) has moved enough to be considered a
pan. It changes (UIGestureRecognizerStateChanged) when a finger moves
while at least the minimum number of fingers are pressed down. It ends
(UIGestureRecognizerStateEnded) when all fingers are lifted.
Clients of this class can, in their action methods, query the
UIPanGestureRecognizer object for the current translation of the
gesture (translationInView:) and the velocity of the translation
(velocityInView:). They can specify the view whose coordinate system
should be used for the translation and velocity values. Clients may
also reset the translation to a desired value.
Edit: The spring feeling part you would need to implement yourself. Since iOS 7 there is UIDynamics which contains different animators, for what you describe you may need UIGravityBehavior and maybe UICollisionBehaviour. Look at the WWDC 2013 videos for this topic, I think you will find some examples there.

a last touch event does not come if the view goes out of the screen programmatically

I have a situation where I apply an effect to a UIView when a touch begins and reverse that effect when that touch ends. So basically I am tracking touchesbegan, touchesEnded and touchesCancelled methods of UIView.
But the problem is that when the view goes out of the screen, i.e. when it or one of its parents gets removed from superview, it does not get any more touch events. Is there any way to give this "last" touchesended event to the view? Maybe if the UIView gets notified about being invisible, I can also use this event for that purpose.
Ok I am going to move the answers in comments to original question to make a good summary of important points.
The reason I am tracking touch events is that I want to apply some
nice effects such as glowing on touch start and remove those effects
on touch ending.
The reason why I can not simulate touchesEnded on removing those
views is that I do not directly remove them. Instead I remove one of
the ancestor views of them. I can not keep track of ancestor views
all the way to UIWindow, it is technically impossible I think.
Instead, framework should provide this to as an event I think.
I solved my problem by overriding -(void)willMoveToWindow:(UIWindow *)newWindow method and checking if newWindow is nil.

How do I implement multitouch on iOS

I'd like to implement multitouch, and I was hoping to get some sanity checks from the brilliant folks here. :)
From what I can tell, my strategy to detect and track multitouch is going to be to use the touchesBegan _Moved and _Ended methods and use the allTouches method of the event parameter to get visibility on all relevant touches at any particular time.
I was thinking I'd essentially use the previousLocationInView as a way of linking touches that come in with my new events with the currently active touches, i.e. if there is a touchBegan for one that is at x,y = 10,14, then I can use the previous location of a touch in the next message to know which one this new touch is tied to as a way of keeping track of one finger's continuous motion etc. Does this make sense? If it does make sense, is there a better way to do it? I cannot hold onto UITouch or UIEvent pointers as a way of identifying touches with previous touches, so I cannot go that route. All I can think to do is tie them together via their previouslocationInView value (and to know which are 'new' touches).
You might want to take a look at gesture recognizers. From Apple's docs,
You could implement the touch-event handling code to recognize and handle these gestures, but that code would be complex, possibly buggy, and take some time to write. Alternatively, you could simplify the interpretation and handling of common gestures by using one of the gesture recognizer classes introduced in iOS 3.2. To use a gesture recognizer, you instantiate it, attach it to the view receiving touches, configure it, and assign it an action selector and a target object. When the gesture recognizer recognizes its gesture, it sends an action message to the target, allowing the target to respond to the gesture.
See the article on Gesture Recognizers and specifically the section titled "Creating Custom Gesture Recognizers." You will need an Apple Developer Center account to access this.

Resources