For my custom UIView I've overriden touchesBegan method. What I told it to do is to change its' layer's background color:
dispatch_async(dispatch_get_main_queue()){
self.layer.backgroundColor = clr_someCGColor
}
It acts weird. If I quickly tap the view while in Landscape it does everything perfectly well, but if I do this in Portrait, I have to hold it for some time to see the result, however the touchesEnded method is called right away, if I quickly tap. What could be the reason, causing the delay in Portrait?
Remove the dispatch_async wrapper. All it does is cause a delay (we can't execute on the main thread until, as you rightly say, the tap ends and touchesEnded has come and gone). You are already on the main thread, in touchesBegan, so there is no need for this extra delay.
Even better, use a tap gesture recognizer.
Related
I am implementing touchesBegan and touchesEnded in my iOS application trying to detect when the user puts a finger on the screen and when he releases it.
The problem I am having is that as soon as touchesBegan gets called, if the user rotates the device while still holding his finger on the screen, when he lets go of the screen, touchesEnded does not get called.
Does anyone know why this might be happening?
Are you getting touchesCancelled instead?
In general, the system will call either touchesEnded or touchesCancelled after a touchesBegan, so code should deal with both. Touches can be cancelled for various reasons, such as a gesture recognizer taking over, an non-interactive animation starting on the view, an incoming phone call, etc.
I'm making an animation in a CALayer that needs to update smoothly over time. I tried the following approaches:
CADisplayLink for iOS, CVDisplayLink for OS X, plus calling setNeedsDisplay from the callback
CABasicAnimation for a custom displayTime property, with fromValue=0, toValue=1, duration=1, repeatCount=HUGE_VALF, cumulative=YES, and needsDisplayForKey:#"displayTime" returning YES.
These both work to draw the animation, but I'm not sure which is better. Does Core Animation internally use a display link for updating animations?
Furthermore, neither of these allow me to know when the layer comes onscreen or is offscreen. Even the animation continues to run when the NSWindow is closed (-drawInContext: is continuously called). The kCAOnOrderOut action is not triggered, nor is -setHidden: called. How can the layer tell when it is on or offscreen?
I'm working on an app where the user is expected to rapidly touch and swipe across multiple UIViews, each of which is supposed to do an action once the user's finger has reached it. I've got a lot of views and so the typical thing to do, where I'd iterate over each view to see if a touch is inside of its bounds, is a no-go - there's just too much lag. Is there any other way to get touch events from one view to another (that is beside the first one)? I thought maybe there is some way to cancel the touch event, but I've searched and so far have come up empty.
One of the big problems I have is that if I implement my touch handling in my view controller, touchesBegan only fires for the first touch - if the user touches something and then, without moving the first finger, taps on something else, that tap is not recorded in either touchesBegan or touchesMoved. But if I implement my touch handling in the UIViews themselves, once a view registers a touch, if the user does not lift their finger up and moves it, the views around the first view do not register the touch. Only if the user lifts his finger and then puts it back down will the surrounding views register the touch.
So my question is, lets say I have two views side by side, my touch handling code is implemented in the views, and I put my finger down on view 1. I then slide my finger over to view 2 - what do I need to do to make view 2 register that touch, which started in view 1 and never "ended"?
Set userInteractionEnabled property of UIView to NO.
view.userInteractionEnabled = NO;
UIView has the following property:
#property(nonatomic, getter=isUserInteractionEnabled) BOOL userInteractionEnabled
Ok, I figured out what was going on. Thing is, I have my views as subviews of a scrollview, which is itself a subview of my main view. With scrollEnabled = NO, I could touch my subviews - but apparently the scrollview was only forwarding me the initial touch event, and all subsequent touches were part of that initial event. Because of that, I had many weird problems such as touching two views one after the other, both would select and highlight, but if I took the first finger off the screen both views would de-select. This was not the desired behavior.
So what I did is I subclassed the scrollview and overrode the touch handling methods to send the events to its first responder, which is its superview, which is the view where I'm doing my touch handling. Now it works!
I have a UITableView with Custom Cells, when you touch the cell it adds a subview of a UIImageView.
I want the image to disappear after the user lifts there finger and i have a touchesEnded method on the UIImageVIew Subview but its never called.
It is only called if you lift your finger then press it down again and release it.
How do I get the method to be called on the original touch ended.
Im kinda going for what Snapchat does when you view images.
The reason the subview does not get the touchesEnded event is that it has not received the touchesBegan event: these two come in pairs - whichever view gets the touches began is going to get the touches ended. Your UIImageView could not get touchesEnded because it wasn't there at the time; it gets touchesEnded the second time around when you press down and release because it's there for both events.
There are several ways around this problem:
Process view removal in the same place where you process the addition of UIImageView - when you add the subview, store a __weak reference to it in a separate variable. When the view that added the UIImageView gets the touchesEnded event, go to that variable, and remove subview.
Keep UIImageView there, but control its transparency - rather than adding and removing the subvuew, start it as fully transparent, then make it opaque on touch, then make it transparent again on release.
Don't add the temporary UIImageView at all, use CALayer instead - it looks like you are adding the image view simply to host an image in it for a short time. There is a simpler way of doing it that's much lighter-way - using CALayer. This approach should be easier to implement, because the layer does not participate in handling of touch events.
Presumably the touchBegan method is on the Cell. That's when the subview gets added.
The touch has already been recorded by the Cell. The Cell's touchesEnded is what's going to be called when you lift up your finger. So that's where you need to handle removing the subview from the screen. Save a reference to the subview in the cell class, and if it is not equal to nil, remove it on touchesEnded. Simple as that.
The touch began on the Cell, before the subview existed. That same touch is going to end on the cell. You can't transfer the touch to another view while it's in progress.
When you touch that view and raise your finger, you should not drag your finger. If you did like that, then touchesCancelled: method will get called. So I think your view is too small to touch. If yes, then make a big View and try it again. It will work for you then.
I have a situation where I apply an effect to a UIView when a touch begins and reverse that effect when that touch ends. So basically I am tracking touchesbegan, touchesEnded and touchesCancelled methods of UIView.
But the problem is that when the view goes out of the screen, i.e. when it or one of its parents gets removed from superview, it does not get any more touch events. Is there any way to give this "last" touchesended event to the view? Maybe if the UIView gets notified about being invisible, I can also use this event for that purpose.
Ok I am going to move the answers in comments to original question to make a good summary of important points.
The reason I am tracking touch events is that I want to apply some
nice effects such as glowing on touch start and remove those effects
on touch ending.
The reason why I can not simulate touchesEnded on removing those
views is that I do not directly remove them. Instead I remove one of
the ancestor views of them. I can not keep track of ancestor views
all the way to UIWindow, it is technically impossible I think.
Instead, framework should provide this to as an event I think.
I solved my problem by overriding -(void)willMoveToWindow:(UIWindow *)newWindow method and checking if newWindow is nil.