I built a SplitView class that looks like this picture below:
As you can see the SplitView always have two subviews, therefore it has two properties that are leftView and rightView.
The job of the SplitView is to manage its subview size proportion.
If I do a swipe gesture to the left, it will move the separator location and it changes the size of each subviews, so it will look like this:
It works perfectly until I use UITableView as the leftView and rightView.
This is because the UITableView inside it is the one which will process the touch event, not the superview (which is SplitView)
And because the code to intercept and respond to the touch is in SplitView, it doesn't do anything if the UITableView inside it is the one that receives the touch event.
To do this I implement hitTest to make the SplitView the responder, not the UITableView inside it.
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
return self
}
Now I can get the touch event in the SplitView even though the user swipe on the UITableView.
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
self.startInfo = StartInfo(timestamp: touch.timestamp,
touchLocation: touch.location(in: self),
separatorLocation: separatorView.frame.origin)
}
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first, let startInfo = startInfo {
let location = touch.location(in: self)
if moveHorizontally == nil {
let deltaX = abs(location.x - startInfo.touchLocation.x)
let deltaY = abs(location.y - startInfo.touchLocation.y)
if deltaX > 4.0 || deltaY > 4.0 {
moveHorizontally = deltaX > deltaY
}
}
if let moveHorizontally = moveHorizontally {
if moveHorizontally {
print("the user intends to adjust separator position")
adjustSeparatorViewPosition(usingTouchLocation: location)
} else {
print("the user intends to scroll the table view")
if rightView.frame.contains(location) {
rightView.touchesMoved(touches, with: event) // doesn't work
} else {
leftView.touchesMoved(touches, with: event) // doesn't work
}
}
}
}
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
startInfo = nil
moveHorizontally = nil
leftView.touchesEnded(touches, with: event)
guard let touch = touches.first else { return }
if touch.location(in: self).x < self.bounds.width / 2 {
UIView.animate(withDuration: 0.3) {
self.setSeparatorLocationX(to: self.separatorMinX)
}
} else {
UIView.animate(withDuration: 0.3) {
self.setSeparatorLocationX(to: self.separatorMaxX)
}
}
}
override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) {
startInfo = nil
moveHorizontally = nil
leftView.touchesCancelled(touches, with: event)
}
As you can see in this code, I added a delay, just like UIScrollView does when swiping to get the user intention, whether the user want to scroll horizontally or vertically.
- If the swipe direction is horizontal, I want to adjust the separator location
- If the swipe direction is vertical, I want to forward the event to the UITableView inside it (for example the leftView)
But, forwarding the touch event using rightView.touchesMoved(touches, with: event) doesn't work.
How to forward the touch event from the SplitView to the UITableView inside it?
Related
So I've been messing around trying to get the coordinates of touches on the screen. So far I can get the coordinates of one touch with this:
override func touchesBegan(touches: NSSet, withEvent event: UIEvent) {
let touch = touches.anyObject()! as UITouch
let location = touch.locationInView(self.view)
println(location)
}
But when touching with two fingers I only get the coordinates of the first touch. Multi-touch works (I tested with this little tutorial: http://www.techotopia.com/index.php/An_Example_Swift_iOS_8_Touch,_Multitouch_and_Tap_Application). So my question is, how do I get the coordinates of the second (and third, fourth...) touch?
** Updated to Swift 4 and Xcode 9 (8 Oct 2017) **
First of all, remember to enable multi-touch events by setting
self.view.isMultipleTouchEnabled = true
in your UIViewController's code, or using the appropriate storyboard option in Xcode:
Otherwise you'll always get a single touch in touchesBegan (see documentation here).
Then, inside touchesBegan, iterate over the set of touches to get their coordinates:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
for touch in touches {
let location = touch.location(in: self.view)
print(location)
}
}
the given touches argument is a set of detected touches.
You only see one touch because you select one of the touches with :
touches.anyObject() // Selects a random object (touch) from the set
In order to get all touches iterate the given set
for obj in touches.allObjects {
let touch = obj as UITouch
let location = touch.locationInView(self.view)
println(location)
}
You have to iterate over the different touches. That way you can access every touch.
for touch in touches{
//Handle touch
let touchLocation = touch.locationInView(self.view)
}
In Swift 1.2 this has changed, and touchesBegan now provides a Set of NSObjects.
To iterate through them, cast the touches collection as a Set of UITouch objects as follows:
override func touchesBegan(touches: Set<NSObject>, withEvent event: UIEvent) {
var touchSet = touches as! Set<UITouch>
for touch in touchSet{
let location = touch.locationInView(self.view)
println(location)
}
}
For Swift 3, based on #Andrew's answer :
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
let touchSet = touches
for touch in touchSet{
let location = touch.location(in: self.view)
print(location)
}
}
EDIT, My bad, that's not answering your question, had the same problem and someone linked me to this previous answer :
Anyway, I had to change few things to make it works in swift 3, here is my current code :
var fingers = [String?](repeating: nil, count:5)
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesBegan(touches, with: event)
for touch in touches{
let point = touch.location(in: self.view)
for (index,finger) in fingers.enumerated() {
if finger == nil {
fingers[index] = String(format: "%p", touch)
print("finger \(index+1): x=\(point.x) , y=\(point.y)")
break
}
}
}
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesMoved(touches, with: event)
for touch in touches {
let point = touch.location(in: self.view)
for (index,finger) in fingers.enumerated() {
if let finger = finger, finger == String(format: "%p", touch) {
print("finger \(index+1): x=\(point.x) , y=\(point.y)")
break
}
}
}
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesEnded(touches, with: event)
for touch in touches {
for (index,finger) in fingers.enumerated() {
if let finger = finger, finger == String(format: "%p", touch) {
fingers[index] = nil
break
}
}
}
}
I still have a little problem but I think it's linked to my GestureRecognizer in my code.
But that should do the trick.
It will print you the coordinate of each point in your consol.
In Swift 3,4
Identify touch pointer by its hash:
// SmallDraw
func pointerHashFromTouch(_ touch:UITouch) -> Int {
return Unmanaged.passUnretained(touch).toOpaque().hashValue
}
I have a drawing app with a canvas larger than the size of the phone screen. I want to implement scrolling with two fingers and drawing with one finger. So far I can make the scrolling work just fine but when it comes to drawing, the line begins and then the view where the drawing is loses control of the touch such that only the first part of the line is drawn. I think the scrollview takes control back. Dots can be draw just fine.
This is my subclassed UIScrollView
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
guard let touches = event?.touches(for: self) else { return }
if touches.count < 2 {
self.next?.touchesBegan(touches, with: event)
} else {
super.touchesBegan(touches, with: event)
}
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
guard let touches = event?.touches(for: self) else { return }
if touches.count < 2 {
self.next?.touchesEnded(touches, with: event)
} else {
super.touchesEnded(touches, with: event)
}
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
guard let touches = event?.touches(for: self) else { return }
if touches.count < 2 {
self.next?.touchesMoved(touches, with: event)
} else {
super.touchesMoved(touches, with: event)
}
}
override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) {
guard let touches = event?.touches(for: self) else { return }
if touches.count < 2 {
self.next?.touchesCancelled(touches, with: event)
} else {
super.touchesCancelled(touches, with: event)
}
}
override func touchesShouldCancel(in view: UIView) -> Bool {
if (type(of: view)) == UIScrollView.self {
return true
}
return false
}
You will need a Long Press Gesture Recognizer connected to your ScrollerView, set with Min Duration 0 seconds, also to recognize only 1 Touch and Cancel touches in view option active.
You can find all these options under the Attributes Inspector on Interface Builder.
Please play a little with the Tolerance settings to fine tune the results.
Is it possible to detect touches and get the location of a touch from a UIViewController which is being currently used as previewingContext view controller for 3D Touch? (I want to change the image of within the preview controller when the touch moves from left to right)
I've tried both touchesBegan and touchesMoved none of them are fired.
class ThreeDTouchPreviewController: UIViewController {
func getLocationFromTouch(touches: Set<UITouch>) -> CGPoint?{
guard let touch = touches.first else { return nil }
return touch.location(in: self.view)
}
//Not fired
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
let location = getLocationFromTouch(touches: touches)
print("LOCATION", location)
}
//Not fired
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
let location = getLocationFromTouch(touches: touches)
print("LOCATION", location)
}
}
Even tried adding a UIPanGesture.
Attempting to replicate FaceBook's 3D Touch feature where a user can move finger from left to right to change the current image being displayed.
Video for context: https://streamable.com/ilnln
If you want to achive result same as in FaceBook's 3D Touch feature you need to create your own 3D Touch gesture class
import UIKit.UIGestureRecognizerSubclass
class ForceTouchGestureRecognizer: UIGestureRecognizer {
var forceValue: CGFloat = 0
var isForceTouch: Bool = false
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent) {
super.touchesBegan(touches, with: event)
handleForceWithTouches(touches: touches)
state = .began
self.isForceTouch = false
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent) {
super.touchesMoved(touches, with: event)
handleForceWithTouches(touches: touches)
if self.forceValue > 6.0 {
state = .changed
self.isForceTouch = true
}
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent) {
super.touchesEnded(touches, with: event)
state = .ended
handleForceWithTouches(touches: touches)
}
override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent) {
super.touchesCancelled(touches, with: event)
state = .cancelled
handleForceWithTouches(touches: touches)
}
func handleForceWithTouches(touches: Set<UITouch>) {
if touches.count != 1 {
state = .failed
return
}
guard let touch = touches.first else {
state = .failed
return
}
forceValue = touch.force
}
}
and now you can add this gesture in your ViewController in viewDidLoad method
override func viewDidLoad() {
super.viewDidLoad()
let gesture = ForceTouchGestureRecognizer(target: self, action: #selector(imagePressed(sender:)))
self.view.addGestureRecognizer(gesture)
}
Now you can manage your controller UI in Storyboard. Add cover view above UICollectionView and UIImageView in a center and connect it with IBOutlets in code.
Now you can add handler methods for gesture
func imagePressed(sender: ForceTouchGestureRecognizer) {
let location = sender.location(in: self.view)
guard let indexPath = collectionView?.indexPathForItem(
at: location) else { return }
let image = self.images[indexPath.row]
switch sender.state {
case .changed:
if sender.isForceTouch {
self.coverView?.isHidden = false
self.selectedImageView?.isHidden = false
self.selectedImageView?.image = image
}
case .ended:
print("force: \(sender.forceValue)")
if sender.isForceTouch {
self.coverView?.isHidden = true
self.selectedImageView?.isHidden = true
self.selectedImageView?.image = nil
} else {
//TODO: handle selecting items of UICollectionView here,
//you can refer to this SO question for more info: https://stackoverflow.com/questions/42372609/collectionview-didnt-call-didselectitematindexpath-when-superview-has-gesture
print("Did select row at indexPath: \(indexPath)")
self.collectionView?.selectItem(at: indexPath, animated: true, scrollPosition: .centeredVertically)
}
default: break
}
}
From this point you need to customize your view to make it look in same way as Facebook do.
Also I created small example project on GitHub https://github.com/ChernyshenkoTaras/3DTouchExample to demonstrate it
In my main view in a UIViewController I have a mapView and a another view (Let's say view A) that is above mapView. Both of them have frames equal to self.view.bounds. The view A is a rectangle that is resizable similar to those used to crop images. My goal here is to let the user specify an area on the map. So, I want the user to be able to zoom in an out of the map as well as change the rectangle width and height proportions since only letting the view A to be an unrealizable square would limit it too much.
I got this project from GitHub https://github.com/justwudi/WDImagePicker from which I am using the resizable rectangle functionality. In the second picture of the Github link, there's a rectangle with 8 dots and a shaded area outside. I want to be able to let the touch pass to the map which is behind the view A if the user touches on the shaded area. Only if the user clicks on the area inside the dots or on the dots (so that he wants to resize the rectangle) I want the view A to recognize the touch. So, I modified the code on touch on the view A and have this:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
if cropBorderView.frame.contains(touch.location(in: self)){
print("touch contains - touchesbegan")
//self.isUserInteractionEnabled = true
}
else{
print("Touch does not contain - touchesbegan")
self.touchesCancelled(touches, with: event)
//return
}
let touchPoint = touch.location(in: cropBorderView)
anchor = self.calculateAnchorBorder(touchPoint)
fillMultiplyer()
resizingEnabled = true
startPoint = touch.location(in: self.superview)
}
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
print("inside touches moved")
if let touch = touches.first {
if cropBorderView.frame.contains(touch.location(in: self)){
print("touch contains - touchesmoved")
//self.isUserInteractionEnabled = true
}
else{
print("Touch does not contain - touchesmoved ")
self.touchesCancelled(touches, with: event)
//return
}
if resizingEnabled! {
self.resizeWithTouchPoint(touch.location(in: self.superview))
}
}
}
It is indeed recognizing the touch when I click inside and outside as I wanted, but it is not stopping the touch when I click outside. This means calling self.touchesCancelled(touches, with: event) is not working. Calling return gives a crash and does not work as well. Are there any solutions to this problem?
Thank you for your time and consideration.
touchesCancelled(_:with:) just a notification for UITouch, it will not work this way.
As far as I understand, you implemented touch handlers in your overlay UIView, if so, you can try to replace the call to self.touchesCancelled(touches, with: event) with cancelTracking(with:) function from UIControl class implementation:
else {
print("Touch does not contain - touchesmoved ")
self.cancelTracking(with event)
}
Update solution, based on hitTest:
I've checked possible solutions and it seems that you can use hitTest: to avoid unnecessary touch recognitions. The following example is Swift Playground, you can tap and drag touches and see what happens in the console:
import UIKit
import PlaygroundSupport
class JSGView : UIView {
var centerView = UIView()
override func didMoveToSuperview() {
frame = CGRect(x: 0, y: 0, width: 320, height: 480)
backgroundColor = UIColor.clear
centerView.frame = CGRect(x: 110, y: 190, width: 100, height: 100)
centerView.backgroundColor = UIColor.blue
addSubview(centerView)
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
dump(event)
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
if (hitTest(touch.location(in: self), with: event) != nil) {
print("Touch passed hit test and seems valid")
super.touchesCancelled(touches, with: event)
return
}
}
print("Touch isn't passed hit test and will be ignored")
super.touchesMoved(touches, with: event)
}
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
if centerView.bounds.contains(centerView.convert(point, from: self)) {
return centerView
}
return nil
}
}
class JSGViewController : UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
view.frame = CGRect(x: 0, y: 0, width: 320, height: 480)
let customView = JSGView()
view.addSubview(customView)
}
}
let controller = JSGViewController()
PlaygroundPage.current.liveView = controller.view
In my AppDelegate, I want to detect when a tap event has been made to the status bar. In order to do so, I need to get the CGPoint from the event. How do I get it from this code?
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
super.touchesBegan(touches, withEvent: event)
let location = // how to get a CGPoint ????
let statusBarFrame = UIApplication.sharedApplication().statusBarFrame
if CGRectContainsPoint(statusBarFrame, location){
print("Status bar touched")
}else{
print("Not touched")
}
}
You're using Swift 2; you could retrieve the tap location as follows:
override func touchesBegan(touches: Set<NSObject>, withEvent event: UIEvent)
{
if let touch = touches.first as? UITouch
{
//The view you would like to get the tap location from.
let tapPoint = touch.locationInView(self.view)
let statusBarFrame = UIApplication.sharedApplication().statusBarFrame
if CGRectContainsPoint(statusBarFrame, tapPoint)
{
print("Status bar touched")
}
else
{
print("Not touched")
}
}
}
A UITouch object has a method locationInView: that allows you to find the location of a touch in a particular view. From the docs:
Returns the current location of the receiver in the coordinate system of the given view.
The view object in whose coordinate system you want the touch located. A custom view that is handling the touch may specify self to get the touch location in its own coordinate system. Pass nil to get the touch location in the window’s coordinates.
So try:
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
super.touchesBegan(touches, withEvent: event)
guard let touch = touches.first else {
return
}
let location = touch.locationInView(nil)
let statusBarFrame = UIApplication.sharedApplication().statusBarFrame
if CGRectContainsPoint(statusBarFrame, location) {
print("Status bar touched")
} else {
print("Not touched")
}
}
This works regardless of device orientation
Swift 5.1
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesBegan(touches, with: event)
guard let touch = touches.first else {
return
}
let location = touch.location(in: nil)
}
You'd have to have a view that you touched. If you've managed to get your app delegate to receive touch events, you could try getting the locationInView: using the app delegates window.