I know this has been probably asked before but I've seen many approaches and i don't know which is best for me, so plz don't send me a link to another post unless it addresses my problem directly.
I have a controller which has a uiview on the top (like a header) (this header is bigger than it seems because is partially hidden on top). on that view i have a uibutton which now with a touch up inside shows the entire header view and taping again returns it to its starting position (changing frame with animation). I want to also be able to drag the view but only changing position on the y axis(dragging up and down)... i was thinking of adding the dragInside/Outside event to the button but this doesn't give me the position of the finger... and also want to know when the user releases the drag so the view ends animation to any of its two possible states (showing or partially hidden). Is this a "touches began" , "touches moved" , "touches ended" thing? if it is please provide a code example. I also want to do this with another view but this is on the left side... same thing but this one moves on the X axis... any help is appreciated. or maybe it can be made with drag event if i only can save a CGpoint of last touch, maybe that's better, any other suggestions
Look at using a UIPanGestureRecognizer to detect the touch movements. Use the translationInView: of the gesture to set the view y position. The translation is the total movement since the start of the gesture so you don't need to remember and accumulate the offset position yourself.
The main thing to worry about while implementing this is bounding the y position of the view so that no matter how far the user drags the view won't go too high or low on the screen.
Use a UIPanGestureRecognizer, that's a class dedicated to handling such drag/pan gestures.
Everything is described here in Apple's documentation, including examples, so you should find your answer here.
There is also some sample code in Apple Developer Library that shows you how to use Gesture Recognizers if needed.
Related
I'm currently working on an interactive view that relies heavily on the user's touch location. I have found that there are a few ways to interact with the UITapGestureRecognizer while VoiceOver is on, but when I tap my point the values given are very wrong. I've looked elsewhere, but my use case is outside of the norm so there is not a lot to tell me what is going on. Has anyone experienced this before?
I am aware that I can change accessibilityTrait to UIAccessibilityTraitAllowsDirectInteraction which will give me the correct screen point when used, but I would like to know what is causing this issue at the very least for the sake of knowledge. To interact with the UITapGestureRecognizer I either double tap or do a 3D touch by pressing on hard on the screen. The ladder method doesn't work for the tap gesture but will work for the pan gesture.
This is the only line I use to get my screen points. My map view is a UIImageView
CGPoint screenPoint = [tapGesture locationInView:map];
I'm using a map of a building and I try to tap the same corner or landmark for my testing. I know I can't hit the same exact point every time, but I do use a stylus and I can get pretty close.
Without VoiceOver on I would get the result: (35.500, 154.363)
With VoiceOver on and tapping in generally the same spot, I get : (187.500, 197.682)
The point I am using to test is on the left side of the screen and the result from VoiceOver being on is in the middle of the screen. I believe the y-axis value may have changed because of my tool bar's size, but I have no idea what is throwing off the x-axis value. If more information is needed let me know.
UPDATE: Upon further investigation, it turns out that the UITapGestureRecognizer will always return (187.500, 197.682) no matter where I touch in the map view when VoiceOver is on. That point seems to be the middle of the map view. Oddly enough though, the UIPanGestureRecognizer will give me the correct (x,y) for my view if I use the 3D touch while VoiceOver is on.
On a side note not relating to the problem at hand, it seems if I use the accessibility trait UIAccessibilityTraitAllowsDirectInteraction the method UIAccessibilityConvertFrameToScreenCoordinates returns a frame that is higher than my view. It works fine if I do not change the trait.
Your problem may deal with the reference point used when VoiceOver is on.
Verify what your point coordinates are referring to : view or screen coordinates ?
I suggest you take a look at the following elements :
accessibilityFrame
accessibilityFrameInContainerSpace
UIAccessibilityConvertFrameToScreenCoordinates
According to your project, the previous elements may be interesting to get your purposes.
So I have this project that I took from somebody else and they have implemented this OneFingerRotationGestureRecognizer (https://github.com/melle/OneFingerRotationGestureDemo/blob/master/OneFingerRotationGestureDemo/OneFingerRotationGestureRecognizer.m) for a circular slider. Additionally, they have added a UITapGestureRecognizer on top of that, so you could tap a value within that circular slider and the value would jump to that specific one. Now the problem is, when I drag that thing just a very small amount (imagine putting your thumb onto the control and tilting left/right), then the UITapGestureRecognizer also fires! And this is a problem, because I want to be able to grab the circular slider wherever I want (there is no handle or something). And when I only drag it a little, then the value just jumps to that spot where I did that small dragging. Somehow I need to cancel that tap gesture as soon as that OneFingerRotationGestureRecognizer started registering touches. I tried what is described here: https://developer.apple.com/documentation/uikit/touches_presses_and_gestures/coordinating_multiple_gesture_recognizers/preferring_one_gesture_over_another?language=objc but didn't have any success with that :-(.
What can I do? I'm afraid the solution is so simple that I just don't see it.
When swiping between stories in Instagrams new feature "Stories" (you know that cube-like transition when going from one story to another) I can't manage to understand how they do it!
First of all, if you dig deeper into the functionality you find that it works exactly like the UIPageViewControllers transition:
- It bounces when swiping fast from one view to another.
- You can pause the swipe in the middle of the transition by touching the screen.
The developing team couldn't have used a solution based on the more known workarounds out there, e.g:
https://www.appcoda.com/custom-view-controller-transitions-tutorial/
Because as far as I know my two statement above is not possible to achieve with anything else than the PageViewController.
This leaves me thinking that the Instagram Developer Team gained access to a new transition style for the PageViewController, also known as Cube-scroll, or is it a workaround that I'm not aware of?
Any ideas?
I took a shot at recreating this functionality a while back. You can check the source code on GitHub: https://github.com/oyvind-hauge/OHCubeView
I'm using a scroll view (with paging enabled) and, for each subview I'm manipulating these as a function of the given view's current x-offset in the scroll view. The actual animations are done on each subview's layer using Core Animation (more specifically, transforming an identity matrix, given by CATransform3DIdentity, using the method CATransform3DRotate).
The shadow effects are also applied to the subview's layers (view.layer.opacity), with the amount of shadow determined by how much of the view is showing on screen.
My implementation solves both of your concerns (bounces when swiping, can pause swipes). I'm sure this could have also been implemented using the a UIPageViewController, but I hate working with those.
I think you are overthinking the controller's part here. The effect can easily be achieved using a CATransformLayer and three-sided cube-like view structure, where there is one view which aligns with the screen plane, and two others rotated -90 and 90 degrees on their y axis. Then, getting a pan gesture to rotate the scene. After a successful 90 degree turn (in either direction), you can either quickly reset the scene (so that keeping on rotating appears as if continues, but actually the camera shifted back to initial position) or you can have a full 360 degree rotation, and just update previous and next "pages". A single controller can handle this scene. If you prefer to have each page as a controller, it is possible, you can still use one controller for the scene, and then use the page controllers as child controllers, and setting their views as described above.
See this article for more information on CATransformLayer. Their example already creates something that is quite close to what you need.
I'm facing a delicated problem handling touch events. This is problably not a usual stuff to make but i think it is possible. I just don't know how...
I have a Main View (A) and Main View (B) with a lot of subviews 1,2,3,4,5,...
MainView
SubView(A)
1
2
3
SubView(B)
1
2
3
Some of these sub sub views (1,2,4) are scrollviews.
It happens that I want to change between A and B with a two finger pan.
I have tried to attach a UIPanGestureRecognizer to MainView but the scrollviews cancel the touches and it only works sometimes.
I need a consistent method to first capture the touches, detect if it is a two finger pan, and only then decide if it will pass the touches downwards (or upwards... i'm not sure) the responder chain.
I tried to create a top level view to handle that, but I cant make to have the touches being passed through that view.
I have found a lot of people with similar problems but couldn't find a solution to this problem from their solutions.
If any one could give me a light, that would be great as i'm already desperate with this.
You can create a top level view to capture the touches and the coordinates of touches then you can check if the coordinates of touch is inside of the sub views. You can do that using
BOOL CGRectContainsPoint(CGRect rect, CGPoint point)
Method. Rect is a frame of the view, and point is point of touch.
Please not that the frames and touch locations are relative to their super views, therefore you need to convert them to coordinate system of app window.
Or maybe it can be more helpful
Receiving touch events on more then one UIView simultaneously
I've got a UIView which the user drags on and off of the screen (like a drawer).
The UIView has a bunch of buttons, some always visible, others hidden until the user 'opens' the drawer.
For some reason, the UIButtons that are 'off screen' on the initial UIView presentation aren't being passed events when they're later moved onto screen.
While the others receive all the events all the time.
Seems buggy to me, I would have thought the SDK would handle all this itself?
I've got a very simple example which you can take a look at: http://cl.ly/2r1c0k2p361B3B1A461L
Thanks in advance.
After a bit more research, it had to do with the views clipping boundaries.
The buttons were being drawn, but were essentially out of the views frame, so no events were registered.
More details on this question: Programmatically Created UIButtons Below Screen Not Working?