UIWebview GestureRecognizer for showing a toolbar - ios

I have a Webview (inside a view) and a toolbar which is hidden most of the time.
This is quite common behaviour for ipad magazines:
Tapping on the page will hide and display the toolbar, but the toolbar is hidden by default.
I am using shouldRecognizeSimultaneouslyWithGestureRecognizer
The behaviour right now does this:
- If the user taps the page (webview) it toggles the toolbar state using gesture recognizer
- if the user taps the page and there is an interactive element such as a weblink within the UIWebview, it responds to that interactive link but ALSO toggles the toolbar.
The desired behaviour is this:
- if the user taps the page on a non interactive area, it toggles the toolbar state
- if the user taps the page on an interactive area it ONLY responds to the webview interaction and does NOT toggle the toolbar.
Note there is an almost identical question here:
Gesture recognition with UIWebView
Even though it is marked as resolved if you read it through you will see the solution did not work for the poster and he is still getting a dual response when he (and I) want an either or response. I did try posting a follow up question but that was deleted probably because the moderator believed it was resolved

If the UIWebView, for whatever reason, catches the touch, you're not going to be able to get the UIGestureRecognizer callback as well. My only recomendation is to make whatever happens in the website when you tap on something execute some javascript, and then catch that Javascript. You can take this as an example.

Related

React Native press and hold, drag finger to another touchable and capture touch by that view

In my React Native app I'm trying to have a button that the user can long press, and without lifting their finger, able to interact with another view. Here is roughly what I want:
Think of it like how 3D touch/long press worked prior to iOS 13/14 (depending on place in system and device): user either 3D touched or long pressed a button, for example an app icon, and a contextual menu popped up. Then, users could, without lifting the finger, hover onto one of the buttons and release their finger, triggering the button tap.
I have complete control over my buttons, touchables, and views (even the tab bar is custom, as opposed to the illustrations I made above).
How can I achieve this? (I'm on React Native 0.63)
There may be a better solution to this but off the top of my head I would use the Gesture Responder System
https://reactnative.dev/docs/gesture-responder-system
You can have a one container view that wraps tab bar and buttons. Then listen to the onResponderMove event to decide when these buttons should appear. This may happen for example when the locationY exceeds some value.
You can also use the onResponderRelease event (again with the help of locationX and locationY parameters) to determine if the finger was released above the button.

How Do I Trigger a Button By Sliding Onto It?

I'm working on an app with a musical keyboard component.
I need 2 types of "sent events" to trigger the keys of the keyboard (UIButtons).
1) "Touch Down" triggers the buttons they way I need it to
2) The 2nd way I need buttons to be triggered is by sliding onto a button,from another button/key to the side of it as if it is "touched down" upon, when it is slid upon from the left or right.
How do I achieve this?
You can't do this using the built-in control events of the buttons, for the simple reason that you don't get an event in a button at all unless the touch is initially in that button (as I explain here: https://stackoverflow.com/a/40414929/341994).
Still, this doesn't sound very hard to do. The simplest approach is probably to put the touch response (such as a gesture recognizer) into the common superview of all the buttons. The superview can then track the gesture. And it can very easily find out which button the touch is currently inside at any given moment. So it can manage the whole interaction. It can even send messages to the buttons telling them when to highlight and unhighlight. (And if you aren't going to use the button touch handling for anything, you might even want to give up the idea that these are buttons; they could just be views or custom controls that look like buttons.)

iOS UIPageViewController page control getting out of sync

Whenever I swipe the page controller and tap the UIPageControl at the bottom in the opposite direction of the swipe at the same time, the page that is currently being displayed and the page number in the pageControl will be out of sync.
Has anyone ever had this weird issue and solved it?
Let me know if you need any additional info.
Just checked out the docs for UIPageControl. I never realized this myself, but you can use page controls for input:
When a user taps a page control to move to the next or previous page,
the control sends the UIControlEventValueChanged event for handling by
the delegate. The delegate can then evaluate the currentPage property
to determine the page to display. The page control advances only one
page in either direction.
My suggestion would be to either disable the page control or update your app to respond to input on it. Setting userInteractionEnabled to false on my page control resolved the problem for me.

Keyboard covers the whole UIWebView

I am making an iPad app and I have 3 small UIWebViews on the view. I have realized that if there is an HTML input field and I tap on it, Keyboard appears and covers one of the webviews. I can move up the web view but how can I detect which webviews' input field is tapped?
I would say use TapRecognizer active on the whole view. Using tap recognizer, find the location where it is tapped.
Now based on location I believe you can tell which webview is tapped.
Based on which webview is tapped, you can shift view upside.
Hope this will do the trick.

iOS system icons and custom buttons

i'm working on my first app and the problem a have is that the application interface design is quite customised, (even though it is a tab bar based app). now in one of the view controllers i need to present the user with the print interaction controller to print images. the thing is i don't use a navigation bar or a toolbar system or otherwise. i have managed to attach a target action method to a custom button. however, apple states that the printing interface should be presented by a system button (the one that looks like an arrow, kind of). question is: is there any way of putting a system icon inside a button that is not inside a (bar)?, or would it be ok to somehow tell the user (with an overlay or something) that tapping the button i'm using (the button is a red ribbon coming down from a picture frame) they will get the printing options?
Apple says:
Although the print button can be any
custom button, it is recommended that
you use the system item-action button
shown in Figure 6-1.
I'd interpret that to mean that you can use your own button if you want to.
You might want to consider having a toolbar at the top of the view for this particular tab. Just appearing on this tab. This would make the issue moot.
You could also, have the tool bar "slide in" and "slide out" from the top to provide access to this (and other?) actions. A single or double tap could instigate such an action.
Unfortunately, Apple doesn't expose the images for the custom bar button items in any reasonable manner. If you'd like access to them, I suggest using the bug reporter system at Apple's developer site to request that.

Resources