How to get (X,Y) coordinate of touch in UIWebView - ios

I have a UIWebView that displays a generated html table. When the user taps on a cell in the html table, my app needs to know which cell they've tapped, and the (x,y) coordinate of the tap location so I can display a popover at that point.
I've implemented shouldStartLoadWithRequest in my UIWebView delegate. In my web page, I've embedded javascript code that captures the touch event and passes what should be the (x,y) coordinate of the touched point in a URL request as follows:
var x, y;
function init()
{
// Add an event listener for touch events - set x and y to the coordinate of the touch point
document.addEventListener('touchstart', function(event) {
x = event.touches[0].clientX;
y = event.touches[0].clientY;
}, false);
}
function cellTapped(event)
{
window.location.href="file://myapp/dostuff?x=" + x + "&y=" + y;
}
In my html table, each cell gets an onclick event that calls cellTapped():
<td onclick="cellTapped(event)">...</td>
So whenever the user touches anywhere in the UIWebView, I get the coordinate of the touch point, which I save off in x and y. If they touch within one of the table cells, I receive the touch event (which sets x and y), then cellTapped() gets called and I set window.location.href, passing the (x,y) coordinate into my app.
This all works beautifully. Unless the user has zoomed or scrolled the UIWebView. When they zoom or scroll, the x and y coordinates I'm getting from event.touches[0].clientX and event.touches[0].clientY are off by some varying number of pixels (varies with the amount of zoom and how far up/down or left/right the web view is scrolled).
Is there some way to determine the zoom ratio and scroll position of the web view so that I can adjust my x and y coordinates accordingly? The zoomScale and contentOffset properties from UIScrollView do not seem to be exposed in UIWebView.

Use UIGestureRecognizerDelegate method:
Add UIGestureRecognizerDelegate in declaration file (i.e. your .h file)
Step 1: Just set the delegate of gestureRecognizer: (in .m file)
UITapGestureRecognizer *webViewTapped = [[UITapGestureRecognizer alloc]initWithTarget:self action:#selector(tapAction:)];
webViewTapped.numberOfTapsRequired = 1;
webViewTapped.delegate = self;
[webView addGestureRecognizer:webViewTapped];
[webViewTapped release];
Step 2: Override this function: (in .m file)
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
Step 3: Now implement the tapAction function:
- (void)tapAction:(UITapGestureRecognizer *)sender
{
CGPoint point = [sender locationInView:self.view]; // get x and y from here
}

EDIT: In iOS 5 and above, the scrollView property of UIWebView is exposed and accessible so this becomes a non-issue. In my case, I still need to support devices running iOS 4 (believe it or not...), so the following solves it for older versions.
By looping through the subviews of my UIWebView, I can find the underlying UIScrollView, then use its zoomScale and contentOffset properties to find the zoom and scroll position:
for (UIView *view in myWebView.subviews)
{
if ([view isKindOfClass:[UIScrollView class]])
{
// Get UIScrollView object
scrollview = (UIScrollView *) view;
// Find the zoom and scroll offsets
float zoom = scrollView.zoomScale;
float xOffset = scrollView.contentOffset.x;
float yOffset = scrollView.contentOffset.y;
}
}
I don't know if Apple would approve of this for app store submission, since I assume they had their reasons for not exposing the underlying UIScrollView object, but it does solve my problem. My app is distributed under an Enterprise license anyway, so app store submission isn't an issue for me.

Related

Expand UIScrollView interactive area and differentiate swiping and tapping

I'm using UIScroll View to make a gallery-like ui with paging functionality. Basically like this:
Since I need paging, so I set the width of scrollview equals to the width of a single page, in my example, the width of the pink rectangular.
But I want two extra things:
Tapping the yellow or blue area will bring the corresponding rectangular to the center.
One can scroll/swipe on yellow or blue area (out of the scrollview), which means the entire width of the screen is scrollable.
I followed this thread and added - (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event. BUT by doing so, I can only achieve my second goal. When I set selector or delegate handling tapping reaction of yellow and blue, it does't work. Any idea about it?
That answer you referenced is one of my old favorites. It doesn't contemplate your first requirement, but I think it can handle it very neatly with just the addition of a tap gesture recognizer.
Create it on your "ClipView":
UITapGestureRecognizer *tapGR = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tap:)];
[self.myClipView addGestureRecognizer:tapGR];
// myClipView is the view that contains the paging scroll view
- (void)tap: (UITapGestureRecognizer *)gr {
// there are a few challenges here:
// 1) get the tap location in the correct coordinate system
// 2) convert that to which "page" was tapped
// 3) scroll to that page
}
Challenge 1) is easy thanks to the gesture recognizer, which answer locationInView:
CGPoint location = [gr locationInView:self.scrollView];
For challenge 2) we need to work out what page within your scroll view was tapped. That can be done with pretty simple arithmetic given the page width.
// assuming you have something like this
#define kPAGE_WIDTH // some float
// page is just how many page-width's are represented by location.y
NSInteger page = floor(location.y/kPAGE_WIDTH);
Now, challenge 3) is easy now because we can change a page to it's scroll position straight-forwardly...
CGFloat y = page * kPAGE_WIDTH;
[self.scrollView setContentOffset:CGPointMake(y, 0.0f) animated:YES];
Or, all in one chunk of code...
- (void)tap: (UITapGestureRecognizer *)gr {
CGPoint location = [gr locationInView:self.scrollView];
NSInteger page = floor(location.y/kPAGE_WIDTH);
CGFloat y = page * kPAGE_WIDTH;
[self.scrollView setContentOffset:CGPointMake(y, 0.0f) animated:YES];
}
EDIT
You may also want to exclude the "current page" area from the gesture recognizer. That's simply done by qualifying the test in the tap method.
The only trick is to get the tap position in the same coordinate system as the scroll view's frame, that is, the clip view...
CGPoint locationInClipper = [gr locationInView:gr.view];
And the SDK provides a nice method to test...
BOOL inScrollView = [self.scrollView pointInside:locationInClipper withEvent:nil];
So...
- (void)tap: (UITapGestureRecognizer *)gr {
CGPoint locationInClipper = [gr locationInView:gr.view];
BOOL inScrollView = [self.scrollView pointInside:locationInClipper withEvent:nil];
if (!inScrollView) {
CGPoint location = [gr locationInView:self.scrollView];
NSInteger page = floor(location.y/kPAGE_WIDTH);
CGFloat y = page * kPAGE_WIDTH;
[self.scrollView setContentOffset:CGPointMake(y, 0.0f) animated:YES];
}
}

How to make Clickable Image area with zoom in & out in iOS

I have an image as a background and I want to make certain parts of this image clickable with zooming in and out , Is there any way to do something like that ??
Don't create a new view for your gesture recognizer. The recognizer implements a locationInView: method. Set it up for the view that contains the sensitive region. On the handleGesture, hit-test the region you care about like this:
0) Do all this on the view that contains the region you care about. Don't add a special view just for the gesture recognizer.
1) Setup mySensitiveRect
#property (assign, nonatomic) CGRect mySensitiveRect;
#synthesize mySensitiveRect=_mySensitiveRect;
self.mySensitiveRect = CGRectMake(0.0, 240.0, 320.0, 240.0);
2) Create your gestureRecognizer:
gr = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:#selector(handleGesture:)];
[self.view addGestureRecognizer:gr];
// if not using ARC, you should [gr release];
// mySensitiveRect coords are in the coordinate system of self.view
- (void)handleGesture:(UIGestureRecognizer *)gestureRecognizer {
CGPoint p = [gestureRecognizer locationInView:self.view];
if (CGRectContainsPoint(mySensitiveRect, p)) {
//Add your zooming code here
} else {
NSLog(#"got a tap, but not where i need it");
}
}}
The sensitive rect should be initialized in myView's coordinate system, the same view to which you attach the recognizer.
Apple has a demo app called PhotoScroller that implements a zoomable, scrollable set of images (in a page view controller, but you don't need that.) That would be a good starting point for what you need.
Their sample apps used to be built into the Xcode docs. Since Xcode 6 I haven't seen them linked in the docs any more.
You can download PhotoScroller from Apple's online iOS Developer Library. (link)

iOS: Implement a rotating wheel with custom views being horizontally stationary [duplicate]

I am looking for a little guidance to start figuring out an animation that tracks finger movement and moves a collection of UIButtons along the outer path of a circle
I am picturing it will kind of have a revolver feel to it. Like each one locks into place at the bottom
or like the swiping through one of the slide inserts of these
thanks in advance
(Sample code on GitHub)
It's not really that difficult, there's just a lot of trigonometry involved.
Now, what I'm going to describe now, is not an animation, since you requested for it to track the position of your finger in the title. An animation would involve its own timing function, but since you're using the touch gesture, we can use the inherent timing that this event has and just rotate the view accordingly. (TL;DR: The user keeps the time of the movement, not an implicit timer).
Keeping track of the finger
First of all, let's define a convenient class that keeps track of the angle, I'm gonna call it DialView. It's really just a subclass of UIView, that has the following property:
DialView.h
#interface DialView : UIView
#property (nonatomic,assign) CGFloat angle;
#end
DialView.m
- (void)setAngle:(CGFloat)angle
{
_angle = angle;
self.transform = CGAffineTransformMakeRotation(angle);
}
The UIButtons can be contained within this view (I'm not sure if you want the buttons to be responsible for the rotation? I'm gonna use a UIPanGestureRecognizer, since it's the most convenient way).
Let's build the view controller that will handle a pan gesture inside our DialView, let's also keep a reference to DialView.
MyViewController.h
#class DialView;
#interface ViewController : UIViewController
// The previously defined dial view
#property (nonatomic,weak) IBOutlet DialView *dial;
// UIPanGesture selector method
- (IBAction)didReceiveSpinPanGesture:(UIPanGestureRecognizer*)gesture;
#end
It's up to you on how you hook up the pan gesture, personally, I made it on the nib file. Now, the main body of this function:
MyViewController.m
- (IBAction)didReceiveSpinPanGesture:(UIPanGestureRecognizer*)gesture
{
// This struct encapsulates the state of the gesture
struct state
{
CGPoint touch; // Current touch position
CGFloat angle; // Angle of the view
CGFloat touchAngle; // Angle between the finger and the view
CGPoint center; // Center of the view
};
// Static variable to record the beginning state
// (alternatively, use a #property or an _ivar)
static struct state begin;
CGPoint touch = [gesture locationInView:nil];
if (gesture.state == UIGestureRecognizerStateBegan)
{
begin.touch = touch;
begin.angle = self.dial.angle;
begin.center = self.dial.center;
begin.touchAngle = CGPointAngle(begin.touch, begin.center);
}
else if (gesture.state == UIGestureRecognizerStateChanged)
{
struct state now;
now.touch = touch;
now.center = begin.center;
// Get the current angle between the finger and the center
now.touchAngle = CGPointAngle(now.touch, now.center);
// The angle of the view shall be the original angle of the view
// plus or minus the difference between the two touch angles
now.angle = begin.angle - (begin.touchAngle - now.touchAngle);
self.dial.angle = now.angle;
}
else if (gesture.state == UIGestureRecognizerStateEnded)
{
// (To be Continued ...)
}
}
CGPointAngle is a method invented by me, it's just a nice wrapper for atan2 (and I'm throwing CGPointDistance in if you call NOW!):
CGFloat CGPointAngle(CGPoint a, CGPoint b)
{
return atan2(a.y - b.y, a.x - b.x);
}
CGFloat CGPointDistance(CGPoint a, CGPoint b)
{
return sqrt(pow((a.x - b.x), 2) + pow((a.y - b.y), 2));
}
The key here, is that there's two angles to keep track:
The angle of the view itself
The angle formed between your finger and the center of the view.
In this case, we want to have the view's angle be originalAngle + deltaFinger, and that's what the above code does, I just encapsulate all the state in a struct.
Checking the radius
If that you want to keep track on "the border of the view", you should use the CGPointDistance method and check if the distance between the begin.center and the now.finger is an specific value.
That's homework for ya!
Snapping back
Ok, now, this part is an actual animation, since the user no longer has control of it.
The snapping can be achieved by having a set defined of angles, and when the finger releases the finger (Gesture ended), snap back to them, like this:
else if (gesture.state == UIGestureRecognizerStateEnded)
{
// Number of "buttons"
NSInteger buttons = 8;
// Angle between buttons
CGFloat angleDistance = M_PI*2 / buttons;
// Get the closest angle
CGFloat closest = round(self.dial.angle / angleDistance) * angleDistance;
[UIView animateWithDuration:0.15 animations:^{
self.dial.angle = closest;
}];
}
The buttons variable, is just a stand-in for the number of buttons that are in the view. The equation is super simple actually, it just abuses the rounding function to approximate to the closes angle.

UIButton position while UIImage zoom on UIScrollView

I have a scenario where I need to implement an Offline Map concept for which I am using the image of map on a UIScrollView that zooms on PinchGesture, which works fine.
Problem
I have a UIButton on map. While zooming, the button does not track its position with respect to UIImageView which is being scaled.I am able to reframe the button without affecting its size. But the position is wrong.
TLDR,
I need to reproduce the mapView with annotation kinda concept on UIScrollView with UIImage on it. Can any one help?
Thanks in advance :)
I have found the answer for this. I initially stored the button value in a CGRect initialButtonFrame. Then I updated the button frame (only origins, not the size of the button size as I wanted the button not to zoom like the image ie; I button should not zoom) using the scrollview delegate
- (void)scrollViewDidEndZooming:(UIScrollView *)scrollView withView:(UIView *)view atScale:(float)scale
{
[self manageImageOnScrollView];//here i managed the image's coordinates and zoom
[self manageButtonCoordinatesWithRespectToImageWithScale:scale];
}
-(void)manageButtonCoordinatesWithRespectToImageWithScale:(float)scaleFactor
{
//initialButtonFrame is frame of button
self.button.frame = CGRectMake((initialButtonFrame.origin.x * scaleFactor),
(initialButtonFrame.origin.y * scaleFactor),
initialButtonFrame.size.width,
initialButtonFrame.size.height);
[self.scrollView addSubview:self.button];// I removed the button from superview while zooming and later added with updated button coordinates which I got here
}
If you know your current offset and zoom of your map, you should be able to compute the position of your button:
//Assuming your map image has its origin at 0, 0
CGPoint mapOffsetX, mapOffsetY; // these would come from your map as you calculated it.
CGPoint mapZoomFactor; // 1.0 means not zoomed, 3.0 means zooming in 3x, etc
CGPoint buttonAnchorPosition; //the position of your button on your map at 1.0 zoom
CGFloat buttonX = buttonAnchorPosition.x * mapZoomFactor + mapOffsetX;
CGFloat buttonY = buttonAnchorPosition.y * mapZoomFactor + mapOffsetY;
CGPoint buttonPosition = CGPointMake(buttonX, buttonY);
button.position = buttonPosition;
Try that, good luck

New foursquare venue detail map

I really love the way foursquare designed venue detail view. Especially the map with venue location in the "header" of view ... How was it done? Details are obviously some uiscrollview (maybe uitableview?) and behind it (in the header) there is a map so when you scroll up the map is beeing uncovered as the scroll view bounces... does anyone has an idea how to do this?
Here's the way I manage to reproduce it:-
You need a UIViewController with a UIScrollView as its view. Then, the content of the UIView you add to your scrollview should look like this :-
- The frame of the MKMapView have a negative y position. In this case, we can only see 100pts of the maps in the default state (before dragging).
- You need to disable zooming and scrolling on your MKMapView instance.
Then, the trick is to move down the centerCoordinate of the MKMapView when you drag down, and adjust its center position.
For that, we compute how much 1point represent as a delta latitude so that we know how much the center coordinate of the map should be moved when being dragged of x points on the screen :-
- (void)viewDidLoad {
[super viewDidLoad];
UIScrollView* scrollView = (UIScrollView*)self.view;
[scrollView addSubview:contentView];
scrollView.contentSize = contentView.frame.size;
scrollView.delegate = self;
center = CLLocationCoordinate2DMake(43.6010, 7.0774);
mapView.region = MKCoordinateRegionMakeWithDistance(center, 1000, 1000);
mapView.centerCoordinate = center;
//We compute how much latitude represent 1point.
//so that we know how much the center coordinate of the map should be moved
//when being dragged.
CLLocationCoordinate2D referencePosition = [mapView convertPoint:CGPointMake(0, 0) toCoordinateFromView:mapView];
CLLocationCoordinate2D referencePosition2 = [mapView convertPoint:CGPointMake(0, 100) toCoordinateFromView:mapView];
deltaLatFor1px = (referencePosition2.latitude - referencePosition.latitude)/100;
}
Once those properties are initialized, we need to implement the behavior of the UIScrollViewDelegate. When we drag, we convert the move expressed in points to a latitude. And then, we move the center of the map using the half of this value.
- (void)scrollViewDidScroll:(UIScrollView *)theScrollView {
CGFloat y = theScrollView.contentOffset.y;
// did we drag ?
if (y<0) {
//we moved y pixels down, how much latitude is that ?
double deltaLat = y*deltaLatFor1px;
//Move the center coordinate accordingly
CLLocationCoordinate2D newCenter = CLLocationCoordinate2DMake(center.latitude-deltaLat/2, center.longitude);
mapView.centerCoordinate = newCenter;
}
}
You get the same behavior as the foursquare app (but better: in the foursquare app, the maps recenter tends to jump, here, changing the center is done smoothly).
The example above is nice. If you need more help, I think they're using something very similar to RBParallaxTableViewController. https://github.com/Rheeseyb/RBParallaxTableViewController
It's essentially the same effect that Path uses for its header photo.
Yonel's answer is nice, but I found a problem as I have a pin at the center of the map. Because the negative Y, the point is hidden under my UINavigationBar.
Then, I didn't set the Negative Y, and I correct my mapView.frame according the scroll offset.
My mapView is 320 x 160
_mapView.frame = CGRectMake(0, 160, 320, -160+y);
Hope this helps someone.

Resources