iOS Zoomview using Bubbleview touch location - ios

how to create bubble view for touch location ....(Xcode for iOS)
I had a problem with Zooming the image at certain location using bubbleview of touch location.
I wanna zoom certain location which I touch and that bubble have to display above my finger. But here it zooming upper location and bubble the upper location of my finger. How to zoom touch location and bubble it above my finger..?
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
if (CGRectContainsPoint(self.MeterView.frame, touchLocation)) {
_zoomView = [[BubbleView alloc] initWithFrame:CGRectMake(3, touchLocation.y-120, 120,120)];
[_zoomView setZoomScale:2.0];
}

There are couple of very good projects on github which provides bubble zoom functionality where you move your finger.
They are,
iOS-MagnifyingGlass
BKZoomView
I hope it may help you.

Related

Correct touch location after zooming and panning a pdf drawn in CATiledLayer on top of the UIScrollview in iOS

I am working on Atlas App in which I am displaying map which I can zoom and pan using pdf file. I am using vfr reader for this purpose and it is working fine. I want to detect the touch location so that I can get the correct state selected. I am getting the correct coordinate when view is not zoomed and panned using the code below:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:theScrollView];
}
But,when I zoom it out and pan it,the touch location changes and I am not getting the correct state selected. How will I get the correct selected state?
On debugging vfr reader classes I found that I can get the correct exact location of touch in ReaderContentPage class. This class gives the correct touch location after zooming also. You can get the point in processingSingleTap method as below:
- (id)processSingleTap:(UITapGestureRecognizer *)recognizer
{
CGPoint point = [recognizer locationInView:self];
}
CGPoint point gives the correct touch location. And then use the delegate method to get the correct coordinates in the required class.

How to draw a polygon and calculate the area of that polygon on mapview in iOS

please take a look on this link
http://www.daftlogic.com/projects-google-maps-area-calculator-tool.htm.
where the use taps on some locations and that forms a polygon.you can see the area covered by the polygon in output. I want same thing to be achieved in iOS through map kit.
currently i am using the following simple code to get touched coordinates on map view . I am unable to get much idea to that. please provide any sample links or code
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if (!self.isDrawingPolygon)
return;
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self.mapView];
CLLocationCoordinate2D coordinate = [self.mapView convertPoint:location toCoordinateFromView:self.mapView];
[self addCoordinate:coordinate replaceLastObject:YES];
}

Looking for an alternative to touchesMoved to detect any touch event within a defined area?

I have a virtual keyboard in my app with 6 keys, and the whole thing is just an image implemented with UIImageView. I determined the exact x and y coordinates that correspond to the image of each 'key' and used the following code to respond to a user interacting with the keyboard:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [[event allTouches] anyObject];
CGPoint point = [touch locationInView:touch.view];
if(point.y < 333 && point.y > 166 && point.x < 72 && point.x > 65)
{
NSLog(#"Key pressed");
}
//Repeat per key...
}
However, I have realized that this method is not very smart, because changing phone perspectives (portrait to landscape) or changing devices will ruin my x and y coordinates and therefore cause problems.
So, I am looking for an alternative to specifying the absolute x and y values, and using touchesMoved in general. Ideally, it would be a button with specific settings that would call its method if it was tapped, or if the user dragged their finger into the area of the button (even if very slowly - I used swipe detection before and it required too much of an exaggerated movement).
Is it possible to set up a button to call its method if tapped or if a touch event started outside of the button and then proceded into the button? If not, what are my alternatives?
Thanks SE!
You need to get the winSize property, which will fix the problem you are having with screen sizes.
CGSize size = [[CCDirector sharedDirector]winSize];
I do believe you are using Cocos2D? If so you can use this size property instead of hard coding numbers. :)
to convert your point, use
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:[touch view]];
location = [[CCDirector sharedDirector]convertToGL:location];
To see if its within the bounds/box of the button you could try:
if (CGRectContainsPoint ([self.myButton boundingBox], location)
{
//Execute method
}
This is all assuming you are using Cocos2D and a CCSprite for your button.
This should work on any screen size and portrait or landscape :)

read x,y pixel touched ipad/iphone?

Is there anyway to detect which pixels you are touching while keeping your hand/finger on the screen (iphone/ipad)? Essentially drawing a shape of my hand (not as detailed like a fingerprint).
Thanks.
What you want to achieve is sadly not possible. The current devices can only detect up to 11 touches as points (more info in this post). There is no way to get the real touch area or the true touched pixels.
If you are looking for the coordinate point of touch use following code.
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentTouchPosition = [touch locationInView:self.view];
NSLog(#"%f,%f",currentTouchPosition.x,currentTouchPosition.y);
}

Changing locationinview UITouch

i'm using this code below to drag an image across the screen. The one thing i like to change is the location of the image. Right now, the location changes to every touch so everytime i try to move it, it translate to my finger. How do i make it so it can move relative to my finger? e.g. if the image is in the middle of the screen, and i put my finger at the bottom right of the screen, the image don't get translate to my finger. Hope that makes sense. Thanks in advance.
// get touch event
UITouch *touch = [[event allTouches] anyObject];
// get the touch location
CGPoint touchLocation = [touch locationInView:touch.view];
// move the image view
image.center = touchLocation;
uiimage-detecting-touch-and-dragging
this may help..!!

Resources