I am working on Atlas App in which I am displaying map which I can zoom and pan using pdf file. I am using vfr reader for this purpose and it is working fine. I want to detect the touch location so that I can get the correct state selected. I am getting the correct coordinate when view is not zoomed and panned using the code below:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:theScrollView];
}
But,when I zoom it out and pan it,the touch location changes and I am not getting the correct state selected. How will I get the correct selected state?
On debugging vfr reader classes I found that I can get the correct exact location of touch in ReaderContentPage class. This class gives the correct touch location after zooming also. You can get the point in processingSingleTap method as below:
- (id)processSingleTap:(UITapGestureRecognizer *)recognizer
{
CGPoint point = [recognizer locationInView:self];
}
CGPoint point gives the correct touch location. And then use the delegate method to get the correct coordinates in the required class.
how to create bubble view for touch location ....(Xcode for iOS)
I had a problem with Zooming the image at certain location using bubbleview of touch location.
I wanna zoom certain location which I touch and that bubble have to display above my finger. But here it zooming upper location and bubble the upper location of my finger. How to zoom touch location and bubble it above my finger..?
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
if (CGRectContainsPoint(self.MeterView.frame, touchLocation)) {
_zoomView = [[BubbleView alloc] initWithFrame:CGRectMake(3, touchLocation.y-120, 120,120)];
[_zoomView setZoomScale:2.0];
}
There are couple of very good projects on github which provides bubble zoom functionality where you move your finger.
They are,
iOS-MagnifyingGlass
BKZoomView
I hope it may help you.
I want to shift the position of some UILabels when the user swipes the screen--specifically, move them the same distance the user swiped their finger.
How would I go about detecting and implementing this?
Thanks
Override touchesBegan:withEvent and touchesMoved:withEvent:. Keep track of the starting location in touchesBegan:withEvent. If repeated calls to touchesDragged (it is called while the drag is happening) show that you are moving fast enough for the action to be a swipe versus just a drag, use the difference between the starting location and the current touch location to animate changing the UILabels.
You can do that by using following methods for UIView.
In touchesBegan you should store the position where the user first touches the screen. So you can identify the left or right swipe when touches ends.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
self.startPoint = [touch locationInView:self];
}
Record the final position on touchesEnded method. By comparing this two positions you can determine the direction of movement.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint endPosition = [touch locationInView:self];
if (startPoint.x < endPoint.x) {
// Right swipe
} else {
// Left swipe
}
}
I hope this helps.
Is there anyway to detect which pixels you are touching while keeping your hand/finger on the screen (iphone/ipad)? Essentially drawing a shape of my hand (not as detailed like a fingerprint).
Thanks.
What you want to achieve is sadly not possible. The current devices can only detect up to 11 touches as points (more info in this post). There is no way to get the real touch area or the true touched pixels.
If you are looking for the coordinate point of touch use following code.
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentTouchPosition = [touch locationInView:self.view];
NSLog(#"%f,%f",currentTouchPosition.x,currentTouchPosition.y);
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
NSLog("%f %f",currentPoint.x,currentPoint.y);
}
i want to developed a paint app for my ipad.
when i use these code and use finger to paint a line on my pad,
it's print
(x,1),(x,3),(x,6),(x,7),(x,12),(x,15),(x,18)....
in my thought,it should print
(x,1),(x,2),(x,3),(x,4),(x,5),(x,6),(x,7),(x,8),(x,9),(x,10),(x,11),(x,12),(x,13),(x,14),(x,15),(x,16),(x,17),(x,18)....
touchesMoved can not get continued coordinate ?
It depends on the speed that you swipe.
If you swipe really slow you'll probably get (x,1),(x,2),(x,3),(x,4),(x,5),(x,6),(x,7),(x,8),(x,9),(x,10), but if you swipe fast you can get as little as (x,1),(x,5),(x,10).
If you are developing a paint app you will have to take into account if the user hasn't lift his finger and paint the line between the points if he hasn't.
Good luck!