I'm using UIScroll View to make a gallery-like ui with paging functionality. Basically like this:
Since I need paging, so I set the width of scrollview equals to the width of a single page, in my example, the width of the pink rectangular.
But I want two extra things:
Tapping the yellow or blue area will bring the corresponding rectangular to the center.
One can scroll/swipe on yellow or blue area (out of the scrollview), which means the entire width of the screen is scrollable.
I followed this thread and added - (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event. BUT by doing so, I can only achieve my second goal. When I set selector or delegate handling tapping reaction of yellow and blue, it does't work. Any idea about it?
That answer you referenced is one of my old favorites. It doesn't contemplate your first requirement, but I think it can handle it very neatly with just the addition of a tap gesture recognizer.
Create it on your "ClipView":
UITapGestureRecognizer *tapGR = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tap:)];
[self.myClipView addGestureRecognizer:tapGR];
// myClipView is the view that contains the paging scroll view
- (void)tap: (UITapGestureRecognizer *)gr {
// there are a few challenges here:
// 1) get the tap location in the correct coordinate system
// 2) convert that to which "page" was tapped
// 3) scroll to that page
}
Challenge 1) is easy thanks to the gesture recognizer, which answer locationInView:
CGPoint location = [gr locationInView:self.scrollView];
For challenge 2) we need to work out what page within your scroll view was tapped. That can be done with pretty simple arithmetic given the page width.
// assuming you have something like this
#define kPAGE_WIDTH // some float
// page is just how many page-width's are represented by location.y
NSInteger page = floor(location.y/kPAGE_WIDTH);
Now, challenge 3) is easy now because we can change a page to it's scroll position straight-forwardly...
CGFloat y = page * kPAGE_WIDTH;
[self.scrollView setContentOffset:CGPointMake(y, 0.0f) animated:YES];
Or, all in one chunk of code...
- (void)tap: (UITapGestureRecognizer *)gr {
CGPoint location = [gr locationInView:self.scrollView];
NSInteger page = floor(location.y/kPAGE_WIDTH);
CGFloat y = page * kPAGE_WIDTH;
[self.scrollView setContentOffset:CGPointMake(y, 0.0f) animated:YES];
}
EDIT
You may also want to exclude the "current page" area from the gesture recognizer. That's simply done by qualifying the test in the tap method.
The only trick is to get the tap position in the same coordinate system as the scroll view's frame, that is, the clip view...
CGPoint locationInClipper = [gr locationInView:gr.view];
And the SDK provides a nice method to test...
BOOL inScrollView = [self.scrollView pointInside:locationInClipper withEvent:nil];
So...
- (void)tap: (UITapGestureRecognizer *)gr {
CGPoint locationInClipper = [gr locationInView:gr.view];
BOOL inScrollView = [self.scrollView pointInside:locationInClipper withEvent:nil];
if (!inScrollView) {
CGPoint location = [gr locationInView:self.scrollView];
NSInteger page = floor(location.y/kPAGE_WIDTH);
CGFloat y = page * kPAGE_WIDTH;
[self.scrollView setContentOffset:CGPointMake(y, 0.0f) animated:YES];
}
}
Related
I am working on an iOS map app and it includes interactive map. The interactive map is a subclass of UIImageView and placed on a scrollView. My view hierarchy is shown below:
When user taps some part of the map, ViewController performs animated segue (like zoom-in to that area of the map). I can start segue from any point of the screen, but to do this properly, I need exact coordinates of user's tap relative to the screen itself. As ImageView is put at the top of ScrollView, it uses different coordinate system, larger than screen size. No matter, which area of map has ben tapped, what matters is the tapped CGPoint on the screen (physical).
ImageView uses its own code to get coordinates of a tap:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesEnded:touches withEvent:event];
// cancel previous touch ended event
[NSObject cancelPreviousPerformRequestsWithTarget:self];
CGPoint touchPoint = \
[[touches anyObject] locationInView:self];
NSValue* touchValue =\
[NSValue
valueWithCGPoint:touchPoint];
// perform new one
[self
performSelector:#selector(_performHitTestOnArea:)
withObject:touchValue
afterDelay:0.1];
}
And the case if I place gesture recognizer, it works, but ImageView can't get any touches and, therefore, call segue.
The code for gesture recognizer, I attempted to use:
UITapGestureRecognizer *rec = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapRecognized:)];
[someView addGestureRecognizer:rec];
[rec release];
// elsewhere
- (void)tapRecognized:(UITapGestureRecognizer *)recognizer
{
if(recognizer.state == UIGestureRecognizerStateRecognized)
{
CGPoint point = [recognizer locationInView:recognizer.view];
// again, point.x and point.y have the coordinates
}
}
So, is there any way to get two coordinates in different reference systems?, or to make these recognizers work simultaneously without interfering each other?
Solved
I use this code to convert touched point from one view's reference system to
CGPoint pointInViewCoords = [self.parentView convertPoint:self.imageView.touchPoint fromView:self.imageView];
Where self.parentView is "View" on hierarchy image - with the size of the screen.
I want to do smooth swipe animation. I just want to that swipe only can be possible when user swipe the page from the right or left border only. Middle of the page swipe should not possible.Both the swipe should be possible left to right and right to left.
I have tried lots of swipe animation sample code or demo code. But its not what I want. I want animation like this https://itunes.apple.com/in/app/clear-tasks-to-do-list/id493136154?mt=8
In this app its like when we touch the right border its swipe smoothly.Please guide me to do this animation. Thanks in advance.
Sorry for the late reply. Just saw this question.
If you want your swipe operation to happen from the edges, create 2 subviews in the far ends (left and right) of your main view and give then a width of 30 or 40.
I believe you have 2 other views popin up from left and right. So inorder to do this you need to add 2 views right on top of your main view.
Now for the left view, set it's right horizondal space constraint connecting to the main view to a value lesser than (-1)x width of the main view. For the right view set its right horizondal space constraint connecting to the main view to a value greater than the width of the main view, so that both the views are outside the main view
X stands for a value greater than or equal to the mainview's width
Add two NSLayoutConstraint variables as IBOutlet holding these 2 values.
NSLayoutConstraint *leftViewHorizondalRightPadding;
NSLayoutConstraint *rightViewHorizondalRightPadding;
Now add the UISwipeGestures to these subViews (indicated in orange).
UISwipeGestureRecognizer *leftToRightSwipe = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(handleSwipe:)];
[leftToRightSwipe setDirection:UISwipeGestureRecognizerDirectionRight];
[self.leftSubview addGestureRecognizer:leftToRightSwipe];
UISwipeGestureRecognizer *rightToLeftSwipe = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(handleSwipe:)];
[rightToLeftSwipe setDirection:UISwipeGestureRecognizerDirectionLeft];
[self.rightSubview addGestureRecognizer:rightToLeftSwipe];
///Now in the swipe handler distinguish the swipe actions
-(void)handleSwipe:(UISwipeGestureRecognizer *)recognizer {
NSLog(#"Swipe received.");
if (recognizer.direction == UISwipeGestureRecognizerDirectionRight) {
//It's leftToRight
leftViewHorizondalRightPadding.constant = 0;
[UIView animateWithDuration:1
animations:^{
[self.view layoutIfNeeded];
}];
}
else {
//It's rightToLeft
rightViewHorizondalRightPadding.constant = 0;
[UIView animateWithDuration:1
animations:^{
[self.view layoutIfNeeded];
}];
}
}
}
This will make a swipe animation from left to right and right to left.
Hope this helps..
After you create the 2 swipe gesture recognisers you should set their delegates. Then use this delegate method:
UISwipeGestureRecognizer *_swipeLeft;
UISwipeGestureRecognizer *_swipeRight;
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer {
static const CGFloat borderWidth = 50.0f;
if(gestureRecognizer == _swipeLeft) {
return [gestureRecognizer locationInView:self].x > self.frame.size.width - borderWidth;
}
else if(gestureRecognizer == _swipeRight) {
return [gestureRecognizer locationInView:self].x < borderWidth;
}
return YES;
}
Do note that for smooth swiping/dragging you will probably need to use a pan gesture or even long press gesture recogniser rather then the swipe gesture. They are very similar except the long press takes a bit of time to begin (which is settable). If you use them you may still want to use the same delegate method. Or you can simply do all the code in the gestures target method. Try something like this:
CGPoint gestureStartPoint;
- (void)dragFromBoreder:(UIGestureRecognizer *)sender {
static const CGFloat borderWidth = 50.0f;
switch (sender.state) {
case UIGestureRecognizerStateBegan: {
CGPoint location = [sender locationInView:self];
if(location.x > borderWidth || location.x < self.frame.size.width-borderWidth) {
//break the gesture
sender.enabled = NO;
sender.enabled = YES;
}
else {
gestureStartPoint = location;
}
break;
}
case UIGestureRecognizerStateChanged: {
CGPoint location = [sender locationInView:self];
CGFloat deltaX = location.x - gestureStartPoint.x;
UIView *viewToMove;
CGPoint defaultCenter;
viewToMove.center = CGPointMake(defaultCenter.x+deltaX, defaultCenter.y);
break;
}
case UIGestureRecognizerStateEnded:
case UIGestureRecognizerStateCancelled: {
CGPoint location = [sender locationInView:self];
CGFloat deltaX = location.x - gestureStartPoint.x;
/*
if(deltaX > someWidth) {
show the left view
}
else if(deltaX < -someWidth) {
show the right view
}
else {
put everything back the way it was
}
*/
break;
}
default:
break;
}
}
In ios7 there is a gesture recogniser specifically for gestures beginning from the edge of the screen. You should use this.
I can't help with your "smooth" problem, because you haven't said what your current animation looks like or how you are doing it. But a pan gesture, like the one linked, which directly updates view positions, will track the user's movement much more smoothly than a swipe.
In iPad when you put your finger outside top or bottom edge of screen and then drag it on screen a menu is revealed. How can I implement that?
There is specifically a Gesture Recogniser class for this, introduced in iOS 7. It's the UIScreenEdgePanGestureRecognizer. The documentation for it is here. Check it out.
To test this in the simulator, just start the drag from near the edge (~15 points).
Also, you will have to create a gestureRecognizer for each edge. You can't OR edges together, so UIRectEdgeAll won't work.
There is a simple example here. Hope this helps!
Well you can do something like this, this example is the case where you want you pan gesture to work only when the user swipes 20px inside from the right hand side of the screen
First of all add the gesture to your window
- (void)addGestures {
if (!_panGesture) {
_panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePanGesture:)];
[_panGesture setDelegate:self];
[self.view addGestureRecognizer:_panGesture];
}
}
After adding the check whether the touch you recieved is a pan gesture and then perform your action accordingly
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
CGPoint point = [touch locationInView:self.view];
if (gestureRecognizer == _panGesture) {
return [self slideMenuForGestureRecognizer:gestureRecognizer withTouchPoint:point];
}
return YES;
}
Here is how you can check whether your touch is contained in the region where you want it to be
-(BOOL)isPointContainedWithinBezelRect:(CGPoint)point {
CGRect leftBezelRect;
CGRect tempRect;
//this will be the width between CGRectMaxXEdge and the screen offset, thus identifying teh region
CGFloat bezelWidth =20.0;
CGRectDivide(self.view.bounds, &leftBezelRect, &tempRect, bezelWidth, CGRectMaxXEdge);
return CGRectContainsPoint(leftBezelRect, point);
}
I'm looking to animate bubbles with text on them to slide on and off the screen. The ideal implementation for this animation is iOS's horizonatal scroll with paging enabled. I definitely want the "bounce" when I reach the end of the speech bubbles and I definetely want the bubbles to track the finger until a certain point before they will slide off the screen. I believe this is not the same as a swipe (which is just a flick in one direction).
However, the problem with the horizontal scroll is that it is optimized for a static number of images. I will be having a dynamic number of images and as far as I can tell, you cannot dynamically append images to horizontal scroller. The idea is the app dynamically adds content to the scroller as you continue to progress through it.
The scroller was easy enough to get going but I'm going to have to tear it down now. How can I get started with the gesture (I'm not sure if the standard gesture recognizers will work for me at this point) as well as the animation? I've never worked with that portion of iOS code before.
I'm not sure if I follow your question entirely, but if you want to animate the movement of something based upon a gesture, you can use a UIPanGestureRecognizer and change the center of whatever subview you want. For example, in viewDidLoad you would:
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(movePiece:)];
[whateverViewYouWantToAnimate addGestureRecognizer:panGesture];
You can then have your gesture recognizer move it where ever you want:
- (void)movePiece:(UIPanGestureRecognizer *)gestureRecognizer
{
static CGPoint originalCenter;
if (gestureRecognizer.state == UIGestureRecognizerStateBegan)
{
originalCenter = [gestureRecognizer view].center;
}
else if (gestureRecognizer.state == UIGestureRecognizerStateChanged)
{
CGPoint translation = [gestureRecognizer translationInView:self.view];
gestureRecognizer.view.center = CGPointMake(originalCenter.x + translation.x, originalCenter.y);
// if you wanted to animate both left/right and up/down, it would be:
// gestureRecognizer.view.center = CGPointMake(originalCenter.x + translation.x, originalCenter.y + translation.y);
}
else if (gestureRecognizer.state == UIGestureRecognizerStateEnded)
{
// replace this offscreen CGPoint with something that makes sense for your app
CGPoint offscreen = CGPointMake(480, gestureRecognizer.view.center.y);
[UIView animateWithDuration:0.5
animations:^{
gestureRecognizer.view.center = offscreen;
}
completion:^(BOOL finished){
// when you're done, you might want to do whatever cleanup
// is appropriate for your app (e.g. do you want to remove it?)
[gestureRecognizer.view removeFromSuperview];
}];
}
}
When I touch on a line plot in my multiline plot graph, the method for displaying a symbol with values corresponding to the point is not called frequently,
-(void)scatterPlot:(CPTScatterPlot *)plot plotSymbolWasSelectedAtRecordIndex:(NSUInteger)index;
This problem is also with barplots. plotSymbolMarginForHitDetection property also set to high value. But No effect. How can I increase my graph's user interaction?
There is no scatter plot delegate method for detecting hits on the line between plot points. If that's what you're after, you'll need to use a plot space delegate. Handle the touch event and look through the plot data to find which line segment (if any) is near the touched point.
Bar plots aren't as complicated. Any touch inside a bar should trigger the delegate method. You might have issues if the bars are very narrow. The only solution in that case is to make them wider.
Another way to increase your "hit area" is to monitor all touches in your graph and translate that to the closest index.
To do this, you'll have to make sure the delegate is nil (since you are manually monitoring).
self.myBarPlot.delegate = nil;
Then, on your CPTGraphHostingView, set up your UIGestureRecognizers. I've found that using both tap and pan recogizners works best. Set these up like so.
UITapGestureRecognizer *tapRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(graphTapped:)];
[self.hostView addGestureRecognizer:tapRecognizer];
UIPanGestureRecognizer *panRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(graphPanned:)];
[self.hostView addGestureRecognizer:panRecognizer];
The recognizers will monitor when your hostView has been tapped or panned. From there, you can easily translate the location of the touch to an index doing the following.
- (void)graphTapped:(UITapGestureRecognizer *)sender {
if (sender.state == UIGestureRecognizerStateEnded) {
[self gestureUpdated:sender];
}
}
- (void)graphPanned:(UIPanGestureRecognizer *)sender {
if (sender.state == UIGestureRecognizerStateEnded || sender.state == UIGestureRecognizerStateChanged) {
[self gestureUpdated:sender];
}
}
- (void)gestureUpdated:(UIGestureRecognizer *)sender {
CGFloat width = self.hostView.frame.size.width;
CGPoint loc = [sender locationInView:self.hostView];
NSInteger index = (loc.x / width) * [self numberOfRecordsForPlot:self.myBarPlot];
NSLog(#"Touch index: %li", index);
}
Now that we have an index, just go ahead and do what you did in your original delegate callback.
For bar plots:
[self barPlot:self.myBarPlot barWasSelectedAtRecordIndex:index];
For scatter plots (untested):
[self scatterPlot:self.myScatterPlot plotSymbolWasSelectedAtRecordIndex:index];
ViolĂ !