How would it be possible to swipe (left) html pages?
Handling gestures happening on a web view is tricky, since UIWebView is really greedy with touches and wants to handle them all.
As usual with web views, you could try and do everything with Javascript. You can read something about it in this S.O. post.
The way I prefer handling pinching (zoom) and swiping on a web view though is following:
subclass UIWindow:
#interface MyWebNavigatorWindow : UIWindow {
install an instance of that window type as the window for your app in application:didFinishLaunchingWithOptions:
_window = [[MyWebNavigatorWindow alloc] initWithFrame:rect];
Alternatively, you can assign a class to your window object in Interface Builder.
handle swipes and pinches in sendEvent in your MyWebNavigatorWindow class:
- (void)sendEvent:(UIEvent*)event {
NSSet* allTouches = [event allTouches];
UITouch* touch = [allTouches anyObject];
UIView* touchView = [touch view];
You will need a mechanism in sendEvent to detect when the touch happens inside of your web view. In my case, I "register" all the views that I want to handle touch for and then check if the touch is inside them:
for (UIView* view in self.controlledViews) { // self.controlledViews is the array of all "registered" views
if ([touchView isDescendantOfView:view]) {
then I handle the various touch phases to detect what kind of gesture it is (code snippet not compilable, but it gives the idea):
if (touch.phase == UITouchPhaseBegan) {
NSLog(#"TOUCH BEGAN");
_initialView = touchView;
startTouchPosition1 = [touch locationInView:self];
startTouchTime = touch.timestamp;
if ([allTouches count] > 1) {
startTouchPosition2 = [[[allTouches allObjects] objectAtIndex:1] locationInView:self];
previousTouchPosition1 = startTouchPosition1;
previousTouchPosition2 = startTouchPosition2;
}
}
if (touch.phase == UITouchPhaseMoved) {
NSLog(#"TOUCH MOVED");
if ([allTouches count] > 1) {
CGPoint currentTouchPosition1 = [[[allTouches allObjects] objectAtIndex:0] locationInView:self];
CGPoint currentTouchPosition2 = [[[allTouches allObjects] objectAtIndex:1] locationInView:self];
CGFloat currentFingerDistance = CGPointDist(currentTouchPosition1, currentTouchPosition2);
CGFloat previousFingerDistance = CGPointDist(previousTouchPosition1, previousTouchPosition2);
if (fabs(currentFingerDistance - previousFingerDistance) > ZOOM_DRAG_MIN) {
NSNumber* movedDistance = [NSNumber numberWithFloat:currentFingerDistance - previousFingerDistance];
if (currentFingerDistance > previousFingerDistance) {
NSLog(#"zoom in");
[[NSNotificationCenter defaultCenter] postNotificationName:NOTIFICATION_ZOOM_IN object:movedDistance];
} else {
NSLog(#"zoom out");
[[NSNotificationCenter defaultCenter] postNotificationName:NOTIFICATION_ZOOM_OUT object:movedDistance];
}
}
}
}
if (touch.phase == UITouchPhaseEnded) {
CGPoint currentTouchPosition = [touch locationInView:self];
// Check if it's a swipe
if (fabsf(startTouchPosition1.x - currentTouchPosition.x) >= SWIPE_DRAG_HORIZ_MIN &&
fabsf(startTouchPosition1.x - currentTouchPosition.x) > fabsf(startTouchPosition1.y - currentTouchPosition.y) &&
touch.timestamp - startTouchTime < 0.7
) {
// It appears to be a swipe.
if (startTouchPosition1.x < currentTouchPosition.x) {
NSLog(#"swipe right");
[[NSNotificationCenter defaultCenter] postNotificationName:NOTIFICATION_SWIPE_RIGHT object:touch];
} else {
NSLog(#"swipe left");
[[NSNotificationCenter defaultCenter] postNotificationName:NOTIFICATION_SWIPE_LEFT object:touch];
}
}
startTouchPosition1 = CGPointMake(-1, -1);
_initialView = nil;
}
Related
I have a character in my game that I can move left/right and jump. I am using touchesBegan to do that. But when I move left ( keep touching the screen -> makes character faster and faster ), it only jumps when i release it. I would like to jump and move left at the same time.
Here is some of my code :
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self.view];
if((point.x < moveLeftButton.frame.size.width))
{
if((point.y > JumpButton.frame.size.height))
{
leftSide = YES;
sanchez.image = [UIImage imageNamed:#"characterL2"];
}
}
if((point.x) > (backGround.frame.size.width - moveRightButton.frame.size.width))
{
if((point.y > JumpButton.frame.size.height))
{
rightSide = YES;
sanchez.image = [UIImage imageNamed:#"characterR2"];
}
}
if( notJumping && (point.y < JumpButton.frame.size.height) && (sanchezStopAll == NO))
{
AudioServicesPlaySystemSound(jump);
sanchezJumping = 5.5;
notJumping = NO;
}
[self animation];
}
I am not using buttonPressed because they are hidden and I think you can't click hidden buttons.
This is how it looks like if its not hidden:
Thanks for any Help.
You'll only get a single touch by default unless you set multipleTouchEnabled = YES. Then your code should work as expected.
For example:
- (void)viewDidLoad
{
[super viewDidLoad];
self.view.multipleTouchEnabled = YES;
}
I am having trouble using initWithItem:attachedToItem: Initializes an attachment behavior that connects the center point of a dynamic item to the center point of another dynamic item.
But when I changed the center point of topview using method pan,only the topview moved around,I can't get the other view to move.Isn't it should be moving together?
(BTW I am trying to implement a pile of cards and move around all together when I pan the card in the top.)
-(void)pinch:(UIPinchGestureRecognizer *)gesture
{
if(gesture.state == UIGestureRecognizerStateChanged){
CGPoint pinchCen = [gesture locationInView:self.cardArea];
if (gesture.scale <= 0.5 && !self.pileStat) {
self.pileStat = !self.pileStat;
NSUInteger number = [self.cardViews count];
UIView *topView = [self.cardViews lastObject];
[topView addGestureRecognizer:[[UIPanGestureRecognizer alloc]initWithTarget:self action:#selector(pan:)]];
for (int i = 0; i < number;i++) {
UIView *cardView = self.cardViews[i];
[UIView animateWithDuration:0.5 animations:^{cardView.center = CGPointMake(pinchCen.x+i%10*0.5, pinchCen.y+i%10*0.5);} completion:^(BOOL finished){
if(i != number - 1){
UIAttachmentBehavior *attach = [[UIAttachmentBehavior alloc]initWithItem:cardView attachedToItem:topView];
[self.animator addBehavior:attach];
}
}];
}
}
else if(gesture.scale > 1.5 && self.pileStat)
{
self.pileStat = !self.pileStat;
}
}else if (gesture.state == UIGestureRecognizerStateEnded){
gesture.scale = 1.0f;
}
}
-(void)pan:(UIPanGestureRecognizer *)gesture
{
if (gesture.state == UIGestureRecognizerStateChanged || UIGestureRecognizerStateEnded) {
UIView *topView = [self.cardViews lastObject];
CGPoint trans = [gesture translationInView:self.cardArea];
topView.center = CGPointMake(trans.x+topView.center.x, trans.y+topView.center.y);
[gesture setTranslation:CGPointMake(0, 0) inView:self.cardArea];
}
}
setCenter does not play nice with UIDynamicAnimator. Instead of changing the center coordinate of your top view, you should use another UIAttachmentBehavior that you attach to your touch. Replace your pan gesture handler with this method:
//Add a variable in your interface or header section called "touchAttachment" of type UIAttachmentBehavior
- (void) pan: (UIPanGestureRecognizer *) sender{
UIView* topCard = [self.cardViews lastObject];
if (sender.state == UIGestureRecognizerStateBegan){
_touchAttachment = [[UIAttachmentBehavior alloc] initWithItem:topCard
attachedToAnchor: [sender locationInView:self.view]];
[self.animator addBehavior:_touchAttachment];
}
else if (sender.state == UIGestureRecognizerStateChanged){
[_touchAttachment setAnchorPoint: [sender locationInView: self.view]];
}
else if (sender.state == UIGestureRecognizerStateEnded){
[self.animator removeBehavior: _touchAttachment];
_touchAttachment = nil;
}
}
Make sure you add the "touchAttachment" variable.
Hope it helps :)
I am developing an enterprise App that injects touches. The method I use is the same with the one seen in the KIF framework (see https://github.com/square/KIF/tree/master/Additions).
There is at least one case, where this method doesn't work:
When the user taps on the device, the injection doesn't work anymore for scrollviews. They don't scroll anymore, when I inject moving touch events. When the user himself scrolls at the device, the injection does work again. It seems like a state is not set with KIFs method, but it is set by iOS when a real touch occurs.
Does anyone know more about it ? Maybe has an idea wich state it could be ? I tried already several private API calls that sounded right, but none worked.
Here is some code, describing the method I used:
Touch down:
UITouch *touch = [[[UITouch alloc] initAtLocation:point inWindow:hitView.window] autorelease];
[touch setPhase:UITouchPhaseBegan];
UIEvent *newEvent = [hitView _eventWithTouch:touch];
[_activeTouchEvents addObject:touch];
[[UIApplication sharedApplication] sendEvent:newEvent];
Touch move
UITouch* touch = _activeTouchEvent;
[touch setPhase:UITouchPhaseMoved];
[touch setLocationInWindow:point];
UIEvent *newEvent = [hitView _eventWithTouch:touch];
[[UIApplication sharedApplication] sendEvent:newEvent];
Touch up
UITouch* touch = _activeTouchEvent;
[touch setPhase:UITouchPhaseEnded];
[touch setLocationInWindow:point];
UIEvent *newEvent = [hitView _eventWithTouch:touch];
[[UIApplication sharedApplication] sendEvent:newEvent];
[_activeTouchEvent release];
_activeTouchEvent = nil;
UIView (extension)
- (UIEvent *)_eventWithTouch:(UITouch *)touch;
{
UIEvent *event = [[UIApplication sharedApplication] performSelector:#selector(_touchesEvent)];
CGPoint location = [touch locationInView:touch.window];
KIFEventProxy *eventProxy = [[KIFEventProxy alloc] init];
eventProxy->x1 = location.x;
eventProxy->y1 = location.y;
eventProxy->x2 = location.x;
eventProxy->y2 = location.y;
eventProxy->x3 = location.x;
eventProxy->y3 = location.y;
eventProxy->sizeX = 1.0;
eventProxy->sizeY = 1.0;
eventProxy->flags = ([touch phase] == UITouchPhaseEnded) ? 0x1010180 : 0x3010180;
eventProxy->type = 3001;
NSSet *allTouches = [event allTouches];
[event _clearTouches];
[allTouches makeObjectsPerformSelector:#selector(autorelease)];
[event _setGSEvent:(struct __GSEvent *)eventProxy];
[event _addTouch:touch forDelayedDelivery:NO];
[eventProxy release];
return event;
}
UITouch (extension)
- (id)initAtLocation:(CGPoint)point inWindow:(UIWindow *)window
{
self = [super init];
if (self == nil) {
return nil;
}
// Create a fake tap touch
_tapCount = 1;
_locationInWindow = point;
_previousLocationInWindow = _locationInWindow;
UIView *hitTestView = [window hitTest:_locationInWindow withEvent:nil];
_window = [window retain];
_view = [hitTestView retain];
_gestureView = [hitTestView retain];
_gestureRecognizers = [[NSMutableArray arrayWithArray:[hitTestView gestureRecognizers]] retain];
_phase = UITouchPhaseBegan;
_touchFlags._firstTouchForView = 1;
_touchFlags._isTap = 1;
_timestamp = [[NSProcessInfo processInfo] systemUptime];
return self;
}
- (void)setPhase:(UITouchPhase)phase;
{
_phase = phase;
_timestamp = [[NSProcessInfo processInfo] systemUptime];
}
Edit 1: Codeblock of "Touch down" was not right
Right now I am developing an iPad app that uses a subclass of UITableviewCell that uses UIPanGestureRecognizer to slide left and right, However it only slides right and dosent slide left... This is the code I am usuing:
- (id)initWithStyle:(UITableViewCellStyle)style reuseIdentifier:(NSString *)reuseIdentifier
{
self = [super initWithStyle:style reuseIdentifier:reuseIdentifier];
if (self)
{
// Initialization code
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePan:)];
pan.delegate = self;
[self addGestureRecognizer:pan];
}
return self;
}
This is my handlePan: method,
CGPoint translation = [gesture locationInView:self];
self.center = CGPointMake(self.center.x + translation.x,
self.center.y);
if (self.center.x > 700) {
NSDictionary *dic = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:self.tag],#"number",nil];
[[NSNotificationCenter defaultCenter] postNotificationName:#"right" object:nil userInfo:dic];
dic = nil;
}
if (self.center.x < 0){
NSDictionary *dic = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:self.tag],#"number",nil];
[[NSNotificationCenter defaultCenter] postNotificationName:#"left" object:nil userInfo:dic];
dic = nil;
NSLog(#"Left Called");
}
[gesture setTranslation:CGPointZero inView:self];
No matter what I try, i cant seem to get the console to say "Left Called" , i.e. The cell slides left. I am really struggling with this issue and would appreciate any help at all.
The trick is to react to the changed state of the pan, and the center calculation can be much simpler...
- (void)handlePan:(UIPanGestureRecognizer *)gesture {
if ((gesture.state == UIGestureRecognizerStateChanged) ||
(gesture.state == UIGestureRecognizerStateEnded)) {
CGPoint newCenter = CGPointMake([gesture locationInView:self.superview].x, self.center.y);
self.center = newCenter;
// To determine the direction of the pan, check the sign of translation in view.
// This supplies the cumulative translation...
if ([gesture translationInView:self.superview].x > 0) {
NSLog(#">>>");
} else {
NSLog(#"<<<");
}
}
}
I think the way that you are determining left/right might be flawed.
CGPoint translation = [gesture locationInView:self];
self.center = CGPointMake(self.center.x + translation.x,
self.center.y);
self.center.x will always be positive since "translation" is the location in the view which is positive.
What you probably want to keep track of the original touch position and then compare that with the location when swiping/moving. Try something like this:
- (void)handlePan:(UIPanGestureRecognizer *)panRecognizer {
if (panRecognizer.state == UIGestureRecognizerStateBegan)
{
// create a local property for the original point and assign the position on first touch
self.originalPanTouchPoint = [panRecognizer locationInView:self.view];
// create a local property for cell's center prior to pan
self.initialCellCenter = self.center;
} else {
CGPoint currentTouchPoint = [panRecognizer locationInView:self.view];
CGFloat xTranslation = self.originalPanTouchPoint.x - currentTouchPoint.x;
self.center = CGPointMake(self.initialCellCenter.x + xTranslation, self.initialCellCenter.y);
}
}
i really got stuck in a week about this case.
i have a UIBarButtonItem inside UINavigationItem, the hierarchy is like this
The BarButtonItem is a wrapper of segmentedControl. The UIBarbuttonitem and UIsegmentedControl are made programmatically, but the others are made in IB.
in this case, i want to show a view after pressing or touching the barbuttonitem. In several thread i read in this forum, i knew that UIBarbuttonItem didn't inherit UIResponder, so i choose the NavigationBar to get the touch, and i define a frame for it.
this is the code i made :
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
navBar = self.navigationController.navigationBar;
int index = _docSegmentedControl.selectedSegmentIndex;
NSLog(#"index di touches began : %d", index);
CGFloat x;
if (index == 0) {
x = 0.0;
}else if (index == 1) {
x = widthSegment + 1;
}else if (index == 2) {
x = 2*widthSegment + 1;
}else if (index == 3) {
x = 3*widthSegment+ 1;
}else if (in dex == 4) {
x = 4*widthSegment + 1;
}
CGRect frame = CGRectMake(x, 0.00, widthSegment, 46.00);
UITouch *touch = [touches anyObject];
CGPoint gestureStartPoint = [touch locationInView:navBar];
NSLog(#"gesturestart : %f, %f", gestureStartPoint.x, gestureStartPoint.y);
if (CGRectContainsPoint(frame, gestureStartPoint)) {
[NSObject cancelPreviousPerformRequestsWithTarget:self selector:#selector(segmentItemTapped:) object:[self navBar]];
NSLog(#"cancel popover");
}
}
the navBar was declare in myViewController.h and i set it as an IBOutlet.
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
int index = _docSegmentedControl.selectedSegmentIndex;
NSLog(#"index di touches ended : %d", index);
navBar = self.navigationController.navigationBar;
CGFloat x;
if (index == 0) {
x = 0.0;
}else if (index == 1) {
x = widthSegment + 1;
}else if (in dex == 2) {
x = 2*widthSegment + 1;
}else if (index == 3) {
x = 3*widthSegment+ 1;
}else if (index == 4) {
x = 4*widthSegment + 1;
}
CGRect frame = CGRectMake(x, 0.00, widthSegment, 46.00);
UITouch *touch = [touches anyObject];
CGPoint gestureLastPoint = [touch locationInView:navBar];
NSLog(#"lastPOint : %d", gestureLastPoint);
if (CGRectContainsPoint(frame, gestureLastPoint)) {
if (touch.tapCount <= 2) {
[self performSelector:#selector(segmentItemTapped:) withObject:nil afterDelay:0.0];
}
}
}
touchesBegan and touchesEnded was detected when i tap at the toolbar, NOT in the navbar.
i did implemented the hitTest method like this :
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
UIView *touchedView = [super hitTest:point withEvent:event];
NSSet* touches = [event allTouches];
// handle touches if you need
return touchedView;
}
but it still nothing get better.
Somebody can decribe why this is happen?
Regards
-Risma-
you can simply add an action to the barbutton item like
[self.mybarbutton setAction:#selector(barButtonTapped:)];
[self.mybarbutton setTarget:self];