I have a RootViewController with a UIScrollViewController (boardScrollView).
This boardScrollView has a UIImageView as subview (boardImage) to create the board. I can zoom in and out and scroll on the boardImage within the boardScrollView. Works great!
Now I want to drag & drop other UIImageViews into the boardScrollImage within the boardScrollView and also OUT of the boardScrollView.
For these other UIImageViews (Tiles) I have created a subclass of the UIImageView class (TileViewClass).
I have the drag&drop working to drop the Tile INTO the boardScrollView/boardImage and also drag&drop INSIDE the boardScrollView/boardImage but I can not get the drag&drop to OUTSIDE the boardScrollView working..
I think this is because I can not access the views in the rootviewcontroller from the subclass.
Maybe it is even better to place the Tile back to the topview (window) in touchesBegan, and so the determination of the drop-position is always done from the same view.
But I don't know how this can be done...
I have tried [[UIApplication sharedApplication].keyWindow bringSubviewToFront:self.dragObject]; in the touchesBegan method, but this does not do the trick....
Maybe I am missing a removeFromSuperView somewhere?
Anyone any idea how I can get the drag&drop working?
RootViewController.h:
#interface RootViewController : UIViewController <UIScrollViewDelegate>
#property (nonatomic, strong) IBOutlet UIScrollView *boardScrollView;
#property (nonatomic, strong) UIImageView *dragObject;
#property (nonatomic, assign) CGPoint touchOffset;
#property (nonatomic, assign) CGPoint homePosition;
#property (nonatomic, strong) UIImageView *boardImage;
#end
RootViewController.m:
#implementation RootViewController
#synthesize boardScrollView;
#synthesize dragObject;
#synthesize touchOffset;
#synthesize homePosition;
#synthesize boardImage;
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view from its nib.
UIImage *image = [UIImage imageNamed:#"greyblue_numbered_15x15_900x900.png"];
self.boardImage = [[UIImageView alloc] initWithImage:image];
self.boardImage.frame = (CGRect){.origin=CGPointMake(0.0f, 0.0f), .size=image.size};
[self.boardScrollView addSubview:self.boardImage];
self.boardImage.userInteractionEnabled = YES;
self.boardScrollView.contentSize = image.size;
self.boardScrollView.canCancelContentTouches = NO;
self.boardScrollView.userInteractionEnabled = YES;
self.boardScrollView.clipsToBounds = YES;
}
TileImageView.h:
#import <UIKit/UIKit.h>
#interface TileImageView : UIImageView
#property (nonatomic, strong) UIImageView *dragObject;
#property (nonatomic, assign) CGPoint touchOffset;
#end
TileImageView.m:
#import "TileImageView.h"
#implementation TileImageView
#synthesize dragObject;
#synthesize touchOffset;
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code
self.exclusiveTouch = YES;
self.userInteractionEnabled = YES;
}
return self;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if ([touches count] == 1) {
// one finger
CGPoint touchPoint = [[touches anyObject] locationInView:self.superview];
for (UIImageView *iView in self.superview.subviews) {
if ([iView isMemberOfClass:[TileImageView class]]) {
if (touchPoint.x > iView.frame.origin.x &&
touchPoint.x < iView.frame.origin.x + iView.frame.size.width &&
touchPoint.y > iView.frame.origin.y &&
touchPoint.y < iView.frame.origin.y + iView.frame.size.height)
{
self.dragObject = iView;
self.touchOffset = CGPointMake(touchPoint.x - iView.frame.origin.x,
touchPoint.y - iView.frame.origin.y);
[self.superview bringSubviewToFront:self];
}
}
}
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint touchPoint = [[touches anyObject] locationInView:self.superview];
CGRect newDragObjectFrame = CGRectMake(touchPoint.x - touchOffset.x,
touchPoint.y - touchOffset.y,
self.dragObject.frame.size.width,
self.dragObject.frame.size.height);
self.dragObject.frame = newDragObjectFrame;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UIView *iView in self.superview.subviews) {
if ([iView isMemberOfClass:[UIScrollView class]])
{
CGPoint touchPointScreen = [[touches anyObject] locationInView:[UIApplication sharedApplication].keyWindow];
if (touchPointScreen.x > iView.frame.origin.x &&
touchPointScreen.x < iView.frame.origin.x + iView.frame.size.width &&
touchPointScreen.y > iView.frame.origin.y &&
touchPointScreen.y < iView.frame.origin.y + iView.frame.size.height)
{
for (UIView *iView2 in iView.subviews) {
[iView2 addSubview:self.dragObject];
CGPoint touchPointImage = [[touches anyObject] locationInView:iView2];
self.dragObject.frame = CGRectMake(touchPointImage.x - touchOffset.x,
touchPointImage.y - touchOffset.y,
self.dragObject.frame.size.width,
self.dragObject.frame.size.height);
}
}
self.dragObject = nil;
}
}
Basically you'r catching all touch events on the tileimageview and moving its origin CGPoint around.
If a touch event ended, your cycling though all subviews of your tileviews superview.
If any1 of this is a UIScrollview, you try to locate if the touchpoint matches its frame.
So basically you just checking the frame of your uiscrollview. Therefore your touchpoint cannot be outside it.
Solution: you have to check the view in which you want to drop the tileimageview! you can assume that its a sibling of your uiscrollview(what i assume).
Other way, which i would recommend: Create a View - dropgroundview, in which you catch the touches(=> implement touchesend)
there you can drop your views around and check for subclasses of e.g. dropviews.
Solution: I have delegated the touch methodes to the RootViewController. There, I have access to all subviews and most important: the highest view.
Here, I say:
[self.view bringSubviewToFront:self.tmpDragObject];
And then it works!
Related
I am moving items in the view by touching them to the place where i leave it
i am using touch events
touchesBegin , touchesMoved , touchesEnd
and in touchesMoved i move the item.frame to the new location and it works with me
but then i found a code that use panGestureRecognizer
and then i cant determine what to use
the code to handle pan was
- (IBAction)handlePan:(UIPanGestureRecognizer *)recognizer {
if (recognizer.state == UIGestureRecognizerStateBegan || recognizer.state == UIGestureRecognizerStateChanged) {
CGPoint translation = [recognizer translationInView:self.superview];
CGPoint translatedCenter = CGPointMake(self.center.x + translation.x, self.center.y + translation.y);
[self setCenter:translatedCenter];
[recognizer setTranslation:CGPointZero inView:self];
}
given that i need the exact coordinates of the point i am touching
The code you have is OK but seems to be running in a custom subclass of UIView (from references to self). Ideally a UIView should not be handling this it should be dealt with in the View Controller to adhere to the MVC design pattern.
Something like this:
#import "ViewController.h"
#interface ViewController ()
#property (nonatomic, strong) UIPanGestureRecognizer *pan;
#property (nonatomic, strong) IBOutletCollection(UIView) NSArray<UIView *> *touchableViews;
#property (nonatomic, weak) UIView *currentMovingView;
#end
#implementation ViewController
#pragma mark - Lifecyle
- (void)viewWillAppear:(BOOL)animated
{
[super viewWillAppear:animated];
if (!self.pan) {
self.pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(didPan:)];
}
[self.view addGestureRecognizer:self.pan];
}
- (void)viewWillDisappear:(BOOL)animated
{
[super viewWillDisappear:animated];
[self.view removeGestureRecognizer:self.pan];
}
#pragma mark - Gestures
- (void)didPan:(UIPanGestureRecognizer *)gesture
{
switch (gesture.state) {
case UIGestureRecognizerStateBegan:
{
CGPoint loc = [gesture locationInView:self.view];
self.currentMovingView = [self viewForLocation:loc];
}
break;
case UIGestureRecognizerStateChanged:
{
if (self.currentMovingView == nil) {
return;
}
self.currentMovingView.center = [gesture locationInView:self.view];
}
break;
case UIGestureRecognizerStateEnded:
{
self.currentMovingView = nil;
}
break;
default:
break;
}
}
- (UIView *)viewForLocation:(CGPoint)loc
{
for (UIView *v in self.touchableViews) {
if (CGRectContainsPoint(v.frame, loc)) {
return v;
}
}
return nil;
}
#end
Bear in mind that this is untested code and will almost certainly need to be tweaked for different use cases but this is neat and follows good practices.
i found a code that use panGestureRecognizer
and then i cant determine what to use
Unless you need something than cannot be achieved with UIPanGestureRecogniser, use it.
I need to select text by swiping gesture without long press. There seems two way: One is to subclass UITextView or do something about UITextView, the other is to use Core Text to make a new UI component.
What should I do?
This is another answer with pan gesture.
Hope this is what you want.
- (IBAction)pan:(UIPanGestureRecognizer *)ges
{
CGPoint point = [ges locationInView:ges.view];
if (ges.state == UIGestureRecognizerStateBegan)
{
startPoint = point;
}
else if (ges.state == UIGestureRecognizerStateChanged || ges.state == UIGestureRecognizerStateEnded)
{
UITextPosition *start = [self.textView closestPositionToPoint:startPoint];
UITextPosition *end = [self.textView closestPositionToPoint:point];
UITextRange *range = [self.textView textRangeFromPosition:start toPosition:end];
[self.textView select:self.textView];
self.textView.selectedTextRange = range;
}
}
UIResponder contains
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
Make a custom UITextView class and override these event could get what you want.
Hopes it could help you.
Just like baliman said.
A little modified.
First separate textView.text into array.
_swipeValue = [textView componentsSeparatedByString:#" "];
Seconde in gesture selector
if(gestureReconize.direction == UISwipeGestureRecognizerDirectionLeft) {
swipeIndex++;
swipeIndex = (swipeIndex >= [self.swipeValues count]) ? 0 : swipeIndex;
} else {
swipeIndex--;
swipeIndex = (swipeIndex < 0) ? [self.swipeValues count] - 1 : swipeIndex;
}
NSString *selectedValue = [self.swipeValues objectAtIndex:swipeIndex];
Third convert selectedValue to range of textView.text
NSRange range = [textView.text rangeOfString:selectedValue];
Final set textView.selectedRange
[textView select:self]; //See http://stackoverflow.com/questions/1708608/uitextview-selectedrange-not-displaying-when-set-programatically
textView.selectedRange = range;
It work. But not perfect if your textView.text has same word.
Hi Maybe something like this
//
// ViewController.m
// SwipeTextFieldDemo
//
#import "ViewController.h"
#interface ViewController ()
#property (weak, nonatomic) IBOutlet UITextField *swipeTextField;
#property (strong, nonatomic) NSMutableArray *swipeValues;
#end
int swipeIndex = 0;
#implementation ViewController
#synthesize swipeTextField = _swipeTextField;
#synthesize swipeValues = _swipeValues;
// Create list of colors
- (NSMutableArray *)swipeValues
{
if(_swipeValues == nil) {
_swipeValues = [[NSMutableArray alloc]initWithObjects:#"Red",
#"Blue",
#"Green",
#"Yellow",
#"Orange",
#"Pink",
#"White",
#"Black",
nil];
}
return _swipeValues;
}
- (void)swipeColor:(UISwipeGestureRecognizer *)gestureReconize
{
if(gestureReconize.direction == UISwipeGestureRecognizerDirectionLeft) {
swipeIndex++;
swipeIndex = (swipeIndex >= [self.swipeValues count]) ? 0 : swipeIndex;
} else {
swipeIndex--;
swipeIndex = (swipeIndex < 0) ? [self.swipeValues count] - 1 : swipeIndex;
}
self.swipeTextField.text = [self.swipeValues objectAtIndex:swipeIndex];
}
- (void)viewDidLoad
{
[super viewDidLoad];
// Right swipe
UISwipeGestureRecognizer *swr = [[UISwipeGestureRecognizer alloc]
initWithTarget:self action:#selector(swipeColor:)];
[swr setNumberOfTouchesRequired:1];
[swr setDirection:UISwipeGestureRecognizerDirectionRight];
[self.swipeTextField addGestureRecognizer:swr];
// Left swipe
UISwipeGestureRecognizer *swl = [[UISwipeGestureRecognizer alloc]
initWithTarget:self action:#selector(swipeColor:)];
[swl setNumberOfTouchesRequired:1];
[swl setDirection:UISwipeGestureRecognizerDirectionLeft];
[self.swipeTextField addGestureRecognizer:swl];
self.swipeTextField.text = [self.swipeValues objectAtIndex:swipeIndex];
}
- (void)viewDidUnload
{
[self setSwipeTextField:nil];
[super viewDidUnload];
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
return (interfaceOrientation != UIInterfaceOrientationPortraitUpsideDown);
}
#end
//
// ViewController.h
// SwipeTextFieldDemo
//
#import <UIKit/UIKit.h>
#interface ViewController : UIViewController <UITextFieldDelegate, UIGestureRecognizerDelegate>
- (void)swipeColor:(UISwipeGestureRecognizer *)gestureReconize;
#end
I have been trying to implement dragable UIButton in iOS by overriding touchesMoved method.
The button shows up , however i am not able to drag it.What am i missing here?
this is what i reffered
This is my .h file.
#interface ButtonAnimationViewController : UIViewController
#property (weak, nonatomic) IBOutlet UIButton *firstButton;
And the .m file.
#implementation ButtonAnimationViewController
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint pointMoved = [touch locationInView:self.view];
self.firstButton.frame = CGRectMake(pointMoved.x, pointMoved.y, 73, 44);
}
Here you have a fully working button dragging example using UIPanGestureRecognizer which, in my opinion, is easier. I tested it before posting the code. Let me know if you have any more questions:
#interface TSViewController ()
#property (nonatomic, strong) UIButton *firstButton;
#end
#implementation TSViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// this code is just to create and configure the button
self.firstButton = [UIButton buttonWithType:UIButtonTypeRoundedRect];
[self.firstButton setTitle:#"Button" forState:UIControlStateNormal];
self.firstButton.frame = CGRectMake(50, 50, 300, 40);
[self.view addSubview:self.firstButton];
// Create the Pan Gesture Recognizer and add it to our button
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(dragging:)];
[self.firstButton addGestureRecognizer:panGesture];
}
// this method will be called whenever the user wants to drag the button
-(void)dragging:(UIPanGestureRecognizer*)panGesture {
// if is not our button, return
if (panGesture.view != self.firstButton) {
return;
}
// if the gesture was 'recognized'...
if (panGesture.state == UIGestureRecognizerStateBegan || panGesture.state == UIGestureRecognizerStateChanged) {
// get the change (delta)
CGPoint delta = [panGesture translationInView:self.view];
CGPoint center = self.firstButton.center;
center.x += delta.x;
center.y += delta.y;
// and move the button
self.firstButton.center = center;
[panGesture setTranslation:CGPointZero inView:self.view];
}
}
#end
Hope it helps!
Somebody help!
Here is what I want to implement :
While an UIImageView is becoming alpha 0 (hidden)
it can be touched
so that its alpha becomes one (unhidden).
But UIImageView is not touched while it is animating (=becoming alpha 0).
I tried hundred skills at stackoverflow, but didn't work.
They are.......
UIViewAnimationOptionAllowUserInteraction
setUserInteractionEnabled:YES;
touchesBegan
GestureRecognizer options
etc..
Only function 'hitTest' worked but not during animation.
Please reply . Thank you.
Below is my codes.
#import "ViewController.h"
#define AD #"text.png"
#import <QuartzCore/QuartzCore.h>
#interface ViewController ()
#end
#implementation ViewController
#synthesize scrollData=_scrollData;
#synthesize adImage=_adImage;
- (void)viewDidLoad
{
//_adImage is UIImageView
_adImage=[[adImage alloc] initWithImage:[UIImage imageNamed:AD]];
_scrollData.scrollView.delegate = self;
UITapGestureRecognizer *singleTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(singleTapGestureCaptured:)];
[_adImage addGestureRecognizer:singleTap];
[_adImage setMultipleTouchEnabled:YES];
[_adImage setUserInteractionEnabled:YES];
[self.view addSubview:_adImage];
_adImage.alpha=0;
[_adImage setUp_pt:CGPointMake(160,250)];
_adImage.center=_adImage.up_pt;
[super viewDidLoad];
[self hideImage:_adImage delay:0];
[self becomeFirstResponder];
}
- (void)hideImageComplete:(UIView*)v
{
[self hideImage:v delay:0];
}
- (void)hideImage:(UIImageView*)v delay:(int)nDelay
{
[_adImage becomeFirstResponder];
[UIView animateWithDuration:1
delay:nDelay
options:
(UIViewAnimationOptionCurveEaseIn | UIViewAnimationOptionAllowUserInteraction)
animations: ^
{
_adImage.alpha=0.0f;
}
completion:^(BOOL completed){
[self hideImageComplete:v];
}];
}
- (void)singleTapGestureCaptured:(UITapGestureRecognizer *)gesture
{
NSLog(#"Gesture event on view");
_adImage.alpha=1;
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch{
return YES;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
NSLog(#"hit!!");
[super touchesBegan:touches withEvent:event];
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self.view];
CGPoint touchLocation = [touch locationInView:self.view];
_adImage.alpha=1;
}
#end
#implementation adImage
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self];
if ([[[self layer] presentationLayer] hitTest:touchPoint]) {
[self.layer removeAllAnimations];
}
}
-(UIView*)hitTest:(CGPoint)point withEvent:(UIEvent *)event{
if ([[self.layer presentationLayer] hitTest:point]) {
NSLog(#"hit!!");
self.alpha=1;
return self;
}
return [super hitTest:point withEvent:event];
}
#synthesize up_pt;
#end
Here is my ViewController.h codes.
#import <UIKit/UIKit.h>
#interface adImage : UIImageView {
}
//#property (strong, nonatomic) UIImageView *imageView;
#property (assign)CGPoint up_pt;
#end
#interface ViewController : UIViewController <UIScrollViewDelegate>{
}
- (void)hideImageComplete:(UIView*)v;
- (void)hideImage:(UIView*)v delay:(int)nDelay;
#property (strong, nonatomic) adImage *adImage;
#property(nonatomic, strong) IBOutlet UIWebView *scrollData;
#end
I have the answer for you, but first:
always name a class with a starting capital letter, ie AdImageView (its an image view not a pure view subclass too)
I took your code but commented out all your touch methods, which you may or may not need
the root issue here is that a UIGestureRecognizer won't work if alpha is 0, so you will see the animation to alpha=0 is broken into two pieces, almost to 0, then to 0. If the user taps to cancel, then the final completion block returns the view to 0
Code:
- (void)hideImage:(UIImageView*)v delay:(int)nDelay
{
//[_adImage becomeFirstResponder];
[UIView animateWithDuration:(3.0f - 0.1f)
delay:nDelay
options: (UIViewAnimationOptionCurveEaseIn |
UIViewAnimationOptionAllowUserInteraction)
animations: ^
{
_adImage.alpha=0.1f;
}
completion:^(BOOL completed)
{
NSLog(#"completed=%d", completed);
if(completed) {
[UIView animateWithDuration:0.1f animations:^{
_adImage.alpha=0.0f;
}];
} else {
_adImage.alpha=1;
}
}];
}
- (void)singleTapGestureCaptured:(UITapGestureRecognizer *)gesture
{
NSLog(#"Gesture event on view");
[_adImage.layer removeAllAnimations];
}
This worked great for me on iOS7.1+:
self.viewWithTapgesture.backgroundColor = [UIColor colorWithWhite:1.0 alpha:0.01];
first. here is my code.
paintingView.h
#import <UIKit/UIKit.h>
#import <Foundation/Foundation.h>
#protocol PaintingViewDelegate
-(void)makeLine:(CGPoint) touch;
#end
//CLASS INTERFACES:
#interface PaintingView : UIView
{
id <PaintingViewDelegate> delegate;
#private
CGPoint location;
CGPoint previousLocation;
Boolean firstTouch;
}
#property(nonatomic, readwrite) CGPoint location;
#property(nonatomic, readwrite) CGPoint previousLocation;
#property(nonatomic, assign) id <PaintingViewDelegate> delegate;
#end
paintingView.m
#import <QuartzCore/QuartzCore.h>
#import <OpenGLES/EAGLDrawable.h>
#import "PaintingView.h"
//CLASS IMPLEMENTATIONS:
#implementation PaintingView
#synthesize location;
#synthesize previousLocation;
#synthesize delegate;
// Implement this to override the default layer class (which is [CALayer class]).
// We do this so that our view will be backed by a layer that is capable of OpenGL ES rendering.
+ (Class) layerClass
{
return [CAEAGLLayer class];
}
// Handles the start of a touch
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGRect bounds = [self bounds];
UITouch* touch = [[event touchesForView:self] anyObject];
firstTouch = YES;
// Convert touch point from UIView referential to OpenGL one (upside-down flip)
location = [touch locationInView:self];
location.y = bounds.size.height - location.y;
[[self delegate]makeLine:location];
}
...
#end
viewController.h
#import "PaintingView.h"
#interface GMViewController : UIViewController <AGSMapViewLayerDelegate, PaintingViewDelegate>
{
PaintingView *painting1;
}
...
#end
viewController.m
...
- (void)viewDidLoad
{
[super viewDidLoad];
painting1 = [[PaintingView alloc] init];
[painting1 setDelegate:self];
}
...
-(void)makeLine(CGPoint) touch
{
NSLog(#"TOUCH");
}
...
#end
my delegate method is never called. I have made my delegate method from the iOS-developer.net resources example. there are no errors during compiling.
Fixed it by setting the paintingView's delegate with the line
[painting setDelegate: self];
in my viewController class file.