I have the following problem:
I have one UIImageView which I can drag by touch and a toolbar, which I want to be near that Image View. This is, what I'm doing at the moment:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
//motion here;self.tool is toolbar View
CGFloat a=self.tool.frame.size.width;
CGFloat b=self.tool.frame.size.height;
self.tool.frame=CGRectMake(self.frame.origin.x+self.frame.size.width/2+50, self.frame.origin.y+self.frame.size.height/2+50, a, b);
}
It works fine but sometimes toolbar is moving outside of screen. Maybe there is simple way to track if I am outside and move toolbar to another point?
You can check it like this:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
//motion here;self.tool is toolbar View
CGFloat a = self.tool.frame.size.width;
CGFloat b = self.tool.frame.size.height;
CGRect newFrame = CGRectMake(self.frame.origin.x+self.frame.size.width/2+50,
self.frame.origin.y+self.frame.size.height/2+50,
a, b);
// only set frame, if it is still in the bounds of self.superview
if(CGRectContainsRect(self.superview.frame, newFrame)) {
self.tool.frame = newFrame;
}
}
You should be using UIGestureRecognizer, not touchesMoved:. And when you do, the gesture recognizer on the image view can move the toolbar view however it likes.
Related
I want to scroll the image that is inside my UIImageView in case my finger is inside the UIImageView area and i'm moving my finger. I'm trying to do that using objective c. I got it moving, but it act weird and dont work right. Can you please show me how to do that please???
This is what i'm doing:
- (void)touchesMoved :(NSSet *)touches withEvent:(UIEvent *)event{
for (UITouch *touch in touches) {
CGPoint touchLocation = [touch locationInView:self];
for (id sublayer in view_shape.sublayers) {
if (sublayer ==imageView.layer) {
if (CGRectContainsPoint(imageView.frame, touchLocation)) {
imageView.layer.contentsRect = CGRectMake( touchLocation.x/1000,
touchLocation.y/1000,
imageView.layer.contentsRect.size.width,
imageView.layer.contentsRect.size.height);
}
}
}
}
}
Why not use a scrollView and just add the imageView to it
You can't do this with just an instance of UIImageView. An image view will just hold on to an image and draw that.
Either add that inside a scrollview if you want a simple scroll/paging interface.
(or)
Add that to a UIView, and declare a UIPanGestureRecognizer to the UIView, and check the actions you get. Based on the translation, you can set the frame for the UIImageView.
In my application I have multiple small views joined together to form a big canvas. Im properly getting the touch begin/moved/ended events for each of those views separately. What I want now is this that if I touch at view1 and drag my finger out of view1 and into the territory of view2 without lifting my finger up, I want the view2 to somehow get a notification that I'm now in this view i.e. view2. Thanks.
I was able to do it using touchesMoved method. Here's the code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesMoved:touches withEvent:event];
CGPoint nowPoint = [touches.anyObject locationInView:self.view];
NSLog(#"%f, %f", nowPoint.x, nowPoint.y);
NSArray *viewsToCheck = [self.view subviews];
for (UIView *v in viewsToCheck)
{
if ([v isKindOfClass:[CharacterTile class]])
{
if (CGRectContainsPoint(v.frame, nowPoint))
{
CharacterTile *ctTemp = (CharacterTile *)v;
//perform your work with the subview.
}
}
}
}
where CharacterTile are the subviews added on self.view.
CGRectContainsPoint tells if the point touched by user is inside a view or not.
i would like to get the UIImageView ( or whatever I have ) in the view according to the xPos and yPos using this method :
- (void) touchesMoved: (NSSet *)touches withEvent:(UIEvent *)event {
CGPoint tappedPt = [[touches anyObject] locationInView:viewIndex];
int xPos = tappedPt.x;
int yPos = tappedPt.y;
NSLog(#"xPos %i yPos %i ", xPos, yPos);
// do something with xPos and yPos like add them to an array
}
Thanks a lot !
Should not be much difficult to check whether a point lies in some rect or not, provided you know the frame of UIImageView object.
But have you tried creating a subclass of UIView, add an image to it. And that class object on your rather than adding a UIImageView object over it, and implementing the touches methods in this UIView's subclass rather than in you UIViewController class?
I don't think there is anything that walks through the child hierarchy. You can cut out a little of the work using the UIView method :-
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
where event can be nil, but you'll still have to do that test to each child, and potentially it's descendents.
you can get all subviews of the main view. and then determine whether the touch point in the subviews.
for(UIView *aView in [self subviews])
{
if(CGRectContainsPoint(aView.frame, tappedPt))
{
return aView;
}
}
but it has a problem . if many subviews are overlapping , you can only get a view . you can modify the code to get a view array.
Just like the title ,when I touch the UIView I want the touch point to be glow, I came up with a idea that to show a glow image on the touch point ,but if by code?any example or idea?thanks for sharing!!
You can try touchesBegan event below Code:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if([touch view] ==self.my_view)//Check click your uiview or not
{
CGRect frame = [your_image_view frame];
CGPoint touchLocation = [touch locationInView:touch.view];
frame.origin.x=touchLocation.x;
frame.origin.y=touchLocation.y;
[your_image_view setFrame:frame];
[self.my_view addSubview:your_image_view];
}
}
Thanks..!
Well the first step would be to detect the touch on your UIView for that I would recommend using the high level API UIGestureRecognizer andd add to your view. Then you can use the call back to do whatever you want, perhaps some sort of animation?. If so you can use an animation block to animate your changes.
I have a UIImageView that is added as a subview. It shows up when a button is pressed.
When someone taps outside of the UIImageView in any part of the application, I want the UIImageView to go away.
#interface SomeMasterViewController : UITableViewController <clip>
<clip>
#property (strong, nonatomic) UIImageView *someImageView;
There are some hints in stackoverflow and Apple's documentation that sound like what I need.
Apple's : Gesture Recognizers
Apple's : UIView hitTest:withEvent
Apple's : UITouch Class Reference
Stackoverflow: Listening to UITouch event along with UIGestureRecognizer
(not likely needed but..) - CGRectContainsPoint as mentioned in the following post titled: Comparing a UITouch location to UIImageView rectangle
However, I want to check my approach here. It's my understanding that the code needs to
Register a UITapGestureRecognizer to get all touch events that can happen in an application
UITapGestureRecognizer should have its cancelsTouchesInView and
delaysTouchesBegan and delaysTouchesEnded set to NO.
Compare those touch events with the someImageView (how? Using UIView hitTest:withEvent?)
Update: I am registering a UITapGestureRecognizer with the main UIWindow.
Final Unsolved Part
I have a handleTap:(UITapGestureRecognizer *) that the UITapGestureRecognizer will call. How can I take the UITapGestureRecognizer that is given and see if the tap falls outside of the UIImageView? Recognizer's locationInView looks promising, but I do not get the results I expect. I expect to see a certain UIImageView when I click on it and not see the UIImageView when I click in another spot. I get the feeling that the locationInView method is being used wrong.
Here is my call to the locationInView method:
- (void)handleTap:(UITapGestureRecognizer *)gestureRecognizer
{
if (gestureRecognizer.state != UIGestureRecognizerStateEnded) {
NSLog(#"handleTap NOT given UIGestureRecognizerStateEnded so nothing more to do");
return;
}
UIWindow *mainWindow = [[[UIApplication sharedApplication] delegate] window];
CGPoint point = [gestureRecognizer locationInView:mainWindow];
NSLog(#"point x,y computed as the location in a given view is %f %f", point.x, point.y);
UIView *touchedView = [mainWindow hitTest:point withEvent:nil];
NSLog(#"touchedView = %#", touchedView);
}
I get the following output:
<clip>point x,y computed as the location in a given view is 0.000000 0.000000
<clip>touchedView = <UIWindow: 0x8c4e530; frame = (0 0; 768 1024); opaque = NO; autoresize = RM+BM; layer = <UIWindowLayer: 0x8c4c940>>
I think you can just say [event touchesForView:<image view>]. If that returns an empty array, dismiss the image view. Do this in the table view controller's touchesBegan:withEvent:, and be sure to call [super touchesBegan:touches withEvent:event] or your table view will completely stop working. You probably don't even need to implement touchesEnded:/Cancelled:..., or touchesMoved:....
UITapGestureRecognizer definitely seems like overkill in this case.
You can use touch functions to do that:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
when user touch the screen first your touchBegan function is called.
in touchBegan:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint pt = [[touches anyObject] locationInView:self];
}
so you have the point that user touched.Then you must find that the point is in your UIImageView or not.
But if you can give tag to your UIImageViews. That will be pretty much easy.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject ];
if( yourImageView.tag==[touch view].tag){
[[self.view viewWithTag:yourImageView.tag] removeFromSuperView];
[yourImageView release];
}
}