Enable UserInteraction to some part of UIViewController - ios

I have been researching about it since last 3 to 4 hours but I didn't get any information. My issue is I want to enable userInteraction to some part of the UIViewController.
Description:
I have a UIViewController. I have added 30 tableviews. I have stored one value in the application. If that value is 1 then I have to enable user interaction for tableview1 only and if the value is 2 then tableview2 only ........etc
. Please let me know if I am not clear. Thank you for spending your valuable time. Thanks in advance

A simple way to do it is to subclass UIView and override - (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event.
Return NO for the part of the UIView (represented as ignoreRect in the sample) you want the subviews to ignore touches.
#interface InteractionView ()
#property (nonatomic) CGRect ignoreRect;
#end
#implementation InteractionView
- (void)awakeFromNib {
self.ignoreRect = CGRectMake(0.0f, 0.0f, 300.0f, 300.0f);
}
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
if (CGRectContainsPoint(self.ignoreRect, point)) {
return NO;
}
return [super pointInside:point withEvent:event];
}
#end
If you need more tweaking for an expected behavior : for exemple return a specific view for a specific zone, return the top view of a specific zone, ... you may use
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
if (CGRectContainsPoint(self.ignoreRect, point)) {
return nil; // Edit that part if you want to return a chosen view
}
return [super hitTest:point withEvent:event];
}

Another solution without - (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event. is to adding UIButton as a subview to the part of UIView you want to close interaction.
for example if you want to close interaction of the bottom half of view.
UIButton *button = [UIButton buttonWithType:UIButtonTypeCustom];
button.frame = CGRectMake(0, self.view.frame.size.height*0.5f, self.view.frame.size.width, self.view.frame.size.height*0.5);
[self.view addSubview:button];
since it will get the touch events, half of the view will be closed to user interaction.
EDIT
IBOutletCollection(UITableView) NSArray *allTableViews;// get all your tableviews reference to this array. set tag in interface builder for each array to reach later.
then when you want to enable/disable interaction of related tableview
int tagToOpenInteraction = 1;//or whatever it is
for(UITableView *t in allTableViews)
{
if(t.tag == tagToOpenInteraction)
[t setUserInteractionEnabled:YES];
else
[t setUserInteractionEnabled:NO];
}

Related

How to change one button's title based on another button's state in objective-c?

I'm very new to objective-c. This is what I want to achieve:
I have two buttons named buttonOne and buttonTwo. The default title of buttonOne is "One". Now, when I touch buttonTwo, I want it to stay highlighted (background changes) like being pressed and at the same time I want the title of buttonOne to change to "Two". When I touch buttonTwo again, I want it to change back to its normal state and also the title of buttonOne will change back to "One". I've done something like this:
[_buttonTwo addTarget:self action:#selector(buttonTwoPressed:) forControlEvents:UIControlEventTouchDown];
[_buttonTwo addTarget:self action:#selector(buttonTwoReleased:) forControlEvents:UIControlEventTouchUpInside];
- (IBAction)buttonTwoPressed:(id)sender {
[_buttonOne setTitle:#"Two" forState:UIControlStateNormal];
[sender setBackgroundColor:[UIColor grayColor]];
}
- (IBAction)buttonTwoReleased:(id)sender {
[_buttonOne setTitle:#"One" forState:UIControlStateNormal];
}
This doesn't work properly. When I touch buttonTwo, the background the buttonTwo changes but the title of buttonOne doesn't change.I have to constantly hold down buttonTwo to see the title of buttonOne change to "Two". Also, when I touch buttonTwo again, nothing happens.
Thanks in advance!
In your current code, you are resetting the title immediately when user takes the finger off from the button.
You can do it using a single method:
[_buttonTwo addTarget:self action:#selector(buttonTwoPressed:) forControlEvents:UIControlEventTouchUpInside];
Implement the method like:
- (IBAction)buttonTwoPressed:(UIButton *)sender
{
sender.selected = !sender.selected;
if (sender.selected)
{
[_buttonOne setTitle:#"Two" forState:UIControlStateNormal];
[sender setBackgroundColor:[UIColor grayColor]];
}
else
{
[_buttonOne setTitle:#"One" forState:UIControlStateNormal];
}
}
There are circumstances that you may have overlooked in this implementation. For example, because you have no action on control event UIControlEventTouchUpOutside, you can catch the change of state on touch down but miss the change back because the touch area moves before the touch event ends (drag off the button before releasing/touchUp..)
I'd be looking at catching the change of control state rather than touchEvents. Unfortunately -(UIControlState )state is read only. so there is no setter to override, but we can catch the events which immediately precede/cause the change of controlState. This is UIControl stuff, let's call this strategyA
Another approach might be to override the touch events from UIView, lets call this strategyB.
Both of these strategies leverage the default implementation, so it's important to pass the message on to super(superclass, i.e. what would happen if we didn't implement the method at all) before we inform our delegate
Subclass UIButton and do something like this. Don't forget you'll need to go change out the class of your buttons in your nib or storyboard or whatever you've done to set them up..
interface:
#import <UIKit/UIKit.h>
#protocol StateButtonDelegate;
#interface StateButton : UIButton
#property (nonatomic, weak) id<StateButtonDelegate> delegate;
#end
#protocol StateButtonDelegate <NSObject>
-(void)button:(StateButton *)button didChangeState:(UIControlState )newState;
#end
implementation:
// StateButton.m
#import "StateButton.h"
#implementation StateButton
//strategyA
-(void)setSelected:(BOOL)selected{
[super setSelected:selected];
[self.delegate button:self didChangeState:self.state];
}
-(void)setHighlighted:(BOOL)highlighted{
[super setHighlighted:highlighted];
[self.delegate button:self didChangeState:self.state];
}
-(void)setEnabled:(BOOL)enabled{
[super setEnabled:enabled];
[self.delegate button:self didChangeState:self.state];
}
//or
//strategyB
-(void )touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
[super touchesBegan:touches withEvent:event];
[self.delegate button:self didChangeState:self.state];
}
-(void )touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event{
[super touchesCancelled:touches withEvent:event];
[self.delegate button:self didChangeState:self.state];
}
-(void )touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
[super touchesEnded:touches withEvent:event];
[self.delegate button:self didChangeState:self.state];
}
#end
then add the protocol to your controller, set it delegate to both buttons and you might have some control
-(void)button:(StateButton *)button didChangeState:(UIControlState )newState{
if (button == _buttonOne){
if (newState & UIControlStateNormal){
//doStuff
}else if (newState & UIControlStateHighlighted){
//doOtherStuff
}else if (newState & UIControlStateSelected){
//doDifferentStuffAgain
}
}else if (button == _buttonTwo){
if (newState & UIControlStateNormal){
//doStuff
}else if (newState & UIControlStateHighlighted){
//doOtherStuff
}else if (newState & UIControlStateSelected){
//doDifferentStuffAgain
}
}
}

Detecting a touch anywhere on the screen

I am wanting to know when a user has touched anywhere on the screen of my app.
I have looked into using -(UIResponder *)nextResponder but unfortunately this will not work, as I am also reloaded a table automatically, so this gets trigged when that occurs.
I have also tried a gesture recognizer, with the following code. But this will only recognise touches on the view. Where as I have many buttons the user will be using to operate the app. I would like to avoid adding a gesture recogniser or code for this in every button and segment control I have on the screen
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapOnView:)];
[self.mainView addGestureRecognizer:tap];
- (void)tapOnView:(UITapGestureRecognizer *)sender
{
//do something
}
I have also tried -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event , but this has the same issue as the gesture recognizer.
I was wondering if there is any way I could achieve this task. I was hoping that I may be able to recognise the type of event from within the nextResponder, and then I could detect if it is button for example.
EDIT: The reason I am working on this is that my app needs to stay active and the screen cannot be locked (so I have disabled screen locking). To avoid excessive use of power, I need to dim the screen, but then return the brightness back to the original level once the app is touched. I need this feature to only occur on 1 of my viewcontrollers.
As mentioned by Ian MacDonald, using hitTest:: is a great solution to detect user interaction on an app wide scale, including when buttons, textfields, etc, are selected.
My solution was to subclass UIWindow and implement the hitTest method.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
// do your stuff here
// return nil if you want to prevent interaction with UI elements
return [super hitTest:point withEvent:event];
}
You could attach your UITapGestureRecognizer to your [[UIApplication sharedApplication] keyWindow].
Alternatively, you could override hitTest: of your root UIView.
Is there a particular task you are hoping to accomplish? There may be a better way than assigning an "anywhere" gesture.
Edit: Use hitTest:.
#interface PassthroughView : UIView
#property (readonly) id target;
#property (readonly) SEL selector;
#end
#implementation PassthroughView
- (void)setTarget:(id)target selector:(SEL)selector {
_target = target;
_selector = selector;
}
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
[_target performSelector:_selector];
return nil;
}
#end
#implementation YourUIViewController {
PassthroughView *anytouchView;
}
- (void)viewDidLoad {
// Add this at the end so it's above all other views.
anytouchView = [[PassthroughView alloc] initWithFrame:self.view.bounds];
[anytouchView setAutoresizingMask:UIViewAutoresizingFlexibleWidth|UIViewAutoresizingFlexibleHeight];
[anytouchView setTarget:self selector:#selector(undim)];
[anytouchView setHidden:YES];
[self.view addSubview:anytouchView];
}
- (void)undim {
[anytouchView setHidden:YES];
}
- (void)dim {
[anytouchView setHidden:NO];
}
#end
Your edit adds more clarity to your question.
The reason I am working on this is that my app needs to stay active
and the screen cannot be locked (so I have disabled screen locking).
To avoid excessive use of power, I need to dim the screen, but then
return the brightness back to the original level once the app is
touched.
Since you are controlling the screen brightness, you can add one transparent view controller before dimming screen on top of your root controller which does only one job, listen to tap using Tap gesture. And on tap you can dismiss the view controller and adjust brightness to previous state.
By doing so you dont have to worry about buttons being clicked as they will be below the transparent view controller. Since its a whole new view controller sitting on top of stack you dont have to modify your existing code as well.
Ok I have had a similar problem before.
As I remember I subclassed the UIWindow for full screen detection and made it First responder.
Than I overridden the touch to handle from subclasses.
You can also use code to identify the control that is been touched.
#import <QuartzCore/QuartzCore.h>
- (void)viewDidLoad
{
[super viewDidLoad];
[self.view setMultipleTouchEnabled:YES];
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// Enumerate over all the touches
[touches enumerateObjectsUsingBlock:^(id obj, BOOL *stop) {
// Get a single touch and it's location
UITouch *touch = obj;
CGPoint touchPoint = [touch locationInView:self.view];
...
}];
}
To disable the locking of screen I used below code:
[[UIApplication sharedApplication] setIdleTimerDisabled:YES];
I used following functions to dim or increase the screen brightness
[[UIScreen mainScreen] setBrightness:0.0f]; //and
[[UIScreen mainScreen] setBrightness:1.0f];

how to disregard touch events in topmost uiview when it is clear and a different uiview can handle them

I have a clear UIView which has gesture recognizers attached to it.
This clear uiview covers the entire super view to allow for the gestures to be invoked from anywhere on it.
Under this clear UIView sit different components such as tables,buttons,collectionview etc.
The clear UIView has no idea what is under it any time.
What I want - if a view which is under the clear uiview can handle a touch event (or any type of gesture) - the clear view should disregard that event - and the event will pass through to the underlying view which could handle it.
I tried
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
but I don't know how to make sure the underlying view can handle it.
-(id)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
id hitView = [super hitTest:point withEvent:event];
if (hitView == self)
{
return nil;
}
else
{
return hitView;
}
}
Add this to your to clear view.
If the hit on clear view means just return nil.
You can override pointInside: withEvent: method. This method returns a boolean value indicating whether the receiver contains the specified point. So if we return NO then your upper clear view will become transparent for touch events and they will be passed to underlying views.
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
// Clear UIView will now respond to touch events if return NO:
return NO;
}
use below code for your case->
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event{
UIView *hitTestView = [super hitTest:point withEvent:event];
if(hitTestView!=nil){
//check for gesture
if([hitTestView.gestureRecognizers count]>0)
return hitTestView;
//if it is subclass of UIControl like UIButton etc
else if([hitTestView isKindOfClass:[UIControl class]])
return hitTestView;
//if can handle touches
else if([hitTestView respondsToSelector:#selector(touchesBegan:withEvent:)])
return hitTestView;
else
return nil;
}
else{
return self;
}
}
In the above code if the subView which is hitView can anyway handle touch ,we return that object to handle that touch. If there is no such hitTest view, then we return the view itself.
I used some of these suggestions and used the following solution:
I added the gesture recognizer to the bottom most superview in the heirarchy (and not the top most)
Then in that class over rid
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *v = [super hitTest:point withEvent:event];
// if v is nil then touch wasn't in this view or its subviews
if (v == nil)
{
return nil;
}
// in any case if the topview was hidden than return the default value
if (self.myTopView.hidden)
{
return v;
}
// if the view isn't hidden but the touch returned a control - than we can pass the touch to the control
if ([v isKindOfClass:[UIControl class]])
{
return v;
}
// decide on what threshold to decide is a touch
CGFloat threshHold = 40;
// if the touch wasn't on a control but could initiate a gesture than that view should get the touch
if (v.gestureRecognizers)
{
threshHold = 30;
// return v;
}
// check if the threshold should be bigger
if ([self someCondition])
{
threshHold = 100;
}
// threshold according to its position - this is the dynamic part
if (point.y > (self.myTopView.frame.origin.y - threshold))
{
return self.handleBarView;
}
return v;
}

iOS 7.0 User interaction disabled for controls inside UITableView backgroundView

I had custom view set as background view to UITableview.Which is utilise to perform some actions when tableview is blank.
I had some buttons on view & a action is associated with each button
For iOS < 7 , action are called properly on buttons inside background view
But for iOS > 7 , actions set on buttons in backgroundView arent getting called.This seems interaction is disabled on backgroundView
Is this an issue with iOS 7.Anyone else face same issue?
There is a UITableViewWrapperView view sitting in front of the background view intercepting interactions. Can you not just use the table's tableHeaderView property instead?
Firstly this is a bad approach to add touchable items to tableView background view.
Secondly I'm not sure my fix will work for your case.
Try to implement following methods in your BackgroundView class:
- (UIView*)hitTest:(CGPoint)point withEvent:(UIEvent*)event
{
UIView* hitView = [super hitTest:point withEvent:event];
if (hitView != nil)
{
[self.superview bringSubviewToFront:self];
}
return hitView;
}
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent*)event
{
CGRect rect = self.bounds;
BOOL isInside = CGRectContainsPoint(rect, point);
if(!isInside)
{
for (UIView *view in self.subviews)
{
isInside = CGRectContainsPoint(view.frame, point);
if(isInside)
break;
}
}
return isInside;
}
By overriding those methods you are making all subviews of backgroundView as well as backgroundView itself touch sensitive.
Cheers!
UPDATE:
Sorry, this will not work. BackgroundView is located behind view with cells itself, so it will not receive touch.
As specified in my comment, this is a known issue in iOS7, see the radar (http://openradar.appspot.com/14707569).
But my solution or workaround to this was to implement a 'proxy' view on top of the table which offers a protocol to forward the hitTest to a Delegate who implements that protocol.
EventFixBackrgoundView.h
#protocol EventFixBackrgoundViewDelegate <NSObject>
- (UIView *)eventFixHitTest:(CGPoint)point withEvent:(UIEvent *)event;
#end
#interface EventFixBackrgoundView:UIView
#property (nonatomic, weak) id <EventFixBackrgoundViewDelegate> delegate;
#end
EventFixBackrgoundView.m
#import "EventFixBackrgoundView.h"
#implementation EventFixBackrgoundView
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
if (self.delegate && [self.delegate respondsToSelector:#selector(eventFixHitTest:withEvent:)])
{
return [self.delegate eventFixHitTest:point withEvent:event];
}
return [super hitTest:point withEvent:event];
}
#end

how to make an object move by touching it in iOS

I am a newbie in Objective-C and trying to make a simple app in which when you touch the object it will move randomly for a while and stops. then you need to touch it again so it will move again and stop after a while.
I have searched for touch method and some tutorials, but the problem is I don't know how to start. It is like I need a function to move the object and one to touch it, but I don't know how to connect them and use them.
here is a tutorial which helped me a lot to get a view of functionality and it actually function in opposite way of my app. but still I can not start programming on my own.
http://xcodenoobies.blogspot.se/2010/11/under-construction.html
Any help would be appreciated, regarding how to program my logic and how to find the right methods and how to find the type of variables I need.
Step 1: Put this code in your ViewDidLoad Method, in which i have created some UIImageView and add it to View Randomly
[self.view setTag:1];
for(int i=0;i<4;i++)
{
int x = arc4random()%300;
int y = arc4random()%400;
#warning set Image here
UIImageView *imgview = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"someImage.png"]];
[imgview setFrame:CGRectMake(x, y, 25, 25)];
[imgview setUserInteractionEnabled:YES];
[self.view addSubview:imgview];
}
Step 2 : Define touchBegan Method to handle touch and move objects around the view, we have set Tag = 1 for ViewController ,because we dont want to move our mainview, only subviews will be moved
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
if([[touch view] tag] != 1)
{
[UIView animateWithDuration:0.25f animations:^{
int x = arc4random()%300;
int y = arc4random()%400;
[[touch view] setCenter:CGPointMake(x, y)];
}];
}
}
What you need is to add a gesture recognizer to the view you want to be able to touch:
// In a view controller or in a UIView subclass
UITapGestureRecognizer *tapGestureRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(handleTap:)];
[self addGestureRecognizer:tapGestureRecognizer];
// Or [self.view addGestureRecognizer:tapGestureRecognizer];
- (void)handleTap:(UITapGestureRecognizer *)sender
{
if (sender.state == UIGestureRecognizerStateEnded) {
// Animate the view, move it around for a while etc.
// For animating a view use animateWithDuration:animations:
}
}
If I get you correctly I would say that the easiest way to achieve what you request is to create a UIButton in Interface Builder and connect it to an IBAction which moves it to a random spot.. You can then add a custom graphic to the button..
Create a public method in your ViewController with return type IBAction
Create a button in IB and connect its "Touch Up Inside" outlet to your IBAction
I your IBAction method, generate a random x and y coordinate within the screens bounds and animate the movement to this point.
I will/can not go into details on the specific code since it would take way to much space.
Note that your question is very open and vague which is not considered good style on StackOverflow. Also, you might wan't to save stuff like animations until you are a bit more experienced with iOS and Objective-C
-V
Actually Apple made a demo for this.
http://developer.apple.com/library/ios/#samplecode/MoveMe/Introduction/Intro.html
You can try to modify this code to your needs. And the actual functions you where looking for where:
- (void) touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
- (void) touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event
If this is the answer you where looking for please click "answered" so this question can be considered as closed :-).
(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
is a method called when user touches the view. if you override this in mainview (big view) you will have to find if touched point is where the object is with some helper method as described in your link. else you can override your object's class & implement the method so you dont have to explicitly find if touched point is on the object or not.
for your requirement. i'd say override the uiimageview & inside that put the touchesbegan implementation it will just work fine.
.h file
#interface StarView : UIImageView
#end
.m file
#implementation StarView
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code
}
return self;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// your code in the link with the related methods
Destination = CGPointMake(arc4random() % 320, arc4random() % 480);
// calculate steps based on speed specified and distance between the current location of the sprite and the new destination
xamt = ((Destination.x - self.center.x) / speed);
yamt = ((Destination.y - self.center.y) / speed);
// ask timer to execute moveBall repeatedly each 0.2 seconds. Execute the timer.
mainTimer = [NSTimer scheduledTimerWithTimeInterval:(0.02) target:self selector:#selector(moveBall) userInfo:nil repeats: NO];
}
dont forget to copy the moveBall method after this.
In your mainview just make an instance of StarView & add to mainview

Resources