iOS Button Custom Image bounds - ios

I have multiple buttons in the shape of an octagon, and they are all right next to each other. Sharing edges. However, when i import these custom images for a button, the bounds of the custom shape is a square. Therefore, part of one of the octagon is overlapping the one right next to this. Not the actual octagon but the transform tool/modifer button bounds. Thus part of the button, though it is hidden, is overlapping another button. How do i modify the button to only take shape to the bounds of the custom shape?

subclass UIButton and override -(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event. check if the point is inside your octagon shape and return YES in that case, otherwise NO. Then the buttons will technically still be overlapping but only consume touch events inside your shape.

Related

How to make just the circle area of a button clickable, instead of the entire the original rect [duplicate]

I need to make some triangular buttons that overlap each other.
While UIButtons can take transparent images as backgrounds, and UIControls can have custom views, the hit area of these is always square. How can I create a triangular hitarea for my buttons?
I come from a FLash background so I would normally create a hitarea for my view, but I don't believe I can do this in Cocoa.
Any tips?
You can achieve this by subclassing UIButton and providing your own:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
// return YES if point is inside the receiver’s bounds; otherwise, NO.
}
Apple's UIView Documentation provides the details, such as confirming that point is already in the receiver's coordinate system.

iOS touch event triggering by movement of other view

Here's the scenario I am trying to implement:
I already have a view that can let user draw doodles by touching inside the view directly (like a doodle canvas). This view implements touchesBegan, touchMoved, and touchEnded handlers to draw lines from touch event parameters.
Now instead of that, I want user to be able to drag and move another UIView on this canvas view and can still draw lines just like they touch directly. For example, user can drag a pen image view on the canvas view to draw pen-style lines.
In this case, how can I transfer the movement of this pen image view to the canvas so it recognize it? One more question: if I want this canvas view to only recognize movements of drag other views rather than touching directly, what I should do?
(Sorry that this question is little too general, just want to get some pointer of it)... Thanks!
A better way to look at the problem is
How can I transfer movement on the canvas into the location of a pen
image view?
That's easy. You already have all the code that keeps track of movement in the canvas (touchesBegan, touchesMoved, touchesEnded), so all you need to do is change to center property of the pen view to track the movement in the canvas. (Obviously, you'll need to apply small X and Y offsets to put the center of the pen view at the correct location).
The only non-obvious detail that you need to be aware of is that the pen view must have userInteractionEnabled set to NO. That way, the pen view won't interfere with touches reaching the canvas view.
Note that UIImageView has user interaction disabled by default, so you don't need to do anything if the pen view is a UIImageView. However, if you're using a generic UIView to display the pen, then you need to disable user interaction in storyboard under the Attributes inspector or disable it in code, e.g. in viewDidLoad.

Touch Event clicking UIImageView while rotating

My ImageView rotates but while it rotates it doesn't recognize touches on itself.
Do you think it's okay if I create a button via code that's over the ImageView that recognizes the touch?
When an animation is applied on any UIView or any subclass object of a UIView like UIImageView, UIButton etc then it does not detect touch events because when an animation is applied to a view, the animated property changes to its end value right away. what you are actually seeing on screen is the presentation layer of your views layer.
To answer your question, Yes, you can make a UIButton that covers up the area of the UIImageView to detect touch events on it. That sounds like the easiest to implement option in this case.
Apart from that, this link may also help you in the process. Hit testing animating layers

Custom UIbutton down part does not recognize touchup inside

I created a custom UIbutton and placed it onto a view which shows certain information about a package.I want the whole area to be touchable as button that was my starting point.THe problem is that only the upper half of the custom button is touchable and the down part is ignored.I set the background color to a solid one and the frame seems to be ok.
UPDATE: If i add the same custom button to superview it seems to be ok but the coordinates are not right in this case.I need to convert the coordinates of the subview to super view.
After struggling so many hours i figured out that my outer frame was smaller than it was supposed to be.

Using Cocos2d, to detect if a Sprite is tapped on, we need to do all the calculations?

For example, if we have 10 rectangle sprites, and we generate them using random width, height, position, and z-index. And now the method
-(void) ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
is called. How do we know which sprite is tapped on? I know some technique checks whether the tapped point is within the bound of the sprite's rectangle, but in the case described above, what if rect A is on top of rect B at the TOP LEFT corner, and when the TOP LEFT corner of rect B is tapped on, it could be rect A that is tapped on -- the tapping point is actually inside of both rects.
Do we have to do it manually, and even consider the z-index...? (possibly looping through all sprites from the highest z-index to the lowest).
What if the sprite is a triangle, and rotating? There isn't a built-in way in Cocos2d that handles that?
(that's because I went through Core Graphics sample code a few days ago... seems like in that case, there will be two tap events, one for the main view, and one for the sub-view, and we can check what view it is that the user tapped on, without doing any calculation)
A possible solution would be a subclass of CCSprite that declares itself a delegate for CCStandardTouchDelegate or CCTargetedTouchDelegate. Then perform the appropriate operations on the sprite in those delegate methods.

Resources