Get all touches on screen - ios

I'm having a small issue. I'd like to receive all touches on the screen and for each one spawn a new video. The problem is that once a video is placed then it intercepts the touch points. I tried various values to go in locationInView but without any luck so far. Am I looking in the right place?
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint pointOnScreen = [[touches anyObject] locationInView:self.view];
C4Movie *player = [C4Movie movieNamed:#"inception.mov"];
player.shouldAutoplay = YES;
player.loops = YES;
player.center = pointOnScreen;
[self.canvas addMovie:player];
}
#end

Try setting the userInteractionEnabled property of each video screen (assuming it is held in some sort of UIView) to NO - that way, touch events will pass through it and continue to be received by your handler.

Yes, you're looking in the right place, and Chris has it right about user interaction. You should try:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint pointOnScreen = [[touches anyObject] locationInView:self.view];
C4Movie *player = [C4Movie movieNamed:#"inception.mov"];
player.shouldAutoplay = YES;
player.loops = YES;
player.center = pointOnScreen;
player.userInteractionEnabled = NO;
[self.canvas addMovie:player];
}
However you're going to run into an issue with adding videos. Unfortunately, iOS / hardware only lets you have 4 video pipelines running at one time so you'll hit that pretty quickly.
If you want to add things to the screen as you touch and drag your finger, then you could also do the above code inside of the following method:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
//work your magic here
}

Related

How to make an animated button in Sprite Kit

I have been trying to implement a method where the user can press down the play button, it then changes the texture to the pressed down image. Then, if the user decides not to continue with their action. Such as starting the game. They can then simply drag out of the sprite's frame/body which will then no longer detect the touch as one which will start the action.
The problem with this implementation is that I can't make the touch on the button cancel if the user drags outside of the play button's sprite frame.
The code you will see below doesn't have the transition between the scenes, as I'd like to test the button's usability before having to always quit the application to try the button.
Also, I have declared the majority of objects in the .h file.
Code from MenuScene.m:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
//Touch detection declaration
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInNode:self];
touchNode = [self nodeAtPoint:touchLocation];
if([touchNode.name isEqualToString:#"playButton"]){
[playButtonSprite runAction:changePlayButtonTextureON];
}else{}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
if ([touchNode.name isEqualToString:#"playButton"]) {
[playButtonSprite runAction:changePlayButtonTextureOFF];
}
}
I would also like to know if there is an alternative method to detecting if the touch is on the play button sprite node. Though, this may be solved when a solution is found for the previous issue I have mentioned.
The best way to handle this would be to subclass SKSpriteNode to handle touches on it's own. You can see such an implementation in this project on GitHub. It cancels the touch when the touch goes out of the node's bounds.
The existing code you have posted is good as the code will not get called when the touch ends out of the play button's bounds.
In your case, since you are handling the touches from the scene itself, you can explicitly set the playButton's texture whenever the touch is detected outside the node's bounds.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInNode:self];
touchNode = [self nodeAtPoint:touchLocation];
if(![touchNode.name isEqualToString:#"playButton"]){
[playButtonSprite runAction:changePlayButtonTextureOFF];
}
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
//Touch detection declaration
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInNode:self];
touchNode = [self nodeAtPoint:touchLocation];
if([touchNode.name isEqualToString:#"playButton"]){
[playButtonSprite runAction:changePlayButtonTextureON];
playButtonSprite.isON = YES;
}else{}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if (playButtonSprite.isON) {
if ([touchNode.name isEqualToString:#"playButton"]) { // The user really wants to play!
[self startPlaying];
}
[playButtonSprite runAction:changePlayButtonTextureOFF];
playButtonSprite.isON = NO;
}
}
Where changePlayButtonTextureON also sets playButtonSprite.isON to YES, and vise versa.
You will want to make a subclass of SKSpriteNode for playButtonSprite, and add that boolean property isON there.

Smoother sprite rotation in cocos2d?

I have a simple program where when you hold the screen the sprite moves up and when you let go the sprite moves down. So in the ccTouchesBegan function I rotate the sprite:
-(void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
player.rotation = -5;
}
And then in the ccTouchesEnded function I rotate it again:
-(void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
player.rotation = 20;
}
So I was wondering if there was a way to make the rotation a little slower and smoother? It looks very clunky when I run it and I want it to look more realistic. I am not using a physics engine (box2d, chipmunk, etc..)
You can use CCRotateBy or CCRotateTo Class functions to rotate it.
Ex. player->runAction(CCRotateTo::create(1.0f, -5)); // In C++
Here first parameter specifies duration to rotation (currently 1 sec) and 2nd one is for Angle.
But if you touch the screen too frequently this one also may look weird.
Below code is for cocos2d-IOS. But you can translate it for c++ also. its easy.
CCRotateBy *rotTouchBegin = [CCRotateBy actionWithDuration:1.0f angle:-5];
CCRotateBy *rotTouchEnd = [CCRotateBy actionWithDuration:1.0f angle:20];
-(void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[player stopAllActions];
[player rotTouchBegin];
}
-(void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
[player stopAllActions];
[player rotTouchEnd];
}
If you dont want to stop all the actions running on the player then just tag your actions so that you can use
[player stopAction:[player getActionByTag:]];

COCOS2d Creating movement when button is held

I am working on making an iPhone game involving a spaceship moving from left to right on the screen. I want to make it such that the ship only moves if the buttons are pressed. Here is my current code that creates movement, but it doesnt stop when the button is no longer pressed.
-(void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGSize winSize = [CCDirector sharedDirector].winSize;
NSSet *allTouches = [event allTouches];
UITouch *touch = [allTouches anyObject];
CGPoint location = [touch locationInView:[touch view]];
location = [[CCDirector sharedDirector] convertToGL:location];
if (CGRectContainsPoint([_paddle2 boundingBox], location)){
int bottomOfScreenX = 0 + _Paddle1.contentSize.width/2;
id action = [CCMoveTo actionWithDuration:5 position:ccp(bottomOfScreenX,winSize.height/3) ];
[_starShip runAction:action];
[action setTag:1];
}
}
-(void)ccTouchEnded:(UITouch *)touch withEvent:(UIEvent *)event
{
[_starShip stopActionByTag:1];
}
I believe it has to do with your use of "ccTouchesBegan" along with "ccTouchEnded". Notice "touches" versus "touch". They need to be consistent. ccTouchesBegan handles multiple touch events, while ccTouchBegan is meant for a single touch event. So since it appears you are dealing with a single touch event, you really do not need to use ccTouchesBegan. Switch it to ccTouchBegan and you should be fine.
THe problem with this code is that the TouchHasEnded and TouchHasBegan are each using different argument setters. THe Touch began is using an NSSet while the TouchEnded is suing a UITouch. Set the TouchHasEnded as a
-(void)ccTouchEnded:(NSSet *)Touches withEvent:(UIEvent *)event;
or
have the touch began
-(void)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event.
The second method will require a slight tweak to the logic.

How to detect someone tapping outside of a UIImageView

I have a UIImageView that is added as a subview. It shows up when a button is pressed.
When someone taps outside of the UIImageView in any part of the application, I want the UIImageView to go away.
#interface SomeMasterViewController : UITableViewController <clip>
<clip>
#property (strong, nonatomic) UIImageView *someImageView;
There are some hints in stackoverflow and Apple's documentation that sound like what I need.
Apple's : Gesture Recognizers
Apple's : UIView hitTest:withEvent
Apple's : UITouch Class Reference
Stackoverflow: Listening to UITouch event along with UIGestureRecognizer
(not likely needed but..) - CGRectContainsPoint as mentioned in the following post titled: Comparing a UITouch location to UIImageView rectangle
However, I want to check my approach here. It's my understanding that the code needs to
Register a UITapGestureRecognizer to get all touch events that can happen in an application
UITapGestureRecognizer should have its cancelsTouchesInView and
delaysTouchesBegan and delaysTouchesEnded set to NO.
Compare those touch events with the someImageView (how? Using UIView hitTest:withEvent?)
Update: I am registering a UITapGestureRecognizer with the main UIWindow.
Final Unsolved Part
I have a handleTap:(UITapGestureRecognizer *) that the UITapGestureRecognizer will call. How can I take the UITapGestureRecognizer that is given and see if the tap falls outside of the UIImageView? Recognizer's locationInView looks promising, but I do not get the results I expect. I expect to see a certain UIImageView when I click on it and not see the UIImageView when I click in another spot. I get the feeling that the locationInView method is being used wrong.
Here is my call to the locationInView method:
- (void)handleTap:(UITapGestureRecognizer *)gestureRecognizer
{
if (gestureRecognizer.state != UIGestureRecognizerStateEnded) {
NSLog(#"handleTap NOT given UIGestureRecognizerStateEnded so nothing more to do");
return;
}
UIWindow *mainWindow = [[[UIApplication sharedApplication] delegate] window];
CGPoint point = [gestureRecognizer locationInView:mainWindow];
NSLog(#"point x,y computed as the location in a given view is %f %f", point.x, point.y);
UIView *touchedView = [mainWindow hitTest:point withEvent:nil];
NSLog(#"touchedView = %#", touchedView);
}
I get the following output:
<clip>point x,y computed as the location in a given view is 0.000000 0.000000
<clip>touchedView = <UIWindow: 0x8c4e530; frame = (0 0; 768 1024); opaque = NO; autoresize = RM+BM; layer = <UIWindowLayer: 0x8c4c940>>
I think you can just say [event touchesForView:<image view>]. If that returns an empty array, dismiss the image view. Do this in the table view controller's touchesBegan:withEvent:, and be sure to call [super touchesBegan:touches withEvent:event] or your table view will completely stop working. You probably don't even need to implement touchesEnded:/Cancelled:..., or touchesMoved:....
UITapGestureRecognizer definitely seems like overkill in this case.
You can use touch functions to do that:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
when user touch the screen first your touchBegan function is called.
in touchBegan:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint pt = [[touches anyObject] locationInView:self];
}
so you have the point that user touched.Then you must find that the point is in your UIImageView or not.
But if you can give tag to your UIImageViews. That will be pretty much easy.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject ];
if( yourImageView.tag==[touch view].tag){
[[self.view viewWithTag:yourImageView.tag] removeFromSuperView];
[yourImageView release];
}
}

How to move/rearrange position of views on touch event?

Suppose, on my window i have 3 views, main one in the background and 2 up front.
Each view up front contains some content, i'd like to be moved as part of a view upon touch. By "moved" i mean "rearrange positions relative to another view". Upon touch, i'd like to "pick up the view with all of it's content and place it in the position currently occupied by another view"
Where would you get started on something like this?
Something like this:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UIView *touchedView = [[touches anyObject] view];
CGPoint location = [[touches anyObject] locationInView:touchedView];
touchedView.center = location;
}
See also these methods in UIResponder.h:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
First you need to create a GestureRecognizer. Something along the lines of:
UITapGestureRecognizer *doubleTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(doubleTap:)];
doubleTap.numberOfTapsRequired = 2;
[self addGestureRecognizer:doubleTap];
and add it to whatever view you want (or all three). From the sounds of it your main background view makes the most sense. Then create the doubleTap method, which will make one of your views move to where the other is:
-(void)doubleTap:(id)sender {
view1.frame = view2.frame;
[self bringSubviewToFrom:view1];
}
I would also make sure all your subview content has its autoresizingMask set according to how you want the subviews to behave.

Resources