Cocos2d ccDrawLine performance issue - ios

I use cocos2d 2.0 and Xcode 4.5. I am trying to learn how to draw a line. I can draw a line but after I drew few lines a serious performance issue occurs on Simulator.
Simulator starts to freeze, draws lines very very slowly and worst of all ,I guess because of -(void)draw is called every frame, the label on the screen becomes bold
before lines :
after lines;
I use following code :
.m
-(id) init
{
if( (self=[super init])) {
CCLabelTTF *label = [CCLabelTTF labelWithString:#"Simple Line Demo" fontName:#"Marker Felt" fontSize:32];
label.position = ccp( 240, 300 );
[self addChild: label];
_naughtytoucharray =[[NSMutableArray alloc ] init];
self.isTouchEnabled = YES;
}
return self;
}
-(BOOL) ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
BOOL isTouching;
// determine if it's a touch you want, then return the result
return isTouching;
}
-(void) ccTouchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [ touches anyObject];
CGPoint new_location = [touch locationInView: [touch view]];
new_location = [[CCDirector sharedDirector] convertToGL:new_location];
CGPoint oldTouchLocation = [touch previousLocationInView:touch.view];
oldTouchLocation = [[CCDirector sharedDirector] convertToGL:oldTouchLocation];
oldTouchLocation = [self convertToNodeSpace:oldTouchLocation];
// add my touches to the naughty touch array
[_naughtytoucharray addObject:NSStringFromCGPoint(new_location)];
[_naughtytoucharray addObject:NSStringFromCGPoint(oldTouchLocation)];
}
-(void)draw
{
[super draw];
ccDrawColor4F(1.0f, 0.0f, 0.0f, 100.0f);
for(int i = 0; i < [_naughtytoucharray count]; i+=2)
{
CGPoint start = CGPointFromString([_naughtytoucharray objectAtIndex:i]);
CGPoint end = CGPointFromString([_naughtytoucharray objectAtIndex:i+1]);
ccDrawLine(start, end);
}
}
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
ManageTraffic *line = [ManageTraffic node];
[self addChild: line z:99 tag:999];
}
I saw few Air Traffic Control games such as Flight Control, ATC Mania works really well.
Does this performance issue occur because of CCDrawLine/UITouch *touch or it is a common issue?
What Flight Control, ATC Mania might be using for line drawing?
Thanks in advance.
EDIT::::
OK I guess problem is not ccDrawLine, problem is I call ManageTraffic *line = [ManageTraffic node]; every time touch ends it calls init of node so it overrides scene
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
ManageTraffic *line = [ManageTraffic node];
[self addChild: line z:99 tag:999];
}

There's three things going on:
You assess performance on the Simulator. Test it on a device as Ben says.
You store points as strings and convert strings back to CGPoint. That is terribly inefficient.
ccDrawLine is not exactly efficient. For a couple dozen line segments it's ok. In your case maybe not (see below).
For #2, create a point class with only a CGPoint property and use that to store points in the array. Removes the string conversion or packing into NSData.
For #3 make sure that new points are only added if the new point is at least n points away from the previous point. For example a distance of 10 should reduce the number of points while still allowing for relatively fine line details.
Also regarding #3, I notice you add both current and previous point to the array. Why? You only need to add the new point, and then draw points from index 0 to 1, from 1 to 2, and so on. You only have to test for the case where there is only 1 point. The previous touch event's location is always the next touch event's previousLocation. So you're storing twice as many points as you need to.

Related

How to create button in sprite kit iOS (similar to toggle button)

I am trying to make a button in sprite kit using SKSpriteNode. I want the button image to change when it is pressed and revert back to old image as soon as the press ends. What i have done till now is following:-
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
self.startTouch = [[touches allObjects][0] locationInNode:self ];
for (UITouch *touch in touches){
CGPoint position = [touch locationInNode:self];
SKNode *node = [self nodeAtPoint:position];
if ([node.name isEqualToString:#"missileButton"]) {
TEMissileButtonNode *button = (TEMissileButtonNode*) node;
button.isPressed = YES;
}
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches){
CGPoint position = [touch locationInNode:self];
SKNode *node = [self nodeAtPoint:position];
if ([node.name isEqualToString:#"missileButton"]) {
TEMissileButtonNode *button = (TEMissileButtonNode*) node;
button.isPressed = NO;
}
}
}
inside the update method i am calling this method to check if the touch has ended
-(void)changeMissileButton{
if (self.missileButton.isPressed) {
[self.missileButton addMoreMissileButtons];
[self.missileButton setTexture:[SKTexture textureWithImageNamed:#"missileButtonPressed"]];
}else{
[self.missileButton setTexture:[SKTexture textureWithImageNamed:#"missileButtonDeselected"]];
[self.missileButton hideMissileButtons];
}
}
The issue is that the touch doesn't get registered at times. Sometimes it works the way i want. When i touch it, its texture changes and when i remove my finger, the texture reverts back to old texture. But most of the times, the button doesn't react to my touch. Am i missing something?
Use touchesBegan, touchesMoved, touchesEnded.
touchesBegan - check if the button (SKSpriteNode) contains the touch. If so, change you button texture.
touchesMoved- if button does not contain touch, change texture back.
touchesEnded- if touch stayed within button during touchesMoved, change texture back.
Above is the logic for how to efficiently accomplish what you are trying to do. Yes, it is annoying Spritekit decided to abandon buttons.

Unable to apply impulse twice

I'm trying to create objects and apply an impulse on them
There is no problem with the first time, but it looks I have to wait until the last one is removed in order to apply an impulse to the new object. It is created but the impulse is not applied
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint touchlocation = [[touches anyObject] locationInNode:self];
MySpriteNode *object = [MySpriteNode node];
object.position = touchlocation;
[self addChild:object];
[object.physicsBody applyImpulse:CGVectorMake(0.0, 9.0)];
[self performSelector:#selector(removeObject:) withObject:object afterDelay:1.5];
}
It seems the problem comes creating the physicsBody like this
[SKPhysicsBody bodyWithTexture:texture size:object.size];
The first time, the impulse is really strong, and the next ones really soft. So it looks it doesn't work if you compare with the first impulse.
I really hope it's a bug if the iOS 8 Beta

COCOS2d Creating movement when button is held

I am working on making an iPhone game involving a spaceship moving from left to right on the screen. I want to make it such that the ship only moves if the buttons are pressed. Here is my current code that creates movement, but it doesnt stop when the button is no longer pressed.
-(void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGSize winSize = [CCDirector sharedDirector].winSize;
NSSet *allTouches = [event allTouches];
UITouch *touch = [allTouches anyObject];
CGPoint location = [touch locationInView:[touch view]];
location = [[CCDirector sharedDirector] convertToGL:location];
if (CGRectContainsPoint([_paddle2 boundingBox], location)){
int bottomOfScreenX = 0 + _Paddle1.contentSize.width/2;
id action = [CCMoveTo actionWithDuration:5 position:ccp(bottomOfScreenX,winSize.height/3) ];
[_starShip runAction:action];
[action setTag:1];
}
}
-(void)ccTouchEnded:(UITouch *)touch withEvent:(UIEvent *)event
{
[_starShip stopActionByTag:1];
}
I believe it has to do with your use of "ccTouchesBegan" along with "ccTouchEnded". Notice "touches" versus "touch". They need to be consistent. ccTouchesBegan handles multiple touch events, while ccTouchBegan is meant for a single touch event. So since it appears you are dealing with a single touch event, you really do not need to use ccTouchesBegan. Switch it to ccTouchBegan and you should be fine.
THe problem with this code is that the TouchHasEnded and TouchHasBegan are each using different argument setters. THe Touch began is using an NSSet while the TouchEnded is suing a UITouch. Set the TouchHasEnded as a
-(void)ccTouchEnded:(NSSet *)Touches withEvent:(UIEvent *)event;
or
have the touch began
-(void)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event.
The second method will require a slight tweak to the logic.

Knob rotation gesture recognizer

I'm trying to create a gesture recognizer able to detect the rotation of 4 fingers (similar when you rotate a volume knob).
The main idea was to create a subclass of UIRotateGestureRecognizer and override its method. In the -touchesBegan I detect the number of touches, if the number is lower than 4 the state of the gesture is fail. After that I pass the location point to an algorithm that find the diameter of a convex hull. If you think about it, your fingers are the vertices and I just need to find the two vertices with the max distance. Obtained these two points I reference them as ivar and I pass them to the superclass as it is a simple rotation with just two fingers.
It doesn't work:
the detection of the touches seems pretty hard
very rarely the -touchesHasMoved is called
when its called it hangs the most of time
Can someone help me?
Here is the code:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if (touches.count<4) {
//FAIL
self.state = UIGestureRecognizerStateFailed;
return;
}
//Find the diameter of the convex hull
NSArray * touchesArray = [touches allObjects];
NSMutableArray * pointsArray = #[].mutableCopy;
for (UITouch * touch in touchesArray) {
[pointsArray addObject:[NSValue valueWithCGPoint:[touch locationInView:touch.view]]];
}
DiameterType convexHullDiameter = getDiameterFromPoints(pointsArray);
CGPoint firstPoint = convexHullDiameter.firstPoint;
CGPoint secondPoint = convexHullDiameter.secondPoint;
for (UITouch * touch in touchesArray) {
if (CGPointEqualToPoint([touch locationInView:touch.view], firstPoint) ) {
self.fistTouch = touch;
}
else if (CGPointEqualToPoint([touch locationInView:touch.view], secondPoint)){
self.secondTouch = touch;
}
}
//Calculating the rotation center as a mid point between the diameter vertices
CGPoint rotationCenter = (CGPoint) {
.x = (convexHullDiameter.firstPoint.x + convexHullDiameter.secondPoint.x)/2,
.y = (convexHullDiameter.firstPoint.y + convexHullDiameter.secondPoint.y)/2
};
self.rotationCenter = rotationCenter;
//Passing touches to super as a fake rotation gesture
NSSet * touchesSet = [[NSSet alloc] initWithObjects:self.fistTouch, self.secondTouch, nil];
[super touchesBegan:touchesSet withEvent:event];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
if (touches.count<4) {
self.state = UIGestureRecognizerStateFailed;
return;
}
[super touchesMoved:[[NSSet alloc] initWithObjects:self.fistTouch, self.secondTouch, nil] withEvent:event];
}
- (void) touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesCancelled:[[NSSet alloc] initWithObjects:self.fistTouch, self.secondTouch, nil] withEvent:event];
}
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesEnded:[[NSSet alloc] initWithObjects:self.fistTouch, self.secondTouch, nil] withEvent:event];
}
The reason initial detection is hard is that all the touches may not start at the same time. touchesBegan will likely be called multiple times as separate touches land on the screen. You can use the event parameter to query all of the current touches with event.allTouches. So your current approach for triggering the gesture to fail will not work. You should not set state to fail if touches.count is < 4 but instead just return if event.allTouches.count < 4. You could use a timer to set the state to fail if the fourth touch does not happen within a certain time from the first.
touchesMoved likely has problems because the touches in the event object do not match up with those in the set that you pass to super.
If you think about it, your fingers are the vertices and I just need to find the two vertices with the max distance.
I don't think this will work in practice, even if you are able to trick the UIGestureRecognizer.
This is how I would implement the algorithm in the 'correct' way:
Remember the 'old' touches.
When you're given 'new' touches, try to match each finger to the previous touch. If you can't, fail.
Compute the center of 'new' + 'old' touches.
For each of 4 fingers identified two steps ago, compute angle traveled in radians, approximated as
new(i) - old(i) divided by distance to center
If any angle is too big (> 0.5), fail.
This guarantees that approximation is valid.
Now compute the average of 4 angles.
Congratulations, you now have the rotation angle (measured in radians).
I would put this in a comment if I had enough Rep.
[super touchesMoved:[[NSSet alloc] initWithObjects:self.fistTouch, self.secondTouch, nil] withEvent:event];
You're using something called fistTouch, which doesn't sound like what you want. My guess is you want firstTouch.
Additionally there are possible collisions between gestures going on that may be overriding each other. Did you know there is a 4-finger zoom-out in iOS7 that is a system-wide gesture? Also, a 4-finger zoom-in during an app will close it.

Get all touches on screen

I'm having a small issue. I'd like to receive all touches on the screen and for each one spawn a new video. The problem is that once a video is placed then it intercepts the touch points. I tried various values to go in locationInView but without any luck so far. Am I looking in the right place?
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint pointOnScreen = [[touches anyObject] locationInView:self.view];
C4Movie *player = [C4Movie movieNamed:#"inception.mov"];
player.shouldAutoplay = YES;
player.loops = YES;
player.center = pointOnScreen;
[self.canvas addMovie:player];
}
#end
Try setting the userInteractionEnabled property of each video screen (assuming it is held in some sort of UIView) to NO - that way, touch events will pass through it and continue to be received by your handler.
Yes, you're looking in the right place, and Chris has it right about user interaction. You should try:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint pointOnScreen = [[touches anyObject] locationInView:self.view];
C4Movie *player = [C4Movie movieNamed:#"inception.mov"];
player.shouldAutoplay = YES;
player.loops = YES;
player.center = pointOnScreen;
player.userInteractionEnabled = NO;
[self.canvas addMovie:player];
}
However you're going to run into an issue with adding videos. Unfortunately, iOS / hardware only lets you have 4 video pipelines running at one time so you'll hit that pretty quickly.
If you want to add things to the screen as you touch and drag your finger, then you could also do the above code inside of the following method:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
//work your magic here
}

Resources