I'm currently working on game where the main character rides on a ship and when an enemy is parallel to the ship, it drops a tube. My main problem is the tube is bigger than the ship so it is visible from behind while it is going down or up. Please note that the image (the ship) on top of the tube is a transparent image. Thanks!
You can clip draw regions in Cocos2d without too much effort. If you add this code to the tube object then you can define a suitable region to draw the object. Anything outside of this rectangle doesn't get drawn.
-(void) visit
{
if(!self.visible)
return;
glEnable(GL_SCISSOR_TEST);
CGRect thisClipRegion = _clipRegion;
thisClipRegion = CC_RECT_POINTS_TO_PIXELS(thisClipRegion);
glScissor(thisClipRegion.origin.x, thisClipRegion.origin.y, thisClipRegion.size.width, thisClipRegion.size.height);
[super visit];
glDisable(GL_SCISSOR_TEST);
}
I think you have to manage two tube image ,one is big and other is small, which is fit to your ship.
you have to change tube image when you drops this tube.
to change tube image you use this code
CCTexture2D* tex = [[CCTextureCache sharedTextureCache] addImage:#"blast.png"];
[player setTexture: tex];
here player is CCSprite.
CCSprite *player;
Related
I'm trying to use a SKEmitterNode to create a shader, kind of like in Pokemon when you are in a cave:
http://www.serebii.net/pokearth/maps/johto-hgss/38-route31.png
Here is the code I have so far :
NSString *burstPath =
[[NSBundle mainBundle] pathForResource:#"MyParticle" ofType:#"sks"];
SKNode *area = [[SKNode alloc] init];
SKSpriteNode *background = [SKSpriteNode spriteNodeWithColor:[SKColor blackColor] size:CGSizeMake(self.frame.size.width, self.frame.size.width)];
background.position = CGPointMake(CGRectGetMidX(self.frame), CGRectGetMidY(self.frame));
SKEmitterNode *burstNode =
[NSKeyedUnarchiver unarchiveObjectWithFile:burstPath];
burstNode.position = CGPointMake(CGRectGetMidX(self.frame),CGRectGetMidY(self.frame));
burstNode.particleBlendMode = SKBlendModeSubtract;
[area addChild:background];
[area addChild:burstNode];
[self addChild:area];
Here is the SKEmitterNode : http://postimg.org/image/60zflqjzt/
I've had two ideas.
The first one was to create a rectangular SKSpriteNode and remove the SKEmitterNode from the rectangular SKSpriteNode. That way, we have a black rectangle with a "hole" in the center, where we can see through.
The second one was to add the rectangular SKSpriteNode and the SKEmitter node to another SKNode (area), then set the particleBlendMode of the SKEmitterNode and finally set the alpha of the SKNode (area) in function of the color. For exemple, if the color of a pixel is black, the alpha value of that pixel will be 1.0 and another pixel is white, that other pixel's alpha value will be 0.0.
This question is a possible duplicate of How to create an alpha mask in iOS using sprite kit in some ways, but since no good answer has been given, I assume it isn't a problem.
Thank you very much.
These are not the nodes you are looking for! ;)
Particles can't be used to make a fog of war, even if you could make them behave to generate a fog of war it would be prohibitively slow.
Based on the linked screenshot you really only need an image with a "hole" in it, a transparent area. The image should be screen-sized and just cover up the borders to whichever degree you need it. This will be a non-revealing fog of war, or rather just the effect of darkness surrounding the player.
A true fog of war implementation where you uncover the world's area typically uses a pattern, in its simplest form it would just be removing (fading out) rectangular black sprites.
Now, with the powerful devices of this era (iPhone 12) is it possible to use 'SKEmitterNode' without lost too much frames per second.
You must build an SKS (SpriteKit Particle File) with this image:
Then, set your vars like this picture:
So, go to your code and add your particle with something like this example:
let fog = SKEmitterNode(fileNamed: "fog")
fog.zPosition = 6
self.addChild(fog)
fog.position.y = self.frame.midY
fog.particlePositionRange.dx = self.size.width * 2.5
The reason for the question is -- is it better to composite a texture at runtime (and make a sprite on that), or just use multiple sprites?
Example: Say you have a small image source for a repeating pattern, where you need many sprites to fill the view (as in a background).
A SKTexture is just image data. A SKSpriteNode is a display object.
The quick answer is no, you cannot draw a SKTexture to the screen without SKSpriteNode.
This answer goes over that limitation : How do you set a texture to tile in Sprite Kit
However, I wanted to answer to give you an option to achieve your ultimate goal.
What you could do is use an SKNode as a container for however many SKSpriteNodes you need to create your background. Then using the SKView method textureFromNode you can create one single SKTexture from that SKNode, that you can use to create a single SKSpriteNode for your background.
Hopefully upcoming version of SpriteKit for iOS 8 has some better tiling options.
Update
Also, in doing some research tonight, since I had a need for this same functionality, I found this :
http://spritekitlessons.wordpress.com/2014/02/07/tile-a-background-image-with-sprite-kit/
Which is doing similar to what I was pondering doing. Gonna copy the code here, in case that page ends up gone :
CGSize coverageSize = CGSizeMake(2000,2000); //the size of the entire image you want tiled
CGRect textureSize = CGRectMake(0, 0, 100, 100); //the size of the tile.
CGImageRef backgroundCGImage = [UIImage imageNamed:#"image_to_tile"].CGImage; //change the string to your image name
UIGraphicsBeginImageContext(CGSizeMake(coverageSize.width, coverageSize.height));
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextDrawTiledImage(context, textureSize, backgroundCGImage);
UIImage *tiledBackground = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
SKTexture *backgroundTexture = [SKTexture textureWithCGImage:tiledBackground.CGImage];
SKSpriteNode *backgroundTiles = [SKSpriteNode spriteNodeWithTexture:backgroundTexture];
backgroundTiles.yScale = -1; //upon closer inspection, I noticed my source tile was flipped vertically, so this just flipped it back.
backgroundTiles.position = CGPointMake(0,0);
[self addChild:backgroundTiles];
I am trying to create a conveyor belt effect using SpriteKit like so
MY first reflex would be to create a conveyor belt image bigger than the screen and then move it repeatedly forever with actions. But this does not seem ok because it is dependent on the screen size.
Is there any better way to do this ?
Also obviously I want to put things (which would move independently) on the conveyor belt so the node is an SKNode with a the child sprite node that is moving.
Update : I would like the conveyor belt to move "visually"; so the lines move in a direction giving the impression of movement.
Apply physicsBody to all those sprites which you need to move on the conveyor belt and set the affectedByGravity property as NO.
In this example, I am assuming that the spriteNode representing your conveyor belt is called conveyor. Also, all the sprite nodes which need to be moved are have the string "moveable" as their name property.
Then, in your -update: method,
-(void)update:(CFTimeInterval)currentTime
{
[self enumerateChildNodesWithName:#"moveable" usingBlock:^(SKNode *node, BOOL *stop{
if ([node intersectsNode:conveyor])
{
[node.physicsBody applyForce:CGVectorMake(-1, 0)];
//edit the vector to get the right force. This will be too fast.
}
}];
}
After this, just add the desired sprites on the correct positions and you will see them moving by themselves.
For the animation, it would be better to use an array of textures which you can loop on the sprite.
Alternatively, you can add and remove a series of small sprites with a sectional image and move them like you do the sprites which are travelling on the conveyor.
#akashg has pointed out a solution for moving objects across the conveyor belt, I am giving my answer as how to make the conveyor belt look as if it is moving
One suggestion and my initial intuition was to place a larger rectangle than the screen on the scene and move this repeatedly. Upon reflecting I think this is not a nice solution because if we would want to place a conveyor belt on the middle, in a way we see both it ends this would not be possible without an extra clipping mask.
The ideal solution would be to tile the SKTexture on the SKSpriteNode and just offset this texture; but this does not seem to be possible with Sprite Kit (no tile mechanisms).
So basically what I'm doing is creating subtextures from a texture that is like so [tile][tile](2 times a repeatable tile) and I just show these subtextures one after the other to create an animation.
Here is the code :
- (SKSpriteNode *) newConveyor
{
SKTexture *conveyorTexture = [SKTexture textureWithImageNamed:#"testTextureDouble"];
SKTexture *halfConveyorTexture = [SKTexture textureWithRect:CGRectMake(0.5, 0.0, 0.5, 1.0) inTexture:conveyorTexture];
SKSpriteNode *conveyor = [SKSpriteNode spriteNodeWithTexture:halfConveyorTexture size:CGSizeMake(conveyorTexture.size.width/2, conveyorTexture.size.height)];
NSArray *textureArray = [self horizontalTextureArrayForTxture:conveyorTexture];
SKAction *moveAction = [SKAction animateWithTextures:textureArray timePerFrame:0.01 resize:NO restore:YES];
[conveyor runAction:[SKAction repeatActionForever:moveAction]];
return conveyor;
}
- (NSArray *) horizontalTextureArrayForTxture : (SKTexture *) texture
{
CGFloat deltaOnePixel = 1.0 / texture.size.width;
int countSubtextures = texture.size.width / 2;
NSMutableArray *textureArray = [[NSMutableArray alloc] initWithCapacity:countSubtextures];
CGFloat offset = 0;
for (int i = 0; i < countSubtextures; i++)
{
offset = i * deltaOnePixel;
SKTexture *subTexture = [SKTexture textureWithRect:CGRectMake(offset, 0.0, 0.5, 1.0) inTexture:texture];
[textureArray addObject:subTexture];
}
return [NSArray arrayWithArray:textureArray];
}
Now this is still not ideal because it is necessary to make an image with 2 tiles manually. We can also edit a SKTexture with a CIFilter transform that could potentially be used to create this texture with 2 tiles.
Apart from this I think this solution is better because it does not depend on the size of the screen and is memory efficient; but in order for it to be used on the whole screen I would have to create more SKSpriteNode objects that share the same moveAction that I have used, since tiling is not possible with Sprite Kit according to this source :
How do you set a texture to tile in Sprite Kit.
I will try to update the code to make it possible to tile by using multiple SKSpriteNode objects.
I am facing an unknown error since from yesterday. I am creating CCSprites or CCMenuItemImage but it set black background instead of background image. Following is my code, I know its fine because I used it before many times.
[[CCSpriteFrameCache sharedSpriteFrameCache] addSpriteFramesWithFile:#"challenge_screen.plist"];
CCSprite *bg = [CCSprite spriteWithFile:#"ads.png"];
[bg setPosition:background.position];
// [bg setContentSize:CGSizeMake(100, 100)];
[self addChild:bg z:1000];
//CGSize windowSize = [[CCDirector sharedDirector] winSize];
CCMenuItemImage *coinMenuItem = [[CCMenuItemImage alloc] initWithNormalSprite:[CCSprite spriteWithSpriteFrame:[[CCSpriteFrameCache sharedSpriteFrameCache] spriteFrameByName:#"coin.png"]] selectedSprite:nil disabledSprite:nil block:^(id sender)
{
NSLog(#"I am Tapped");
}];
coinMenuItem.position = ccp(100, 100);
CCMenu *mainMenu = [CCMenu menuWithItems:coinMenuItem, nil];
mainMenu.position = CGPointZero;
[self addChild:mainMenu];
Attached is screenshot.
Thanks in advance.
I am guessing that you are loading this sprite sheet (challenge_screen.plist and the associated texture file, which frequently is challenge_screen.png or challenge_screen.pvr.*) in a color mode that doesn't have transparency.
First, make sure that the associated texture file shows transparency itself. Maybe something messed with this particular texture.
Once you checked that, if the associated texture is .PNG. then you have to set the texture loading format in code like this. You have to set the texture format before loading the texture itself (the texture loads as a side effect of adding the SpriteFrames to the cache).
[CCTexture2D setDefaultAlphaPixelFormat:kCCTexture2DPixelFormat_RGBA4444];
[[CCSpriteFrameCache sharedSpriteFrameCache] addSpriteFramesWithFile:#"challenge_screen.plist"];`
...
You can also try the kCCTexture2DPixelFormat_RGBA8888 mode if RGBA444 produces banding with your graphics and if you are good regarding free memory.
On the other hand, if the texture is a PVR.*, then the format in which the texture loads is embedded in the file, and setting the texture format in code doesn't make a difference. You will then need to regenerate your sprite sheet using the appropriate format (through TexturePacker or similar).
Is your background in the sprite sheet? If so try:
CCSprite *bg = [CCSprite spriteWithSpriteFrameName:#"ads.png"];
If it is the menu item and you know the code works, it must be an asset issue.
I am trying to create a painting feature in an iPad app for iOS. I have managed to get the colour to appear through touch, but I would like to recreate the multiply functionality of photoshop so the underlying black and white image continues to show through the colour. I started doing it with opacity but going over the same spot will result in it eventually being removed. I am using cocos2d and this is the sample code.
in header
CCSprite *background;
CCRenderTexture *target;
CCSprite *brush;
in init method:
background = [CCSprite spriteWithFile:#"background.png"];
background.position = ccp(self.size.width/2, self.size.height/2);
[self addChild: background z:-1];
target = [[CCRenderTexture alloc] initWithWidth:self.size.width height:self.size.height pixelFormat:kCCTexture2DPixelFormat_RGBA8888];
[target setPosition:ccp(self.size.width/2, self.size.height/2)];
brush = [[CCSprite spriteWithSpriteFrameName:#"brush_spot.png"] retain];
[brush setColor:ccRED];
in -(void) ccTouchMoved:(UITouch *)touch withEvent:(UIEvent *)event method:
[target begin];
[brush setPosition:<CALCULATED POSITION>];
[brush visit];
[target end];
I have tried using different blend functions on the brush but nothing has managed to create the look I want. I did get the correct effect when adding a sprite directly on top of the background and setting its blend function to
[sprite setBlendFunc:(ccBlendFunc) { GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA }];
but when I try to use this blend function on the brush nothing appears on the screen.
Thanks
Iain
The blend function blends whatever is on top with whatever is behind. When you render to a texture you blend with whatever the texture is blanked to.. Which I guess is something with alpha 0. So you should draw to the texture using opacity.
When you have filled the texture with color and attached it to a sprite, you can render it again and blend it with your background sprite. Then you can blend using multiply.