I am trying to create a background to flow with the game. However the image isn't Continuous. There is a space between each of the image loads. I want the image to continue to loop.
Here is the method to create the sprite
CCSprite *sprite = [CCSprite spriteWithFile:#"Image.png" rect:CGRectMake(0, 0, 960, 640)];
ccTexParams tp = {GL_NEAREST, GL_NEAREST, GL_REPEAT, GL_REPEAT};
[sprite.texture setTexParameters:&tp];
sprite.anchorPoint = ccp(1.0f/8.0f, 0);
sprite.position = ccp(screenW/8, 0);
Method to update the position of the sprite.
- (void) setOffsetX:(float)offsetX {
if (_offsetX != offsetX) {
_offsetX = offsetX;
CGSize size = _sprite.textureRect.size;
_sprite.textureRect = CGRectMake(_offsetX, 0, size.width, size.height);
}
}
Any help please
Your image width needs to be a power of two. i.e. the width has to be 64, 128, 256, 512, etc if you want it to repeat
The gap you are seeing is where OpenGL has padded empty space to your texture to make it power of two.
After trying it a few times, the best way is to ensure that the sprite dimensions are a power of 2. This will ensure that you can scale the layer and all remains fine. If you don't plan on scaling the layer, then you can use any size sprites and use this:
[[CCDirector sharedDirector] setProjection:CCDirectorProjection2D];
http://ak.net84.net/iphone/gap-between-sprites-when-moving-in-cocos2d/
Related
I've never had this happen before and can't figure out what's going on. I suspect it might be auto-layout, but I don't see how. I have a "Compass" view that has several subviews it manages itself (not part of auto layout). Here's an example of their setup:
- (ITMView *) compass {
if (!_compass){
_compass = [ITMView new];
_compass.backgroundColor = [UIColor blueColor];
_compass.layer.anchorPoint = CGPointMake(.5, .5);
_compass.translatesAutoresizingMaskIntoConstraints = NO;
_compass.frame = self.bounds;
__weak ITMCompassView *_self = self;
_compass.onDraw = ^(CGContextRef ref, CGRect frame) { [_self drawCompassWithFrame:frame]; };
[self addSubview:_compass];
}
return _compass;
}
I need to rotate the compass in response to heading changes:
- (void) setCurrentHeading:(double)currentHeading{
_currentHeading = fmod(currentHeading, 360);
double rad = (M_PI / 180) * _currentHeading;
self.compass.transform = CGAffineTransformMakeRotation(rad);
}
The problem is that it's rotating in on the z-axis for some reason:
I'm not manipulating layer transforms on any other views. Does anyone have any idea why this is occurring?
Update
I checked the transform for all superviews. Every superview has an identity transform.
I logged the transform of the compass view before and after it was set for the first time. Before it was set, the transform was at identity, which is expected. After I set the transform to rotate 242.81 degrees (4.24 rad) I get:
[
-0.47700124155378526, -0.87890262006444575,
0.87890262006444575, -0.47700124155378526,
0, 0
]
Update 2
I checked CATransform3DIsAffine and it always returns YES. I double checked the layer transform and for a rotation of 159.7 (degrees) I get:
[
-0.935, 0.356, 0, 0,
-0.356, -0.935, 0, 0,
0, 0, 1, 0,
0, 0, 0, 1
]
That looks correct to me.
All of the transforms are correct but it's still not displaying correctly on screen.
Update 3
I removed my drawing code from the view and set the view background to blue. The view is definitely being rotated, squeezed, or something:
Some things to note:
The view displays correctly at 90, 180, 270 & 0 degrees.
The view disappears (turned on edge) at 45, 135, 225 & 315 degrees.
The view looks like it's being rotated in 3D as it progresses from 0 to 360 degrees.
I'm not sure why #matt withdrew his answer, but he was correct: The compass view had it's frame reset every time I made a rotation in the layoutSubviews method in my containing superview. I wasn't expecting this, thinking that a rotation wouldn't trigger a layoutSubviews. The frame never changed, but the applied transform distorted the results as the frame was re-applied to the view. What threw me was the results really looked like the view was being rotated in 3D, which led me down that particular rabbit hole. At least I know what to look for now.
Something I want to point out: The apparent 3D rotation was very particular. It rotated around each diagonal combination of {x,Y} sequentially between each 90° quadrant of the unit circle. This makes sense if you think about how the frame would distort over those periods.
The solution is simple enough, store and remove the transform before setting the subview frame and then reapply the transform. However, because the rotation is applied very, very frequently (inside an animation block no less) I added an escape to help reduce the load:
- (void) layoutSubviews{
[super layoutSubviews];
if (!CGRectEqualToRect(_lastLayout, self.bounds)){
CGRect frame = SquareRectAndPosition(self.bounds, CGRectXCenter, CGRectYCenter);
CGAffineTransform t;
t = self.compass.transform;
self.compass.transform = CGAffineTransformIdentity;
self.compass.frame = frame;
self.compass.transform = t;
t = self.target.transform;
self.target.transform = CGAffineTransformIdentity;
self.target.frame = frame;
self.target.transform = t;
}
_lastLayout = self.bounds;
}
I am working on a Sprite Kit project in which I need to display in some cases only half of an existing image.
I tried making the frame of the sprite smaller, but it just stretches the image.
Is there any possibility to use a mask or something in order to display only half of the sprite/image/texture?
So, in order to show only half of the image, texture, sprite, someone would need to use a SKCropNode. The only sensible thing is to crop the sprite you need starting from half, not just cropping with a predefined size. This can be achieved by setting the mask node position.
1) create a SkSpriteNode with that texture/image:
// Obj-C
SKSpriteNode *skelet = [SKSpriteNode spriteNodeWithImageNamed:imageName];
// Swift
let skelet = SKSpriteNode(imageNamed: imageName)
2) create the SKCropNode:
// Obj-C
SKCropNode * cropNode = [SKCropNode node];
// Swift
let cropNode = SKCropNode()
3) create the mask
// Obj-C
SKSpriteNode *mask = [SKSpriteNode spriteNodeWithColor:[UIColor blackColor] size:CGSizeMake(skelet.frame.size.width/2, skelet.frame.size.height)];
// Swift
let mask = SKSpriteNode(color: .black, size: CGSize(width: skelet.frame.size.width/2, height: skelet.frame.size.height))
** set the mask position to the half of the result you need (you need half of the skelet -> set the mask position to the half "of the half of the skelet")
// Obj-C
mask.position = CGPointMake(skelet.frame.size.width/4, 0);
// Swift
mask.position = CGPoint(x: skelet.frame.size.width/4, y: 0)
The division by 4 is because you need the center of the mask to be not in the center of the skelet node, but moved to the half of a half of the skelet (reminder the mask node works with a default anchor point of 0.5 0.5 - so the zero point corresponds with the center of the skelet node).
4) add the needed elements to the crop node
// Obj-C
[cropNode addChild:skelet];
[cropNode setMaskNode:mask];
[self addChild:cropNode];
// Swift
cropNode.addChild(skelet)
cropNode.maskNode = mask
self.addChild(cropNode)
I want to draw simple square with size of full screen with glDrawArray method in cocos2d. When retina is disabled everything draws as expected but when enabled - everything is half as big as it should be. (it seems like coordinate system in glDrawArray is not in points but in pixels)
Other draw functions works as expected but since I am drawing complicated shapes we have to use glDrawArray since it is much faster.
Any ideas how to solve this?
-(void) draw
{
CGPoint box[4];
CGPoint boxTex[4];
CGSize winSize = [[CCDirector sharedDirector] winSize];
//float boxSize = winSize.width;
box[0] = ccp(0,winSize.height); // top left
box[1] = ccp(0,0); // bottom left
box[2] = ccp(winSize.width,winSize.height);
box[3] = ccp(winSize.width,0);
boxTex[0] = ccp(0,1);
boxTex[1] = ccp(0,0);
boxTex[2] = ccp(1,1);
boxTex[3] = ccp(1,0);
// texture backround
glBindTexture(GL_TEXTURE_2D, self.sprite.texture.name);
glVertexPointer(2, GL_FLOAT, 0, box);
glTexCoordPointer(2, GL_FLOAT, 0, boxTex);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
}
Yes, the drawing is done in pixels, so in order to handle proper rendering on the retina display as well, you need to multiply your vertices by CC_CONTENT_SCALE_FACTOR():
for (int i = 0; i < 3; i++)
box[i] = ccpMult(box[i], CC_CONTENT_SCALE_FACTOR());
CC_CONTENT_SCALE_FACTOR returns 2 on retina devices instead of 1, so using it should take care of the scaling.
I'm new to cocos2D and I want to draw lines in it, which I tried to implement from here
I've a problem with the frame set in it. I set background image by the below code
CCSprite* background = [CCSprite spriteWithFile:imgPath rect:frame];
where imgPath is the path of the image file that is set to CCSprite and frame is the view bounds. CCSprite's frame is ok and now I added
[background addChild: [LineDrawingClass node]];
Then I added a CCRenderTexture instance to the LineDrawingClass with the following code snips
renderTexture.anchorPoint = ccp(0, 0);
renderTexture.position = ccp(self.width * 0.5f, self.height * 0.5f);
then I added renderTexture to LineDrawingClass
What I got is the CCSprite's background set to correct frame, with no problem but renderTexture's frame is set some five pixel below the CCSprite.
I also set the anchor point to
renderTexture.anchorPoint = ccp(0.5f, 0.5f);
but the lag in origin.y of renderTexture remains.
Please see the attached image for reference. Can some one point out the mistake and correct me that renderTexture's frame is exactly over the CCSprite's frame (which is now 5px lag with origin.y)??
Try setting the height of the texture a bit larger say if 480 is ur height, set it as 580 or greater than that could match ur requirements.
renderTexture.position = ccp(self.width * 0.5f, (self.height + 100) * 0.5f);
Its becoz of some orientation problem.
I have recently switched from using separate resource files to using a texture atlas. I experimented with replacing a few [CCSprite spriteWithFile] with [CCSprite spriteWithSpriteFrameName]. This works fine except one thing. The CCSprite's texture dimensions are incorrect. Here is my code :
CGSize screenSize = [[CCDirector sharedDirector]winSize];
CCSprite * leftArrow = [CCSprite spriteWithSpriteFrameName:#"smallLeftArrow.png"];
CGSize arrowSize = [leftArrow texture].contentSizeInPixels;
CCSprite * selectedLA = [CCSprite spriteWithSpriteFrameName:#"smallLeftArrow.png"];
selectedLA.opacity = 100;
CCMenuItem * leftArrowItem = [CCMenuItemSprite itemFromNormalSprite:leftArrow selectedSprite:selectedLA target:[HelloWorldLayer sharedHelloWorldLayer] selector:#selector(doLeft)];
leftArrowItem.position = ccp(- arrowSize.width, arrowSize.height);
CCLOG(#"arrow.width = %f arrow.height %f",arrowSize.width, arrowSize.height);
This is the output on the debugger:
2011-08-29 18:27:04.239 Zero Gravity Combat[454:307] arrow.width = 1024.000000 arrow.height 1024.000000
The size of the entire texture atlas is 1024 X 1024. I am using texture packer to create the texture atlas. Is there a way to fix this or do I need to create a texture manually to determine dimensions?
You obviously want to get the size of the arrow sprite. Thats simple: leftArrow.contentSize.
What you've done is getting the size of the used texture, which is of size 1024x1024. The size of the texture is not always the same size as the sprite's size.