I seemed to have run into a strange problem with the CCParticleSystem and the iPhone.
I have a laser that is being shot across the screen from left to right. I added a particle effect to give the laser more of a railgun look to it. I used the "emmas sharing" particle effect from Particle Designer.
Here is the code to send the laser across the screen:
-(void)fireLaserCannonAddon
{
if( _ship == nil || _ship.dead ) return;
CGSize winSize = [CCDirector sharedDirector].winSize;
shipLaserCannon = [_laserCannonArray nextSprite];
[shipLaserCannon stopAllActions];
shipLaserCannon.position = ccpAdd(_ship.position, ccp(shipLaserCannon.contentSize.width / 2, -shipLaserCannon.contentSize.height));
[shipLaserCannon revive];
CCParticleSystemQuad *laserEffect = [_laserEffect nextParticleSystem];
[laserEffect resetSystem];
[shipLaserCannon runAction:[CCSequence actions:
[CCMoveBy actionWithDuration:0.5 position:ccp(winSize.width, 0)],
[CCCallFuncN actionWithTarget:self selector:#selector(invisNode:)],
[CCCallFunc actionWithTarget:self selector:#selector(endLaserEffects)], nil]];
}
And the code to set the particle system effect to the laser's position:
-(void)updateLaserEffects:(ccTime)dt
{
for( CCParticleSystemQuad *laserEffect in _laserEffect.array )
{
laserEffect.position = shipLaserCannon.position;
}
}
-(void)endLaserEffects
{
for( CCParticleSystemQuad *laserEffect in _laserEffect.array )
{
[laserEffect stopSystem];
}
}
If you open up the "emmas sharing" effect in Particle Designer, the effect is the same as when you click and drag across the screen. This works perfectly on the iPad and iPad simulator, however on my iPhone 3GS / iPhone (SD and retina) simulator, the emitted particles seem to be "carried" with the laser. It's not as identical as setting the PositionType to kCCPositionTypeGrouped (the emitted particles stay in that circle shape), but kind of a mix between it being kCCPositionTypeGrouped and kCCPositionTypeFree. The particles are emitting off the laser, but also being a dragged a bit behind the laser instead of staying where it was emitted like on the Particle Designer simulator and regular iPad. It looks as if the laser is creating its own layer with the particle effect on it with the "layer" lagging behind it.
I thought that maybe the laser was moving too fast, but even when slowed down, it had the same effect.
This "bug" also creates another small problem, since it's being "carried" with the laser, when the laser is off the screen and then taken out, the remnants of the last emitting particles are visible on the bottom left of the screen, since I'm sure its because the emitted particles are still following the position.x of the laser (which it shouldn't be doing, only the base of it is supposed to) and since the laser is gone, it defaults to it's default set position. However, I do not have this problem on the iPad / iPad simulator.
BTW, this wasn't just limited to only the "emma sharing" particle effect, it seems to do the same for all the other effects.
Has anyone else ever had similar issues with using CCParticleSystems on a moving object for the iPhone?
Any helpful input is greatly appreciated!
OK, so after some messing around, I found out what was causing all this.
I originally had the CCParticleSystem set to 1.0 (original scale) for the iPad and 0.5 for the iphone. I changed the scale for the iPhone to 1.0 and everything worked like it should..just a lot bigger, but it worked. I really didn't want to have two different particle effects for the same effect just because of screen size, so I figured I'll just scale up to 2.0 for the iPad while leaving 1.0 on the iPhone. Low and behold, now the iPad suffered the same weird looking effect as I did on the iPhone, but much more extreme.
Looks like I don't have much of a choice now, but to have two different files for the same effect, but I'm relieved I found out what was causing this and can save a few hairs from leaving prematurely.
I think scaling an effect is fine as long as it's not following an object dynamically, like in my case.
I don't know if this would be considered as a bug or not, since I'm sure it's a math thing that cocos2d is using and scaling it affects it.
TLDR::Scaling up/down a particle effect will cause this weird effect when it is following an object's position. Don't re-scale particle effects that are following an object's position dynamically. If it's just in one spot, then it's fine.
Related
I have an issue with particle systems, which can be, in rare cases, kind of occluded in a wrong manner. The particle system you see on the print-screens is a sphere (with invisible material, material transparency = 0.0), that emits particles from its surface. Like 250 particles per second - no magic - and the particle systems works in 99% as it should.
You see also a floor (which is a SCNPlane) that has a very large diameter, of like 100m x 100m. The occlusion happens when the camera is flying by and the angle of the view changes a little bit, because the camera moves smoothly. Depending on the camera angle, it can happen - as you see on the second image - the particle system is occluded partially in a wrong manner, like it would stay behind the horizon - but it does not - it hoovers 2m above the floor and has a radius of 1m.
Did anyone ran into a similar issue? Is there something that could be done, to make this render correctly in all cases (from all viewing angles).
Sometimes the particle system disappears even completely. i.Ex when the camera looks from (20m) above directly on the particle system.
(The scene uses physically based rendering using SceneKit - the background is a simple skybox)
You asked if anyone ran into a similar issue?
I can answer yes!
Depending on the point of view (camera position), and the object on which the SCNParticleSystem is attached, I'm getting weird occlusions of the emitted particles.
I have no SCNPlane, but I have a large SCNSphere around the scene showing a 360 video. If I remove the sphere, the bug doesn't occur anymore.
It might be a regression with iOS 14.x and macOS 11.2, as the same application running under iOS 13.6.1 doesn't show the problem !
If somebody needs. I had similar problem and was trying to set different settings of the particle system for a while.
One of solutions was increased "Rendering order" of the node, that contains the particles, but particles disappear if you change camera orientation.
By chance, I discovered that bug happens when I add specific node to the scene. One difference I found, this node had a material with Transparency mode "Dual layer". I tried this one to make transparent texture.
I changed mode to "Default" and it helped.
Xcode screen
I'm playing around with SpriteKit in Xcode 6, iOS 8 beta 5. Everything is all laid out and working perfectly on the iPhone 4S simulator, however when switching to the 5S, the elements at the bottom of the screen are cut off.
It was to my understanding that the bottom left corner of the iPhone screen should be CGPoint(0, 0) but after checking the location by printing the coordinates to the console that the lowest point of the left corner I could click was around (5, 44). Is there something wrong in my scene setup thats causing this?
No changes have been made to the GameViewController file and even after I strip the GameScene file the problem persists.
Can anyone at least point me in the right direction with this?
Adding the following code will fix your problem (code is in Swift):
scene.scaleMode = SKSceneScaleMode.ResizeFill
Now if you want to know why this fixes your problem, what your problem actually is, and how to handle multiple resolutions – I suggest you continue reading.
There are three things that can impact the position of nodes in your scene.
1) Anchor Point
Make sure your scene's anchor point is set to (0,0) bottom left. By default the scene's anchor point starts at (0,0) so i'm assuming that is not causing the issue.
2) Size Check the size of your scene. I typically make my scene size match the size of the device (i.e. iPad, iPhone 4-inch, iPhone 3.5 inch), then I place another layer in the scene for storing my nodes. This makes me able to do a scrolling effect for devices with smaller resolutions, but it depends on your game of-course. My guess is that your scene size might be set to 320, 480 which could be causing the positioning problems on your iPhone 5s.
3) Scale Mode The scale mode has a huge effect on the positioning of nodes in your scene. Make sure you set the scale mode to something that makes sense for your game. The scale mode kicks in when your scene size does not match the size of the view. So the purpose of the scale mode is to let Sprite Kit know how to deal with this situation. My guess is that you have the scene size set to 320,480 and the scene is being scaled to match the iPhone 5 view which will cause positioning problems identical to what you described. Below are the various scale modes you can set for your scene.
SKSceneScaleMode.AspectFill
The scaling factor of each dimension is calculated and the larger of
the two is chosen. Each axis of the scene is scaled by the same
scaling factor. This guarantees that the entire area of the view is
filled, but may cause parts of the scene to be cropped.
SKSceneScaleMode.AspectFit
The scaling factor of each dimension is calculated and the smaller of
the two is chosen. Each axis of the scene is scaled by the same
scaling factor. This guarantees that the entire scene is visible, but
may require letterboxing in the view.
SKSceneScaleMode.Fill
Each axis of the scene is scaled independently so that each axis in
the scene exactly maps to the length of that axis in the view.
SKSceneScaleMode.ResizeFill
The scene is not scaled to match the view. Instead, the scene is
automatically resized so that its dimensions always matches those of
the view.
Conclusion
It looks like you want to remove the scaling of your scene, that way your positions in the scene will match the actual positions in the view. You can either set your scene's size to match the view size, in which case no scaling will take place. Or you can set your scene's scale mode to ResizeFill which will always make the scene's size match your view's size and it won't scale anything. In general I would stay away from any scaling and instead adjust the interface and the scene size to best suit each device. You may also want to add zoom and/or scrolling to allow devices with smaller resolutions to achieve the same view field.
But what if I want to scale my scene?
If however you need to scale your scene, but you still want positions to be relative to the view (i.e. You want (0,0) to be the bottom left of screen even when scene is cutoff) then see my answer here
Additional Info
See answer here for sample code showing how I layout nodes dynamically.
See answer here for more details about scaling to support multiple devices.
If you want to preserve the size of your scene (usually desired when you work with a fixed size and coordinates system), you might want to add padding to either side of your scene. This would remove the letter boxing and preserve all the physics and dynamics of your app on any platform.
I created a small Framework to help with this:
https://github.com/Tokuriku/tokuriku-framework-stash
Just:
Download the ZIP file for the Repository
Open the "SceneSizer" sub-folder
Drag the SceneSizer.framework "lego block" in your project
Make sure that the Framework in Embedded and not just Linked
Import the Framework somewhere in your code import SceneSizer
And you're done, you can now call the sizer Class with:
SceneSizer.calculateSceneSize(#initialSize: CGSize, desiredWidth: CGFloat, desiredHeight: CGFloat) -> CGSize
Just in case, try doing CMD+1, worked for me. Some of the elements were cut off because they were simply not displayed in Simulator - I stress this, this is just a simulator feature (and a bug if you ask me, wasted hours of time to solve this). CMD+2, CMD+3 views can sometimes hide parts of the scene.
screenshot below has been taken from 3.5inch simulator
these are bunch of UIButton, and the border created programmatically like:
btn.layer.cornerRadius = btn.frame.size.width / 2;
i don't know, but now all Fonts and UIButtons in the app get pixelated. mostly everything got pixelated.
I checked every setting in Xcode.
I tried to clean the project, then cleaned DerivedData folder.
I tried building the app in another machine.
I tried the app on real device. same problem.
nothing worked out yet.
An easy way to get pixellation on retina devices is rasterizing layers without setting the right rasterizationScale.
view.layer.shouldRasterize = YES;
view.layer.rasterizationScale = UIScreen.mainScreen.scale;
Without the second line, stuff will look fine for non-retina devices but it'll look awful on retina devices.
...hard to say if that's the problem you're seeing without more of the code, but it's a common enough bug that it merits posting.
Whether or not you should actually be rasterizing is a separate question...there are performance tradeoffs of which to be aware.
It could be that the resulting frame isn't aligned on an even integer. i.e. Moving/changing width is causing the frame to be something like (100.5, 50.0, 50.0, 50.0). When you are drawing on a half-pixel boundary, some of the drawing routines are going to make things look blurry to try and make it appear in the correct place. I would print out the frame after the animation and check:
NSLog(#"%#", NSStringFromCGRect(yourButton.frame));
If you see any non-integer values, use one of the floor() functions to modify the resulting frame in order to snap it to a boundary.
I had same problem with my UILabel(when i changed frame),
Before using floor() method:
And after:
Using OpenAL, one can set the distance model:
alDistanceModel(AL_LINEAR_DISTANCE_CLAMPED);
And the position of a sound effect:
float globalRefDistance = 125.0f;
float globalMaxDistance = 1250.0f;
ALfloat alPos[] = {pos.x, pos.y, 0.0f};
alSourcefv(soundId, AL_POSITION, alPos);
alSourcef(soundId, AL_REFERENCE_DISTANCE, globalRefDistance);
alSourcef(soundId, AL_MAX_DISTANCE, globalMaxDistance);
This attenuates and pans the sound nicely, except when the listener is close to the source and steps back and forth to the left and right of the sound. In this case, the panning quickly goes from left to right. There is not really a spot where it plays the sound panned in the center.
How can I define a range/window/cone where OpenAL puts a 3d sound right in the center, without panning?
I want to be able to walk up to the sound from the left, hearing it gradually fade in from the left channel. Then be in both channels for awhile. Then gradually fade out in the right channel.
I've tried messing with setting the sound to be directional, but it doesn't seem to do the trick:
alPos[0] = 0.0f; alPos[1] = 0.0f; alPos[2] = 1.0f;
alSourcefv(soundId, AL_DIRECTION, alPos);
alSourcef(soundId, AL_CONE_INNER_ANGLE, 180.0f);
alSourcef(soundId, AL_CONE_OUTER_ANGLE, 240.0f);
Instead of using OpenAL's AL_POSITION, I ended up tracking the listener position and all sound positions by hand, then manually applying volume rolloff and panning each tick.
This allows a certain window/width/cone of space where an effect is panned fully center. It sounds much better.
Also note that I switched from AL_LINEAR_DISTANCE_CLAMPED back to the default inverse clamped mode. For some reason the linear clamped mode was causing any effect that was positioned with a negative X value to pan much too quickly, regardless of reference or maximum distance. This only happened on Mac builds, so I think the OpenAL Mac implementation has a panning bug when using linear clamped.
I wrote some code for a Mac (Cocoa) App, where the animations are smooth and lovely. However, I copied it as it is to the iOS project, and ran it on "The New iPad", the resolutionary thingy, and for some reason the layers animation is jagged. The layer suddenly moves a few pixels up, then stops for a sec, then suddenly popes up a few more pixels, it is like a very low frame rate animation).
Interestingly, running the same code on the old iPad 1 gave smooth results!! x( .. made me realize it's a retina display problem.
I am doing a faint (slow) animation, not moving the layers around much. (Example, moving a layer.position.x 10 pixels over a period of 9.1 sec. This gives us a hint that the layer is not interpolating the 0.x pixels?)
I tried increasing the speed (reducing the duration) by four, and it is animating without problems. :/ but, faint animations (slow) have problems..
Any Ideas?
If my question is vague, this might help:
move a layer 10 pixels over 10 sec duration. On the iPad, it looks great (60 FPS presumably).
On the new iPad (retina), it is as if it's 10 FPS (or something like that) !!.
To animate properly on Retina screens, an extra line of code was missing .. >.<
if ([subLayer respondsToSelector:#selector(setContentsScale:)]) {
subLayer.contentsScale = [[UIScreen mainScreen] scale];
}
From this awesome guy: Retina display core graphics font quality
(I don't think I should close the question, since this title is more appropriate, as it addresses all retina issues with core animation).