I'd like to mask model with cube. I have 2 SCNNode in scene: a cube and a GameModel, they have some part intersection. how can I do mask effect with cube? some kind of sceneKit shader modifie I guess?
Related
I'm looking for a most efficient way of drawing a 2-dimensional background in metal. This requires rendering a textured rectangle.
The basic geometry example shows an example on how to draw a triangle. Is there an easy and non-bloated way to draw a rectangle (a polygon with 4 corners)?
The Basic Texturing sample draws a textured rectangle
My question is related to SceneKit framework.
I have a geometry shader modifier that does some deformation that involves rotation of the whole geometry. In my specific case it is important to deform and then rotate a vertex inside the shader. Everything works great. After finishing the deformation I cannot interact with this node, since now it is located in a different orientation and position - the geometry was deformed and flipped. As I understand, I need to tell the host node that geometry was updated by shader.
My question is: How to sync transformation changes between geometry shader modifier and Scene Graph - the transformation matrix of host SCNNode instance?
So the premise of my problem is that I apply an alpha mask with a fragment shader to a circular CCSprite. Then I want to rotate the CCSprite while keeping the alpha mask in the same position. If I apply the mask and then rotate the sprite, the mask will rotate with the sprite. If I rotate the sprite and then apply the mask, the mask will be in the rotated position because I base it off the texture of the Sprite which never rotates.
Is there a way I can get the cctexture2d to rotate in a way that the mask will stay in place while the sprite is rotated or do I need to apply a custom vertex shader to the mask or the ccpsrite texture? If custom, what would the matrix transformations look like?
How can a particle emitter be set to produce shapes (solid shapes/Core Graphics?), instead of texture images (png)?
In the documentation Manipulating the Particle Emitter, it looks like only a texture image can be set.
Only a SKTexture can be used, but that doesn't stop you from using SKShapeNode to create shapes and then create a SKTexture of that shape via the SKView method textureFromNode:.
I am building a game in XNA, but this question could apply to most 3D frameworks.
My scene contains several point lights. Each light has a position, intensity, color and radius. It also contains several objects that can be lit by the point lights.
I have successfully used a simple forward renderer to light objects these with up to 8 point lights at a time.
However, I also want to try some other effects, such as lit objects where the light slowly fades away each frame.
I want to use a multipass approach:
Each lit object has a texture and a lightmap (a rendertexture)
Then, each frame and for each object:
Clear the lightmap (or fade it)
Draw each light to the lightmap, with falloff etc.
Render the object using texture * lightmap in the pixel shader
My problem is related to texture coordinates. The lightmap should use the same UV mapping as the texture map, so that when I draw the objects, the color is:
tex2D(TextureSampler, uv) * tex2D(LightMapSampler, uv)
Or, put another way, I need to compute the world position of each pixel in the lightmap in the lightmap pixel shader for computing its distance to each light.
What coordinate transforms are required to achieve this?