Strange problem with ID3DXSprite Draw method - directx

I use ID3DXSprite interface to draw gui controls in my app. I have a 512x512 texture with all controls and use sprite->Draw() telling the exact RECT of control. Everything works fine except a strange bug on only one(!) machine.
Normally, the control looks:
And on that strange machine:
Moreover, some controls look fine but also many of them look like this one - with corrupted edges and ... well you can see the difference :(
The second machine has Intel(R) G41 Express Chipset video adapter.
Please, if someone has ANY ideas why can it happen - help!
Regards, Anthony.

It looks to me like you have mipmaps in the sprite's texture and the card is choosing the wrong mipmap level. Set the mip map level, explicitly to 1, and see if that helps.

Related

Stage3D iOS Antialiasing on AIR 24

With AIR 24 release we are able to set anti aliasing on Stage3D now, but there are some issues with it. Can anybody help how to use it in right way without changing entire project code ?
The issue I have is that anti alias works great, and no more jagged edges, but there are rendering issues and I guess some texture normals are being inverted, also when using Occlusion Material there are some jagged material shadows...
Next thing I notice is when drawing Wireframe Globe with Lines Segments - the lines are visible on the globe all the time, no matter if you add some object in front or not.
So, intersecting line segments with other materials don't work at all, and lines are on the screen forever.
Please, help if you find any trick fixing the issues.
Thanks
Just to add some more information: the issue seems to happen when shareContext = true. Without Starling there is antialiasing and the lineSegments are rendered at the current depth. It would be interesting to see if it works with other sharedContext besides Starling to isolate the issue. If I find an answer I will come back and post it. It would be nice to get this working. Any idea the performance hit on mobile of having a second instance of away3d? Layering that way might be a dirty work around.
*****EDIT****
AntiAliasing on the line Segments only occurs with sharedContext. View3D class does not seem to have it's antiAlias value set anywhere and when I forced it to a value of 2, all hell broke loose.
Edit#2
Mesh appears above line segments, Sprite3D do not.

Graphic in iOs device is all distorted Unity

Right now I'm developing application for iOs by Unity3D, and I have no idea what's happening with all my graphic. All textures are absolutely distorted and I've been struggling with this for a long time with no success.
I believe Unity compresses textures before convert the project to iOS, but I really can't find out how to change this setting or even knock it off. I appreciate any help from you guys. Here's the screenshot of my Texture Settings. I've setup it according to Ray Wenderlich lesson.
P.S.: I've tried to use "Point" for Filter Mode, but it had made the texture even worse.
I would suggest disabling mitmaps to see if this solves your problem, and making sure you have the best quality settings in Edit>Player Settings>Quality
As an aside, how large is your button image? You're overriding the max size to be 2048 for what seems to be a button. That sounds a little excessive.
By "distorted" you mean "blurred"? Try to uncheck "Generate Mip Maps".

Strange rendering behavior with transparent texture in WebGL

I've been writing a little planet generator using Haxe + Away3D, and deploying to HTML5/WebGL. But I'm having a strange issue when rendering my clouds. I have the planet mesh, and then the clouds mesh slightly bigger in the same position.
I'm using a perlin noise function to generate the planetary features and the cloud formations, writing them to a bitmap and applying the bitmap as the texture. Now, strangely, when I deploy this to iOS or C++/OSX, it renders exactly how I wanted it to:
Now, when I deploy to WebGL, it generates an identical diffuse map, but renders as:
(The above was at a much lower resolution, due to how often I was reloading the page. The problem persisted at higher resolutions.)
The clouds are there, and the edges look alright, wispy and translucent. But the inside is opaque and seemingly being rendered differently (each pixel is the same color, only the alpha channel is changed)
I realize this is likely something to do with how the code is ultimately compiled/generated in haxe, but I'm hoping it's something simple like a render setting or blending mode I'm not setting. But since I'm not even sure exactly what is happening, I wouldn't know where to look.
Here's the diffuse map being produced. I overlaid it on red so the clouds would be viewable.
Bitmapdata.perlinNoise does not work on html5.
You should implement it by yourself, or you could use pre-rendered image.
public function perlinNoise (baseX:Float, baseY:Float, numOctaves:UInt, randomSeed:Int, stitch:Bool, fractalNoise:Bool, channelOptions:UInt = 7, grayScale:Bool = false, offsets:Array = null):Void {
openfl.Lib.notImplemented ("BitmapData.perlinNoise");
}
https://github.com/openfl/openfl/blob/c072a98a3c6699f4d334dacd783be947db9cf63a/openfl/display/BitmapData.hx
Also, WebGL-Inspector is very useful for debugging WebGL apps. Have you used it?
http://benvanik.github.io/WebGL-Inspector/
Well, then, did you upload that image from ByteArray?
Lime once allowed access ByteArray with array index operator, even though it shouldn't on js. This is fixed in the lastest version of Lime to avoid mistakes.
I used __get and __set method instead of [] to access a byte array.
Away3d itself might be the cause of this issue too, because the code of backend is generated from different source files depending on the target you use.
For example, byteArrayOffset parameter of Texture.uploadFromByteArray is supported on html5, but not on native.
If away3d is the cause of the problem, which part of the code is causing the problem? I'm not sure for now.
EDIT: I've also experienced a problem with OpenFL's latest WebGL backend. I think legacy OpenFL doesn't have this problem. OpenFL's sprite renderer was changing colorMask (and possibly other OpenGL render states) without my knowledge! This problem occured because my code and OpenFL's sprite renderer was actually using the same OpenGL context. I got rid of this problem by manually disabling OpenFL's sprite renderer.

WebGL with CocoonJS - Duplicate triangle

I have a very simple test code which draws a moving triangle. In Chrome it works fine, however on an Android 4.1 Device with the CocoonJS 1.4.1 launcher (which funnels work to OpenGL ES 2.0) a random effect pops in:
The triangle is drawn, but quite often also a bit translated triangle is also drawn (more exactly, the extra triangle seems to be a replica of a formerly drawn one, but the distance is not consistently the same). The tri needs to be moved with some minimal speed for the effect to show (or maybe the replica is just hidden if moving is slow). The tint of the replica seems to be a bit different (even though the fragment shader color is constant), but it might be some alpha magic.
Other CocoonJS WebGL demos work fine on the device, however they don't exhibit fast movement. OpenGL benchmarks run fine.
Drawing multiple triangles has the same effect. Even though gl.Clear is used, it seems like part of some previous buffer shines through. Have you seen anything similar? Any ideas?
Thank you so much for your report, this is a known bug and we are working to solve this issue.
btw, by changing the setInverval time to this one: setInterval(loop, 16); should solve the problem temporarily.

Clear single viewport in DirectX 10

I am preparing to start on a C++ DirectX 10 application that will consist of multiple "panels" to display different types of information. I have had some success experimenting with multiple viewports on one RenderTargetView. However, I cannot find a definitive answer regarding how to clear a single viewport at a time. These panels (viewports) in my application will overlap in some areas, so I would like to be able to draw them from "bottom to top", clearing each viewport as I go so the drawing from lower panels doesn't show through on the higher ones. In DirectX 9, it seems that there was a Clear() method of the device object that would clear only the currently set viewport. DirectX 10 uses ClearRenderTargetView(), which clears the entire drawing area, and I cannot find any other option that is equivalent to the way DirectX 9 did it.
Is there a way in DirectX 10 to clear only a viewport/rectangle within the drawing area? One person speculated that the only way may be to draw a quad in that space. It seems that another possibility would be to have a seprate RenderTargetView for each panel, but I would like to avoid that as it requires other redundant resources, such as a separate depth/stencil buffers (unless that is a misunderstanding on my part).
Any help will be greatly appreciated! Thanks!
I would recommend using one render target per "viewport", and compositing them together using quads for the final view. I know of no way to scissor a clear in DX 10.
Also, according to the article here, "An array of render-target views may be passed into ID3D10Device::OMSetRenderTargets, however all of those render-target views will correspond to a single depth stencil view."
Hope this helps.
Could you not just create a shader together with the appropriate blendstate settings and a square mesh (or other shape of mesh) and use it to clear the area where you want to clear? I haven't tried this but I think it can be done.

Resources