Screenshot on iPhone6s :
Unity3D version:5.3.5f1
Texture Format:TrueColor (Not Compressed)
iOS device:iPhone6s iOS 9.3.4
And this issue only occured on our trunk branch recently, and other branches do not have this issue.
So we are sure about is:
Shaders are correct.
Meshes and texture coordinates(uv) of each vertex are correct.
By using Unity Profiler, what can also be sure:
Memory in use is not too large ( compared with other branches )
By using GPU Frame Capture, I found that the texture is incorrect in the captured frame:
GPU Frame Capture :
Has anyone ever solved this issue?
Related
I'm using openGL ES to display CVPixelBuffers on iOS. The openGL pipeline uses the fast texture upload APIs (CVOpenGLESTextureCache*). When running my app on the actual device the display is great but on the simulator it's not the same (I understand that those APIs don't work on the simulator).
I noticed that, when using the simulator, the pixel format is kCVPixelFormatType_422YpCbCr8 and I'm trying to extract the Y and UV components and use the glTexImage2D to upload but, I'm getting some incorrect results. For now I'm concentrating on the Y component only, and the result looks like the image is half of the expected width and is duplicated - if it makes sense.
I would like to know from some one that has successfully displayed YUV422 video frames on iOS simulator if I'm on the right track and/or if I can get some pointers on how to solve my problem.
Thanks!
I am developing iOS app and android app using adobe AIR and Flash CS6.The app contains lots of animations. Since Bitmap images do not give a good quality, I have kept the images in Vector form only. It runs fine on android devices but when I publish it on iOS device many animation lags. How can I solve this without affecting the quality of my animations? I am using AIR SDK version 4.0 and GPU rendering method. Any help would be appreciated.
There might be a few things you could try:
use TweenMax/TweenLite for your animations as the GreenSock library is optimized for performance
try setting cacheAsBitmap to true on the vector you're animating
convert vectors to cached bitmap data (http://esdot.ca/site/2012/fast-rendering-in-air-cached-spritesheets)
try see if using "direct" mode for rendering yields better performance; from what I've experienced GPU is not well suited for vectors
In my photo editing iOS app I want to scale down large images before sending them into my filter pipeline. I'm using the CILanczosScaleTransform filter of Apple's Core Image framework for that.
Now for some images I get a black screen in the result. I enabled the "OpenGL ES Error" breakpoint in Xcode and found that a GL_INVALID_VALUE error is thrown inside Core Image—when it's trying to create an intermediate texture for the Lanczos filter, to be precise.
After countless experiments I found that it only happens if the resulting image would have a width greater then 2048 pixels.
So for example images taken with the build-in camera in landscape mode scaled down to 4 MP (2369x1769) cause this error. Portrait images of the same size (height > 2048) work without problems.
If I'm using the CIAffineTransform filter instead I'm not getting that error. But I'd prefer to use Lanczos since it yields better results.
I tested on an iPad Mini (non-retina, iOS 7.1) and an iPhone 5s (iOS 7.0.6) with the same result.
Any ideas on what is causing this issue? I searched online but was not able to find any documented or non-documented restrictions concerning image sizes for this filter.
I am a Cocos2d game developer. I am developing a game using retina display images.
I have created texture files with and without HD suffix using zwoptex.
I have added those zwoptex plist texture files in app-delegate like [[CCSpriteFrameCache sharedSpriteFrameCache] addSpriteFramesWithFile:#"Background.plist"];
I have enabled the retina display to YES [director enableRetinaDisplay:YES];.
I have used the png files from the plist wherever i want using ccsprite *background = [CCSprite spriteWithSpriteFrameName:#"sample.png"];.
All those png files which I have included are high resolution images with both sizes 960*640 and 480 * 320. But in no reason the images look blurry and fuzzy when i run the game in simulator or iPhone. Anyone please help me to solve this issue …..
(The following image was posted as an example in a comment.)
cocos2d applies anti-aliasing to sprites by default. You need to turn that off:
[background.texture setAliasTexParameters];
hope this helps.
The screenshot you posted (I took the liberty of adding it to your question) shows that it was taken from the iPhone Simulator and not the iPhone (Retina) Simulator. Therefore it will not use the HD images.
With the iPhone Simulator running go to the Hardware -> Device menu and select iPhone (Retina) as the device. Then restart your app.
Note also that the iPhone Simulator will only render the game with a color depth of 16-Bit, regardless of settings in cocos2d or your Mac. The iOS Simulator renderer is limited to 16-Bit rendering for performance reasons (it only uses software rendering, no hardware acceleration). Only by looking at the game on an actual device can you make judgement calls about image quality.
To test whether the game is actually loading the HD assets or for some reason just loads the SD images, try running it without the SD images. If the game tries to load the SD images it will cause an error. If not, it is loading the HD images and the "blur issue" has a different cause. You could also log which files are loaded by adding a NSLog statement to the CCFileUtil class method fullPathFromRelativePath which performs the file name changes to load -hd images whenever possible.
You'll find that even miniscule amount of scaling or rotation applied to a sprite may have it look blurred, so check if you happen to do that. Any change in blend modes (using ccBlendFunc) could also cause blurred images. Also check that your images are fully opaque (opacity == 255).
I'm currently working on an OpenGL ES 1.1 app for the iPad
its running at full 768x1024 iPad resolution, with textures, polygons, and the works
but only at about 30 fps! (not fast enough for my purposes)
im pretty sure its not my code, because when i lowered the resolution, the FPS increased, eventually the normal 60 at iPod touch resoultion
Is anyone else encountering this FPS slowdown?
should I reduce the size then scale up?... also, would upgrading to opengl 2.0 increase speed?
any guidance is much appreciated!
The iPad has the exact same GPU as the iPhone 3GS, so you would probably expect worse fullscreen performance on the iPad due to having to push 5 times as many pixels.
If this is the case, then using scaling is probably the best solution. After all, even console developers have to do it!
I had same problem when porting an iPhone game to iPad. There are few optimizations that raised FPS from 5-6 to 20+:
using vbo-s
reducing as much as possible per fragment operations(fog, blend, multi-texturing)
putting some operations on CPU(lights for example)
using multi-texturing instead of multi-pass with blend
improving culling algorithm(now we have a better CPU)