I am updating some textures of the scene all the time by new images.
Problem is uploading is synchronous and texImage2D takes ~100ms. It takes so long time even if texture is not used during rendering of the next frame or rendering is switched off.
I am wondering, is there any way to upload texture data asynchronously?
Additional conditions:
I had mention there is old texture which could stay active until uploading of new one to GPU will be finished.
Solution is to use texSubImage2D and upload image to GPU by small portions. Once uploading will be finished activate your new texture and delete old one.
is there any way to upload texture data asynchronously?
no, not in WebGL 1.0. There might be in WebGL 2.0 but that's not out yet.
Somethings you might try.
make it smaller
What are you uploading? Video? Can you make it smaller?
Have you tried different formats?
WebGL converts from whatever format the image is stored in to the format you request. So for example if you load a .JPG the browser might make an RGB image. If you then upload it with gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE it has to convert the image to RGBA before uploading (more time).
Do you have UNPACK_FLIP_Y set to true?
If so WebGL has to flip your image before uploading it.
Do you have UNPACK_COLORSPACE_CONVERSION_WEBGL set to BROWSER_DEFAULT_WEBGL?
If not WebGL may have to re-decompress your image
Do you have UNPACK_PREMULTIPLY_ALPHA_WEBGL set to false or true?
Depending on how the browser normally stores images it might have
to convert the image to the format your requesting
Images have to be decompressed
are you sure your time is in "uploading" vs "decompressing"? If you switch to uploading a TypedArray of the same dimensions does it speed up?
I'm using Unity3d (4.3.1) and NGUI for creating an 2d iOS (iPad) app. Also I need to use a lot of full screen images (about 100 images with size 2048x1536), for Gallery for example.
Now I'm using them with GUI type, override for iPhone with max size 2048 and compression quality: normal. And I'm using a UITexture with Unlit/Transparent shader to show them.
However, after about 40 images in the project XCode returns the terminated due to memory error. So the question is, what type of images do I need, and with which preferences to make them work?
I'm using iPad 3 as a test device with XCode 5.1.1. I'll be thankful for any help!
Also I need to use a lot of full screen images (about 100 images with size 2048x1536), for Gallery for example.
I think your 2048x2048 size images use a very huge memory area. Basically, 2048 image use 16MB memory. So, this case need to use about a 1600MB memory! Normal application don't over about 200 MB.
So, I think you need to be reduce using a memory:
Remember that this texture is going to be expand 2048x2048 by unity.( http://www.opengl.org/wiki/NPOT_Texture ) So, if you are going to reduce file size to 1500x1000, your application still use 2048x2048 image. But if you can reduce file size to 1024x1024, do it. 1024 image just use 4 MB memory.
If you can use texture compression. Use it. PVRTC 4 bit ( https://docs.unity3d.com/Documentation/Manual/ReducingFilesize.html ) compression is make file size 1/8 than true color. Also memory size is going to reduce.(maybe reduced to half)
If your application don't display all images, load image dynamically. Use thumb nail.
Good luck:D
If you want to make a gallery-like app to render photos maybe you can try a different approach:
create two large editable textures and fill texels with image data (it must be editable otherwise you will no have access to write directly image data into them).
if you still have memory issues or if you want to use lower memory you can use several smaller textures as tiles. You can render then image parts to each smaller texture. Remember to configurate correctly the texture borders or so not use border texels to avoid wrapping problems.
Best way is to use a smaller texture. In an ipad you will need a magnifying glass to really appreciate the difference between 1024x1024 and larger textures. Remember an ipad screen is smaller (7"~10") than a computer one and with filtering enabled is really hard to tell the difference.
If you still need manager such a large texture for some other reason (zooming or similar) I recommend you one of the following approaches:
split the texture into layers with alpha channel (transparency): usually backgrounds can be rendered with lower resolutions.
split also the texture into blocks: usually most textures have repeating patterns.
use compression.
Always avoid use such large textures if possible.
Can Texture Packer's pvr.ccz files be used in a non cocos2d app? I'd like to use them with core animation. Is this possible?
Out of the box? No.
You can write your own .pvr.ccz texture loader respectively adapt the .pvr.ccz texture loading code from cocos2d. The key part really is just the compression format (ccz) which uses the zlib compression provided by the iOS/Mac SDK. After inflating the compressed file, you end up with a regular PVR texture that you can use in Core Animation.
I want to load 4-channel texture data from a file in iOS, so I consider the texture as a (continuous) map
[0,1]x[0,1] -> [0,1]x[0,1]x[0,1]x[0,1]
If I use the fileformat .png, XCode/iOS consider the file as an image, and so multiplies each component rgb with a (premultiplied alpha), corrupting my data. How should I solve this? Examples may be
use two textures with components rgb (3-channel)
postdivide alpha
use another file format
Of these, I consider the best solution to be to use another file format. The GL-compressed file format (PVRTC?) is not Apple-platform independent and seems to be of low resolution (4 bits) (reference).
EDIT:
If my own answer below is true, it is not possible to get the 4 channel data of png's in iOS. Since OpenGL is about creating images rather than presenting images, it should be possible to load 4-channel data in some way. png is a fileformat for images (and compression depends on all 4 channels but compression of one channel is independent of the other channels), so one may argue that I should use another file format. So which other compressed file formats should I use, which is easy to read/integrated in iOS?
UPDATE: "combinatorial" mentioned a way to load 4-channel non-premultiplied textures, so I had to give him the correct answer. However, that solution had some restrictions I didn't like. My next question is then "Access raw 4-channel data from png files in iOS" :)
I think it is a bad library design not making it possible to read 4 channel png data. I don't like systems trying to be smarter than myself.
As you considered PVRTC then using GLKit could be an option. This includes GLKTextureLoader which allows you to load textures without pre-multiplying alpha. Using for example:
+ (GLKTextureInfo *)textureWithContentsOfFile:(NSString *)fileName options:(NSDictionary *)textureOperations error:(NSError **)outError
and passing an options dictionary containing:
GLKTextureLoaderApplyPremultiplication = NO
You can simply request that Xcode not 'compress' your PNG files. Click your project in the top left, select the 'Build Settings', find 'Compress PNG Files' and set the option to 'No'.
As to your other options, postdividing isn't a bad solution but obviously you'll lose overall precision and I believe both TIFF and BMP are also supported. PVRTC is PowerVR specific so it's not Apple-specific but it's also not entirely platform independent and is specifically designed to be a lossy compression that's trivial to uncompress with little input on the GPU. You'd generally increase your texture resolution to ameliorate for the low bit per pixel count.
You should use libpng to load PNG without premultiplied colors.
It is written in C and should compile for iOS.
I've had similar problems with Android and also had to use third-party library to load PNG files with non-premultiplied colors.
This is an attempt to answer my own question.
It is not possible to load non-premultiplied .png files.
The option kCGImageAlphaLast is a valid option, but does not give a valid combination for CGBitmapContextCreate (reference). It is however a valid option for CGImageRef's.
What the build setting COMPRESS_PNG_FILES in XCode mentioned above does, is to convert .png files into some other file format and also multiply the channels rgb with a (reference). I was hoping that disabling this option would make it possible to reach the channel data in my actual .png files. But I am not sure if this is possible. The following example is an attempt to access the .png data at low level, as a CGImageRef:
void test_cgimage(const char* path)
{
CGDataProviderRef dataProvider = CGDataProviderCreateWithFilename(path);
CGImageRef cg_image = CGImageCreateWithPNGDataProvider(dataProvider, NULL, NO,
kCGRenderingIntentDefault);
CGImageAlphaInfo info = CGImageGetAlphaInfo(cg_image);
switch (info)
{
case kCGImageAlphaNone: printf("kCGImageAlphaNone\n"); break;
case kCGImageAlphaPremultipliedLast: printf("kCGImageAlphaPremultipliedLast\n"); break;
case kCGImageAlphaPremultipliedFirst: printf("kCGImageAlphaPremultipliedFirst\n"); break;
case kCGImageAlphaLast: printf("kCGImageAlphaLast\n"); break;
case kCGImageAlphaFirst: printf("kCGImageAlphaFirst\n"); break;
case kCGImageAlphaNoneSkipLast: printf("kCGImageAlphaNoneSkipLast\n"); break;
case kCGImageAlphaNoneSkipFirst: printf("kCGImageAlphaNoneSkipFirst\n"); break;
default: break;
}
}
which gives "kCGImageAlphaPremultipliedLast" with COMPRESS_PNG_FILES disabled. So I think iOS always convert .png files, even at run-time.
There is better solution, faster(about 3x~5x) and cross platform
#define STB_IMAGE_IMPLEMENTATION
#include "stb_image.h"
// force 4 channel rgba, force flipy
// image: first pixel is top left, OpenGL assume first is bottom left, so need flipy
bool flipy = true;
int width, height, nrChannels;
stbi_set_flip_vertically_on_load(flipy);
unsigned char *imageData = stbi_load(path.c_str(), &width, &height, &nrChannels, 4);
// load data to your texture
// glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, bytes);
free(imageData);
Not 100% what you want, but I got around the problem using this approach: Put the alpha channel into a separate black & white png and save the original png without alpha. So space taken is about the same. Then in my texture loader load both images and combine into one texture.
I know this is a only a workaround, but at least it gives the correct result. And yes, it is very annoying, that iOS does not allow you to load textures from PNG without premultiplied alpha.
Hello people of the wasteland :),
Brief: There is a problem with GL_RGB internal texture format on iOS platform.
In my application I try to save some memory by using GL_RGB instead of GL_RGBA as an internal format.
I'm using the next code piece to achieve this. Nothing else is changed.
glTexImage2D(_textureTargetType,
0,
GL_RGB, // pixel internalFormat
texWidth, // image width
texHeight, // image height
0, // border
GL_RGBA, // pixel format
GL_UNSIGNED_BYTE, // pixel data type
bitmapData);
On MacOS these changes went fluently, no problems. But on iOS, particularly 4.3 (OpenGL ES2.0) it gives me GL_INVALID_OPERATION everytime I try to render textured polgons with this texture. As nothing except this format is changed I think the problem is in incompatibility of GL_RGB internal format with OpenGL ES2.0. This is just my guess, I'm no guru.
This doesn't work in simulator nor iPod touch 4th gen.
Thank you for any reasonable suggestion.
According to the documentation, "internalformat must match format. No conversion between formats is supported during texture image processing." See the Khronos website. OpenGL does not have this limitation, so this code will work on Mac OS, but not the more limited OpenGL ES on iOS devices.