I have just started learning OpenGl, i have read about Rendering and texture image. If anyone can provide a simple example to insert image.
OpenGL itself doesn’t provide API to generate a texture from image file. We had to read image file and convert it to bitmap data to use texture in OpenGL before, but GLKTextureLoader class in GLKit enables us generating texture from image automatically.
NSString* filePath = [[NSBundle mainBundle] pathForResource:#"mushroom" ofType:#"png"];
GLKTextureInfo* textureInfo =
[GLKTextureLoader textureWithContentsOfFile:filePath options:nil error:nil];
if (textureInfo) {
NSLog(#"Texture loaded successfully. name = %d size = (%d x %d)",
textureInfo.name, textureInfo.width, textureInfo.height);
}
return value of textureWithContentsOfFile:options:error: method is an instance of GLKTextureInfo class. A GLKTextureInfo object contains information about a texture such as width, height… There is a property named ‘name’ (documented as ‘glName’ and it should be misprint). We use value in this property to specify texture in GLKBaseEffect (Will be described later)
GLKTextureLoader is a class for loading textures, so we should release texture by our self. Call glDeleteTextures with value of ‘name’ property in GLKTextureInfo object.
Found this nice discussion in a blog. Hope this helps.. :)
Related
I use two methods to load black white image texture in metal as below:
//1.
mtltexture01 = [textureLoader newTextureWithName:#"texture02"
scaleFactor:1.0
bundle:nil
options:textureLoaderOptions
error:&error];
//2.
UIImage *img = [UIImage imageNamed:#"texture02"];
mtltexture01 = [textureLoader newTextureWithCGImage:img.CGImage options:textureLoaderOptions error:&error];
but both crash, the error log is
"Error Domain=MTKTextureLoaderErrorDomain Code=0 "Image decoding
failed" UserInfo={NSLocalizedDescription=Image decoding failed,
MTKTextureLoaderErrorKey=Image decoding failed}",
how to fix this issue? Also if I load the colorful image into metal, it runs.
-(id<MTLTexture>)textureWithName:(NSString*)imgname UsingDevice:(id<MTLDevice>)device {
MTKTextureLoader* textureLoader = [[MTKTextureLoader alloc] initWithDevice:device];
NSDictionary *textureLoaderOptions = #{
MTKTextureLoaderOptionTextureUsage : #(MTLTextureUsageShaderRead),
MTKTextureLoaderOptionTextureStorageMode : #(MTLStorageModePrivate)
};
return [textureLoader newTextureWithName:imgname
scaleFactor:1.0
bundle:nil
options:textureLoaderOptions
error:nil];
}
and in your metal configurations
id<MTLTexture> mtltexture01;
mtltexture01 = [self textureWithName:#"texture02" UsingDevice:device];
Keep in mind texture02 is a filename and the file needs to be available in your apps assets. You can store the image as MTLTexture into your asset in Xcode and so the conversion is done at build time.
Also check if the image contains at least proper opacity and is one of PNG, JPEG, or TIFF format. There is known loader trouble when textures contain only gray/ black colors and 0% transparency is used.
later integrate your texture to the renderEncoder/commandEncoder
screenshot: Xcode Assets Texture Configurator
I am trying to create a CIFilter using filterWithCVPixelBuffer and it is returning nil.
This is what I'm trying to do:
CFDictionaryRef options = CMCopyDictionaryOfAttachments(nil, photo.pixelBuffer, kCMAttachmentMode_ShouldPropagate);
CIFilter * ciFilter = [CIFilter filterWithCVPixelBuffer:photo.pixelBuffer properties:(__bridge NSDictionary*)options options:nil];
photo is an instance of AVCapturePhoto given to the delegate.
I am using iOS 12 and running the code on iPhone7.
The problem was in the properties NSDictionary. I should have simply passed photo.metadata.
So the function call would look like:
CIFilter * ciFilter = [CIFilter filterWithCVPixelBuffer:photo.pixelBuffer properties:photo.metedata options:nil];
Of course, you can pass an NSDictionary containing the desired CIRAWFilterOption(s).
I think I found the answer in the documentation in the header file:
Returns a CIFilter that will in turn return a properly processed CIImage as "outputImage".
Note that when using this initializer, you should pass in a CVPixelBufferRef with one of the following Raw pixel format types kCVPixelFormatType_14Bayer_GRBG, kCVPixelFormatType_14Bayer_RGGB, kCVPixelFormatType_14Bayer_BGGR, kCVPixelFormatType_14Bayer_GBRG as well as the root properties attachment from the CMSampleBufferRef.
So I guess this method should be used when you have a CMSampleBuffer and is not feasible when coming from a AVCapturePhoto.
In my current SKScene, I use UIGraphicsGetImageFromCurrentImageContext to generate an image from the current graphics on screen.
However, I would like to generate an image from a scene not on screen. One I have created but have not displayed. Is this possible?
The reason for doing is this is to create a custom image for users to share when they achieve a high score, which is similar, but not the same as my main Scene.
Here is a method that captures the contents of a node as a PNG. Be aware that current SpriteKit seems to have a memory leak on the access of the CGImage property, so use this in DEBUG mode.
+ (NSData*) captureNodeAsPNG:(SKSpriteNode*)node skView:(SKView*)skView
{
NSData *data = nil;
#autoreleasepool {
SKTexture *captureTexture = [skView textureFromNode:node];
CGImageRef cgImageRef = captureTexture.CGImage;
NSLog(#"capture texture from node as pixels : %d x %d", (int)CGImageGetWidth(cgImageRef), (int)(CGImageGetHeight(cgImageRef)));
UIImage *capImg = [UIImage imageWithCGImage:cgImageRef];
data = UIImagePNGRepresentation(capImg);
}
return data;
}
I've now filed a bug for the issue below. Anyone with a good
workaround?
I try to save an SKTexture to file, and load it back again, but I don't succeed. The following code snippet can be copied to GameScene.m in the Xcode startup project.
I use textureFromNode in generateTexture, and that seems to be the root cause of my problem. If I use a texture from a sprite, the code works, and two spaceships are visible.
This code worked in iOS 8 but it stopped working in Xcode7 & iOS 9. I just want to verify that this is a bug before I file a bug report. My worry is that I do something wrong with NSKeyedArchiver.
It happens both in simulator and on device.
#import "GameScene.h"
#implementation GameScene
// Generates a texture
- (SKTexture *)generateTexture
{
SKScene *scene = [[SKScene alloc] initWithSize:CGSizeMake(100, 100)];
SKShapeNode *shapeNode = [SKShapeNode shapeNodeWithRectOfSize:CGSizeMake(50, 50)];
shapeNode.position = CGPointMake(50, 50);
shapeNode.strokeColor = SKColor.redColor;
shapeNode.lineWidth = 10;
[scene addChild:shapeNode];
SKTexture *texture = [self.view textureFromNode:scene];
//SKTexture *texture = [SKSpriteNode spriteNodeWithImageNamed:#"Spaceship"].texture; // This works!
return texture;
}
// Just generate a path
- (NSString *)fullDocumentsPath
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *yourFileName = [documentsDirectory stringByAppendingPathComponent:#"fileName"];
return yourFileName;
}
- (void)didMoveToView:(SKView *)view
{
self.scaleMode = SKSceneScaleModeResizeFill;
// Verify that the generateTexture method indeed produces a valid texture.
SKSpriteNode *s1 = [SKSpriteNode spriteNodeWithTexture:[self generateTexture]];
s1.position = CGPointMake(100, 100);
[self addChild:s1];
// Start with saving the texture.
NSString *fullName = [self fullDocumentsPath];
NSError *error;
NSFileManager *fileMgr = [NSFileManager defaultManager];
if ([fileMgr fileExistsAtPath:fullName])
{
[fileMgr removeItemAtPath:fullName error:&error];
assert(error == nil);
}
NSDictionary *dict1 = [NSDictionary dictionaryWithObject:[self generateTexture] forKey:#"object"];
bool ok = [NSKeyedArchiver archiveRootObject:dict1 toFile:fullName];
assert(ok);
// Read back the texture and place it in a sprite. This sprite is not shown. Why?
NSData *data = [NSData dataWithContentsOfFile:fullName];
NSDictionary *dict2 = [NSKeyedUnarchiver unarchiveObjectWithData:data];
SKTexture *loadedTexture = [dict2 objectForKey:#"object"];
SKSpriteNode *s2= [SKSpriteNode spriteNodeWithTexture:loadedTexture];
NSLog(#"t(%f, %f)", loadedTexture.size.width, loadedTexture.size.height); // Size of sprite & texture is zero. Why?
s2.position = CGPointMake(200, 100);
[self addChild:s2];
}
#end
Update for Yudong:
This might be a more relevant example, but imagine that the scene consists of 4 layers, with lots of sprites. When the game play is over I want to store a thumbnail image of the end scene of the match. The image will be used as a texture on a button. Pressing that button will start a replay movie of the match. There will be lots of buttons with images of old games so I need to store each image on file.
-(SKTexture*)generateTexture
{
SKScene *scene = [[SKScene alloc] initWithSize:CGSizeMake(100, 100)];
SKSpriteNode *ship = [SKSpriteNode spriteNodeWithImageNamed:#"Spaceship"];
ship.position = CGPointMake(50, 50);
[scene addChild:ship];
SKTexture *texture = [self.view textureFromNode:scene];
NSLog(#"texture: %#", texture);
return texture;
}
The solution/work around:
Inspired by Russells code I did the following. It works!
CGImageRef cgImg = texture.CGImage;
SKTexture *newText = [SKTexture textureWithCGImage:cgImg];
I've done a lot of experimenting/hacking with SKTextures. My game utilizes SKTextures. It is written in Swift. Specifically, I've had many problems with textureFromNode and textureFromNode:crop: and creating SKPhysicsBodies from textures. These methods worked fine in ios 8, but Apple completely broke them when they released ios 9.0. In ios 9.0, the textures were coming back as nil. Those nil textures broke SKPhysicsBodies from the textures.
I recently worked on serialization/deserialization of SKTextures.
Some key ideas/clues you might investigate are:
Run ios 9.2. Apple Staff mentioned a lot of issues have been fixed. https://forums.developer.apple.com/thread/17463 I've found ios 9.2 helps with SKTextures but didn't solve every issue especially the serialization issues.
Try PrefersOpenGL (set it to "YES" as a Boolean custom property in your config). Here is a post about PrefersOpenGL in the Apple Dev Forums by Apple Staff. https://forums.developer.apple.com/thread/19683 I've observed that ios 9.x seems to use Metal by default rather than OpenGL. I've found PrefersOpenGL helps with SKTexture issues but still doesn't make my SKShaders work (written in GLSL).
When I tried to serialize/deserialize nodes with SKTextures on ios 9.2, I got white boxes instead of visible textures. Inspired by Apple SKTexture docs that say, "The texture data is loaded when:
The size method on the texture object is called.
Another method is called that requires the texture’s size, such as creating a new SKSpriteNode object that uses the texture object.
One of the preload methods is called (See Preloading the Texture Data.)
The texture data is prepared for rendering when:
A sprite or particle that uses the texture is part of a node tree that is being rendered."
... I've hacked a workaround that creates a secondary texture from the CGImage() call:
// ios 9.2 workaround for white boxes on serialization
let img = texture!.CGImage()
let uimg = UIImage(CGImage: img)
let ntex = SKTexture(image: uimg)
let sprite = SKSpriteNode(texture: ntex, size: texture!.size())
So now my SKSpriteNodes created this way seem to serialize/deserialize fine. BTW, just invoking size() or creating an SKSpriteNode with the original texture does not seem to be enough to reify the texture into memory.
You didn't ask about textureFromNode:crop: but I'm adding observations anyway just in case it helps you: I've found this method in ios 8 worked (although the crop parameters were very tricky and seemed to require normalization with UIScreen.mainScreen().scale) In ios 9.0, this method didn't work at all (returned nil). In ios 9.2 this method now works (it now returns a non-nil texture) however subsequent creation of nodes from the texture do not need the size normalization. And furthermore, to make serialization/deserialization work, I found you ultimately have to do #3 above.
I hope this helps you. I imagine I've struggled more than most with SKTextures since my app is so dependent on them.
I tested your code in Xcode 7 and found texture returned in generateTexture was null. That's the reason why you can't load anything from the file, and you even haven't saved anything.
Try to use NSLog to log the description of your texture or sprite. E.g. add this line in generateTexture:
NSLog(#"texture: %#", texture);
What you will get in console:
texture: '(null)' (300 x 300)
And same for s1 and dict1 in your code:
s1: name:'(null)' texture:[ '(null)'
(300 x 300)] position:{100, 100} scale:{1.00, 1.00} size:{100, 100}
anchor:{0.5, 0.5} rotation:0.00
dict1: {
object = " '(null)' (300 x 300)"; }
You may do these tests on both iOS 8 and iOS 9 and you will probably get different results.
I'm not sure why you add the SKShapeNode to a scene and then save the texture from the scene. One workaround is to set texture for your SKShapeNode, and your code should work fine.
shapeNode.fillTexture = [SKTexture textureWithImageNamed:#"Spaceship"];
SKTexture *texture = shapeNode.fillTexture;
return texture;
Update:
It's quite annoying that textureFromNode doesn't works as expected in iOS 9. I tried to solve it by trial and error but no luck at last. Thus, I asked you if you would consider make a snapshot of the whole screen and set it as your thumbnail. Here's the progress I made today and hope you will get inspired from it.
I created a scene which contained SKLabelNode and SKSpriteNode in didMoveToView. After I clicked anywhere on screen, snapshot would be invoked and the down-scaled screenshot would be saved in the document folder. I used the code here.
- (UIImage *)snapshot
{
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, 0.5);
[self.view drawViewHierarchyInRect:self.view.bounds afterScreenUpdates:YES];
UIImage *snapshotImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return snapshotImage;
}
Since the thumbnail is saved as UIImage therefore loading it back for the sprite's texture should be done easily. A sample project demonstrates the whole process and it works on both iOS 8 and 9.
I am making a game in cocos2d. I am checking the level of the sprite and updating the texture accordingly.
Now at start the image is placed in hd alright.. now when the game starts and i start moving the sprite the hd image is replaced with the normal one. I am checking the replacing the texture with this code.
int value = [self.weapon.weaponLevel intValue];
NSString *path = [[NSBundle mainBundle]pathForResource:#"name" ofType:#"png"];
UIImage *imageView = [[UIImage alloc]initWithContentsOfFile:path];
CCTexture2D *frame = [[CCTexture2D alloc]initWithImage:imageView];
[self.player setTexture:frame];
Can any one please help me out here. Thanks.
regards.
There's no need to go through NSBundle and UIImage when you can use CCTextureCache that handles all the path resolution and HD/SD detection internally.
Plus, as the name implies, it caches the texture so that the subsequent use of the same sprite file would be faster:
Here is what you need:
CCTexture2D *frame = [[CCTextureCache sharedTextureCache] addImage:#"name.png"];
[self.player setTexture:frame];