SKTransition transitionWithCIFilter:duration: not animating - ios

I've been trying to create a page-turn transition between scenes in SpriteKit like this:
+ (SKTransition *)pageTurnTransition {
float w = 768.0;
float h = 1024.0;
CIVector *extent = [CIVector vectorWithX:0 Y:0 Z:w W:h];
CIImage *shadingImage = [[CIImage alloc] initWithColor:[CIColor colorWithRed:0.2 green:0.2 blue:0.2]];
CIImage *blankImage = [[CIImage alloc] initWithColor:[CIColor colorWithRed:1 green:1 blue:1]];
CIFilter *pageCurlFilter = [CIFilter filterWithName:#"CIPageCurlTransition"
keysAndValues:
#"inputExtent", extent,
#"inputShadingImage", shadingImage,
#"inputBacksideImage", blankImage,
#"inputAngle",[NSNumber numberWithFloat: -0.2*M_PI],
#"inputRadius", [NSNumber numberWithFloat: 70],
nil
];
return [SKTransition transitionWithCIFilter:pageCurlFilter duration:1];
}
This is how I call the transition:
SKScene *spaceshipScene = [[SpaceshipScene alloc] initWithSize:self.size];
[[self view] presentScene:spaceshipScene transition:[HelloScene pageTurnTransition]];
The problem is that the transition does not animate, it just stays at the original scene for the specified (1 second) duration of the transition, then abruptly shows the next scene.
Anybody knows what I am doing wrong? I've tested this in the iPad simulator iOS 7.1.

I tried your code and it could reproduce your problem.
It seems to be an issue with the CurlTransition. If you use one of the built in transitions it works:
return [SKTransition doorsCloseVerticalWithDuration:2];
Are you sure that the shadingImage mustn't be transparent?

Related

Frosted glass effect in SpriteKit?

I'm trying to add frosted glass effect for nodes in my game. For example http://bit.ly/1vNMvAG
What is the right way to do that?
I must be missing something, as I would just stay with SpriteKit, as suggested in original comments, but maybe I'll get schooled and learn something new. :-) EDIT, simplified everything, and now you can just move the crop mask by setting it as a property and then changing its position dynamically as you go. Obviously you could jazz it up with various sprite layers.
SKSpriteNode *bgImage = [SKSpriteNode spriteNodeWithImageNamed:#"bgImage.png"];
bgImage.anchorPoint = CGPointZero;
[self addChild:bgImage];
cropMaskNode = [SKSpriteNode spriteNodeWithImageNamed:#"clippingImage.png"];
SKCropNode *cropNode = [SKCropNode node];
SKSpriteNode *bgInsideOfCrop = [SKSpriteNode spriteNodeWithImageNamed:#"bgImage.png"];
bgInsideOfCrop.anchorPoint = CGPointZero;
SKEffectNode *effectNode = [SKEffectNode node];
effectNode.filter = [self blurFilter];
effectNode.shouldEnableEffects = YES;
effectNode.shouldCenterFilter = YES;
effectNode.shouldRasterize = YES;
[self addChild: cropNode];
[effectNode addChild:bgInsideOfCrop];
[cropNode addChild:effectNode];
cropNode.maskNode = cropMaskNode;
this effect is only available in ios 7 to my knowlwdge. Engin Kurutepe has posted it on Github
- (void)willMoveToSuperview:(UIView *)newSuperview
{
[super willMoveToSuperview:newSuperview];
if (newSuperview == nil) {
return;
}
UIGraphicsBeginImageContextWithOptions(newSuperview.bounds.size, YES, 0.0);
[newSuperview drawViewHierarchyInRect:newSuperview.bounds afterScreenUpdates:YES];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIImage *croppedImage = [UIImage imageWithCGImage:CGImageCreateWithImageInRect(img.CGImage, self.frame)];
UIGraphicsEndImageContext();
self.backgroundImage = [croppedImage applyBlurWithRadius:11
tintColor:[UIColor colorWithWhite:1 alpha:0.3]
saturationDeltaFactor:1.8
maskImage:nil];
}
Code Sample Ref

Is it possible to apply a filter on a SKEmitterNode?

I have a scene with an SKEmitterNode which works fine when directly added to the scene.
However, I want to add a SKEffectNode on the scene to blur the emitter particles.
This is how it looks.
#implementation MyScene
-(id)initWithSize:(CGSize)size {
if (self = [super initWithSize:size]) {
/* Setup your scene here */
self.backgroundColor = [SKColor colorWithRed:1.0 green:1.0 blue:1.0 alpha:1.0];
SKEffectNode *blurNode = [[SKEffectNode alloc] init];
CIFilter *blurFilter = [CIFilter filterWithName:#"CIGaussianBlur"];
[blurFilter setValue:#10.0 forKey:kCIInputRadiusKey];
blurNode.filter = blurFilter;
blurNode.shouldEnableEffects = YES;
NSString *bokehPath = [[NSBundle mainBundle] pathForResource:#"Bokeh" ofType:#"sks"];
SKEmitterNode *bokeh = [NSKeyedUnarchiver unarchiveObjectWithFile:bokehPath];
[blurNode addChild:bokeh];
[self addChild:blurNode];
}
return self;
}
This results in a blank screen.
Since a SKScene is a subclass of SKEffectNode as well, I tried adding a CIFilter to the scene directly as well.
-(id)initWithSize:(CGSize)size {
if (self = [super initWithSize:size]) {
/* Setup your scene here */
self.backgroundColor = [SKColor colorWithRed:1.0 green:1.0 blue:1.0 alpha:1.0];
CIFilter *blurFilter = [CIFilter filterWithName:#"CIGaussianBlur"];
[blurFilter setValue:#10.0 forKey:kCIInputRadiusKey];
NSString *bokehPath = [[NSBundle mainBundle] pathForResource:#"Bokeh" ofType:#"sks"];
SKEmitterNode *bokeh = [NSKeyedUnarchiver unarchiveObjectWithFile:bokehPath];
[self addChild:bokeh];
self.filter = blurFilter;
}
return self;
}
Same result.
It's possible to achieve this effect if you indeed apply the filter to the scene directly. Add the following to your second code:
self.shouldEnableEffects = YES;
If you still see a blank screen, try using bright-coloured particles on a black background instead of your white one for this effect. But expect the framerate to drop considerably. You might be better off using a blurred texture for the particles themselves in the Particle Editor.

Having issue with SKEffectNode

I am following Apple's SpriteKit documentation, now I am trying to use SKEffectNode, but my problem is the effect will not be applied! Here is my code :
SKEffectNode *lightingNode = [[SKEffectNode alloc] init];
SKTexture *texture = [SKTexture textureWithImage:[UIImage imageNamed:#"Spaceship"]];
SKSpriteNode *light = [SKSpriteNode spriteNodeWithTexture:texture];
self.filter = [self blurFilter];
lightingNode.position = self.view.center;
lightingNode.blendMode = SKBlendModeAdd;
[lightingNode addChild: light];
[self addChild:lightingNode];
//applying blur
- (CIFilter *)blurFilter
{
CIFilter *filter = [CIFilter filterWithName:#"CIBoxBlur"]; // 3
[filter setDefaults];
[filter setValue:[NSNumber numberWithFloat:20] forKey:#"inputRadius"];
return filter;
}
When you run the app it just show spaceship without any blur effect.
It looks like #"CIBoxBlur" filter doesn't exist anymore, at least in iOS 7.0. You can use #"CIGaussianBlur". You can see full list of the filters by running:
NSArray* filters = [CIFilter filterNamesInCategories:nil];
for (NSString* filterName in filters)
{
NSLog(#"Filter: %#", filterName);
}
I don't see anywhere that you're setting the filter property of the effect node you've created, nor where you're setting its shouldEnableEffects property to YES.
As implied by the aforelinked documentation, both of those properties need a meaningful value if your effect node is to apply a filter.
I used your code with the following changes and all tested great!
//switch this
lightingNode.position
//to this
light.position
//add this
lightingNode.shouldEnableEffects = YES;
//change
CIBoxBlur
//to
CIGaussianBlur

sequencing image using core animation, Recieving memory warnings

I am recieving memory warning using 100 of animating images so I tried to use Core Animation instead but that gives me the same problem. This is because I don't know how to use replaceSublayer in my current code
UIView* upwardView=[[UIView alloc]init];
[upwardView setFrame:CGRectMake(0, 0, 1024, 768)];
[self.view addSubview:upwardView];
NSArray *animationImages=[NSArray arrayWithObjects:[UIImage imageNamed:#"001.png"],[UIImage imageNamed:#"001.png"],[UIImage imageNamed:#"002.png"],[UIImage imageNamed:#"003.png"],....,nil];
CAKeyframeAnimation *animationSequence = [CAKeyframeAnimation animationWithKeyPath: #"contents"];
animationSequence.calculationMode = kCAAnimationLinear;
animationSequence.autoreverses = YES;
animationSequence.duration = 5.00;
animationSequence.repeatCount = HUGE_VALF;
NSMutableArray *animationSequenceArray = [[NSMutableArray alloc] init];
for (UIImage *image in animationImages)
{
[animationSequenceArray addObject:(id)image.CGImage];
}
CALayer *layer = [upwardView layer];
animationSequence.values = animationSequenceArray;
[layer addAnimation:animationSequence forKey:#"contents"];
I guess you need to add a few lines more. Just replace the last three lines and add the following line.
//Prepare CALayer
CALayer *layer = [CALayer layer];
layer.frame = self.view.frame;
layer.masksToBounds = YES;
[layer addAnimation:animationSequence forKey:#"contents"];
[upwardView.layer addSublayer:layer]; // Add CALayer to your desired view
For detail implementation check this reference

How can I use found face Coordinate(core image)

UIImage* image = [UIImage imageNamed:# "face.png" ];
UIImageView testImage = [[UIImageView alloc] initWithImage: image];
[testImage setTransform:CGAffineTransformMakeScale(1, -1)];
[[[UIApplication sharedApplication] delegate].window setTransform:
CGAffineTransformMakeScale(1, -1)];
[testImage setFrame:CGRectMake(0, 0, testImage.image.size.width,
testImage.image.size.height)];
[self.view addSubview:testImage];
CIImage* ciimage = [CIImage imageWithCGImage:imag​​e.CGImage];
NSDictionary* opts = [NSDictionary dictionaryWithObject:
CIDetectorAccuracyHigh forKey:CIDetectorAccuracy];
CIDetector* detector = [CIDetector detectorOfType:CIDetectorTypeFace
context:nil options:opts];
NSArray* features = [detector featuresInImage:ciimage];
for (CIFaceFeature *faceFeature in features)
{
CGFloat faceWidth = faceFeature.bounds.size.width;
UIView* faceView = [[UIView alloc] initWithFrame:faceFeature.bounds];
faceView.layer.borderWidth = 1;
faceView.layer.borderColor = [[UIColor redColor] CGColor];
[self.view addSubview:faceView];
}
how can I found the face coordinate?
I try to use facefeature.bounds.origin.x and facefeature.bounds.origin.y
but sometimes , it is not the correct coordinate
how can I found coordinate?
------------------2016/04/10------------------
This is my problem ios
x,y is opposite than c#
Here's the basic idea behind it, CIDetector allows you to extrapolate points for the left eye, right eye, and mouth from the image. From that we can do some basic math to create a rectangle that spans between these points, e.g.
for (CIFaceFeature *faceFeature in features)
{
CGPoint lefteye = faceFeature.leftEyePosition;
CGPoint righteye = faceFeature.rightEyePosition;
CGPoint mouth = faceFeature.mouthPosition;
//Face Rectangle
CGRect faceRectangle = CGRectMake(lefteye.x, lefteye.y, righteye.x - lefteye.x, mouth.y - righteye.y);
//Face Center
CGPoint faceCenter = CGPointMake(faceRectangle.origin.x + (faceRectangle.size.width / 2), faceRectangle.origin.y + (faceRectangle.size.height / 2));
UIView* faceView = [[UIView alloc] initWithFrame:faceRectangle];
faceView.layer.borderWidth = 1;
faceView.layer.borderColor = [[UIColor redColor] CGColor];
[self.view addSubview:faceView];
}
Keep in mind, I'm not on a computer right now to test this part of the function for you, but I believe the coordinates outputted by the detector are true to the resolution of the input image. This would cause inaccuracy when trying to apply the create rect to an on screen view using iOS's points coordinate system. This being said all you should have to do is run the newly created rectangle through a convertRect function to get the proper coordinates.

Resources