I'm drawing many circles using UIBezierPath and CAShapeLayer the issue is the edge of the circles are not rendered properly and it looks like a saw like in the screenshot below :
I tried to fix the problem by forcing the performance with this snippet but did not work:
shapeLayer.shouldRasterize = true
shapeLayer.rasterizationScale = UIScreen.main.scale
Thanks to #PunetSharma in fact i had only to increase the scale for retina devices as explained in this link to be:
shapeLayer.rasterizationScale = 2.0 * UIScreen.main.scale
Related
The use case is to detect something in an image and zoom/distort-outward it as in you are looking through a magnifying glass.
Now I know the points where to zoom in but I need a CIFilter which can do the same.
Tried CIHoleDistortion but it did not work. The documentation seems to be fine to me and it should work but it only creates a black hole and the area around it is distorted.
let distortion = CIFilter(name: "CIHoleDistortion")
distortion.setValue(sourceImage, forKey: kCIInputImageKey)
distortion.setValue(CIVector.init(cgPoint: CGPoint.init(x: 200, y: 200)), forKey: "inputCenter")
distortion.setValue(NSNumber.init(value: 100), forKey: "inputRadius")
Here are the test results:
How about a CIBumpDistortion?
Another (and maybe better) option is to use CIGlassLozenge filter:
Set inputRefraction to something slightly larger than 1 (e.g. 1.06)
Align both inputPoint0 and inputPoint1 with the coordinate of the zoom point
Play with the radius to get the desired effect. The attached example uses radius 1000.
If you want to play with this filter (and any other out of 250 CIFilters) in real time check this app out: https://apps.apple.com/us/app/filter-magic/id1594986951
CIBumpDistortion as mentioned by Frank above, but the scale parameter actually works as follows: scale < 0 -> concave; scale > 0 -> convex.
I am using SpriteKit to draw a graph (with the ability to zoom in and pan around).
When I use an SKCropNode to crop the grid of my graph it doesn't crop the desired area. It crops less, no matter if I use a rectangular SKShapeNode or a SKSpriteNode (with image) as .maskNode.
Here is my code:
//GRID
let grid = SKCropNode()
graphViewModel.graphScene.addChild(grid)
let ratio:CGFloat = 1000 / 500
let width = (graphViewModel.sceneSize.width*0.95)
let newSize = CGSize(width: width, height: width/ratio)
let origin = CGPoint(x: -newSize.width/2.0, y: 0.0)
let rectangularMask = SKShapeNode(rect: CGRect(origin: origin, size: newSize))
rectangularMask.fillColor = UIColor.lightGray
rectangularMask.zPosition = -10.0 //So it appears behind the grid, doesn't affect the cropping
grid.maskNode = rectangularMask
graphViewModel.graphScene.addChild(rectangularMask)
Here are two screenshots to illustrate what I mean:
This is the graph with its grid not being cropped.
This is the graph with the maskNode set.
The lightGray Area is the actual rectangularNode and the grid is being cut off a lot less than it ought to be.
My scene is scaled so I can zoom in without pixelating.
When I disable zooming (setting the scene's size to the view's size) then the bug disappears. Unfortunately I need zooming without any pixel artefacts.
Maybe someone has an idea how to fix this issue. It might also be a SpriteKit Bug.
I would like to rotate UIView which defined on the storyboard I used
CGAffineTransform(rotationAngle: self.degreesToRadians(5))
which degreesToradians a function to convert to Radians, the rotation is working perfectly but the only problem is the UIView not vectorial (the edge of the view is not rendered properly and it looks like a saw) like in the screenshot below :
I have had the same problems as you and to solve it I just add a border to the view which is transparent, do it like this:
customView.layer.borderWidth = 1
customView.layer.borderColor = UIColor.clear.cgColor // important that it is clear
customView.layer.shouldRasterize = true
customView.layer.rasterizationScale = UIScreen.main.scale
Add that code after your rotation code.
With the code:
Without the code:
I am attempting to create an effect on an image using GPUImage. I am adding a vignette to an image to produce an Instagram-inspired filter. Currently I am using a GPUImageVignetteFilter to achieve this. The filter works, but I am looking for a way to either decrease the opacity of this filter, or blend it similar to a photoshop effect. Current code:
let sourceImage = GPUImagePicture(image: "Nothing.png")
let vignetteFilter = GPUImageVignetteFilter()
vignetteFilter.vignetteColor = GPUVector3(one: 77.0 / 255.0, two: 3.0 / 255.0, three: 188.0 / 255.0)
vignetteFilter.vignetteStart = 0
vignetteFilter.vignetteEnd = 1.2
sourceImage?.addTarget(vignetteFilter)
vignetteFilter.useNextFrameForImageCapture()
sourceImage?.processImage()
let newImage = vignetteFilter.imageFromCurrentFramebuffer()
Current Effect:
Desired Effect:
Original Photo:
Any help would be appreciated!
For anyone looking into adding vignettes with alpha, it is not currently supported through the current GPUImage. There is a fork by Drew Wilson (https://github.com/drewwilson/GPUImage) which adds a vignetteAlpha property to the filter. This worked like a charm. Hopefully it will be added to the main branch in the future!
I'm looking to add multiple drop shadows with different opacities to a view. The specs for the shadows are as follows:
Y-offset of 4 with blur radius of 1
Y-offset of 10 with blur radius of 10
Y-offset of 2 with blur radius of 4
Blur radius of 1, spread of 1 (no offsets, will probably have to be 4 different shadows)
I can get all this working just fine using CALayers. Here's the code I have working for that (please note that I haven't bothered to set shadowPath yet, and won't until I get the multiple shadows thing working):
layer.cornerRadius = 4
layer.masksToBounds = false
layer.shouldRasterize = true
let layer2 = CALayer(layer: layer), layer3 = CALayer(layer: layer), layer4 = CALayer(layer: layer)
layer.shadowOffset = CGSizeMake(0, 4)
layer.shadowRadius = 1
layer2.shadowOffset = CGSizeMake(0, 10)
layer2.shadowRadius = 10
layer2.shadowColor = UIColor.blackColor().CGColor
layer2.shouldRasterize = true //Evidently not copied during initialization from self.layer
layer3.shadowOffset = CGSizeMake(0, 2)
layer3.shadowRadius = 4
layer3.shouldRasterize = true
layer4.shadowOffset = CGSizeMake(0, 1)
layer4.shadowRadius = 1
layer4.shadowOpacity = 0.1
layer4.shouldRasterize = true
layer.addSublayer(layer2)
layer.addSublayer(layer3)
layer.addSublayer(layer4)
(While this code is in Swift, I trust that it looks familiar enough to most Cocoa/Objective-C developers for it to make sense. Just know that layer is equivalent to self.layer in this context.)
The problem, however, arises when I attempt to use different opacities for each shadow. The shadowOpacity property of layer ends up being applied to all of its sublayers. This is a problem, as I need all of them to have their own shadow opacity. I have tried setting each layer's shadow opacity to its correct value (0.04, 0.12, etc.), but then the opacity of 0.04 of layer is applied to all sublayers. So I tried to set layer.shadowOpacity to 1.0, but this made all the shadows solid black. I also tried to be clever and do layer2.shadowColor = UIColor(red: 0, green: 0, blue: 0, alpha: 0.12).CGColor, but it was just changed to full black with no transparency.
I suppose it makes some sort of sense that the layers should all have the same shadow opacity. But what's a way to get this working, varying opacities and all (doesn't have to utilize CALayer if it's easier another way)?
Please don't answer with "just use an image": no matter how sane that may be, I'm trying to avoid it. Just humor me.
Thanks.
EDIT: As per request, here's what I'm after: .
The key thing that needs to be added is setting the layers' shadowPath. By default, Core Graphics draws a shadow around the layer's visible content, but in your code neither backgroundColor nor bounds are set for the layers, so the layers are actually empty.
Assuming you have a UIView subclass, you can make it work by adding something like this:
override func layoutSubviews() {
super.layoutSubviews()
layer.sublayers?.forEach { (sublayer) in
sublayer.shadowPath = UIBezierPath(rect: bounds).cgPath
}
}
I tested this approach on a view with multiple shadows and it worked as expected, as soon as the shadowPath is defined for the shadow layers. Different shadow colors and opacities worked as well, but you have to keep in mind that upper layers in the hierarchy will overlap the layers behind them, so if the front layer has a thick shadow, the other shadows can get hidden by it.
What about adding the alpha to the shadow color instead of the layer shadow opacity?
i.e. instead of
layer.shadowColor = UIColor.black.cgColor
layer.shadowOpacity = 0.5
do
layer.shadowColor = UIColor.black.withAlphaComponent(0.5).cgColor
for each layer.