Gaussian blur on full screen - ios

I want to blur the whole screen of my iOS app, and I can't use UIBlurEffect because I want to be able to control the blurriness. So I'm trying to use CIGaussianBlur, but I'm having trouble with the edges of the screen.
I'm taking a screenshot of the screen, and then running it through a CIFilter with CIGaussianBlur, converting the CIImage back to UIImage, and adding the new blurred image on top of the screen.
Here's my code:
let layer = UIApplication.sharedApplication().keyWindow?.layer
UIGraphicsBeginImageContext(view.frame.size)
layer!.renderInContext(UIGraphicsGetCurrentContext()!)
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
let blurRadius = 5
var ciimage: CIImage = CIImage(image: screenshot)!
var filter: CIFilter = CIFilter(name:"CIGaussianBlur")!
filter.setDefaults()
filter.setValue(ciimage, forKey: kCIInputImageKey)
filter.setValue(blurRadius, forKey: kCIInputRadiusKey)
let ciContext = CIContext(options: nil)
let result = filter.valueForKey(kCIOutputImageKey) as! CIImage!
let cgImage = ciContext.createCGImage(result, fromRect: view.frame)
let finalImage = UIImage(CGImage: cgImage)
let blurImageView = UIImageView(frame: view.frame)
blurImageView.image = finalImage
blurImageView.sizeToFit()
blurImageView.contentMode = .ScaleAspectFit
blurImageView.center = view.center
view.addSubview(blurImageView)
Here is what I see:
It looks almost right except the edges. It seems that the blurrisness is taking off from the blur radius to the edge. I tried playing with the context size but couldn't seem to make it work.
How can I make the blur go all the way to the edges?

It is happening because the gaussian blur filter samples pixels outside the edges of the image. But because there are no pixels, you get this weird artefact. You can use "CIAffineClamp" filter to "extend" your image infinitely in all directions.
Please see this answer https://stackoverflow.com/a/18138742/762779
I tried running your code with chained 'CIAffineClamp-> CIGaussianBlur' filters and got good results.
let layer = UIApplication.sharedApplication().keyWindow?.layer
UIGraphicsBeginImageContext(view.frame.size)
layer!.renderInContext(UIGraphicsGetCurrentContext()!)
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
let blurRadius = 5
let ciimage: CIImage = CIImage(image: screenshot)!
// Added "CIAffineClamp" filter
let affineClampFilter = CIFilter(name: "CIAffineClamp")!
affineClampFilter.setDefaults()
affineClampFilter.setValue(ciimage, forKey: kCIInputImageKey)
let resultClamp = affineClampFilter.valueForKey(kCIOutputImageKey)
// resultClamp is used as input for "CIGaussianBlur" filter
let filter: CIFilter = CIFilter(name:"CIGaussianBlur")!
filter.setDefaults()
filter.setValue(resultClamp, forKey: kCIInputImageKey)
filter.setValue(blurRadius, forKey: kCIInputRadiusKey)
let ciContext = CIContext(options: nil)
let result = filter.valueForKey(kCIOutputImageKey) as! CIImage!
let cgImage = ciContext.createCGImage(result, fromRect: ciimage.extent) // changed to ciiimage.extend
let finalImage = UIImage(CGImage: cgImage)
let blurImageView = UIImageView(frame: view.frame)
blurImageView.image = finalImage
blurImageView.sizeToFit()
blurImageView.contentMode = .ScaleAspectFit
blurImageView.center = view.center
view.addSubview(blurImageView)

Related

How to show the image in a way that the pixels are shown cascading from top to bottom?

I need to display a UIImage on splash screen/ launch screen where the image will be shown such that the pixels look like they are cascading from top to bottom to display the full image. I donot have any snapshot of such animation but i have created the pixelated image as below:
guard let currentCGImage = UIImage(named: "back")?.cgImage else {
return
}
let currentCIImage = CIImage(cgImage: currentCGImage)
let filter = CIFilter(name: "CIPixellate")
filter?.setValue(currentCIImage, forKey: kCIInputImageKey)
filter?.setValue(12, forKey: kCIInputScaleKey)
guard let outputImage = filter?.outputImage else { return }
let context = CIContext()
if let cgimg = context.createCGImage(outputImage, from: outputImage.extent) {
let processedImage = UIImage(cgImage: cgimg)
self.imgView.image = processedImage
}
Does anybody has any idea of such animation?

Applying CIGaussianBlur to UIImage not working properly

I want a blur effect to UIImage as slider value changes.
I am using the CIGaussianBlur filter to blur the image.
The code is as follows
func applyBlurFilter(aCIImage: CIImage, val: CGFloat) -> UIImage {
let clampFilter = CIFilter(name: "CIAffineClamp")
clampFilter?.setDefaults()
clampFilter?.setValue(aCIImage, forKey: kCIInputImageKey)
let blurFilter = CIFilter(name: "CIGaussianBlur")
blurFilter?.setValue(clampFilter?.outputImage, forKey: kCIInputImageKey)
blurFilter?.setValue(val, forKey: kCIInputRadiusKey)
let rect = aCIImage.extent
if let output = blurFilter?.outputImage {
if let cgimg = self.context.createCGImage(output, from: rect) {
let processedImage = UIImage(cgImage: cgimg)
return processedImage
}
}
return image ?? self.image
}
Note: I've also tried the below code using CICrop filter
func applyBlurFilter(beginImage: CIImage, value: Float) -> UIImage? {
let currentFilter = CIFilter(name: "CIGaussianBlur")
currentFilter?.setValue(beginImage, forKey: kCIInputImageKey)
currentFilter?.setValue(value, forKey: kCIInputRadiusKey)
let cropFilter = CIFilter(name: "CICrop")
cropFilter?.setValue(currentFilter!.outputImage, forKey: kCIInputImageKey)
cropFilter?.setValue(CIVector(cgRect: beginImage!.extent), forKey: "inputRectangle")
let output = cropFilter?.outputImage
let context = CIContext(options: nil)
let cgimg = self.context.createCGImage(output!, from: beginImage!.extent)
let processedImage = UIImage(cgImage: cgimg!)
return processedImage
}
The code works perfectly with some images, but with bigger images, while applying the blur filter to the image, the image's right edges get transparent which I don't want.
Note: I am running this on device
What am I doing wrong here, I have no idea
The image whose right edge gets transparant
Result after applying GaussianBlur to the above image
Thanks!!
Well, you're doing something wrong somewhere. The absolute best advice I can give you in your career is to create a small test project to experiment when you have such an issue - I've done this for 15 years in the Apple world, and its been of enormous help.
I created a project here so you don't have to (this time). I downloaded the image, placed it in an ImageView, and it looked perfect (as expected). I then used your code (except I had to create a context, and guess at radius values, then ran it. Image looks perfect with a blur of 0, 5, 10, and 25.
Obviously the issue is something else you are doing. What I suggest is that you keep adding to the test project until you can find what step is the problem (context? other image processing?)
This is the entirety of my code:
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
let im1 = UIImage(named: "Image.jpg")!
let cim = CIImage(image: im1)!
let im2 = applyBlurFilter(aCIImage: cim, val: 25)
let iv = UIImageView(image: im2)
iv.contentMode = .scaleToFill
self.view.addSubview(iv)
}
func applyBlurFilter(aCIImage: CIImage, val: CGFloat) -> UIImage {
let clampFilter = CIFilter(name: "CIAffineClamp")
clampFilter?.setDefaults()
clampFilter?.setValue(aCIImage, forKey: kCIInputImageKey)
let blurFilter = CIFilter(name: "CIGaussianBlur")
blurFilter?.setValue(clampFilter?.outputImage, forKey: kCIInputImageKey)
blurFilter?.setValue(val, forKey: kCIInputRadiusKey)
let rect = aCIImage.extent
if let output = blurFilter?.outputImage {
let context = CIContext(options: nil)
if let cgimg = context.createCGImage(output, from: rect) {
let processedImage = UIImage(cgImage: cgimg)
return processedImage
}
}
fatalError()
}
}

After applying CIFilter image got bigger no matter what I do

I read latest know-hows about applying CIFilter to UIImage and display it back. However I still get my image bigger than the original, don't know why. Maybe I'm missing some scaling factor?.
This is my screen with plain UIImageView and some image. The only constraints are center X/Y to superview.
Screenshot before applying filter
However I've added this code in viewDidLoad():
let ciContext = CIContext(options: nil)
let coreImage = CIImage(image: ledImageView.image!)
let filter = CIFilter(name: "CIExposureAdjust")
filter!.setValue(coreImage, forKey: kCIInputImageKey)
filter!.setValue(1.5, forKey: kCIInputEVKey)
let filteredImageData = filter?.outputImage as! CIImage
let filteredImageRef = ciContext.createCGImage(filteredImageData, from: filteredImageData.extent)
ledImageView.image = UIImage(cgImage: filteredImageRef!)
I get a result other than expected (yes, the filter is applied but size is broken). What did I do wrong?
Screenshot after applying filter
I found both root cause of the issue and the solution. Apparently final UIImage was lacking of scale and imageOrientation. The original (source) image had scale == 3.0 while image after processing stayed with scale == 1.0.
Here is the proper source code for this:
let ciContext = CIContext(options: nil)
let coreImage = CIImage(image: sourceImageView.image!)
let srcScale = sourceImageView.image.scale // <-- keep this value
let srcOrientation = sourceImageView.image.imageOrientation // <-- keep that value
let filter = CIFilter(name: "CIExposureAdjust")
filter!.setValue(coreImage, forKey: kCIInputImageKey)
filter!.setValue(1.5, forKey: kCIInputEVKey)
let filteredImageData = filter?.outputImage as! CIImage
let filteredImageRef = ciContext.createCGImage(filteredImageData, from: filteredImageData.extent)
// use this constructor with scale/orientation values
ledImageView.image = UIImage(cgImage: filteredImageRef!, scale: srcScale: orientation: srcOrientation)
Now the result is as bellow :)
Fixed image on ViewController
Thats strange,
what are the values of the extends of the input and output image ? Do they match ?
You could try this
// crop the output image to the input's image extend
let croppedImage = filteredImageData.cropped(to: coreImage.extent)
let result = UIImage(ciImage: croppedImage)

Swift 3.0 CIEdgeWork filter not working

I am trying to apply CIEdgeWork filter to my inputImage and place the filtered image into myImage (which is a UIImageView). I'm not getting any result (just a blank screen). This same style code works with other filters like CIEdges. Anyone know what i'm doing wrong? Testing on ios10 devices.
let context = CIContext(options: nil)
if let edgeWorkFilter = CIFilter(name: "CIEdgeWork") {
let beginImage = CIImage(image: inputImage)
edgeWorkFilter.setValue(beginImage, forKey: kCIInputImageKey)
edgeWorkFilter.setValue(3.0, forKey: kCIInputRadiusKey)
if let output = edgeWorkFilter.outputImage {
if let cgimg = context.createCGImage(output, from: output.extent) {
let processedImage = UIImage(cgImage: cgimg)
myImage.image = processedImage
}
}
}
Found the solution, set background color of the image to something other than White.

UIImageView contentMode not working after blur effect application

I'm attempting to set the image property of a UIImageView to an image I'm blurring with CoreImage. The code works perfectly with an unfiltered image, but when I set the background image to the filtered image, contentMode appears to stop working for the UIImageView -- instead of aspect filling, the image becomes vertically stretched. In addition to setting contentMode in code, I also set it on the storyboard but the result was the same.
I'm using Swift 2 / Xcode 7.
func updateBackgroundImage(image: UIImage) {
backgroundImage.contentMode = .ScaleAspectFill
backgroundImage.layer.masksToBounds = true
backgroundImage.image = blurImage(image)
}
func blurImage(image: UIImage) -> UIImage {
let imageToBlur = CIImage(image: image)!
let blurfilter = CIFilter(name: "CIGaussianBlur")!
blurfilter.setValue(10, forKey: kCIInputRadiusKey)
blurfilter.setValue(imageToBlur, forKey: "inputImage")
let resultImage = blurfilter.valueForKey("outputImage") as! CIImage
let croppedImage: CIImage = resultImage.imageByCroppingToRect(CGRectMake(0, 0, imageToBlur.extent.size.width, imageToBlur.extent.size.height))
let blurredImage = UIImage(CIImage: croppedImage)
return blurredImage
}
Why is filtering with CIImage causing my image to ignore contentMode and how do I fix the issue?
Solution is to replace your line:
let blurredImage = UIImage(CIImage: croppedImage)
with these 2 lines:
let context = CIContext(options: nil)
let blurredImage = UIImage (CGImage: context.createCGImage(croppedImage, fromRect: croppedImage.extent))
So your full blurImage function would look like this:
func blurImage(image: UIImage) -> UIImage {
let imageToBlur = CIImage(image: image)!
let blurfilter = CIFilter(name: "CIGaussianBlur")!
blurfilter.setValue(10, forKey: kCIInputRadiusKey)
blurfilter.setValue(imageToBlur, forKey: "inputImage")
let resultImage = blurfilter.valueForKey("outputImage") as! CIImage
let croppedImage: CIImage = resultImage.imageByCroppingToRect(CGRectMake(0, 0, imageToBlur.extent.size.width, imageToBlur.extent.size.height))
let context = CIContext(options: nil)
let blurredImage = UIImage (CGImage: context.createCGImage(croppedImage, fromRect: croppedImage.extent))
return blurredImage
}

Resources