Is it possible to replicate the white on black feature that the iphone has from within an application.
I am trying to set up a toggle and when turned on the application will invert all of it's colors. Essentially creating a 'night time mode'
You could use Core Image. There is a filter called CIColorInvert which inverts the colors in an image. You can find out more about Core Image filters here.
If you want some help with Core Image, you could check out this tutorial by Ray Wenderlich: Beginning Core Image in iOS 5 Tutorial
Hope this helps!
Related
I am looking to create colour splash effect using swift for my iOS app.
I was considering working with UIBezier path to create some random splash shapes so that I could just fill in colour into those paths. Other than that haven't really had any other ideas I could use to achieve this from my search.
Could anyone help me out and point me in right direction for how I could achieve this ?
I would try Diamond-square or Lazy Flood fill algorithm.
Also take a look at other computer graphics algorithms:
https://en.wikipedia.org/wiki/Category:Computer_graphics_algorithms
The first image is the image using my phone. The one under it is the Photoshopped version of the first one. I adjusted some levels on that image. Any idea how I can do this in Swift?
Welcome!
Since iOS 13, Core Image provides a new CIDocumentEnhancer filter that basically does what Notes does when scanning a document.
If you want more control, you can probably also use a simple CIColorControls filter for adjusting contrast and brightness.
Let's say I have an image with a few colors.
I would like to replace programmatically a specific existing color by a new one.
(something simple, no need to support gradients, like I saw elsewhere).
E.g. I have an image showing a green circle and I want to display it as a red circle (every pixel initially defined with a given (R,G,B) is now displayed with a new (R,G,B).
Any idea of how to do that with the Apple ios SDK ? (or open source ...)
And btw what would be the best image file format to make this easier (png, jpg ....) ?
Thanks !
You should be able to do this using Core Image filters. the Color Cube CI filter lets you map a source color range to destination colors. You should be able to define a source color range and map it to different colors.
That's one CI Filter I didn't figure out how to use however. If you do a search on "Color Cube" in the Xcode help system there is sample code that does a "chromakey" effect that knocks out green shades to transparent. You should be able to adapt that to your needs.
I have a project on Github called CIFilterTest that shows how to use Core Image filters to process images. It's written as a general-purpose system that lets you try a wide variety of filters that use a standard set of parameters (points, colors, 1 or 2 source images, and floating-point values.) I never did take the time to generate the 3D color mapping "cube" that the color cube filter needs as input, so it doesn't allow you to use that particular filter. You'll have to look at the color Cube sample code in the Xcode docs to generate inputs for the Color Cube filter, but my sample app should help a great deal with the basic setup for doing CI based image processing.
answered similar question here:
Replace particular color of image in iOS
in short: I would suggest using CoreImage filter.
I want to replace the particular color of an image with other user selected color. While replacing color of image, I want to maintain the gradient effect of that original color. for example see the attached images.
I have tried to do so with CoreGraphics & I got success to replace color. But the replacing color do not maintain the gradient effect of the original color in the image.
Can someone help me on this? Is the CoreGraphics is right way to do this?
Thanks in advance.
After some struggling almost with the same problem (but with NSImage), made a category for replacing colors in NSImage which uses ColorCube CIFilter.
https://github.com/braginets/NSImage-replace-color
inspired by this code for UIImage (also uses CIColorCube):
https://github.com/vhbit/ColorCubeSample
I do a lot of color transfer/blend/replacement/swapping between images in my projects and have found the following publications very useful, both by Erik Reinhard:
Color Transfer Between Images
Real-Time Color Blending of Rendered and Captured Video
Unfortunately I can't post any source code (or images) right now because the results are being submitted to an upcoming conference, but I have implemented variations of the above algorithms with very pleasing results. I'm sure with some tweaks (and a bit of patience) you might be able to get what you're after!
EDIT:
Furthermore, the real challenge will lie in separating the different picture elements (e.g. isolating the wall). This is not unlike Photoshop's magic wand tool which obviously requires a lot of processing power and complex algorithms (and is still not perfect).
Does anyone know how to replicate CIDarkenBlendMode (Core Image filter) on iOS? I need to simulate an old paper by combining two images ...
Why do you need to replicate CIDarkenBlendMode? It's a supported filter in Core Image as of iOS 5.0, so you can use it directly.
If you don't want to use Core Image for this, my GPUImage framework also has a darken blend mode in its GPUImageDarkenBlendFilter.