UIButton image saturation changes - ios

In storyboard I have button with image (not background image):
But when I lunch app, image saturation changes =(
Hot can I resolve this issue? I want to save image brightness and saturation (like in storyboard).. Please any answers.. Thanks

There are 2 possible explanations:
Be sure your image is in RGB mode and not CMYK. iOS devices do not handle CMYK images correctly. (In photoshop look in the "Image" -> "Adjustments" -> "Mode" menu.
(This is the most likely explanation) This is a result of the limited color range of different screens. Different monitors and screens have different depths of color they are capable of displaying. So in your case your computer's monitor has a wider range of greens that it can display than the iPad (or vice versa). We have encountered this most with greens and oranges. You'll just have to experiment to see what color will get you the closest to where you want to be.
Here is a very good explanation of this phenomenon:
http://www.cambridgeincolour.com/tutorials/color-spaces.htm

Related

Color gamut in Xcode 8

Today I have installed Xcode 8(beta) and exploring Storyboards. Here we can now set background and tintColor for different traits. That's good news.
But here with trait collection(for example any height X any width) there is another selection of gamuts
Here is the screenshot
As I have searched for gamut, I found that it is related with color.
I have tried with different combination of gamuts but I am unable to see any difference.
The documentation is also not helpful.
The question is how the developers get the benefit of this feature?
Developers can get benefit from it because it gives a much greater control over your app's color profile. You can explicitly assign a color to display depending on the device's gamut.
A solid understanding of gamut is key here. Devices will distort "untagged" colors, that is, colors outside of their gamut. The P3 gamut has a more extensive range of display colors than the sRGB gamut. This graph should give you a good idea of exactly how much more extensive it is:
So if you create your designs on a monitor with a P3 gamut, say a Cinema Display, your colors may display differently on a device with sRGB gamut. However, it's entirely possible there is no change in the color if you pick a color that is within both gamut.

iOS change specific color brightness/lightness in UIImage

I have been trying to recreate a filter effect used in GIMP (photoshop) in an iPhone application.
and here is an example which shows the tool:
https://drive.google.com/file/d/0B5dHxpdDwpPeYnk5WU81X0RQTlU/view?usp=sharing
https://drive.google.com/file/d/0B5dHxpdDwpPeUF8zMFptY3RNUmM/view?usp=sharing
So from what I understand the lightness changes each specifics color brightness (which is not achieved by changing the brightness in general)
now the question is how do I do that to a UIImage?
I tried changing the HSL/HSB of the whole image but that doesn't give me the right effect.
Thank you for your time

Color selected part of image taken from camera on touch

My requirement is to fill specific color on specific area of an image. But the image should be an image taken from iphone camera or photo gallery. For example, I could take a picture of me with a blue shirt, the app should allow me to change the color of the shirt to red.
Exactly the functionality of "Paint bucket" tool of the photoshop.
I found couple of approaches
1) Using MASKS with prepared images
color selected part of image on touch
Fill color on specific portion of image?
Scanline Flood Fill Algorithm
https://github.com/Chintan-Dave/UIImageScanlineFloodfill
2) Using GLPaint (Actually this is NOT the solution I am running after)
My question is,
Is it possible to color specific area of a image WITH OUT having MASKS or with generating masks for the image on run time?
Scanline Flood Fill Algorithm does that in to a certain level. But when it comes to real time images(like selfie images) it wont work correctly?

CMYK vs sRGB - which one is better

Guys i have 2 images one with CMYK color model and other with sRGB.. I would like to find out which color model is better to use while dealing with image processing like resizing, cropping, color filling etc..
Thanks in advance guys.. !
CMYK color space is used for print, (s)RGB is used for screens (web, monitors, tvs etc). If one were to open a CMYK document in a viewer/program that doesn't support the color profile (which is not uncommon, since CMYK isn't as widely supported as RBG) the colors would appear to be extremely over-saturated. If you are altering the images for use on the web, or in an application, I would highly recommend that you use some variant of RGB.
In short, neither is really better than the other in general, it all depends on where you will be using the images (apples and oranges, comes to mind). CMYK is better for print, and (s)RGB is better for screens.
UPDATE in response to OP's comment:
Just to be clear (forgive me if you already know this) color space/profiles do not affect the resolution of an image, they only affect how the colors are handled/encoded. Resolution is only affected by file dimensions, DPI/PPI (dots/pixels per inch) and compression.
UPDATE 2 in response to OP's comment:
I'm not familiar with "Imagemagick", but in general, I can tell you that I've converted thousands of documents to RGB from CMYK (and vise versa) and never noticed any degradation in quality, when the file is viewed in a program that supports the color profile. The only exception is when converting to CMYK FROM RGB, it is possible to lose a wee bit of vibrancy (due to the fact that CMYK is a smaller color space. Like I said before, if by "quality" you mean "resolution" the color profile won't effect it, the image won't suddenly lose clarity, when switching color profiles. Let me know if you have any other questions.
Neither is "better" for these purposes. In general you should use the color space/model that best aligns with your output device, like CMYK for paper and sRGB for screen, but for manipulations (such as resize, crop) they are the same.
the best is sRGB because is larger and the color are better preserved. AdobeRGB is even better. But remember that if you want to distribute you graphic image you have to know what kind of device will be used to print/view. You may use also a AdobeRGB but when you print it, you'll be sad, because many colors of AdobeRGB cannot be seen on a paper.
Cheers

Wrong color in Interface Builder

The problem: I set a color in Interface Builder, setting the channels RGB, then I do a screenshot of the working window, open it in Photoshop and check color by color picker, which I have set in Interface Builder. The result - the RGB values ​​are different from the ones I set.
Video:
http://www.youtube.com/watch?v=ASLfnYHPbqM
Most abnormal begins with the 45th second, when I tried to use the color picker of the Interface Builder. It shows the RGB values ​​are different from those that I have in him the same and installed), but the values ​​of сolor picker IB coincide with the values ​​of color picker Photoshop.
Apple thinks it is much more important that colors look the same everywhere than that colors have the same RGB values everywhere. See, the same RGB values will not look the same on different screens, because every screen has different display characteristics.
So when you take a screenshot, Apple does not just store a RGB value for every pixel in the image, they also store the display characteristics of your monitor inside the image file. What is that good for? If someone else opens your screenshot, the system can look at the monitor characteristics of the person who created it, compare those to the monitor characteristics of the monitor of the person that wants to watch it, and can then calculate how it must adopt the RGB values in the image so that the image looks the same on the current monitor. If it was just displaying the RGB values without doing any of this, the image colors may look wrong (in some cases only slightly, in some cases some more, and if the user has a very bad monitor, in some cases even radically wrong).
So the system send different RGB values to the graphics adapter, because RGB colors by themselves do not really describe a color. RGB values together with a monitor profile do describe a color and not the RGB values are important, the color is important. If I make something red, I want it to be the same shade of red on every monitor. I don't want it to be a darker red on one monitor, a lighter red on another monitor, and a red that is almost pink on a third monitor.
The problem with Photoshop is that it has its own build-in color correction mechanism. Photoshop usually works in sRGB (standard RGB color space) or Adobe RGB (an extended color space Adobe invented). When you load an image that is not within the desired color space, Photoshop will transform the color space of the image and every color space transform causes the RGB values to change. Please note that the images displayed within Photoshop are still color corrected according to your current monitor color characteristics, it's just the RGB values you manipulate within Photoshop that are in another color space and when stored back to file, Photoshop will either transform the values back or it will keep them and embed a new color profile into the image file.
The times where RGB alone has been used to describe colors are over for many years already. Today RGB is rather meaningless on its own, only when combined with a color profile it becomes really meaningful in describing an actual color.
If you want a screenshot without a real color profile embedded, do the following:
Open "System Preferences"
Go to "Displays"
Go to "Color"
Select "Generic RGB Profile"
Make your screenshot
Change your profile back
It may be necessary to quit Xcode first before you change the profile (and restart it after you changed it back), since I'm not sure if changing the profile has immediate effect. It certainly has immediate effect how things are displayed on your monitor, but if you want to choose a color in Xcode by selecting specific RGB values, I'm not sure if a profile change will have immediate effect here as well (you can give it a try without restarting, if that does not work, you must repeat it with restarting).
This may still cause incorrect colors in Photoshop though, since Photoshop may still convert the colors to sRGB or Adobe RGB. So instead of selecting "Generic RGB Profile", you may want to select the "sRGB" profile (depending on your OS X version, may also be named "sRGB IEC61966-2.1"). This way the image is already in sRGB color space. You only have to convince Photoshop to keep it that way (and not converting it to Adobe RGB) and then you will really see the same RGB values in Xcode and Photoshop.
Note how changing the color profile makes your display look quite differently? You think the color change is dramatically? Trust me, that is nothing compared to how different monitors may sometimes change the colors. Maybe you can now understand why color correction is so important. So the question is, why are the RGB values so important to you in the first place? Does it really matters, that RGB values are the same, as long as the displayed color is the same?
This is the guide that works, just follow these steps and you can set UIColor programatically and have them match colors from a screenshot.

Resources