I was trying to sample the default colors used by Reminders app in iOS7 and use them in my app.
What I do is to use the DigitalColor Meter (in Generic RGB mode) to sample the RGB value, then I put this value in my app to construct a UIColor. Then I tweak this value until the DigitalColor Meter gives me the same set of RGB value as it gives me for the iOS7 Reminders app.
However, it seems like I cannot reconstruct some of the colors by using UIColor. For example, when reconstructing the blue color, I already have 0 for the R value in my UIColor, yet the DigitalColor Meter does not give me a value smaller than 36, which is still bigger than the value of 30 for the R value of the blue color in the Reminders app.
Does UIColor not cover the entire gamut of colors? Does Apple use some special hidden API to a wider range of colors?
I'm not entirely sure what the problem is here. I took a screenshot of the Reminders app, and measured the blue color as 26, 173, 248 in Decimal mode. Then I went to my app's storyboard and set the view background color to 26, 173, 248 using the sliders. I ran my app on the phone and took a screenshot of it. Digital Color Meter reads 26/173/248 on my app as I would expect.
If you're using UIColor, you can use Percentage mode on Digital Color Meter to give you the correct values. For me, that blue color reads as 10%, 68%, 97%, so I used this code:
[UIColor colorWithRed:.1f green:.68f blue:.97f alpha:1];
I set that as my view background color in code, and running it I can't see any difference between it and the label background I set using the sliders on the storyboard.
Related
I'm facing a quite weird color issue with the Interface Builder in Xcode.
I've set the background of my view in my view controller to #1F242C:
So far so good.
Then I create a UITableViewCell in a nib file where I set the background to the exact same color #1F242C:
Now when I start the app, the background of the UITableViewCell is suddenly different from the viewcontroller view's background, even though the RGB values match and the opacity too.
When I analyze a screenshot with Photoshop
The view's background is: #29303A
and the cell's background is: #1F242C
Whey are they different colors despite the RGB values being the same? I don't change the colors programmatically in my source code.
Here's how the different colors look:
This is an issue due to iOS 10 and the new extended color gamut of the new devices screens.
Click on the wheel at the right of "RGB slider" in your color picker, and chose sRGB IEC61966 2-1.
You will see that the RGB values will change at this point. Put them back to the RGB values you previously entered. That should be it!
I have some colored images and same colored text on an iPhone App screen. The images show as expected, but when I use the same RGB color code on the text, it appears different on the finished app (mostly, a little darker). Why is xcode changing the color I tell it to use? How to get the text color to exactly match the color codes in the images.
Update: Here's a screenshot. I use the color picker from inside XCode, to pick the color from the image, so it shouldn't matter which color space I'm using (I'm using the same for both text and image), it still looks different.
It's very popular problem about different colors...
1) Try to use Digital color (native mac application)
2) Try to off text shadow, if you use it from your label.
3) And, all of its, is not a wright way to resolve this problem, if you want to color like on image you should correct them manually in you code or in interface builder.
I can not find an article or a document describing which color space should be used for RGB values when initialising an instance from UIColor class.
The article Getting the right colors in your ios app says we should use Generic RGB. On the other hand, I have found several posts saying that we should definitely use sRGB on iOS.
It seems the default color space is sRGB as written here in CGColorSpace Reference
There is a new color space called "Display P3" used in the iPad Pro and iPhone 7. The profile in the existing image resources has to be converted to the Display P3. For Digital Color Meter app the P3 profile has to be selected in order to get "Display P3" RGB values. See the screenshot.
TL;DR
iOS uses Extended sRGB as its default.
Detailed Explanation
This can easily be verified via the debugger in Xcode:
put a breakpoint anywhere in your code,
when lldb comes up, type po UIColor.red,
the above command will return UIExtendedSRGBColorSpace 1 0 0 1 as of iOS 11, Xcode 9.2.
This color space can use the same sRGB values without a color change but at the same time permits RGB values to go negative as well as larger than 1.0 to express colors outside the sRGB color space (i.e. the extended sRGB color space).
There is a WWDC video, Working with Wide Color, from 2016 that explains this phenomenon nicely (go to 8:35).
So if you use sRGB colors in your app you should be good to go even for e-sRGB displays. Of course, if you need one of those specific colors outside the sRGB gamut, you will need to use the e-sRGB color space.
I wrote a blog post that explains this in detail:
http://www.vsanthanam.com/writing/2017/7/6/colors-in-ios-ensuring-consistency-between-designs-interface-builder-and-uicolor
Essentially, while Sketch uses a custom color picker that assumes sRGB color components, Xcode's color pick does not.
UIColor's most common factory method +colorWithRed:green:blue:alpha: assumes the sRGB Color Space (Extended sRGB if linked with iOS 10.x), but it also has another factory method, colorWithDisplayP3Red:green:blue:alpha: which will take in your Display P3 color space RGBA components, but stores them internally as translated values with a reference to the sRGB color space.
As always, take a look at Apple's documentation on UIColor for more info:
https://developer.apple.com/documentation/uikit/uicolor?changes=latest_minor
Interestingly enough, AppKit's UIColor counterpart, NSColor, supports way more flexibility with more available factory methods (including one designed with a custom color space specifically to mimic UIColor's +colorWithRed:green:blue:alpha: behavior). I'm not sure why UIColor is more limited.
I need to change the color of sliced icons used in app in different color. I dont want to take more and more images i.e. of different color. I want to change the color of image only, not other properties of image e.g. shape, size, layer, opacity etc. Is that possible?
If you construct your icons correctly, you can use the Hue Adjust CIFilter. You'd just need to change the inputAngle to get the color you want. For example, if your icon is mostly blue, you can make a mostly yellow one by changing inputAngle to 180° (or π radians - not sure of the units).
The problem: I set a color in Interface Builder, setting the channels RGB, then I do a screenshot of the working window, open it in Photoshop and check color by color picker, which I have set in Interface Builder. The result - the RGB values are different from the ones I set.
Video:
http://www.youtube.com/watch?v=ASLfnYHPbqM
Most abnormal begins with the 45th second, when I tried to use the color picker of the Interface Builder. It shows the RGB values are different from those that I have in him the same and installed), but the values of сolor picker IB coincide with the values of color picker Photoshop.
Apple thinks it is much more important that colors look the same everywhere than that colors have the same RGB values everywhere. See, the same RGB values will not look the same on different screens, because every screen has different display characteristics.
So when you take a screenshot, Apple does not just store a RGB value for every pixel in the image, they also store the display characteristics of your monitor inside the image file. What is that good for? If someone else opens your screenshot, the system can look at the monitor characteristics of the person who created it, compare those to the monitor characteristics of the monitor of the person that wants to watch it, and can then calculate how it must adopt the RGB values in the image so that the image looks the same on the current monitor. If it was just displaying the RGB values without doing any of this, the image colors may look wrong (in some cases only slightly, in some cases some more, and if the user has a very bad monitor, in some cases even radically wrong).
So the system send different RGB values to the graphics adapter, because RGB colors by themselves do not really describe a color. RGB values together with a monitor profile do describe a color and not the RGB values are important, the color is important. If I make something red, I want it to be the same shade of red on every monitor. I don't want it to be a darker red on one monitor, a lighter red on another monitor, and a red that is almost pink on a third monitor.
The problem with Photoshop is that it has its own build-in color correction mechanism. Photoshop usually works in sRGB (standard RGB color space) or Adobe RGB (an extended color space Adobe invented). When you load an image that is not within the desired color space, Photoshop will transform the color space of the image and every color space transform causes the RGB values to change. Please note that the images displayed within Photoshop are still color corrected according to your current monitor color characteristics, it's just the RGB values you manipulate within Photoshop that are in another color space and when stored back to file, Photoshop will either transform the values back or it will keep them and embed a new color profile into the image file.
The times where RGB alone has been used to describe colors are over for many years already. Today RGB is rather meaningless on its own, only when combined with a color profile it becomes really meaningful in describing an actual color.
If you want a screenshot without a real color profile embedded, do the following:
Open "System Preferences"
Go to "Displays"
Go to "Color"
Select "Generic RGB Profile"
Make your screenshot
Change your profile back
It may be necessary to quit Xcode first before you change the profile (and restart it after you changed it back), since I'm not sure if changing the profile has immediate effect. It certainly has immediate effect how things are displayed on your monitor, but if you want to choose a color in Xcode by selecting specific RGB values, I'm not sure if a profile change will have immediate effect here as well (you can give it a try without restarting, if that does not work, you must repeat it with restarting).
This may still cause incorrect colors in Photoshop though, since Photoshop may still convert the colors to sRGB or Adobe RGB. So instead of selecting "Generic RGB Profile", you may want to select the "sRGB" profile (depending on your OS X version, may also be named "sRGB IEC61966-2.1"). This way the image is already in sRGB color space. You only have to convince Photoshop to keep it that way (and not converting it to Adobe RGB) and then you will really see the same RGB values in Xcode and Photoshop.
Note how changing the color profile makes your display look quite differently? You think the color change is dramatically? Trust me, that is nothing compared to how different monitors may sometimes change the colors. Maybe you can now understand why color correction is so important. So the question is, why are the RGB values so important to you in the first place? Does it really matters, that RGB values are the same, as long as the displayed color is the same?
This is the guide that works, just follow these steps and you can set UIColor programatically and have them match colors from a screenshot.