I'd like to create a NSObject subclass which would hold all my values for UIView.
Problem - what's the right way to do that?
Using "extern" and class method combination?
Using "extern" and #define combination?
Using only #define on class methods?
UI elements(e.g. UIColor) can't be initialized using "extern *const" method.
Writing a class methods for each value seems like too much.
Macros are plain(no coloring, etc.) and are declared only in header file.
Isn't there are better solution, which would hold all my ints,floats, colors in same place. And which is not a macro.
Why not macros?
For UIColor you can use:
#define RGBA_COLOR(r, g, b, a) [UIColor colorWithRed:(r/255.0) green:(g/255.0) blue:(b/255.0) alpha:a]
#define MY_GREEN_COLOR RGBA_COLOR(60, 192, 174, 1.0)
And then you can use MY_GREEN_COLOR without problems:
UIColor *color = MY_GREEN_COLOR;
And the same for int, floats and so on:
#define MY_INT 83
I usually have a "Globals.h" file with all this stuff.
Related
I have used typedef NS_ENUM to reorganise data constants in old code. Using an approach found here every typedef is declared in a single .h file that can be imported to any class in the project. Content of the .h file is wrapped in a message to the compiler. This works nicely for int variables.
MYCharacterType.h
#ifndef MYCharacterType_h
#define MYCharacterType_h
typedef NS_ENUM(NSInteger, MARGIN)
{
MARGIN_Top = 10,
MARGIN_Side = 10,
MARGIN_PanelBaseLine = 1
};
...
#endif /* SatGamEnumType_h */
But Xcode complains when I try to include float variables
“Non-Integral type ’NSNumber’ is an invalid underlying type’
e.g.
typedef NS_ENUM(NSNumber, LINE_WIDTH) {
LINE_WIDTH_Large = 1.5,
LINE_WIDTH_Medium = 1.0,
LINE_WIDTH_Small = 0.5,
LINE_WIDTH_Hairline = 0.25
};
I get the same message whether I use NSValue or NSNumber so I suspect typedef NS_ENUM is not the way to define float variables (or at least the way I am using it).
The approach in this answer would only allow me to do what I have already organised in one file but does not offer a way to reorganise float variables in the same file. Could someone please explain how to do this so all variables are defined in one .h file regardless of their type ? Thanks
SOLUTION
This was answered by rmaddy after I approached the question differently.
Defining different enums in one .h .. like just add it one file.
typedef NS_ENUM(NSInteger, MARGIN)
{
MARGIN_Top = 10,
MARGIN_Side = 10,
MARGIN_PanelBaseLine = 1
};
typedef NS_ENUM(long, ENUM_2)
{
ENUM_2_1 = 10,
ENUM_2_2 = 20,
ENUM_2_3 = 30,
};
typedef NS_ENUM(long, ENUM_3)
{
ENUM_3_1 = 10,
ENUM_3_2 = 20,
ENUM_3_3 = 30,
};
// And so on as many as you want
And your second question, Enums can only be of the integral data types like, int, long, long long, unsigned int, short etc... You can't use any Non-Integral types like float or double or not even any objective c types.
You can do enum mapping for float values like this https://stackoverflow.com/a/8867169/1825618
how to get constant #define value in swift class.
I have created constant.h class in my project, here I created the screen width and hight two constants values.
Constant.h
#define SCREEN_WIDTH_SWIFT UIScreen.main.bounds.size.width
#define SCREEN_HEIGHT_SWIFT UIScreen.main.bounds.size.height
Now i want to asses SCREEN_HEIGHT_SWIFT in ViewController.swift class
NSFontAttributeName : UIFont(name: "GillSans-Light" , size: MS_SCREEN_HEIGHT_SWIFT/40.5)!,NSForegroundColorAttributeName : UIColor.white
#define creates a C style compiler macro, not a constant. Swift doesn't support C compiler macros. You will need to use an actual constant.
Hi instead of using #define use let and assign the values with a =:
let SCREEN_WIDTH_SWIFT = UIScreen.main.bounds.size.width
let SCREEN_HEIGHT_SWIFT = UIScreen.main.bounds.size.height
Simply create separate class to maintain your all constants.
Lets say AppConstats is your class. Then create your constants like this:
static let SCREEN_WIDTH = UIScreen.main.bounds.size.width
Now wherever you want to access your constant, simply use like this:
AppConstats.SCREEN_WIDTH
Hi Can we get hash color string from UIImage ?
In below method if i pass [UIColor redColor] it is working , but if i pass
#define THEME_COLOR [UIColor colorWithPatternImage:[UIImage imageNamed:#"commonImg.png"]]
then it is not working.
+(NSString *)hexValuesFromUIColor:(UIColor *)color {
if (CGColorGetNumberOfComponents(color.CGColor) < 4) {
const CGFloat *components = CGColorGetComponents(color.CGColor);
color = [UIColor colorWithRed:components[0] green:components[0] blue:components[0] alpha:components[1]];
}
if (CGColorSpaceGetModel(CGColorGetColorSpace(color.CGColor)) != kCGColorSpaceModelRGB) {
return [NSString stringWithFormat:#"#FFFFFF"];
}
return [NSString stringWithFormat:#"#%02X%02X%02X", (int)((CGColorGetComponents(color.CGColor))[0]*255.0), (int)((CGColorGetComponents(color.CGColor))[1]*255.0), (int)((CGColorGetComponents(color.CGColor))[2]*255.0)];
}
Is there any other methods which can directly get Hash color from UIImage ?
You can't access the raw data directly, but by getting the CGImage of this image you can access it. Reference Link
You can't do it directly from the UIImage, but you can render the image into a bitmap context, with a memory buffer you supply, then test the memory directly. That sounds more complex than it really is, but may still be more complex than you wanted to hear.
If you have Erica Sadun's iPhone Developer's Cookbook there's good coverage of it from page 54. I'd recommend the book overall, so worth getting that if you don't have it.
I arrived at almost exactly the same code independently, but hit one bug that it looks like may be in Sadun's code too. In the pointInside method the point and size values are floats and are multiplied together as floats before being cast to an int. This is fine if your coordinates are discreet values, but in my case I was supplying sub-pixel values, so the formula broke down. The fix is easy once you've identified the problem, of course - just cast each coordinate to an int before multiplying - so, in Sadun's case it would be:
long startByte = ((int)point.y * (int)size.width) + (int)point.x) * 4;
Also, Sadun's code, as well as my own, are only interested in alpha values, so we use 8 bit pixels that take the alpha value only. Changing the CGBitMapContextCreate call should allow you to get actual colour values too (obviously if you have more than 8 bits per pixel you will have to multiply that in to your pointInside formula too).
OR
I'm working on a subclass of SKLabelNode allowing line-breaks through the use of the new-line character \n. It is coming along nicely and I am currently in the process of making sure it works for OSX as well as iOS before preparing a podspec.
On the surface it seems to be working, but one of my tests is failing when building for OSX, despite passing for iOS. It is somehow NSColorSpace related but this is uncharted territory for me. This is the method in the class that copies the color to the subnodes:
- (void)setFontColor:(SKColor *)fontColor{
[super setFontColor: fontColor];
self.propertyStateholderNode.fontColor = fontColor;
for (SKLabelNode *subNode in self.subNodes) {
subNode.fontColor = fontColor;
}
_fontColor = fontColor;
}
and this is the test that is failing, (I've removed tests towards other properties which passes on both platforms):
- (void)testThatSubnodesInheritsPropertiesFromParent{
NORLabelNode *threeLineNode = [self nodeWithThreeSubNodes];
threeLineNode.fontColor = [SKColor greenColor];
for (SKLabelNode *subNode in threeLineNode.subNodes) {
XCTAssertEqualObjects(threeLineNode.fontColor, subNode.fontColor, #"The subnodes should have the same fontColor as the parent.");
}
}
Details for the test failing are as follows:
((threeLineNode.fontColor) equal to (subNode.fontColor)) failed: ("NSCalibratedRGBColorSpace 0 1 0 1") is not equal to ("NSDeviceRGBColorSpace 0 0.976895 0 1") ...
It is not clear to me at all how the nodes wind up with different colorspaces...
Since we know that the color spaces are different I would try converting them to the same color space before comparing using -colorUsingColorSpace:(NSColorSpace *). You can resolve them to one known colorspace or convert them to each other’s color spaces. Again only on Mac is this happening and is NSColorSpace available.
A post on the CocoaBuilder mailing list talks about comparing the component values, but that is not ideal because the components do not distinguish rgb versus grayscale.
According to the Sprite Kit Function Reference SKColor is
A macro that expands to a platform-specific color class.
#if TARGET_OS_IPHONE
#define SKColor UIColor
#else
#define SKColor NSColor
#endif
As we see in UIColor and NSColor there are different color space convenience methods.
For example if you use colorWithWhite:alpha: you will get a color “in the device gray colorspace” and if you use colorWithHue:saturation:brightness:alpha: you will get a color "in the device RGB colorspace”.
I have different size of pages. I want to use enum to select size of page. somethink like that
typedef NS_ENUM(CGSize, SizeType) {
MAXSIZE=CGSizeMake(640, 1196),
MIDIUMSIZE=CGSizeMake(320, 590),
MINSIZE=CGSizeMake(160, 280)
};
its possible? if not then whats the best way to do this i need this combine value in my whole application
An enum in C (and therefore in Objective-C) is a set of integer values, and that's why you cannot have CGSize values as members of it.
Instead, use constants. The best option is to look at what Apple does and mimic it.
If you take a look at CGGeometry.h you will find the definitions of various constants.
For instance, CGSizeZero is defined as
CG_EXTERN const CGSize CGSizeZero
CG_AVAILABLE_STARTING(__MAC_10_0, __IPHONE_2_0);
You can then do something similar by declaring a constant in your header
CG_EXTERN const CGSize kMaxSize;
and then defining it in the implementation
const CGSize kMaxSize = (CGSize){ 640, 1196 };
As a bonus you can also define a type synonym for CGSize, for instance:
typedef CGSize MySizeType;
and then use it for declaring both constants and variables, e.g.
CG_EXTERN const MySizeType kMaxSize;
...
#property (nonatomic) MySizeType aSize;
That does't change a bit from a technical point of view, but it's semantically nicer and it basically achieves the same purpose of a typedef enum (which is precisely providing a convenient synonym to int)
As per the other answers, enums are basically integers, not structs.
You can just #define the values in a constants file:
#define MAXSIZE CGSizeMake(640, 1196)
#define MIDIUMSIZE CGSizeMake(320, 590)
#define MINSIZE CGSizeMake(160, 280)
though you might want to rename them for easier mnemonics, readability and auto-completion purposes, like:
#define PURPOSE_SIZE_MAX ...
#define PURPOSE_SIZE_MED ...
...
You cannot.
The enum type is a C type and it must be integer types, each member must be the same type also.
You can use char, BOOL, int, uint, NSInteger and so on.
For constant floating point values, you will need to declare them one by one.
structs also need to be done one by one.
You can not use enum for this. In Objective C enum is inherited from C. So it is implicitly converted to int.