I understand that when we declare enum like the ones below, the values are default to type "int"
enum{
category0,
category1
};
However, I am getting issues now that iOS supports 64-bit. The solution I am thinking to prevent changing a lot in my code is to make this enum values default to "NSInteger" instead of "int". On my understanding, the system will decide whether NSInteger will be of type int or long depending if it's running on 32 or 64 bit.
I am having difficulty understanding enum so I will appreciate your help on this.
Thanks
If I use typedef as suggested by the comments:
typedef NS_ENUM(NSInteger, Category){
category0,
category1
}
how should i use it? Normally when I compare it with, say, tableview's indexpath.section, i do this
if(indexpath.section == category0){
...
}
if I declare that "Category", do I need to use it? Sorry I don't quite understand typedef.
Try
typedef enum{
category0,
category1
} Category;
or
typedef NS_ENUM(NSInteger, Category) {
category0,
category1
};
You also can explicitly set integer numbers to your enums values
typedef NS_ENUM(NSInteger, Category) {
category0 = 0,
category1 = 1,
category42 = 42
};
Then you can use them the same as int
if(indexpath.section == category0){
...
}
Related
I have started learning iOS development.
I want to use enum in my sample project.
I've declared enum in sample.h like following. I hope I've declared this correctly.
typedef enum{s=1,m,t,w,th,f,sa} days;
I want to use this in viewController.m. In viewController.h, I've imported sample.h.
I want to use enum with the name like "days.sa". But more code i searched in google, they've said like create a instance variable in "sample.h" like
#interface Sample:NSObject
{
days d;
}
If I want to use this means, I need to create and use instance. But I don't want like that.
I need to use like
days.d or days.sa or days.th
How to do that ?, This must be used for the whole Project and
How to create enum as class variable instead of instance variable ?
In the enum you've created, s, m etc. are now available globally (i.e. to anything that imports sample.h). If you want the integer corresponding to Saturday, for example, it's just sa, not days.sa. I think you're confusing enums with structures.
For this reason, it's better to use more verbose names in your enum. Something like:
typedef enum
{
WeekdaySunday = 1,
WeekdayMonday,
WeekdayTuesday,
WeekdayWednesday,
WeekdayThursday,
WeekdayFriday,
WeekdaySaturday
} Weekday;
so e.g. WeekdayMonday is now just another way of writing 2 in your app, but will make your code more readable and pre-defines the possible legal values for a variable of type Weekday.
The above is fine, but for better compiler support and to ensure the size of a Weekday, I'd recommend using NS_ENUM:
typedef NS_ENUM(NSInteger, Weekday)
{
WeekdaySunday = 1,
WeekdayMonday,
WeekdayTuesday,
WeekdayWednesday,
WeekdayThursday,
WeekdayFriday,
WeekdaySaturday
};
hey you use enum like this here is an example
In .h define enum
typedef enum{s=1,m,t,w,th,f,sa} days;
In .m play with enum element like this
days d1 =f;
switch (d1) {
case m:
case t:
NSLog(#"You like Tuesday");
break;
case w:
case th:
break;
case f:
NSLog(#"You like friday");
break;
case sa:
NSLog(#"You satureday");
break;
case s:
NSLog(#"You like sunday");
break;
default:
break;
}
if you want learn more click this.
#import <Foundation/Foundation.h>
typedef enum{
s=1,m,t,w,th,f,sa
} days;
#interface weekday : NSObject
#property (nonatomic, assign) days day;
#end
#implementation weekday
#end
int main(int argc, const char * argv[])
{
#autoreleasepool {
weekday *sunDay=[[weekday alloc]init];
sunDay.day=s;
NSLog(#"Today is %d",sunDay.day);
}
return 0;
}
Creating Enum in Enumrations.h
typedef enum
{
Atype = 1,
Btype,
Ctype,
Dtype,
Etype,
}type;
Where ever you want to user this enum just import Enumrations.h, and you can use Atype without creating type object.
you can simply use NSLog(#"%#",#(Atype)).
I preprocessed following code with clang in Xcode5.
typedef NS_ENUM(NSInteger, MyStyle) {
MyStyleDefault,
MyStyleCustom
};
typedef NS_OPTIONS(NSInteger, MyOption) {
MyOption1 = 1 << 0,
MyOption2 = 1 << 1,
};
And got this.
typedef enum MyStyle : NSInteger MyStyle; enum MyStyle : NSInteger {
MyStyleDefault,
MyStyleCustom
};
typedef enum MyOption : NSInteger MyOption; enum MyOption : NSInteger {
MyOption1 = 1 << 0,
MyOption2 = 1 << 1,
};
I know NS_OPTIONS is for a bitmask, but is there any technical differences?
Or this is just for naming convention?
EDIT
According to the definition of NS_OPTIONS, it's probably for compiler compatibility.(especially for c++ compiler)
// In CFAvailability.h
// Enums and Options
#if (__cplusplus && __cplusplus >= 201103L && (__has_extension(cxx_strong_enums) || __has_feature(objc_fixed_enum))) || (!__cplusplus && __has_feature(objc_fixed_enum))
#define CF_ENUM(_type, _name) enum _name : _type _name; enum _name : _type
#if (__cplusplus)
#define CF_OPTIONS(_type, _name) _type _name; enum : _type
#else
#define CF_OPTIONS(_type, _name) enum _name : _type _name; enum _name : _type
#endif
#else
#define CF_ENUM(_type, _name) _type _name; enum
#define CF_OPTIONS(_type, _name) _type _name; enum
#endif
__cplusplus value in clang is 199711 and I can't test what this is exactly for, though.
There's a basic difference between an enum and a bitmask (option). You use an enum to list exclusive states. A bitmask is used when several properties can apply at the same time.
In both cases you use integers, but you look at them differently. With an enum you look at the numerical value, with bitmasks you look at the individual bits.
typedef NS_ENUM(NSInteger, MyStyle) {
MyStyleDefault,
MyStyleCustom
};
Will only represent two states. You can simply check it by testing for equality.
switch (style){
case MyStyleDefault:
// int is 0
break;
case MyStyleCustom:
// int is 1
break;
}
While the bitmask will represent more states. You check for the individual bits with logic or bitwise operators.
typedef NS_OPTIONS(NSInteger, MyOption) {
MyOption1 = 1 << 0, // bits: 0001
MyOption2 = 1 << 1, // bits: 0010
};
if (option & MyOption1){ // last bit is 1
// bits are 0001 or 0011
}
if (option & MyOption2){ // second to last bit is 1
// bits are 0010 or 0011
}
if ((option & MyOption1) && (option & MyOption2)){ // last two bits are 1
// bits are 0011
}
tl;dr An enum gives names to numbers. A bitmask gives names to bits.
The only major difference is that using the appropriate macro allows Code Sense (Xcode's code completion) to do type checking and code completion better. For example, NS_OPTIONS allows the compiler to make sure all the enums you | together are of the same type.
For further reading see: http://nshipster.com/ns_enum-ns_options/
Edit:
Now that Swift is coming, using NS_ENUM/OPTIONS is highly recommended so that the enum can be correctly bridged to a swift enum.
The only difference is to let developers using the values know if it makes sense to use them in an OR'ed bitmask.
The compiler doesn't care which one you use though :)
I copied my answer from this question Objective-C Enumeration, NS_ENUM & NS_OPTIONS:
Since the user who add that question hasn't been active for a long time, maybe you can suggest my answers for people who search and find here.
BELOW IS THE ANSWER COPIED:
There is a difference between the two except that they infer different kind of enumerations.
When compiled in Objective-C++ mode, they generate different code:
this is the original code:
typedef NS_OPTIONS(NSUInteger, MyOptionType) {
MyOptionType1 = 1 << 0,
MyOptionType2 = 1 << 1,
};
typedef NS_ENUM(NSUInteger, MyEnumType) {
MyEnumType1 = 1 << 0,
MyEnumType2 = 1 << 1,
};
this is the code when macros are expanded in Objective-C compiling:
typedef enum MyOptionType : NSUInteger MyOptionType; enum MyOptionType : NSUInteger {
MyOptionType1 = 1 << 0,
MyOptionType2 = 1 << 1,
};
typedef enum MyEnumType : NSUInteger MyEnumType; enum MyEnumType : NSUInteger {
MyEnumType1 = 1 << 0,
MyEnumType2 = 1 << 1,
};
this is the code when macros are expanded in Objective-C++ compiling:
typedef NSUInteger MyOptionType; enum : NSUInteger {
MyOptionType1 = 1 << 0,
MyOptionType2 = 1 << 1,
};
typedef enum MyEnumType : NSUInteger MyEnumType; enum MyEnumType : NSUInteger {
MyEnumType1 = 1 << 0,
MyEnumType2 = 1 << 1,
};
See the difference of NS_OPTIONS between two modes?
HERE IS THE REASON:
There is a new feature in C++ 11, you can declare a type for you enumeration, before that, the type holding enumeration is decided by compiler according to the largest value of enumerations.
So in C++ 11, since you can decide the size of your enumeration by yourself, you could forward declare enums without actually define them, like this:
//forward declare MyEnumType
enum MyEnumType: NSInteger
//use myEnumType
enum MyEnumType aVar;
//actually define MyEnumType somewhere else
enum MyEnumType: NSInteger {
MyEnumType1 = 1 << 1,
MyEnumType2 = 1 << 2,
}
This feature is handy, and Objective-C imports this feature , but it brings a problem, when doing bitwise calculation, like this:
enum MyEnumType aVar = MyEnumType1 | MyEnumType2;
This code can't compile in C++/Objective-C++ compiling, since aVar is considered of type NSInteger but MyEnumType1 | MyEnumType2 is of type MyEnumType, this assignment can't perform without a type cast, C++ forbids implicit type casting.
At this time, we need NS_OPTIONS, NS_OPTIONS fall back to enum before C++ 11, so that there is no MyEnumType indeed, MyEnumType is just another name for NSInteger, so that code like
enum MyEnumType aVar = MyEnumType1 | MyEnumType2;
will compile, since it is assigning NSInteger to NSInteger.
This question uses CLLocationCoordinate2D as an example, but this applies to other structs as well, such as CGPoint (although ones like those are usually automatically included).
I want to use CLLocationCoordinate2D as a return value in a class method. If it were an object you could write the following at the top and it would be fine, as long as the .m file had a reference to CoreLocation.h
#class ClassName
Is there an equivalent way of telling the compiler not to worry about the struct without re-declaring it or importing the header file into the class' header file?
I do not want to import CoreLocation.h into the header file, since that would mean every file that imports that header file would inherit CoreLocation.
Thanks
I'm not totally getting the point why you do not want to import CoreLocation, but CLLocationCoordinate2D is declared in CoreLocation.h. I'm not aware about a method like #class for struct and I don't think it exists since struct are C types.
What you can do is create your own class that wraps the CLLocationCoordinate2D or return the NSValue from it, or (why not?) a dictionary.
Easiest way to do this is to just use an object instead of the struct, then you can use the #class keyword. In this case, the CLLocation object works just fine. Alternatively you can often use an NSDictionary in place of a struct, but an object is a bit easier to manage.
You return a struct like any other type. But you should be aware that when returning a struct you are returning a copy of the internal value on the stack as a temporary variable. Unlike an Objective-C object where you are actually returning a pointer.
The type you return MUST be a complete type. That means, in your method declaration you need the definition of the struct. In your case, that means, you need to include the header.
For example:
typedef struct MyStruct {
int a;
int b;
} MyStruct;
#interface MyClass : NSObject
+(MyStruct) theStruct;
#end
#implementation MyClass
+(MyStruct) theStruct {
MyStruct s = {.a = 1, .b = 2};
return s;
}
#end
int main(int argc, const char * argv[])
{
#autoreleasepool {
MyStruct s1 = [MyClass theStruct];
s1.a = 100;
s1.b = 100;
NSLog(#"s1 = {%d, %d}", s1.a, s1.b);
NSLog(#"MyStruct.theStruct = {%d, %d}", [MyClass theStruct].a, [MyClass theStruct].b);
[MyClass theStruct].a = 0; // has no effect!
}
return 0;
}
Prints:
s1 = {100, 100}
MyStruct.theStruct = {1, 2}
There is no straightforward way of doing that with single keyword.
Here you can find why it is not straightforward, although it is stated that it is not possible to do that, somewhat true but not completely.
Forward declare a struct in Objective-C
And here is the workaround of doing this
Forward declare structs in Objective C
Hope this will help.
I'd like to learn bit masking. As far as I understand, it is means to store binary values of certain type into one variable.
If the above assumption is true, I figured I could do something like this:
typedef NSUInteger Traits;
enum
{
TraitsCharacterHonest = 0,
TraitsCharacterOptimistic = 1,
TraitsCharacterPolite = 4,
TraitsCharacterDevious = 8,
TraitsPhysicalTall = 16,
TraitsPhysicalBeautiful = 32,
TraitsPhysicalFat = 64,
TraitsPhysicalBigEyes = 128,
TraitsPhysicalRedHair = 256,
};
#import <Foundation/Foundation.h>
#interface Person : NSObject
#property (strong, nonatomic) NSString *name;
#property (assign, nonatomic) Traits *traits;
#end
Question 1 is, how do I assign more traits to one person?
Question 2 is, do I have to put ever increasing numbers to enum items, or is there a way to indicate this?
Ultimately I want to achieve something like this:
Person *john = [[Person alloc] init];
//here code that assigns john three traits: TraitsCharacterHonest,
//TraitsCharacterOptimistic and TraitsPhysicalBeautiful.
If I understand it correctly, the value of
john.traits should be 100011., reading from right and each place representing that particular enum value / trait..and 0 meaning not having it and 1 meaning having it.
Can you please advice on syntax and explain a particular aspect if needed?
I'd recommend changing a few things:
The enum values can be changed to be a one left-shifted. Makes it a little easier to write, in my opinion.
You don't need to typedef to NSUInteger, you can declare a enum type directly using typedef enum.
And, as other people have mentioned, your property shouldn't be a pointer to a Traits type.
My code would look like this:
typedef enum
{
TraitsCharacterHonest = 1 << 0,
TraitsCharacterOptimistic = 1 << 1,
TraitsCharacterPolite = 1 << 2,
TraitsCharacterDevious = 1 << 3,
TraitsPhysicalTall = 1 << 4,
TraitsPhysicalBeautiful = 1 << 5,
TraitsPhysicalFat = 1 << 6,
TraitsPhysicalBigEyes = 1 << 7,
TraitsPhysicalRedHair = 1 << 8
} Traits;
#import <Foundation/Foundation.h>
#interface Person : NSObject
#property (strong, nonatomic) NSString *name;
#property (assign, nonatomic) Traits traits;
#end
Setting John's traits will look like this:
Person *john = [[Person alloc] init];
john.traits = TraitsCharacterHonest | TraitsCharacterOptimistic | TraitsPhysicalBeautiful;
However, while bit-fields are useful to learn, but they're a real pain to debug. If you want to go and print
this character's traits now, you'll have to write code like this:
NSMutableString *result = [NSMutableString string];
if (self.traits & TraitsCharacterHonest)
{
[result appendString: #"Honest, "];
}
if (self.traits & TraitsCharacterOptimistic)
{
[result appendString: #"Optimistic, "];
}
if (self.traits & TraitsCharacterPolite)
{
[result appendString: #"Polite, "];
}
// etc...
Additionally, syntax for operations like removing a trait are confusing. You'll have to use & and a NOT-ed constant,
// remove 'Tall' trait
john.traits = john.traits & ~TraitsPhysicalTall
If you can (and performance isn't too much of a issue), I'd prefer using a higher-level feature. Perhaps an NSSet with string constants? e.g.
__unused static NSString *TraitsCharacterHonest = #"TraitsCharacterHonest";
__unused static NSString *TraitsCharacterOptimistic = #"TraitsCharacterOptimistic";
__unused static NSString *TraitsCharacterPolite = #"TraitsCharacterPolite";
// etc...
#interface Person : NSObject
#property (strong, nonatomic) NSString *name;
#property (assign, nonatomic) NSMutableSet *traits;
#end
Then you can do:
// adding
[john.traits addObject: TraitsCharacterHonest];
// checking
[john.traits containsObject: TraitsCharacterHonest];
// removing
[john.traits removeObject: TraitsCharacterHonest];
Makes more sense to me. What's more, you can print the description of the traits directly with
NSLog(#"John's traits: %#", john.traits);
and you'll get reasonable output.
One issue that you can run into is that using bit masks to indicate membership within sets can be capped by the number of bits in the underlying data type. For instance an unsigned long of 32 bits has room only for 32 disjoint or different members. If you need to add a 33rd, you are out of luck unless you go to a 64 bit unsigned integer.
One workaround for this is to use an array of bytes. With this approach you have to specify your bit membership as two pieces of data, the offset to the byte and the bit mask to use for the specific bit.
I have also seen people use byte arrays for single membership so that rather than one bit used, the entire byte is used. It can be a waste of memory but then it may be that it is more flexible and useful and the amount of memory wasted is not a problem.
For using an array of bytes to hold the set of bits, you might consider using an unsigned long to represent the members of the set in which the least significant byte is the bit mask and the rest of the bytes are used as an unsigned 3 byte offset into the byte array. You would then do something like the following:
int getBitSet (unsigned char *bArray, unsigned long ulItem)
{
unsigned long ulByteOffset = ((ulItem >> 8) & 0x00ffffff);
unsigned char ucByteMask = (ulItem & 0x000000ff);
return (*(bArray + ulByteOffset) & ucByteMask);
}
int setBitSet (unsigned char *bArray, unsigned long ulItem, unsigned long ulNewValue)
{
unsigned char oldValue;
unsigned long ulByteOffset = ((ulItem >> 8) & 0x00ffffff);
unsigned char ucByteMask = (ulItem & 0x000000ff);
oldValue = *(bArray + ulByteOffset) & ucByteMask;
if (ulNewValue) {
*(bArray + ulByteOffset) |= ucByteMask; // set bit
} else {
*(bArray + ulByteOffset) &= ~ucByteMask; // clear bit
}
return oldValue;
}
You could then have a set of functions to get and set the bytes or you could use macros. With C++ you can create your own class for this functionality and provide various types of logical operations as well so that you can create sets of various kinds and then perform logical operations on the sets.
In iOS 6 or above, Mac OS X 10.8 and above
You can do:
typedef NS_OPTIONS(NSUInteger, Traits) {
TraitsCharacterHonest,
TraitsCharacterOptimistic,
TraitsCharacterPolite,
TraitsCharacterDevious,
TraitsPhysicalTall,
TraitsPhysicalBeautiful,
TraitsPhysicalFat,
TraitsPhysicalBigEyes,
TraitsPhysicalRedHair
};
For more info, refer to http://nshipster.com/ns_enum-ns_options/
Your major issue here is making traits a pointer. Drop the pointer, and do it like you would in C:
john.traits |= TraitsCharacterOptimistic | TraitsCharacterOptimistic | TraitsCharacterOptimistic;
Remember that you only need pointers in a couple of situations in Objective-C:
When you are dealing with actual objects (derived from NSObject)
When you need to pass a primitive by reference (an int * argument to a function to return count), in which case you take the adress of a local variable, and that pointer is not saved by the function.
When you need an array of primitive types, dynamically allocated on the heap (e.g. using malloc & friends).
Otherwise, just use a stack-allocated primitive type, as you can do a lot of things with it.
First of all change:
...
typedef NSUInteger Traits;
enum
{
TraitsCharacterHonest = 0, //cann't be a 0
...
};
...
#property (assign, nonatomic) Traits *traits; //you no need create a pointer to a primitive type
to:
...
typedef NSUInteger Traits;
enum
{
TraitsCharacterHonest = 1,
...
};
...
#property (assign, nonatomic) Traits traits;
For assigning you should do follow:
john.traits |= TraitsCharacterHonest | TraitsCharacterDevious;
Bitwise operations in ObjC are the same like as in C language.
Check this tutorial Bitwise Operators in C and C++: A Tutorial
Assuming:
1 << 8 is the same what 100000000:
john.traits = TraitsCharacterHonest | TraitsCharacterOptimistic | TraitsPhysicalBeautiful;
is exactly the same as:
john.traits = 000000001 | 000000010 | 000100000;
and the result is:
john.traits = 000100011
Now, when you want to check conditional:
if (self.traits & TraitsCharacterHonest) { ... }
it is equivalent to:
if (000100011 & 000000001) { ... }
and result of that is:
if (000000001) { ... }
And, this is actually 1, not zero value is true, so the whole conditional is true. Enjoy:-)
iOS 5.0 SDK
I have a method that took a parameter as a 'type' that I defined. Lets call it 'Places'. This type was defined as the following:
typedef enum {
kBar = 0,
kRestaurant = 1,
kCafe = 2
} Places
My method would take a parameter of Places.
Based on the Place type passed in, I would append the type to the url:
ex: http://www.domain.com/place=1
However, the url parameter cannot be a number it has to be a string.
ex: http://www.domain.com/place=restaurant
I know enums cannot be strings so I am trying to figure out the right approach for this. Do I have a plist and then read the plist into a dictionary? Is there another way?
I would do something like:
typedef enum {
PlaceTypeBar = 0,
PlaceTypeRestaurant = 1,
PlaceTypeCafe = 2
} PlaceType
#interface PlaceTypeHelper : NSObject
+ (NSString *) stringForPlace:(PlaceType)place;
#end
#implementation
+ (NSString *) stringForPlace:(PlaceType)place {
NSArray *places = [NSArray arrayWithobjects:#"Bar", #"Restaurant", #"Cafe", nil];
return [places objectForKey:(NSInteger)place];
}
#end
Headups, I've no tested the code yet.
There's a lot of different approaches you could take. Here's what I might do myself.
Assuming there's a finite and known amount of values, you can do a simple function which returns the string for the given type :
(NSString*) StringForPlaceType(PlaceType thePlace) {
switch(thePlace) {
case kBar:
return #"Bar";
case kRestaurant:
return #"Restaurant";
case kCafe:
return #"Cafe";
default:
// ...
}
}
No need for an object or class unless you want for flexibility such as dynamic values and such.