Get class information from ObjCPropertyDecl - clang

I having some trouble extracting the class information from a clang ObjCPropertyDecl type.
Example Objective-C file:
#import <Foundation/Foundation.h>
#interface Test: NSObject
#property (nonatomic, strong, nullable) NSObject *test;
#property (nonatomic, strong, nullable) NSArray<NSObject *> *test1;
#end
#implementation Test
#end
Dumping the ast gives me this:
...
|-ObjCInterfaceDecl 0x104bbf8e0 <test.m:7:1, line:12:2> line:7:12 Test
| |-super ObjCInterface 0x103962618 'NSObject'
| |-ObjCImplementation 0x104bbff40 'Test'
| |-ObjCPropertyDecl 0x104bbfa20 <line:9:1, col:51> col:51 test 'NSObject * _Nullable':'NSObject *' readwrite nonatomic strong
| |-ObjCPropertyDecl 0x104bbfbe0 <line:10:1, col:62> col:62 test1 'NSArray<NSObject *> * _Nullable':'NSArray<NSObject *> *' readwrite nonatomic strong
| |-ObjCMethodDecl 0x104bbfc50 <line:9:51> col:51 implicit - test 'NSObject * _Nullable':'NSObject *'
| |-ObjCMethodDecl 0x104bbfcd8 <col:51> col:51 implicit - setTest: 'void'
| | `-ParmVarDecl 0x104bbfd60 <col:51> col:51 test 'NSObject * _Nullable':'NSObject *'
| |-ObjCMethodDecl 0x104bbfdc8 <line:10:62> col:62 implicit - test1 'NSArray<NSObject *> * _Nullable':'NSArray<NSObject *> *'
| `-ObjCMethodDecl 0x104bbfe50 <col:62> col:62 implicit - setTest1: 'void'
| `-ParmVarDecl 0x104bbfed8 <col:62> col:62 test1 'NSArray<NSObject *> * _Nullable':'NSArray<NSObject *> *'
...
Previously, I had an OCLint rule that relied on checking the type to see if it was an NSArray, so I'd use an ASTVisitor and this code:
string propertyType = node->getType().getAsString();
// compare to "NSArray *"
Note that both the nullable keyword and the generics in the code sample above change the qualified type (see the AST dump).
My question is: Is there a way I can get only the Objective-C class type from an ObjCPropertyDecl e.g. NSArray * or NSString * without any of the extra sugar?
I've tried getSplitDesugaredType(); and that works well for removing nullable parts but that doesn't remove generics.
Edit:
My current thinking is that I might be able to get the Type from the SplitQualType then cast to ObjCObjectPointerType and get the ObjCObjectType and that might have the information I want but, I'll try implementing that tomorrow.

Ok, after much digging around, I found an acceptable solution to this:
std::string propertyType(clang::ObjCPropertyDecl *d) {
QualType T = d->getType();
if (auto TypePtr = T.getTypePtr()) {
if (TypePtr->isObjCObjectPointerType()) {
if (auto InterfaceType = TypePtr->getAsObjCInterfacePointerType()) {
return InterfaceType->getObjectType()->getBaseType().getAsString();
}
}
}
return "";
}
Examples:
#property (nonatomic, strong, nullable) NSObject *test;
Returns: NSObject
#property (nonatomic, strong, nullable) NSArray<NSObject *> *test1; Returns: NSArray
Note: This will only return values for object pointer types, properties for scalar values like NSInteger etc, will return empty string.

Related

Sending 'double' to parameter of incompatible type 'id _Nullable'

So I am unable to understand this error.
I have an object with these properties
#property (nonatomic, readwrite) double width;
#property (nonatomic, readwrite) double height;
When I am creating an NSSdictionary add adding them into it
if (config.width) {
properties[#"height"] = config.height;
}
if (config.height) {
properties[#"height"] = config.height;
}
I am getting following error
Sending 'double' to parameter of incompatible type 'id _Nullable'
If I am correct, type id is equivalent to any? which includes double. Also, I am setting value if it exist?
If I am correct, type id is equivalent to any? which includes double
But you're not correct. id is equivalent to any object (an NSObject subclass). double is not an object; it's a primitive.
Just in case if anyone wants to know how I was able to fix it, I did
[NSNumber numberWithDouble:config.height];

#import "Project-Swift.h" comes some strange issue

Project is mixed with oc and swift.
#import "myProjectName-Swift.h" in the AppDelegate.m,:
Project is Objective-c language created by me, and using almost swift language to write. and In the Appdelegate.m I want to use swift's class, and so
But unfortunate, my project occurs some error in myProjectName-Swift.h:
The code is below:
SWIFT_CLASS_PROPERTY(#property (nonatomic, class, readonly, copy)
NSString * _Nonnull ;)
+ (NSString * _Nonnull);
SWIFT_CLASS_PROPERTY(#property (nonatomic, class, readonly, copy)
NSString * _Nonnull ;)
+ (NSString * _Nonnull);
SWIFT_CLASS_PROPERTY(#property (nonatomic, class, readonly, strong) UIColor * _Nonnull APP_COLOR;)
+ (UIColor * _Nonnull)APP_COLOR;
The error are:
Expected ';' at end of declaration list.
cannot declare variable inside #interface or #protocal
Expected member name or ';' after declaration specifiers
Expected ']'
Property requires fields to be named
Missing '[' at start of message send expression
And so on...
Attension
And In the global swift, I set Chinese characters variable like this:
class Global: NSObject {
}
EDIT - 1
Because I think a error point out the static let "'s ,so I annotation this line, project will not show this error. Or I delete the ,`, will not show the error too.
Objective-C doesn't support Unicode in identifiers (variable names, class names, method names etc), only in string literals ("🏋"). Only a subset of ASCII is allowed (start with _[a-zA-Z], +[0-9] afterwards).
Example:
class MyClass : NSObject {
func 😞() { }
}
This will break when you compile it in a mixed project. Instead you need to give it a valid Objective-C name, like so:
class MyClass : NSObject {
#objc(unhappyFace)
func 😞() { }
}
which will compile (and you need to access it as unhappyFace on the ObjC side).
Or in your example you need to modify your Global object:
class Global: NSObject {
#objc(emptyNumberNotAllowed)
static let 手机号码不能为空 = "手机号码不能为空"
... etc ...
}

Protocol subclassing and conformance

Given
#protocol Response <NSObject>
#end
#protocol DataResponse <Response>
#end
#interface ListUsersResponse : NSObject <DataResponse>
#end
#interface RequestExecutor : NSObject
+(id<Response>) execute: (id<Request>) request receptor: (Class) model;
#end
ListUsersRequest * request = [self buildListUsersRequest];
ListUsersResponse * result = [RequestExecutor execute:request receptor:[ListUsersResponse class]];
I get an Initializing 'ListUsersResponse *__strong' with an expression of incompatible type 'id<Response>' error. Why is that? Cant the compiler detect protocol conformance? How to solve it?
These warning is because your RequestExecutor's execute: method returns an object of a general id<Response> type. The variables, however, are declared as ListUsersResponse * so the compiler expects a more specific type (and it isn't sure if that type cast is correct or not, it has no way of knowing).
You can get rid of the warning by either:
a) Declaring your variables with id<Response> type instead of ListUsersRequest * type like:
id<Response> result = [RequestExecutor execute:request receptor:[ListUsersResponse class]];
Or
b) Casting them on the fly (if you are sure that at that point they will be of the appropriate class):
ListUsersResponse *result = (ListUsersRequest *)[RequestExecutor execute:request receptor:[ListUsersResponse class]];

Bitmasking in Objective C

I'd like to learn bit masking. As far as I understand, it is means to store binary values of certain type into one variable.
If the above assumption is true, I figured I could do something like this:
typedef NSUInteger Traits;
enum
{
TraitsCharacterHonest = 0,
TraitsCharacterOptimistic = 1,
TraitsCharacterPolite = 4,
TraitsCharacterDevious = 8,
TraitsPhysicalTall = 16,
TraitsPhysicalBeautiful = 32,
TraitsPhysicalFat = 64,
TraitsPhysicalBigEyes = 128,
TraitsPhysicalRedHair = 256,
};
#import <Foundation/Foundation.h>
#interface Person : NSObject
#property (strong, nonatomic) NSString *name;
#property (assign, nonatomic) Traits *traits;
#end
Question 1 is, how do I assign more traits to one person?
Question 2 is, do I have to put ever increasing numbers to enum items, or is there a way to indicate this?
Ultimately I want to achieve something like this:
Person *john = [[Person alloc] init];
//here code that assigns john three traits: TraitsCharacterHonest,
//TraitsCharacterOptimistic and TraitsPhysicalBeautiful.
If I understand it correctly, the value of
john.traits should be 100011., reading from right and each place representing that particular enum value / trait..and 0 meaning not having it and 1 meaning having it.
Can you please advice on syntax and explain a particular aspect if needed?
I'd recommend changing a few things:
The enum values can be changed to be a one left-shifted. Makes it a little easier to write, in my opinion.
You don't need to typedef to NSUInteger, you can declare a enum type directly using typedef enum.
And, as other people have mentioned, your property shouldn't be a pointer to a Traits type.
My code would look like this:
typedef enum
{
TraitsCharacterHonest = 1 << 0,
TraitsCharacterOptimistic = 1 << 1,
TraitsCharacterPolite = 1 << 2,
TraitsCharacterDevious = 1 << 3,
TraitsPhysicalTall = 1 << 4,
TraitsPhysicalBeautiful = 1 << 5,
TraitsPhysicalFat = 1 << 6,
TraitsPhysicalBigEyes = 1 << 7,
TraitsPhysicalRedHair = 1 << 8
} Traits;
#import <Foundation/Foundation.h>
#interface Person : NSObject
#property (strong, nonatomic) NSString *name;
#property (assign, nonatomic) Traits traits;
#end
Setting John's traits will look like this:
Person *john = [[Person alloc] init];
john.traits = TraitsCharacterHonest | TraitsCharacterOptimistic | TraitsPhysicalBeautiful;
However, while bit-fields are useful to learn, but they're a real pain to debug. If you want to go and print
this character's traits now, you'll have to write code like this:
NSMutableString *result = [NSMutableString string];
if (self.traits & TraitsCharacterHonest)
{
[result appendString: #"Honest, "];
}
if (self.traits & TraitsCharacterOptimistic)
{
[result appendString: #"Optimistic, "];
}
if (self.traits & TraitsCharacterPolite)
{
[result appendString: #"Polite, "];
}
// etc...
Additionally, syntax for operations like removing a trait are confusing. You'll have to use & and a NOT-ed constant,
// remove 'Tall' trait
john.traits = john.traits & ~TraitsPhysicalTall
If you can (and performance isn't too much of a issue), I'd prefer using a higher-level feature. Perhaps an NSSet with string constants? e.g.
__unused static NSString *TraitsCharacterHonest = #"TraitsCharacterHonest";
__unused static NSString *TraitsCharacterOptimistic = #"TraitsCharacterOptimistic";
__unused static NSString *TraitsCharacterPolite = #"TraitsCharacterPolite";
// etc...
#interface Person : NSObject
#property (strong, nonatomic) NSString *name;
#property (assign, nonatomic) NSMutableSet *traits;
#end
Then you can do:
// adding
[john.traits addObject: TraitsCharacterHonest];
// checking
[john.traits containsObject: TraitsCharacterHonest];
// removing
[john.traits removeObject: TraitsCharacterHonest];
Makes more sense to me. What's more, you can print the description of the traits directly with
NSLog(#"John's traits: %#", john.traits);
and you'll get reasonable output.
One issue that you can run into is that using bit masks to indicate membership within sets can be capped by the number of bits in the underlying data type. For instance an unsigned long of 32 bits has room only for 32 disjoint or different members. If you need to add a 33rd, you are out of luck unless you go to a 64 bit unsigned integer.
One workaround for this is to use an array of bytes. With this approach you have to specify your bit membership as two pieces of data, the offset to the byte and the bit mask to use for the specific bit.
I have also seen people use byte arrays for single membership so that rather than one bit used, the entire byte is used. It can be a waste of memory but then it may be that it is more flexible and useful and the amount of memory wasted is not a problem.
For using an array of bytes to hold the set of bits, you might consider using an unsigned long to represent the members of the set in which the least significant byte is the bit mask and the rest of the bytes are used as an unsigned 3 byte offset into the byte array. You would then do something like the following:
int getBitSet (unsigned char *bArray, unsigned long ulItem)
{
unsigned long ulByteOffset = ((ulItem >> 8) & 0x00ffffff);
unsigned char ucByteMask = (ulItem & 0x000000ff);
return (*(bArray + ulByteOffset) & ucByteMask);
}
int setBitSet (unsigned char *bArray, unsigned long ulItem, unsigned long ulNewValue)
{
unsigned char oldValue;
unsigned long ulByteOffset = ((ulItem >> 8) & 0x00ffffff);
unsigned char ucByteMask = (ulItem & 0x000000ff);
oldValue = *(bArray + ulByteOffset) & ucByteMask;
if (ulNewValue) {
*(bArray + ulByteOffset) |= ucByteMask; // set bit
} else {
*(bArray + ulByteOffset) &= ~ucByteMask; // clear bit
}
return oldValue;
}
You could then have a set of functions to get and set the bytes or you could use macros. With C++ you can create your own class for this functionality and provide various types of logical operations as well so that you can create sets of various kinds and then perform logical operations on the sets.
In iOS 6 or above, Mac OS X 10.8 and above
You can do:
typedef NS_OPTIONS(NSUInteger, Traits) {
TraitsCharacterHonest,
TraitsCharacterOptimistic,
TraitsCharacterPolite,
TraitsCharacterDevious,
TraitsPhysicalTall,
TraitsPhysicalBeautiful,
TraitsPhysicalFat,
TraitsPhysicalBigEyes,
TraitsPhysicalRedHair
};
For more info, refer to http://nshipster.com/ns_enum-ns_options/
Your major issue here is making traits a pointer. Drop the pointer, and do it like you would in C:
john.traits |= TraitsCharacterOptimistic | TraitsCharacterOptimistic | TraitsCharacterOptimistic;
Remember that you only need pointers in a couple of situations in Objective-C:
When you are dealing with actual objects (derived from NSObject)
When you need to pass a primitive by reference (an int * argument to a function to return count), in which case you take the adress of a local variable, and that pointer is not saved by the function.
When you need an array of primitive types, dynamically allocated on the heap (e.g. using malloc & friends).
Otherwise, just use a stack-allocated primitive type, as you can do a lot of things with it.
First of all change:
...
typedef NSUInteger Traits;
enum
{
TraitsCharacterHonest = 0, //cann't be a 0
...
};
...
#property (assign, nonatomic) Traits *traits; //you no need create a pointer to a primitive type
to:
...
typedef NSUInteger Traits;
enum
{
TraitsCharacterHonest = 1,
...
};
...
#property (assign, nonatomic) Traits traits;
For assigning you should do follow:
john.traits |= TraitsCharacterHonest | TraitsCharacterDevious;
Bitwise operations in ObjC are the same like as in C language.
Check this tutorial Bitwise Operators in C and C++: A Tutorial
Assuming:
1 << 8 is the same what 100000000:
john.traits = TraitsCharacterHonest | TraitsCharacterOptimistic | TraitsPhysicalBeautiful;
is exactly the same as:
john.traits = 000000001 | 000000010 | 000100000;
and the result is:
john.traits = 000100011
Now, when you want to check conditional:
if (self.traits & TraitsCharacterHonest) { ... }
it is equivalent to:
if (000100011 & 000000001) { ... }
and result of that is:
if (000000001) { ... }
And, this is actually 1, not zero value is true, so the whole conditional is true. Enjoy:-)

Incompatible integer to pointer conversion assigning to 'int *' from 'int'

I have yet another pesky warning I would like gone. Basically, I have an int declared like this: #property (nonatomic, assign) int *myInt; and set like this: myInt = 0;. It is also synthesized in the implementation file. I am getting a warning on the line where I set the int's value and it says Incompatible intiger to pointer conversion assigning to 'int *' from 'int'. What should I do to fix this?
There's a big hint in the error message!
In C and Objective C, an int is a primitive data type. You've written int *, which means "a pointer to an int", whereas it looks like you just wanted an int.
So change your property to this:
#property (nonatomic, assign) int myInt;
For more info, google "C pointers" and you'll find information like this: http://pw1.netcom.com/~tjensen/ptr/pointers.htm
Yes, simply remove asterisk (*) from declaration.
for example.declare BOOL isChecked instead of BOOL * isChecked.
int *MyInt is a pointer to int, not an int.
As others told you, just remove the * and you will have a regular int.
Another way to resolve this is to simply not declare a pointer (drop the asterisk):
#interface ViewController : UIViewController {
NSUInteger myObjcInt; // unsigned ( >= 0 ) NSObject int, yo
}

Resources