Sizeof const char* wrong value - ios

NSString *lang = #"en";
const char* ar = [lang UTF8String];
int size_of_array = (sizeof ar) / (sizeof ar[0]);
size_of_array is equal to 4 and (sizeof ar) = 4 and sizeof ar[0] = 1.
Why? I think it (size_of_array) has to be 2.

sizeof ar will get the size of the type char *, which is a pointer and so takes 4 bytes in memory. You want to get the length of the string, so use the function strlen instead of sizeof ar

It isn't clear what you are trying to do.
Your third line of code references an array "ar" that isn't declared anywhere in your post, and doesn't seem to relate to the code before it.
Also, the bit sizeof ar[] doesn't make much sense. That will give you the size of a single element in your ar array, whatever that is. So you are taking the size of the pointer variable ar, and dividing it by the size of one element in the ar array.
Are you trying to determine the memory size of the ASCII string lang_ch?
If so, then you want
int size_of_array = strlen(lang_ch) + 1;
That will give you the length of the string you get back, including the null terminator.

Related

Converting NSStrings to C chars and calling a C function from Objective-C

I'm in an Objective-C method with various NSStrings that I want to pass to a C function. The C function requires a struct object be malloc'd so that it can be passed in - this struct contains char fields. So the struct is defined like this:
struct libannotate_baseManual {
char *la_bm_code; // The base code for this manual (pointer to malloc'd memory)
char *la_bm_effectiveRevisionId; // The currently effective revision ID (pointer to malloc'd memory or null if none effective)
char **la_bm_revisionId; // The null-terminated list of revision IDs in the library for this manual (pointer to malloc'd array of pointers to malloc'd memory)
};
This struct is then used in the following C function definition:
void libannotate_setManualLibrary(struct libannotate_baseManual **library) { ..
So that's the function I need to call from Objective-C.
So I have various NSStrings that I basically want to pass in there, to represent the chars - la_bm_code, la_bm_effectiveRevisionId, la_bm_revision. I could convert those to const chars by using [NSString UTF8String], but I need chars, not const chars.
Also I need to do suitable malloc's for these fields, though apparently I don't need to worry about freeing the memory afterwards. C is not my strong point, though I know Objective-C well.
strdup() is your friend here as that both malloc()s and strcpy()s for you in one simple step. It's memory is also released using free() and it does your const char * to char * conversion for you!
NSString *code = ..., *effectiveRevId = ..., *revId = ...;
struct libannotate_baseManual *abm = malloc(sizeof(struct libannotate_baseManual));
abm->la_bm_code = strdup([code UTF8String]);
abm->la_bm_effectiveRevisionId = strdup([effectiveRevId UTF8String]);
const unsigned numRevIds = 1;
abm->la_bm_effectiveRevisionId = malloc(sizeof(char *) * (numRevIds + 1));
abm->la_bm_effectiveRevisionId[0] = strdup([revId UTF8String]);
abm->la_bm_effectiveRevisionId[1] = NULL;
const unsigned numAbms = 1;
struct libannotate_baseManual **abms = malloc(sizeof(struct libannotate_baseManual *) * (numAbms + 1));
abms[0] = abm;
abms[1] = NULL;
libannotate_setManualLibrary(abms);
Good luck, you'll need it. It's one of the worst interfaces I've ever seen.

Objective-C how to convert a keystroke to ASCII character code?

I need to find a way to convert an arbitrary character typed by a user into an ASCII representation to be sent to a network service. My current approach is to create a lookup dictionary and send the corresponding code. After creating this dictionary, I see that it is hard to maintain and determine if it is complete:
__asciiKeycodes[#"F1"] = #(112);
__asciiKeycodes[#"F2"] = #(113);
__asciiKeycodes[#"F3"] = #(114);
//...
__asciiKeycodes[#"a"] = #(97);
__asciiKeycodes[#"b"] = #(98);
__asciiKeycodes[#"c"] = #(99);
Is there a better way to get ASCII character code from an arbitrary key typed by a user (using standard 104 keyboard)?
Objective C has base C primitive data types. There is a little trick you can do. You want to set the keyStroke to a char, and then cast it as an int. The default conversion in c from a char to an int is that char's ascii value. Here's a quick example.
char character= 'a';
NSLog("a = %ld", (int)test);
console output = a = 97
To go the other way around, cast an int as a char;
int asciiValue= (int)97;
NSLog("97 = %c", (char)asciiValue);
console output = 97 = a
Alternatively, you can do a direct conversion within initialization of your int or char and store it in a variable.
char asciiToCharOf97 = (char)97; //Stores 'a' in asciiToCharOf97
int charToAsciiOfA = (int)'a'; //Stores 97 in charToAsciiOfA
This seems to work for most keyboard keys, not sure about function keys and return key.
NSString* input = #"abcdefghijklkmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890!##$%^&*()_+[]\{}|;':\"\\,./<>?~ ";
for(int i = 0; i<input.length; i ++)
{
NSLog(#"Found (at %i): %i",i , [input characterAtIndex:i]);
}
Use stringWithFormat call and pass the int values.

Setting Integer value in Objective c

I have recently started programming in iOS.. I am going through a code snippet that declares the following variables:
int rc = 0X00;
sqlite3_stmt *pStmt = 0X00;
FMStatement *stat = 0X00;
BOOL abc = 0X00;
what does this mean?? I read somewhere that setting 0X00 in a reference variable means setting it to NULL (in C). But what does setting a BOOL type variable and an int type variable to 0X00 mean??
I suggest you read up about the basics of programming languages, specifically, C programing with pointers. Objective-C is a superset of C and follows many similar rules.
But to your question:
The 0x in front of the literal values in the code (0x00) specifies that the value is interpreted as hexadecimal rather than decimal. But 0x00(hex) is the same as 0(dec).
int rc = 0x00; //same as int rc = 0;
int is a primitive type in both Obj-C and C that specifies an integer, effectively you are initializing the variable. In the C language you must initialize variables otherwise they could be pointing at a random piece of memory.
Therefore, examine this code:
int a;
int b = 0;
//a is NOT equal to b!
In C, the variable 'a' has not be initialized and therefore its not typically safe to assume that it will be initialized to 0. Always initialize your variable.
If you did a printf, or an NSLog of the variable 'a' you will see that it prints some huge number and it doesnt make sense (sometimes this is compiler dependent)
The same can be said for a BOOL. Although setting a BOOL to 0 is the same as setting it to false;
BOOL flag = 0; //The same as saying BOOL flag = false;
Now for the final part of your code:
FMStatement *stat = 0X00;
Often in Objective-C if you are dealing with pointers and objects you need to initialise the pointer to point at some memory address. The actual memory address is usually determined by the stack/heap and you don't need to worry about that. But you do need to ensure that the pointer isn't pointing to the wrong location (known as a garbage pointer).
To do this, we simply set our pointer to nil. eg:
FMStatement *stat = nil; //This pointer is now safe. Although memory still hasnt been allocated for it yet
This is usually taken care of for you though when you immediately allocate the memory for an object, therefore in this case you don't need to worry about initializing the pointer to nil:
FMStatement *stat = [[FMStatement alloc]init];
Like I said, I recommend you read about basic C programming, allocations, pointers, datatypes, initialising etc, once you have a grasp of this, then move to Objective-C which then builds ontop of it with Object-Oriented stuff.
Good luck.
0X00 is simply 0 in hexadecimal notation. So,
int rc = 0X00;
is the same as
int rc = 0;
Same for BOOL variables, where 0 is the same as NO. Using 0X00 is odd -- it'd make more sense to use 0 or NO where appropriate, and use nil for the pointers.

NSData Packet Interpretation

I have a fairly complex issue regarding the interpretation of packets in an app that I am making. A host app sends a packet to client apps with the following structure:
[Header of 10 bytes][peerID of selected client of variable byte length][empty byte][peerID of a client of variable byte length][empty byte][int of 4 bytes][peerID of client of variable byte length][empty byte][int of 4 bytes]
Here is a sample packet that is produced under this structure:
434e4c50 00000000 006a3134 31303837 34393634 00313233 38313638 35383900 000003e8 31343130 38373439 36340000 0003e8
Converted it looks like this:
CNLP j1410874964 1238168589 Ë1410874964 Ë
"CNLP j" is the packet header of 10 bytes. "1410874964" is the peerID of the selected client. "1238168589" is the peerID of another client. " Ë" has an int value of 1000. "1410874964" is the peerID of the other client (in this case, the selected client). " Ë" also has an int value of 1000. Basically, in this packet I am communicating 2 things - who the selected client is and the int value associated with each client.
My problem exists on the interpretation side (client side). To interpret this particular type of packet, I use the following method:
+ (NSMutableDictionary *)infoFromData:(NSData *)data atOffset:(size_t) offset
{
size_t count;
NSMutableDictionary *info = [NSMutableDictionary dictionaryWithCapacity:8];
while (offset < [data length])
{
NSString *peerID = [data cnl_stringAtOffset:offset bytesRead:&count];
offset += count;
NSNumber *number = [NSNumber numberWithInteger:[data cnl_int32AtOffset:offset]];
offset += 4;
[info setObject:number forKey:peerID];
}
return info;
}
Typically, each of these packets range between 49 and 51 bytes. "offset" is set in a previous method to reflect the byte number after the packet header plus the empty byte after the selected player (in the case of the above packet, 21). "count" is initialized with a value of 1. In the case of this particular example, length is 51. The following method is passed the above arguments:
- (NSString *)cnl_stringAtOffset:(size_t)offset bytesRead:(size_t *)amount
{
const char *charBytes = (const char *)[self bytes];
NSString *string = [NSString stringWithUTF8String:charBytes + offset];
*amount = strlen(charBytes + offset) + 1;
return string;
}
This method is supposed to read through a variable length string in the packet, set the offset to the byte immediately after the empty byte pad behind the peerID string, and return the string that was read. "amount" is then set to the number of bytes the method read through for the string (this is becomes the new value of count after returning to the first method). "offset" and "count" are then added together to become the new "offset" - where interpretation of the int portion of the packet will begin. The above arguments are passed to the following method:
- (int)cnl_int32AtOffset:(size_t)offset
{
const int *intBytes = (const int *)[self bytes];
return ntohl(intBytes[offset / 4]);
}
This method is intended to return the 32 bit (4 byte) int value read at the current offset value of the packet. I believe that the problem exists in this method when the offset is a number that is not divisible by 4. In this case, the first int value of 1000 was correctly interpreted, and 32 was returned as the offset during the first iteration of the while loop. However, during the second iteration, the int value interpreted was 909377536 (obtained from reading bytes 36340000 in the packet instead of bytes 000003E8) This was likely due to the fact that the offset during this iteration was set to 47 (not divisible by 4). After interpreting the 32 bit int in the category above, 4 is added to the offset in the first method to account for a 4 byte (32 bit int). If my intuition about an offset not divisible by zero is correct, any suggestions to get around this problem are greatly appreciated. I have been looking for a way to solve this problem for quite some time and perhaps fresh eyes may help. Thanks for any help!!!
The unportable version (undefined behaviour for many reasons):
return ntohl(*(const int *)([self bytes]+offset));
A semi-portable version is somewhat trickier, but in C99 it appears that you can assume int32_t is "the usual" two's complement representation (no trap representations, no padding bits), thus:
// The cast is necessary to prevent arithmetic on void* which is nonstandard.
const uint8_t * p = (const uint8_t *)[self bytes]+offset;
// The casts ensure the result type is big enough to hold the shifted value.
// We use uint32_t to prevent UB when shifting into the sign bit.
uint32_t n = ((uint32_t)p[0]<<24) | ((uint32_t)p[1]<<16) | ((uint32_t)p[2]<<8) | ((uint32_t)p[3]);
// Jump through some hoops to prevent UB on "negative" numbers.
// An equivalent to the third expression is -(int32_t)~n-1.
// A good compiler should be able to optimize this into nothing.
return (n <= INT32_MAX) ? (int32_t)n : -(int32_t)(UINT32_MAX-n)-1;
This won't work on architectures without 8-bit bytes, but such architectures probably have different conventions for how things are passed over the network.
A good compiler should be able to optimize this into a single (possibly byte-swapped) load on suitable architectures.

Find Character String In Binary Data

I have a binary file I've loaded using an NSData object. Is there a way to locate a sequence of characters, 'abcd' for example, within that binary data and return the offset without converting the entire file to a string? Seems like it should be a simple answer, but I'm not sure how to do it. Any ideas?
I'm doing this on iOS 3 so I don't have -rangeOfData:options:range: available.
I'm going to award this one to Sixteen Otto for suggesting strstr. I went and found the source code for the C function strstr and rewrote it to work on a fixed length Byte array--which incidentally is different from a char array as it is not null terminated. Here is the code I ended up with:
- (Byte*)offsetOfBytes:(Byte*)bytes inBuffer:(const Byte*)buffer ofLength:(int)len;
{
Byte *cp = bytes;
Byte *s1, *s2;
if ( !*buffer )
return bytes;
int i = 0;
for (i=0; i < len; ++i)
{
s1 = cp;
s2 = (Byte*)buffer;
while ( *s1 && *s2 && !(*s1-*s2) )
s1++, s2++;
if (!*s2)
return cp;
cp++;
}
return NULL;
}
This returns a pointer to the first occurrence of bytes, the thing I'm looking for, in buffer, the byte array that should contain bytes.
I call it like this:
// data is the NSData object
const Byte *bytes = [data bytes];
Byte* index = [self offsetOfBytes:tag inBuffer:bytes ofLength:[data length]];
Convert your substring to an NSData object, and search for those bytes in the larger NSData using rangeOfData:options:range:. Make sure that the string encodings match!
On iPhone, where that isn't available, you may have to do this yourself. The C function strstr() will give you a pointer to the first occurrence of a pattern within the buffer (as long as neither contain nulls!), but not the index. Here's a function that should do the job (but no promises, since I haven't tried actually running it...):
- (NSUInteger)indexOfData:(NSData*)needle inData:(NSData*)haystack
{
const void* needleBytes = [needle bytes];
const void* haystackBytes = [haystack bytes];
// walk the length of the buffer, looking for a byte that matches the start
// of the pattern; we can skip (|needle|-1) bytes at the end, since we can't
// have a match that's shorter than needle itself
for (NSUInteger i=0; i < [haystack length]-[needle length]+1; i++)
{
// walk needle's bytes while they still match the bytes of haystack
// starting at i; if we walk off the end of needle, we found a match
NSUInteger j=0;
while (j < [needle length] && needleBytes[j] == haystackBytes[i+j])
{
j++;
}
if (j == [needle length])
{
return i;
}
}
return NSNotFound;
}
This runs in something like O(nm), where n is the buffer length, and m is the size of the substring. It's written to work with NSData for two reasons: 1) that's what you seem to have in hand, and 2) those objects already encapsulate both the actual bytes, and the length of the buffer.
If you're using Snow Leopard, a convenient way is the new -rangeOfData:options:range: method in NSData that returns the range of the first occurrence of a piece of data. Otherwise, you can access the NSData's contents yourself using its -bytes method to perform your own search.
I had the same problem.
I solved it doing the other way round, compared to the suggestions.
first, I reformat the data (assume your NSData is stored in var rawFile) with:
NSString *ascii = [[NSString alloc] initWithData:rawFile encoding:NSAsciiStringEncoding];
Now, you can easily do string searches like 'abcd' or whatever you want using the NSScanner class and passing the ascii string to the scanner. Maybe this is not really efficient, but it works until the -rangeOfData method will be available for iPhone also.

Resources