NSString to char* with greek characters - ios

I am using the following code to store the data of a string in a char*.
NSString *hotelName = [components[2] stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
hotelInfo->hotelName = malloc(sizeof(char) * hotelName.length + 1);
strncpy(hotelInfo->hotelName, [hotelName UTF8String], hotelName.length + 1);
NSLog(#"HOTEL NAME: %s",hotelInfo->hotelName);
The problem is with the Greek characters that are printed strangely. I have also tried to use another encoding (e.g NSWindowsCP1253StringEncoding -it crashes- )
I tried even that:
hotelInfo->hotelName = (const char *)[hotelName cStringUsingEncoding:NSUnicodeStringEncoding];
but it also produces strange characters.
What do I miss?
EDIT:
After some suggestions I tried the following:
if ([hotelName canBeConvertedToEncoding:NSWindowsCP1253StringEncoding]){
const char *cHotelName = (const char *)[hotelName cStringUsingEncoding:NSWindowsCP1253StringEncoding];
int bufSize = strlen(cHotelName) + 1;
if (bufSize >0 ){
hotelInfo->hotelName = malloc(sizeof(char) * bufSize);
strncpy(hotelInfo->hotelName, [hotelName UTF8String], bufSize);
NSLog(#"HOTEL NAME: %s",hotelInfo->hotelName);
}
}else{
NSLog(#"String cannot be encoded! Sorry! %#",hotelName);
for (NSInteger charIdx=0; charIdx<hotelName.length; charIdx++){
// Do something with character at index charIdx, for example:
char x[hotelName.length];
NSLog(#"%C", [hotelName characterAtIndex:charIdx]);
x[charIdx] = [hotelName characterAtIndex:charIdx];
NSLog(#"%s", x);
if (charIdx == hotelName.length - 1)
hotelInfo->hotelName = x;
}
NSLog(#"HOTEL NAME: %s",hotelInfo->hotelName);
}
But still nothing!

First of all, it is not guaranteed that any NSString can be represented as a C character array (so-called C-String). The reason is that there is just a limited set of characters available. You should check if the string can be converted (by calling canBeConvertedToEncoding:).
Secondly, when using the malloc and strncpy functions, they rely on the length of the C-String, not on the length of the NSString. So you should first get the C-String from the NSString, then get it's length (strlen), and use this value to the function calls:
const char *cHotelName = (const char *)[hotelName cStringUsingEncoding:NSWindowsCP1253StringEncoding];
int bufSize = strlen(cHotelName) + 1;
hotelInfo->hotelName = malloc(sizeof(char) * bufSize);
strncpy(hotelInfo->hotelName, cHotelName, bufSize);

Related

Drupal and Objective C Base 64 mismatch

I have this sequence of bytes (printed from an HTML, so apologizes for the ugly format)
193<br/>250<br/>194<br/>129<br/>62<br/>60<br/>12<br/>171<br/>199<br/>96<br/>13<br/>125<br/>166<br/>175<br/>80<br/>85<br/>137<br/>29<br/>15<br/>189<br/>33<br/>231<br/>237<br/>98<br/>165<br/>35<br/>75<br/>250<br/>181<br/>150<br/>35<br/>175<br/>129<br/>174<br/>13<br/>13<br/>121<br/>229<br/>30<br/>173<br/>112<br/>210<br/>2<br/>165<br/>110<br/>113<br/>141<br/>166<br/>102<br/>105<br/>33<br/>82<br/>220<br/>233<br/>118<br/>36<br/>73<br/>88<br/>196<br/>152<br/>15<br/>231<br/>164<br/>119<br/>
When I use the Drupal function: [_password_base64_encode][1] I get the following base64 string:
/fjk/u1DAgulUpETay8IJZM5DoP6briMZCmGuLfZXwOUiqE1tJi5h0bo0IePlpcdaZK6GlRuqFGGMFAaDQCdr/
But when I use this sequence of bytes in my iOS application with the code:
NSString *base64Encoded = [hash base64EncodedStringWithOptions:0];
I get:
wfrCgT48DKvHYA19pq9QVYkdD70h5+1ipSNL
Why this behavior?
Thanks
Ported the function _password_base64_encode to iOS:
- (NSString*)drupalBase64PasswordEncode:(NSData*)data {
NSUInteger count = [data length];
int i = 0;
NSString *itTo64String = #"./0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz";
const char *itTo64 = [itTo64String cStringUsingEncoding:NSUTF8StringEncoding];
char *input = [data bytes];
NSMutableString *output = [[NSMutableString alloc] init];
do {
unsigned char value = (unsigned char)input[i++];
int value2;
unsigned char toInsert = itTo64[value & 0x3f];
[output appendString:[NSString stringWithFormat:#"%c" , toInsert]];
if (i < count) {
value2 = value | ((unsigned char)input[i] << 8);
}
toInsert = itTo64[(value2 >> 6) & 0x3f];
[output appendString:[NSString stringWithFormat:#"%c" , toInsert]];
if (i++ >= count) {
break;
}
if (i < count) {
value2 = value2 | ((unsigned char)input[i] << 16);
}
toInsert = itTo64[(value2 >> 12) & 0x3F];
[output appendString:[NSString stringWithFormat:#"%c" , toInsert]];
if (i++ >= count) {
break;
}
toInsert = itTo64[(value2 >> 18) & 0x3F];
[output appendString:[NSString stringWithFormat:#"%c" , toInsert]];
}while(i < count);
return output;
}
The most likely explanation is that you're encoding junk along with your intended string - can you dump the actual string that gets encoded, such as
$string = ' ... ';
var_dump($string);
$base64_string = _password_base64_encode($string);
var_dump($string);
It is most likely that you are including different characters ( breaks, newlines in different format, etc ) than you intended when encoding.
In addition, you might want to compare the output with PHP's native base64_encode function, and compare results.

Convert a multi-byte unicode byte array into an NSString, using a partial buffer

In Objective C is there a way to convert a multi-byte unicode byte array into an NSString, where it will allow the conversion to succeed even if the array data is a partial buffer (not on a complete character boundary)?
The application of this is when receiving byte buffers in a stream, and you want to parse the string version of the data buffer (but there is more data to come, and your buffer data doesn't have complete multi-byte unicode).
NSString's initWithData:encoding: method does not work for this purpose, as shown here...
Test code:
- (void)test {
char myArray[] = {'f', 'o', 'o', (char) 0xc3, (char) 0x97, 'b', 'a', 'r'};
size_t sizeOfMyArray = sizeof(myArray);
[self dump:myArray sizeOfMyArray:sizeOfMyArray];
[self dump:myArray sizeOfMyArray:sizeOfMyArray - 1];
[self dump:myArray sizeOfMyArray:sizeOfMyArray - 2];
[self dump:myArray sizeOfMyArray:sizeOfMyArray - 3];
[self dump:myArray sizeOfMyArray:sizeOfMyArray - 4];
[self dump:myArray sizeOfMyArray:sizeOfMyArray - 5];
}
- (void)dump:(char[])myArray sizeOfMyArray:(size_t)sourceLength {
NSString *string = [[NSString alloc] initWithData:[NSData dataWithBytes:myArray length:sourceLength] encoding:NSUTF8StringEncoding];
NSLog(#"sourceLength: %lu bytes, string.length: %i bytes, string :'%#'", sourceLength, string.length, string);
}
Output:
sourceLength: 8 bytes, string.length: 7 bytes, string :'foo×bar'
sourceLength: 7 bytes, string.length: 6 bytes, string :'foo×ba'
sourceLength: 6 bytes, string.length: 5 bytes, string :'foo×b'
sourceLength: 5 bytes, string.length: 4 bytes, string :'foo×'
sourceLength: 4 bytes, string.length: 0 bytes, string :'(null)'
sourceLength: 3 bytes, string.length: 3 bytes, string :'foo'
As can be seen, converting the "sourceLength: 4 bytes" byte array fails, and returns (null). This is because the UTF-8 unicode '×' character (0xc3 0x97) is only partially included.
Ideally there would be a function that I can use that would return the correct NString, and tell me how many bytes are "left over".
You largely have your own answer. If the initWithData:dataWithBytes:encoding: method returns nil, then you know the buffer has a partial (invalid) character at the end.
Modify dump to return an int. Then have it attempt to create the NSString in a loop. Each time you get nil, reduce the length and try again. Once you get a valid NSString, return the difference between the used length and the passed length.
I had this problem before and forget it for a while. It was an opportunity to do it. The code below is done with informations from the utf-8 page on wikipedia. It is a category on NSData.
It check the data from the end and only the four last bytes because the OP said that it can be giga byte of data. Otherwise with utf-8 it's simpler to run through the bytes from the beginning.
/*
Return the range of a valid utf-8 encoded text by
removing partial trailing multi-byte char.
It assumes that all the bytes are valid utf-8 encoded char,
e.g. it don't raise a flag if a continuation byte is preceded
by a single char byte.
*/
- (NSRange)rangeOfUTF8WithoutPartialTrailingMultibytes
{
NSRange validRange = {0, 0};
NSUInteger trailLength = MIN([self length], 4U);
unsigned char trail[4];
[self getBytes:&trail
range:NSMakeRange([self length] - trailLength, trailLength)];
unsigned multibyteCount = 0;
for (NSInteger i = trailLength - 1; i >= 0; i--) {
if (isUTF8SingleByte(trail[i])) {
validRange = NSMakeRange(0, [self length] - trailLength + i + 1);
break;
}
if (isUTF8ContinuationByte(trail[i])) {
multibyteCount++;
continue;
}
if (isUTF8StartByte(trail[i])) {
multibyteCount++;
if (multibyteCount == lengthForUTF8StartByte(trail[i])) {
validRange = NSMakeRange(0, [self length] - trailLength + i + multibyteCount);
}
else {
validRange = NSMakeRange(0, [self length] - trailLength + i);
}
break;
}
}
return validRange;
}
Here is the static functions used in the method:
static BOOL isUTF8SingleByte(const unsigned char c)
{
return c <= 0x7f;
}
static BOOL isUTF8ContinuationByte(const unsigned char c)
{
return (c >= 0x80) && (c <= 0xbf);
}
static BOOL isUTF8StartByte(const unsigned char c)
{
return (c >= 0xc2) && (c <= 0xf4);
}
static BOOL isUTF8InvalidByte(const unsigned char c)
{
return (c == 0xc0) || (c == 0xc1) || (c > 0xf4);
}
static unsigned lengthForUTF8StartByte(const unsigned char c)
{
if ((c >= 0xc2) && (c <= 0xdf)) {
return 2;
}
else if ((c >= 0xe0) && (c <= 0xef)) {
return 3;
}
else if ((c >= 0xf0) && (c <= 0xf4)) {
return 4;
}
return 1;
}
Here is my inefficient implementation, which I don't consider to be a correct answer. I'll leave it here in case others find it useful (and in the hope that someone else will give a better answer than this!)
It's in a category on NSMutableData...
/**
* Removes the biggest string possible from this NSMutableData, leaving any remainder unicode half-characters behind.
*
* NOTE: This is a very inefficient implementation, it may require multiple parsing of the complete NSMutableData buffer,
* it is especially inefficient when the data buffer does not contain a valid string encoding, as all lengths will be
* attempted.
*/
- (NSString *)removeMaximumStringUsingEncoding:(NSStringEncoding)encoding {
if (self.length > 0) {
// Quick test for the case where the whole buffer can be used (is common case, and doesn't require NSData manipulation).
NSString *result = [[NSString alloc] initWithData:self encoding:encoding];
if (result != Nil) {
self.length = 0; // Simple case, we used the whole buffer.
return result;
}
// Try to find the largest subData that is a valid string.
for (NSUInteger subDataLength = self.length - 1; subDataLength > 0; subDataLength--) {
NSRange subDataRange = NSMakeRange(0, subDataLength);
result = [[NSString alloc] initWithData:[self subdataWithRange:subDataRange] encoding:encoding];
if (result != Nil) {
// Delete the bytes we used from our buffer, leave the remainder.
[self replaceBytesInRange:subDataRange withBytes:Nil length:0];
return result;
}
}
}
return #"";
}

adler32 checksum in objective c [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I am working on a app which sends data to server with user location info. Server accept this data based on checksum calculation, which is written in java.
Here is the code written in Java:
private static final String CHECKSUM_CONS = "1217278743473774374";
private static String createChecksum(double lat, double lon) {
int latLon = (int) ((lat + lon) * 1E6);
String checkSumStr = CHECKSUM_CONS + latLon;
byte buffer[] = checkSumStr.getBytes();
ByteArrayInputStream bais = new ByteArrayInputStream(buffer);
CheckedInputStream cis = new CheckedInputStream(bais, new Adler32());
byte readBuffer[] = new byte[50];
long value = 0;
try {
while (cis.read(readBuffer) >= 0) {
value = cis.getChecksum().getValue();
}
} catch (Exception e) {
LOGGER.log(Level.SEVERE, e.getMessage(), e);
}
return String.valueOf(value);
}
I tried looking for help to find out how to write objective c equivalent of this. Above function uses adler32 and I don't have any clue about that. Please help.
Thanks for your time.
The answers shown here by #achievelimitless and #user3275097 are incorrect.
First off, signed integers should not be used. The modulo operator on negative numbers is defined differently in different languages, and should be avoided when possible. Simply use unsigned integers instead.
Second, the loops will quickly overflow the 16-bit accumulators, which will give the wrong answer. The modulo operations can be deferred, but they must be done before overflow. You can calculate how many loops you can do safely by assuming that all of the input bytes are 255.
Third, because of the second point, you should not use 16-bit types. You should use at least 32-bit types to avoid having to do the modulo operation very often. You still need to limit the number of loops, but the number gets much bigger. For 32-bit unsigned types, the maximum number of loops is 5552. So the basic code looks like:
#define MOD 65521
#define MAX 5552
unsigned long adler32(unsigned char *buf, size_t len)
{
unsigned long a = 1, b = 0;
size_t n;
while (len) {
n = len > MAX ? MAX : len;
len -= n;
do {
a += *buf++;
b += a;
} while (--n);
a %= MOD;
b %= MOD;
}
return a | (b << 16);
}
As noted by #Sulthan, you should simply use the adler32() function provided in zlib, which is already there on Mac OS X and iOS.
On basis of definition of adler32 checksum as mentioned in wikipedia,
Objective C implementation would be like this:
static NSNumber * adlerChecksumof(NSString *str)
{
NSMutableData *data= [[NSMutableData alloc]init];
unsigned char whole_byte;
char byte_chars[3] = {'\0','\0','\0'};
for (int i = 0; i < ([str length] / 2); i++)
{
byte_chars[0] = [str characterAtIndex:i*2];
byte_chars[1] = [str characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, 16);
[data appendBytes:&whole_byte length:1];
}
int16_t a=1;
int16_t b=0;
Byte * dataBytes= (Byte *)[data bytes];
for (int i=0; i<[data length]; i++)
{
a+= dataBytes[i];
b+=a;
}
a%= 65521;
b%= 65521;
int32_t adlerChecksum= b*65536+a;
return #(adlerChecksum);
}
Here str would be your string as mentioned in your question..
So when you want to calculate checksum of some string just do this:
NSNumber * calculatedChkSm= adlerChecksumof(#"1217278743473774374");
Please Let me know if more info needed

Checking the length of an array of characters in Objective-C

I'm translating a small Java library for using in an Objective-C application I'm writing.
char[] chars = sentence.toCharArray();
int i = 0;
while (i < chars.length) { ... }
Where sentence is an NSString.
I'd like to translate the above Java code to Objective-C. Here's what I have so far:
// trims sentence off white space
sentence = [sentence stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
const char *chars = [sentence UTF8String];
How do I the above while condition? I'm not sure of how I'm supposed to check the length of the the string after it was converted to a character array.
Your Objective-C string already holds a measure of its length, it's just a matter of retrieving it:
// trims sentence off white space
sentence = [sentence stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
NSUInteger length = sentence.length;
const char *chars = [sentence UTF8String];
But I would like to remember that even if you didn't know the length, you could use the C strlen function:
// trims sentence off white space
sentence = [sentence stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
const char *chars = [sentence UTF8String];
size_t length = strlen(chars);
Even there is already an accepted answer I want to warn of using strlen(), even in this case it might be without any problem. There are a differences between NSString and C-Strings.
A. -length (NSString) and strlen() has different semantics:
NSString is not(!) \0-terminated, but length based. It can store \0 characters. It is very easy to get different length, if there is a \0 character in the string instance:
NSString *sentence = #"Amin\0Negm";
NSLog( #"length %ld", [sentence length]); // 9
const char *chars = [sentence cStringUsingEncoding:NSUTF8StringEncoding];
size_t length= strlen(chars);
NSLog(#"strlen %ld", (long)length); // 4
length 9
strlen 4
But -UTF8String and even the used -cStringUsingEnocding: (both NSString) copy out the whole string stored in the string instance. (I think in case of -cStringUsingEncoding it is misleading, because standard string functions like strlen() always uses the first \0 as the termination of strings.)
B. In UTF8 a character can have multibytes. A char in C is one byte. (With byte not in the meaning of 8 bits, but smallest addressable unit.)
NSString *sentence = #"Αmin Negm";
NSLog( #"length %ld", [sentence length]);
const char *chars = [sentence UTF8String];
size_t length= strlen(chars);
NSLog(#"strlen %ld", (long)length);
length 9
strlen 10
WTF happened here? The "A" of Amin is no latin capital letter A but a greek capital letter Alpha. In UTF8 this takes two bytes and for pure C's strlen there are two characters!
NSLog(#"%x-%x %x-%x", 'A', 'm', (unsigned char)*chars, (unsigned char)*(chars+1) );
41-6d ce-91
The first two numbers are the codes for 'A', 'm', the second two numbers are the UTF8 code for greek capital letter Alpha (CE 91).
I do not think, that it is a good idea to simply change from NSString to char * without good reason and a complete understanding of the problems. If you do not expect such characters, use NSASCIIStringEncoding. If you expect such characters check your code again and again … or read C.
C. C supports wide characters. This is similiar to Mac OS' unichar, but typed wchar_t. There are string functions for wchar_t in wchar.h.
NSString *sentence = #"Αmin Negm";
NSLog( #"length %ld", [sentence length]);
wchar_t wchars[128]; // take care of the size
wchar_t *wchar = wchars;
for (NSUInteger index = 0; index < [sentence length]; index++)
{
*wchar++ = [sentence characterAtIndex:index];
}
*wchar = '\0';
NSLog(#"widestrlen %ld", wcslen(wchars));
length 9
widestrlen 9
D. Obviously you want to iterate through the string. The common pattern in pure C is not to use an index and to compare it to the length and definitly not to to strlen() in every loop, because it produces high costs. (C strings are not length based so the whole string has to be scanned over and over.) You simply increment the pointer to the next char:
char letter;
while ( (letter = *chars++) ) {…}
or
do
{
// *chars points to the actual char
} while (*char++);
int lenght = sizeof(chars) / sizeof(char)
might work, but it will (inte the best case) return same thing as
sentence.lenght
in worst case 0 because the whole pointer / sizeof thing i don't remember now

BIGNUM strange behavior in a calculation loop

I'm trying to implement a basic routine to perform some calculation on BIGNUM(s) and I've found a strange behavior. The functions are as follows
unsigned char *char_array_as_hex(unsigned char *chr_a, int len)
{
unsigned char *chr_s = (unsigned char *)malloc(len * 2);
char buffer[5];
for (int i = 0; i < len; i++)
{
sprintf(buffer, "%02X", chr_a[i]);
chr_s[(2 * i) + 0] = buffer[0];
chr_s[(2 * i) + 1] = buffer[1];
}
return chr_s;
}
and
char *big_number_as_decimal_from_hex_array(unsigned char *chr_a, int len, BN_CTX *bn_ctx)
{
unsigned char *hex_s = char_array_as_hex(chr_a, len);
BIGNUM *big_number = BN_CTX_get(bn_ctx);
BN_hex2bn(&big_number, (char *)hex_s);
char *big_number_as_decimal = BN_bn2dec(big_number);
free(hex_s);
BN_free(big_number);
return big_number_as_decimal;
}
and
void test_compute_prime256v1()
{
BN_CTX *bn_ctx = BN_CTX_new();
BN_CTX_start(bn_ctx);
unsigned char seed_a[20] = {
0xC4,0x9D,0x36,0x08,0x86,0xE7,0x04,0x93,0x6A,0x66, /* seed */
0x78,0xE1,0x13,0x9D,0x26,0xB7,0x81,0x9F,0x7E,0x90
};
printf("s = %s\n", big_number_as_decimal_from_hex_array(seed_a, 20, bn_ctx));
unsigned char p_a[32] = {
0xFF,0xFF,0xFF,0xFF,0x00,0x00,0x00,0x01,0x00,0x00, /* p */
0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,
0xFF,0xFF,0xFF,0xFF,0xFF,0xFF,0xFF,0xFF,0xFF,0xFF,
0xFF,0xFF
};
printf("p = %s\n", big_number_as_decimal_from_hex_array(p_a, 32, bn_ctx));
BN_CTX_end(bn_ctx);
BN_CTX_free(bn_ctx);
}
then I call "test_compute_prime256v1" in an Objective-C method. If I call it once or multiple times with a reasonable delay between each call it produces correct result however, when I call that function in a loop it produces different incorrect values
- (IBAction)btnOK_Clicked:(id)sender
{
for (int i = 1; i < 10; i++)
{
printf("i = %d\n", i);
test_compute_prime256v1();
}
}
and a sample output was
i = 1
s = 1122468115042657169822351801880191947498376363664
p = 115792089210356248762697446949407573530086143415290314195533631308867097853951
i = 2
s = 1122468115042657169822351801880191947498376363664
p = 966134380529368896499052403318808180610643774633026536153469502543482958881555881553276...
i = 3
s = 1122468115042657169822351801880191947498376363664
p = 115792089210356248762697446949407573530086143415290314195533631308867097853951
Note: some numbers are trimmed to fit in. I have followed the suggestion in here.
Am I missing something? Is there any mistake somewhere?
Anyone can help?
Thanks
EDITED:
I made some modification to code but the issue still exists. I changed big_number_as_decimal_from_hex_array as follows
char *big_number_as_decimal_from_hex_array_ex(unsigned char *chr_a, int len)
{
BN_CTX *bn_ctx = BN_CTX_new();
BN_CTX_start(bn_ctx);
unsigned char *hex_s = char_array_as_hex(chr_a, len);
BIGNUM *big_number = BN_CTX_get(bn_ctx);
BN_hex2bn(&big_number, (char *)hex_s);
char *big_number_as_decimal = BN_bn2dec(big_number);
free(hex_s);
BN_free(big_number);
BN_CTX_end(bn_ctx);
BN_CTX_free(bn_ctx);
return big_number_as_decimal;
}
and also
char *big_number_as_decimal_from_hex_array_ex_2(unsigned char *chr_a, int len)
{
BN_CTX *bn_ctx = BN_CTX_new();
unsigned char *hex_s = char_array_as_hex(chr_a, len);
BIGNUM *big_number = BN_CTX_get(bn_ctx);
BN_hex2bn(&big_number, (char *)hex_s);
char *big_number_as_decimal = BN_bn2dec(big_number);
free(hex_s);
BN_free(big_number);
BN_CTX_free(bn_ctx);
return big_number_as_decimal;
}
I modified the test_compute_prime256v1 as
void test_compute_prime256v1_ex()
{
unsigned char seed_a[20] = {...};
printf("s = %s\n", big_number_as_decimal_from_hex_array_ex(seed_a, 20));
unsigned char p_a[32] = {...};
printf("p = %s\n", big_number_as_decimal_from_hex_array_ex(p_a, 32));
// or
unsigned char seed_a[20] = {...};
printf("s = %s\n", big_number_as_decimal_from_hex_array_ex_2(seed_a, 20));
unsigned char p_a[32] = {...};
printf("p = %s\n", big_number_as_decimal_from_hex_array_ex_2(p_a, 32));
}
but the code produces the same incorrect result in a looped calculation
BN_hex2bn(&big_number, (char *)hex_s); expects a C string as second argument, ie a '\0' terminated one since it has no other way to know the size of your string.

Resources