Base64Encoding of UIImage Doesn't Match - ios

I have a UIImage and I want to encode it using base 64. I then send the string to our server.
Our server decodes it using btoa(). It can't do so properly.
After debugging, we found out that the result of encoding/decoding using btoa()/atob() does not match NSData's base64EncodedStringWithOptions when I convert from UIImage to NSData and then encode.
What's weird is they do match when I read the UIImage directly as NSData using dataWithContentsOfFile: instead of converting from UIImage to NSData using UIImagePNGRepresentation()
My problem is that I'm supposed to use an imagepicker that returns a UIImage. I don't want to write the image to file and then read it directly as NSData. it's not efficient. Is there a way to solve this?

Try this for base64 encoding:
+ (NSString*)base64forData:(NSData*)theData
{
const uint8_t* input = (const uint8_t*)[theData bytes];
NSInteger length = [theData length];
static char table[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/=";
NSMutableData* data = [NSMutableData dataWithLength:((length + 2) / 3) * 4];
uint8_t* output = (uint8_t*)data.mutableBytes;
NSInteger i;
for (i=0; i < length; i += 3) {
NSInteger value = 0;
NSInteger j;
for (j = i; j < (i + 3); j++) {
value <<= 8;
if (j < length) {
value |= (0xFF & input[j]);
}
}
NSInteger theIndex = (i / 3) * 4;
output[theIndex + 0] = table[(value >> 18) & 0x3F];
output[theIndex + 1] = table[(value >> 12) & 0x3F];
output[theIndex + 2] = (i + 1) < length ? table[(value >> 6) & 0x3F] : '=';
output[theIndex + 3] = (i + 2) < length ? table[(value >> 0) & 0x3F] : '=';
}
return [[NSString alloc] initWithData:data encoding:NSASCIIStringEncoding] ;
}

Related

extracting bits from NSData bytes

I would like to extract all the bits in the following bits from NSData byte :
status Data byte : <0011...
Result turns all are 0000 0000 0000 0000 . Could you please tell me how to ?
NSData *aData = [valueData subdataWithRange:NSMakeRange(0, 2)]; //16 bit status
status= [self bitsToInt:aData];
NSString *aString = [NSString stringWithFormat:#"%d", status];
int value = [aString intValue];
NSlog(#"sadasd value : ,%d" ,value );
unsigned thbit0 = (1 << 0) & value;
unsigned thbit1 = (1 << 1) & value;
unsigned thbit2 = (1 << 2) & value;
unsigned thbit3 = (1 << 3) & value;
unsigned thbit4 = (1 << 4) & value;
unsigned thbit5 = (1 << 5) & value;
unsigned thbit6 = (1 << 6) & value;
unsigned thbit7 = (1 << 7) & value;
unsigned thbit8 = (1 << 8) & value;
unsigned thbit9 = (1 << 9) & value;
unsigned thbit10 = (1 << 10) & value;
unsigned thbit11= (1 << 11) & value;
unsigned thbit12 = (1 << 12) & value;
..
- (int) bitsToInt : (NSData *) valueDa {
uint8_t * bytePtr = (uint8_t * )[valueDa bytes];
int high = bytePtr[1] >= 0 ? bytePtr[1] : 256 + bytePtr[1];
int low = bytePtr[0] >= 0 ? bytePtr[0] : 256 + bytePtr[0];
return low | (high << 8);
}
You could try to work with bit string instead of integer values with additional bit extracting.
Here is simple decoder:
- (NSString *)getBitsFromData:(NSData *)data
{
NSMutableString *result = [NSMutableString string];
const uint8_t *bytes = [data bytes];
for (NSUInteger i = 0; i < data.length; i++)
{
uint8_t byte = bytes[i];
for (int j = 0; j < 8; j++)
{
((byte >> j) & 1) == 0 ? [result appendString:#"0"] : [result appendString:#"1"];
}
}
return result;
}
Test:
NSString *test = #"test";
NSLog(#"%#", [self getBitsFromData:[test dataUsingEncoding:NSUTF8StringEncoding]]);
Result:
2015-07-29 11:37:31.768 Test[18342:9947704] 00101110101001101100111000101110

Drupal and Objective C Base 64 mismatch

I have this sequence of bytes (printed from an HTML, so apologizes for the ugly format)
193<br/>250<br/>194<br/>129<br/>62<br/>60<br/>12<br/>171<br/>199<br/>96<br/>13<br/>125<br/>166<br/>175<br/>80<br/>85<br/>137<br/>29<br/>15<br/>189<br/>33<br/>231<br/>237<br/>98<br/>165<br/>35<br/>75<br/>250<br/>181<br/>150<br/>35<br/>175<br/>129<br/>174<br/>13<br/>13<br/>121<br/>229<br/>30<br/>173<br/>112<br/>210<br/>2<br/>165<br/>110<br/>113<br/>141<br/>166<br/>102<br/>105<br/>33<br/>82<br/>220<br/>233<br/>118<br/>36<br/>73<br/>88<br/>196<br/>152<br/>15<br/>231<br/>164<br/>119<br/>
When I use the Drupal function: [_password_base64_encode][1] I get the following base64 string:
/fjk/u1DAgulUpETay8IJZM5DoP6briMZCmGuLfZXwOUiqE1tJi5h0bo0IePlpcdaZK6GlRuqFGGMFAaDQCdr/
But when I use this sequence of bytes in my iOS application with the code:
NSString *base64Encoded = [hash base64EncodedStringWithOptions:0];
I get:
wfrCgT48DKvHYA19pq9QVYkdD70h5+1ipSNL
Why this behavior?
Thanks
Ported the function _password_base64_encode to iOS:
- (NSString*)drupalBase64PasswordEncode:(NSData*)data {
NSUInteger count = [data length];
int i = 0;
NSString *itTo64String = #"./0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz";
const char *itTo64 = [itTo64String cStringUsingEncoding:NSUTF8StringEncoding];
char *input = [data bytes];
NSMutableString *output = [[NSMutableString alloc] init];
do {
unsigned char value = (unsigned char)input[i++];
int value2;
unsigned char toInsert = itTo64[value & 0x3f];
[output appendString:[NSString stringWithFormat:#"%c" , toInsert]];
if (i < count) {
value2 = value | ((unsigned char)input[i] << 8);
}
toInsert = itTo64[(value2 >> 6) & 0x3f];
[output appendString:[NSString stringWithFormat:#"%c" , toInsert]];
if (i++ >= count) {
break;
}
if (i < count) {
value2 = value2 | ((unsigned char)input[i] << 16);
}
toInsert = itTo64[(value2 >> 12) & 0x3F];
[output appendString:[NSString stringWithFormat:#"%c" , toInsert]];
if (i++ >= count) {
break;
}
toInsert = itTo64[(value2 >> 18) & 0x3F];
[output appendString:[NSString stringWithFormat:#"%c" , toInsert]];
}while(i < count);
return output;
}
The most likely explanation is that you're encoding junk along with your intended string - can you dump the actual string that gets encoded, such as
$string = ' ... ';
var_dump($string);
$base64_string = _password_base64_encode($string);
var_dump($string);
It is most likely that you are including different characters ( breaks, newlines in different format, etc ) than you intended when encoding.
In addition, you might want to compare the output with PHP's native base64_encode function, and compare results.

Convert uint8_t to NSString

Just started learning objective-c and was trying to convert a byte array into UTF8 NSString but have been getting nil/null.
Here is the abbreviated code sample.
enum {
TMessageType_CALL = 1,
TMessageType_REPLY = 2,
TMessageType_EXCEPTION = 3,
TMessageType_ONEWAY = 4
};
int32_t VERSION_1 = 0x80010000;
int value = VERSION_1 | TMessageType_CALL;
uint8_t buff[4];
buff[0] = 0xFF & (value >> 24);
buff[1] = 0xFF & (value >> 16);
buff[2] = 0xFF & (value >> 8);
buff[3] = 0xFF & value;
//Convert buff to NSString with offset =0, length =4
I tried the following.
NSString *t = [[NSString alloc] initWithBytes:buff length:4 encoding:NSUTF8StringEncoding];
NSString *t1 = [NSString stringWithUTF8String:(char *)buff];
But both t and t1 return nil.
What is the right API to convert it correctly?
This conversion needs to be generic across WriteI32() writeI64(), writeString(), writeDouble(). Here is the code for the rest.
- (void) writeI16: (short) value
{
uint8_t buff[2];
buff[0] = 0xff & (value >> 8);
buff[1] = 0xff & value;
[mTransport write: buff offset: 0 length: 2];
}
- (void) writeI64: (int64_t) value
{
uint8_t buff[8];
buff[0] = 0xFF & (value >> 56);
buff[1] = 0xFF & (value >> 48);
buff[2] = 0xFF & (value >> 40);
buff[3] = 0xFF & (value >> 32);
buff[4] = 0xFF & (value >> 24);
buff[5] = 0xFF & (value >> 16);
buff[6] = 0xFF & (value >> 8);
buff[7] = 0xFF & value;
[mTransport write: buff offset: 0 length: 8];
}
- (void) writeDouble: (double) value
{
// spit out IEEE 754 bits - FIXME - will this get us in trouble on
// PowerPC?
[self writeI64: *((int64_t *) &value)];
}
- (void) writeString: (NSString *) value
{
if (value != nil) {
const char * utf8Bytes = [value UTF8String];
size_t length = strlen(utf8Bytes);
[self writeI32: length];
[mTransport write: (uint8_t *) utf8Bytes offset: 0 length: length];
} else {
// instead of crashing when we get null, let's write out a zero
// length string
[self writeI32: 0];
}
}
buff is an array of unsigned chars, so you could use this:
NSString *t = [NSString stringWithFormat:#"%s", buff];
As an alternative, you can get each character explicitly:
NSMutableString *t = [NSMutableString stringWithCapacity:4];
for (NSUInteger i = 0; i < 4; ++i)
[t appendFormat:#"%c", buff[i]];
NSLog(#"%#", t);
The first option does a conversion to a valid string. The second option gives you each character, regardless of any terminating characters ('\0').
I'm not sure what useful information this will give you, but there you have it.

How to minimize the length of the base64 string from nsdata of image?

I convert an image to NSData and NSData to base64string using
NSData *imagedata = UIImageJPEGRepresentation(imageView.image, 0.1f);
NSString *c = [NSString base64StringFromData:imagedata];
the fn for stringconversion
+ (NSString*)base64forData:(NSData*)theData {
const uint8_t* input = (const uint8_t*)[theData bytes];
NSInteger length = [theData length];
static char table[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/=";
NSMutableData* data = [NSMutableData dataWithLength:((length + 2) / 3) * 4];
uint8_t* output = (uint8_t*)data.mutableBytes;
NSInteger i;
for (i=0; i < length; i += 3) {
NSInteger value = 0;
NSInteger j;
for (j = i; j < (i + 3); j++) {
value <<= 8;
if (j < length) {
value |= (0xFF & input[j]);
}
}
NSInteger theIndex = (i / 3) * 4;
output[theIndex + 0] = table[(value >> 18) & 0x3F];
output[theIndex + 1] = table[(value >> 12) & 0x3F];
output[theIndex + 2] = (i + 1) < length ? table[(value >> 6) & 0x3F] : '=';
output[theIndex + 3] = (i + 2) < length ? table[(value >> 0) & 0x3F] : '=';
}
return [[NSString alloc] initWithData:data encoding:NSASCIIStringEncoding];
}
but the resulted base64string is too long, its length is above 300000.
ie,
int len = c.length;
value of len is above 300000.
the image is of 3 to 4 mb
infact I compress the image to 0.1f
NSData *imagedata = UIImageJPEGRepresentation(iivv.image, 0.1f);
how to minimize the length, is there any other code for base64conversion from NSData?
Base64 will always have larger space requirements than the original data, because it does not use all the bits in one byte. This is done intentionally in order to make sure that higher bits do not cause problems when being handed from one system to another. So in effect it trades space for transmission safety.
It is called Base64 because it only uses 6 bits (2^6=64) for each byte, therefore effectively taking up 5 bytes where the original data only had 4. Or put another way: size will increase by 25%.
The Base64 encoder of course does not care about what the bytes you feed into it represent, so you are free to compress the heck out of your data, as long as it is still in its own format (e. g. create a PNG or JPG out of uncompressed image data) and then encode that as Base64.

How to convert NSData that is encoded with AES256 to base64?

I am trying to convert an NSData object that has been encrypted with AES256 encryption to base64 NSData object. I am under the impression that I can not directly convert a NSData object that has been encrypted with AES256 encryption to a NSString, and that I must first convert it to base64.
So how would I convert a NSData object to a base64 data object? And bonus I need to convert the base64 data object to a NSString.
I found this method, but I am not sure how I convert my NSData object to base64 using the method below.
- (NSString*)base64forData:(NSData*)theData {
const uint8_t* input = (const uint8_t*)[theData bytes];
NSInteger length = [theData length];
static char table[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/=";
NSMutableData* data = [NSMutableData dataWithLength:((length + 2) / 3) * 4];
uint8_t* output = (uint8_t*)data.mutableBytes;
NSInteger i;
for (i=0; i < length; i += 3) {
NSInteger value = 0;
NSInteger j;
for (j = i; j < (i + 3); j++) {
value <<= 8;
if (j < length) {
value |= (0xFF & input[j]);
}
}
NSInteger theIndex = (i / 3) * 4;
output[theIndex + 0] = table[(value >> 18) & 0x3F];
output[theIndex + 1] = table[(value >> 12) & 0x3F];
output[theIndex + 2] = (i + 1) < length ? table[(value >> 6) & 0x3F] : '=';
output[theIndex + 3] = (i + 2) < length ? table[(value >> 0) & 0x3F] : '=';
}
return [[NSString alloc] initWithData:data encoding:NSASCIIStringEncoding];
}
The method that you provided in your post should work (cocoadev has relevant discussion).
Here is how you use this method:
NSString *b64 = [self base64forData:myNsData];

Resources