Convert a hex string into base64? - ios

I'm trying to convert a hex string into base64 on iPhone. The hex is:
5289be07c5c7edcc18f3a02c7b81c110b8637f8b2ddbc29cdabcbd7e394c1695
But I cant's seem to get the base64 version of this, which is:
Uom+B8XH7cwY86Ase4HBELhjf4st28Kc2ry9fjlMFpU=
How would I get this base64 string?

Since you are using SHA256 (I also use it), I assume you get an NSData output before you convert it to a hex string. Take that data, and use the category method shown in this question (the one with the most upvotes, that actually has the code inlined)

Related

Storing RSA encrypted data as hexadecimal

I am using OpenSSL::PKey::RSA to encrypt/decrypt a string of data using a private key. I am storing the encrypted data in a column in a table as a string. I have gotten this implementation to work no problem using Base64.encode64 and Base64.decode64. However, I do not want to store the encrypted data as base 64, I would like to store it as hexadecimal in a string.
I'm currently using the following to store the encrypted data:
encrypted_data = pk.private_encrypt(plain_data).unpack('H*').first
This results in encrypted_Data equaling a string like that bellow, which easily stores in my database.
d70db8c36d6ccbadd1cca1263ff140df24e0112f636ac9ea92c28f27e443496c
My problem has come in the changing of this hexadecimal string back to the binary string that is needed to decrypt the data. I've tried several different approaches and none seem to work.
What is the best/easiest way to decrypt this hexadecimal string?
The opposite of unpack is pack, which is what you're looking for to get this hex string back to binary. Like so:
[encrypted_data].pack('H*')
Pack is a function on array, not string, so be sure you're passing the same array that unpack('H*') results in or else the output will not be the same.

Base64 encode: Three different outputs from different tools?

I am trying to verify an OAuth signature generated in code against a "known reputable source". All my steps are verified correct except the last, wherein a 'base signature string' is HMAC-SHA1 hashed against a secret key and then base64 encoded.
I have confirmed that my hash value is the same as expected by the algorithm. I then disconfirmed that my base64 encode was the same. Attempting to determine why my encode failed, I wanted to check the encoder I was using.
Here is the (hash) string being base64 encoded:
203ebb13a65cccaae5cb1b9d5af51fe41f534357
Here is the base64 encode that results in my code:
MjAzZWJiMTNhNjVjY2NhYWU1Y2IxYjlkNWFmNTFmZTQxZjUzNDM1Nw==
According to http://www.motobit.com/util/base64-decoder-encoder.asp, that is the correct result:
But, according to http://www.online-convert.com/result/096d7b00138f3726daee5f6ddb107a62 (provided with the secret and base string, not the hash), a different base64 should have been output. Note that the hash output is my correct hash despite the difference in base64:
Finally, the "official" tester (http://hueniverse.com/oauth/guide/authentication/) outputs a third different base64 from the same hash:
I have no idea what I'm doing wrong, and the fact that these tools are outputting different results makes me wonder if there is in fact such a thing as base64 encoding or if they are actually using different algorithms? Perhaps the fact that it's for OAuth would help you help me identify the answer.
Thanks for any leads from the wise.
OK, in this case the first website was making the same "mistake" I was (in my case it was a mistake, the first website may just be making an unstated assumption).
That mistake is whether the hash is interpreted as a string (which gets base64encoded) or as a series of hexadecimal values which get base64encoded. In the former case, the resultant encode is longer than the original string, while in the latter the resultant encode is shorter than the original string. This is not only empirically true but the interwebs show that it is one of the concepts behind the standard in the first place.
The second website, working from (as stated) "hex" data, got the correct answer.
Try to check via https://base64-encode.org
On this website you can convert all types of images to Base64 string.

What's the characterset of SHA1?

I need to know what character will the SHA1 will generate for me?
Is it possible to know the characterset of the SHA1? Or if it's configurable, what's the default characterset of it?
Thank you.
SHA-1 doesn't generate text, it generates a binary hash (like most digests), so it doesn't have a charset (or care about the input's charset for that matter).
You can represent it as text (a string representation of the hex value, and base64 are popular) if you want, especially if you need to transfer it over the network or display it to users. That encoding is up to you.
I'm fairly sure it's just binary data rather than any character encoding. You could then encode that in Base64 if you like.
The hash algorithm SHA1 takes a stream of bytes as input, and calculates the 160-bits digest. Command line versions output the digest as a hexadecimal string. No charsets involved.

Encoded characters in NSString

I'm using the TBXML framework to parse some XML, but am having problems with the returned string values. The problem is that the returned values contain parts such as "£" instead of £, etc. Is there a convenient way to simply convert all of these into the correct characters so that they can be displayed in a UILabel?
Thanks
Maybe this can help you any further:
HTML character decoding in Objective-C / Cocoa Touch
You maybe can use HTML entities to make your currency character.

how to decode image data with JSON format in Rails 3?

In Rails 3, I want to post the bitmap image data in JSON format to server, so I do the following steps.
1. In client, translate the bitmap image to string.
2. Encode the string in JSON format and post to the server.
3. Decode the bitmap image data of JSON format.
Now the problem is: In bitmap image, there are many 0 bytes or other unreadable bytes, after encoding in JSON format, 0 byte will be translated to /u0000, space byte will to /u000a.
In the server end, I use ActiveResource::Formats::JsonFormat.decode to decode the JSON string, but the method will stop when it meets /u0000, for example,
JSON string "\u0066\u0066\u0000\u0066\u0066" will be decoded to be "ff", and the rest three bytes will be discarded silently.
So how to resolve this problem? should I write a function to decode the JSON string myself?
You should really be POSTing that data as binary in a multipart form.
If you must encode it into a string, use base64.

Resources