Decode Data that is base64String to String utf Swift - ios

A library save a token as Data in UserDefaults
//loading the token
if let token = UserDefaults.standard.object(forKey: "token") as? Data {
let base64Encoded = token.base64EncodedData
let base64EncodedString = token.base64EncodedString()
}
When I print that base64Encoded value in the console I get the value: Y2E2N2Y5NTItNDVkOC00YzZkLWFkZDMtZGRiMjc5NGE3YWI2OjdmZDU1ZTAyLWExMjEtNGQ1ZC05N2MzLWM5OWY4NTg5NTIzNg== as Data
When I copy this value from the output in the console and use this website https://www.base64decode.org/ I got the expected result. ca67f952-45d8-4c6d-add3-ddb2794a7ab6:7fd55e02-a121-4d5d-97c3-c99f85895236
My problem is that I don't get convert that base64Encoded to String in my code.
When I use base64EncodedString I got a string, but I can't figure out which format this is: PFsYdirs21247rRG/XcWOVGUGUcCCTzXz7VFAnunLJU= The method base64EncodedString() This is an method from Apple: https://developer.apple.com/documentation/foundation/nsdata/1413546-base64encodedstring
When I decode this value to utf8 at the website I got: <[v*]Fw9QG
When I try to convert my base64Encoded (Data, not String) to a string like this
if let toString = String(data: base64Encoded, encoding: String.Encoding.utf8) as String {
print("test \(toString)")
}
my compiler threw an error: Cannot convert value of type(Data.Base64EncodingOptions) -> Data to expected argument type Data
Here I found a solution to decode when my Data-value would be a string.
https://stackoverflow.com/a/31859383/4420355
So I'm quite confused about the results. Long story short:
I have base64Encoded Data (Y2E2N2Y5NTItNDVkOC00YzZkLWFkZDMtZGRiMjc5NGE3YWI2OjdmZDU1ZTAyLWExMjEtNGQ1ZC05N2MzLWM5OWY4NTg5NTIzNg==) and want them convert to an utf8 string. I don't get handle this.

Related

Decode a JSON object escaped in a String

I get a response from an API (unfortunately, I cannot change it) that looks something like (just an example):
As bytes => "{\"key\":\"value\"}"
The starting and ending quotes and the escaped quotes are all part of the response, I am solving it in a really ugly way that looks like this:
// (...) Receiving response
guard var responseString = String(bytes: data, encoding: .utf8) else {
print("Response wasn't just a string, great!") // unfortunately, this never happens
return
}
responseString = responseString.trimmingCharacters(in: .whitespacesAndNewlines) // make sure it is trimmed
responseString = String(responseString.dropFirst()) // drop the quote at the start
responseString = String(responseString.dropLast()) // drop the quote at the end
responseString = responseString.replacingOccurrences(of: "\\\"", with: "\"")// convert all \" to " (and hope nothing else is escaped <<< this is really bad!!!)
let responseDataToDecode = responseString.data(using: .utf8)!
// (...) decoding with JSONDecoder
Is there a way to automatically unescape the string and use the JSON object that is contained in it?
If it's double-encoded, then you just need to double-decode. If I understand correctly, the incoming data is like this:
let str = #""{\"key\":\"value\"}""#
// "{\\"key\\":\\"value\\"}"
The first byte is ", the second byte is {, the third byte is \, the fourth byte is ".
That's a JSON-encoded string. So decode that as a string (there was a time when this didn't work because it's a "fragment," but it works fine currently, at least in all my tests):
let decoder = JSONDecoder()
let string = try! decoder.decode(String.self, from: Data(str.utf8)) // {"key":"value"}
And then decode that as your type ([String:String] just for example):
let result = try! decoder.decode([String:String].self, from: Data(string.utf8))
// ["key": "value"]
(IMO this kind of double-encoding is fine, BTW, and I'm not sure why there are so many comments against it. Serializing an arbitrary object makes a lot more sense in many cases than forcing the schema to deal with an arbitrary structure. As long as it's cleanly encoded, I don't see any problem here.)
There's a first step: You need an official documented statement what exactly the format of your data is. It looks like someone took some data, turned it into JSON data, interpreted the data as a string, and then converted the string to a JSON fragment. It's not difficult to decode the JSON fragment, getting a string, turning the string into data, and decoding that data as JSON (starting with JSONSerialization and .allowFragments, probably the only time you should use .allowFragments in your life). Doing it without swearing is hard.
But first you want in writing that this is actually the format. Because I would bet that whoever is responsible for that data format will eventually change it without telling you and break your code.

Emojis showing up as question marks in app made using swift(iOS), java(android), ruby(server), mongodb(database)

I've been working on this chatting application in which users can send emojis. Now I'm taking the string from whatever user enters in the UITextField and put it in NSDictionary and sending it to the server as json. And that json is sent to the server where the message is read as string in ruby and then stored in mongodb. Now when the other client make the get messages api call, the emojis are showing up as a box or a '?'.
P.S : only emojis with 5 character code shows up like that for eg: \u1F602
but the emojis with 4 character code shows up fine for eg: \u2764
Now I don't know if the problem is client or server or the database so I don't know which code to add here. Please add in comments the code you need I'll post it here.
It feels like the problem is server, cause the problem is caused in both android and iOS devices.
Have been banging my head on this for more than a month now. Would love if someone can help.
Thanks
----EDIT----
I understand that in ruby \u{1F602} works but I don't know how to make the clients send it in that format. I'm just taking whatever user types in the UITextField(for iOS) and EditText(for Android) and sending them as it is.
Is there a way I can make that change in client or fix it on server somehow?
In iOS,
For encode emojis to unicode use below code:
let msg:String = "😂😂"
extension String {
var encodeEmoji: String{
if let encodeStr = NSString(cString: self.cString(using: .nonLossyASCII)!, encoding: String.Encoding.utf8.rawValue){
return encodeStr as String
}
return self
}
}
let msgdata:String = msg.encodeEmoji
send encoded string to server..
For decode unicode to emojis use below code:
While getting your responce from the server which is unicode.
decode that unicode to Emoji with below code
extension String {
var decodeEmoji: String{
let data = self.data(using: String.Encoding.utf8);
let decodedStr = NSString(data: data!, encoding: String.Encoding.nonLossyASCII.rawValue)
if let str = decodedStr{
return str as String
}
return self
}
}
let decodedstring = "Your Unicode String".decodeEmoji
If anyone is looking for an answer. This is how I fixed it
For Android:
in your gradle file add the following dependency
compile 'org.apache.commons:commons-lang3:3.6'
and then while sending a message to server encode the text using this
String msg = "User message here with emoji";
msg = StringEscapeUtils.escapeJava(msg);
and to decode the message after receiving from the server, use the following command
String msg = "User message here with emoji";
msg = StringEscapeUtils.unescapeJava(text);
For iOS:
(Using #Ankit Chauhan's answer)
let msg:String = "😂😂"
extension String {
var encodeEmoji: String{
if let encodeStr = NSString(cString: self.cString(using: .nonLossyASCII)!, encoding: String.Encoding.utf8.rawValue){
return encodeStr as String
}
return self
}
}
let msgdata:String = msg.encodeEmoji
And to decode use this:
extension String {
var decodeEmoji: String{
let data = self.data(using: String.Encoding.utf8);
let decodedStr = NSString(data: data!, encoding: String.Encoding.nonLossyASCII.rawValue)
if let str = decodedStr{
return str as String
}
return self
}
}
let decodedstring = "Your Unicode String".decodeEmoji
The simplified notation without curly brackets assumes there are four digits following \u. For 3-bytes one should use the complete expression:
â–¶ "\u{1F602}"
#⇒ "😂"
For Android I had to do 2 things.
Use escapeJava when i send msg to server and unescapeJava when receive msg
'org.apache.commons:commons-lang3' //deprecated, dont use
org.apache.commons:commons-text:1.2'
StringEscapeUtils.escapeJava(message)
StringEscapeUtils.unescapeJava(message)
Use EmojiCompat library from google
https://developer.android.com/guide/topics/ui/look-and-feel/emoji-compat.html

How to convert a string into JSON using SwiftyJSON

The string to convert:
[{"description": "Hi","id":2,"img":"hi.png"},{"description": "pet","id":10,"img":"pet.png"},{"description": "Hello! :D","id":12,"img":"hello.png"}]
The code to convert the string:
var json = JSON(stringLiteral: stringJSON)
The string is converted to JSON and when I try to count how many blocks are inside this JSON (expected answer = 3), I get 0.
print(json.count)
Console Output: 0
What am I missing? Help is very appreciated.
Actually, there was a built-in function in SwifyJSON called parse
/**
Create a JSON from JSON string
- parameter string: Normal json string like '{"a":"b"}'
- returns: The created JSON
*/
public static func parse(string:String) -> JSON {
return string.dataUsingEncoding(NSUTF8StringEncoding)
.flatMap({JSON(data: $0)}) ?? JSON(NSNull())
}
Note that
var json = JSON.parse(stringJSON)
its now changed to
var json = JSON.init(parseJSON:stringJSON)
I fix it on this way.
I will use the variable "string" as the variable what contains the JSON.
1.
encode the sting with NSData like this
var encodedString : NSData = (string as NSString).dataUsingEncoding(NSUTF8StringEncoding)!
un-encode the string encoded (this may be sound a little bit weird hehehe):
var finalJSON = JSON(data: encodedString)
Then you can do whatever you like with this JSON.
Like get the number of sections in it (this was the real question) with
finalJSON.count or print(finalJSON[0]) or whatever you like to do.
There is a built-in parser in SwiftyJSON:
let json = JSON.init(parseJSON: responseString)
Don't forget to import SwiftyJSON!
I'm using as follows:
let yourString = NSMutableString()
let dataToConvert = yourString.data(using: String.Encoding.utf8.rawValue)
let json = JSON(data: dataToConvert!)
print("\nYour string: " + String(describing: json))
Swift4
let json = string.data(using: String.Encoding.utf8).flatMap({try? JSON(data: $0)}) ?? JSON(NSNull())

Parse UTF8 json file in swift not work correctly with SwiftyJson

I tried to load UTF8 JSON file in swift and I received response in NSData. Then I converted it to NSString by:
let responseString = NSString(data: myNSdata!, encoding: NSUTF8StringEncoding)
let json = JSON(responseString)
print(json["now"].string)
print(json.string)
First print show nil but second print show file correctly but there is \ char before each " and I think this is the reason. Please help me to find solution.
No need to make a string to init JSON(), do it with the data:
let json = JSON(data: myNSdata!)

Cannot convert empty array of arrays to JSON with Swift 2

What is wrong with this piece of code (which was inspired by this example)? It currently prints JSON string "(<5b5d>, 4)" instead of the expected "[]".
var tags: [[String]] = []
// tags to be added later ...
do {
let data = try NSJSONSerialization.dataWithJSONObject(tags, options: [])
let json = String(data: data, encoding: NSUTF8StringEncoding)
print("\(json)")
}
catch {
fatalError("\(error)")
}
Short answer: The creation of the JSON data is correct. The problem is in the
conversion of the data to a string, what you want is the NSString method:
let json = NSString(data: data, encoding: NSUTF8StringEncoding) as! String
which produces the expected [].
Slightly longer answer:
Your code
let json = String(data: data, encoding: NSUTF8StringEncoding)
calls the String init method
/// Initialize `self` with the textual representation of `instance`.
/// ...
init<T>(_ instance: T)
and the result is the textual representation of the tuple
(data: data, encoding: NSUTF8StringEncoding):
(<5b5d>, 4)
Actually you can call String() with arbitrary arguments
let s = String(foo: 1, bar: "baz")
print(s) // (1, "baz")
in Swift 2. This does not compile in Swift 1.2, so I am not sure if
this is intended or not. I have posted a question in the
Apple Developer Forums about that:
String init accepts arbitrary arguments.

Resources