How to convert a unicode hex string into its ASCII equivalent - c++builder

I hope everything is going well.
I have this unicodestring:
353135313531353135313531
And I want to transform it into another unicodestring with this content:
515151515151
In other words, convert a hex representation into its ASCII interpretation.
It is very straightforward to do this in C, but the idea is to work with C++ Builder.
This is what I have been trying to do:
String hex_to_ascii(const String& hex_str) {
String ascii_str = "";
for (int i = 1; i <= hex_str.Length(); i += 2) {
String hex_char = hex_str.SubString(i, 2);
int ascii_char = hex_char.ToInt();
// ascii_str += String().sprintf(_D("%c"), ascii_char);
ascii_str.Insert(ascii_char, ascii_str.Length() + 1);
}
return ascii_str;
But no luck so far.
I know there is a method called ToHex I've been trying to search for documentation about it because it's related to what I am trying to do, so probably the library that has this method has also something close to what I need.
If you know how to do this or where can I read about the ToHex method, please let me know. Thank you for reading.

The code you have is very close, it just needs some minor tweaks.
Most importantly, String::ToInt() WILL NOT decode hex, like you are expecting. It will convert "35" to an integer with a value of decimal 35 (NOT hex 0x35, decimal 53), and will convert "31" to an integer with a value of decimal 31 (NOT hex 0x31, decimal 49), etc.
You need to instead use Sysutils::StrToInt() with a 0x hex prefix prepended to the string value.
Try this:
String hex_to_ascii(const String& hex_str) {
String ascii_str;
for (int i = 1; i <= hex_str.Length(); i += 2) {
String hex_char = _D("0x") + hex_str.SubString(i, 2);
int ascii_char = StrToInt(hex_char);
ascii_str += static_cast<Char>(ascii_char);
}
return ascii_str;
}
Alternatively, you can use HexToBin() to decode the hex into a byte array, and then construct a UnicodeString from those bytes, eg:
String hex_to_ascii(const String& hex_str) {
TBytes bytes;
bytes.Length = hex_str.Length() / 2;
HexToBin(hex_str.c_str(), &bytes[0], bytes.Length);
return String((char*)&bytes[0], bytes.Length);
// Alternatively:
// return TEncoding::Default.GetString(bytes);
}

Related

Flutter and Dart - how can I format an integer date in one line of code

How can I format an integer such as "20190331" to output as "2019-03-31" without firstly converting it to a date and without splitting it using substring. The date is stored as an integer in SQLite (SQFlite). I would like to do it in 1 line such as (pseudo code) Note: integer :
(pseudo) String sDate = fmt("####'-'##'-'##", map["DueDate"]);
I know that I could do it such as :
String sDate = map["DueDate"].toString();
sDate = sDate.substring(0,4)+'-'+sDate.substring(4,6)+'-'+sDate.substring(6,8);
That however is two lines of code, and Visual Studio Code turns it into 8 lines when formatted and I like to keep my code compact.
Write a function called fmt and call it as in your pseudo code.
String fmt(String f, int i) {
StringBuffer sb = StringBuffer();
RuneIterator format = RuneIterator(f);
RuneIterator input = RuneIterator(i.toString());
while (format.moveNext()) {
var currentAsString = format.currentAsString;
if (currentAsString == '#') {
input.moveNext();
sb.write(input.currentAsString);
} else {
sb.write(currentAsString);
}
}
return sb.toString();
}
one line:
print(fmt('####-##-##', 20190331));

Swift 3 - How to convert memory of Int32 as four characters

I want to convert an Int32 to a string consisting of four C-style, 1-byte wide characters (probably closely related to this but in Swift 3).
The use for this is that many API functions of Core Audio return an OSStatus (really an Int32), which can often be interpreted as string consisting of four C-style characters.
fun interpretAsString(possibleMsg: Int32) -> String {
// Blackbox
}
Actually a "four character code" is usually an unsigned 32-bit
value:
public typealias FourCharCode = UInt32
public typealias OSType = FourCharCode
The four bytes (from the MSB to the LSB) each define one character.
Here is a simple Swift 3 function to convert the integer to a string,
inspired by the various C/Objective-C/Swift 1+2 solutions in
iOS/C: Convert "integer" into four character string:
func fourCCToString(_ value: FourCharCode) -> String {
let utf16 = [
UInt16((value >> 24) & 0xFF),
UInt16((value >> 16) & 0xFF),
UInt16((value >> 8) & 0xFF),
UInt16((value & 0xFF)) ]
return String(utf16CodeUnits: utf16, count: 4)
}
Example:
print(fourCCToString(0x48454C4F)) // HELO
I have chosen an array with the UTF-16 code points as intermediate storage because that can directly be used to create a string.
If you really need it for a signed 32-bit integer then you can
call
fourCCToString(FourCharCode(bitPattern: i32value)
or define a similar function taking an Int32 parameter.
As Tim Vermeulen suggested below, the UTF-16 array can also be
created with map:
let utf16 = stride(from: 24, through: 0, by: -8).map {
UInt16((value >> $0) & 0xFF)
}
or
let utf16 = [24, 16, 8, 0].map { UInt16((value >> $0) & 0xFF) }
Unless the function is performance critical for your application,
pick what you feel most familiar with (otherwise measure and compare).
I don't test this code but try this:
func interpretAsString(possibleMsg: Int32) -> String {
var result = String()
result.append(Character(UnicodeScalar(UInt32(possibleMsg>>24))!))
result.append(Character(UnicodeScalar(UInt32((possibleMsg>>16) & UInt32(0xFF)))!))
result.append(Character(UnicodeScalar(UInt32((possibleMsg>>8) & UInt32(0xFF)))!))
result.append(Character(UnicodeScalar(UInt32((possibleMsg) & UInt32(0xFF)))!))
return result
}
This may be an old question, but since it was asking in the context of Core Audio, I just wanted to share a variant I was playing with.
For Core Audio, where some (but not all?) OSStatus/Int32 values are defined using four characters, some code from Apple's old Core Audio Utility Classes can provide inspiration (very similar to the linked question)
From CAXException.h:
class CAX4CCStringNoQuote {
public:
CAX4CCStringNoQuote(OSStatus error) {
// see if it appears to be a 4-char-code
UInt32 beErr = CFSwapInt32HostToBig(error);
char *str = mStr;
memcpy(str, &beErr, 4);
if (isprint(str[0]) && isprint(str[1]) && isprint(str[2]) && isprint(str[3])) {
str[4] = '\0';
} else if (error > -200000 && error < 200000)
// no, format it as an integer
snprintf(str, sizeof(mStr), "%d", (int)error);
else
snprintf(str, sizeof(mStr), "0x%x", (int)error);
}
const char *get() const { return mStr; }
operator const char *() const { return mStr; }
private:
char mStr[16];
};
In Swift 5, one rough translation (without the hex representation for large values) might be:
private func osStatusToString(_ value: OSStatus) -> String {
let data = withUnsafeBytes(of: value.bigEndian, { Data($0) })
// If all bytes are printable characters, we treat it like characters of a string
if data.allSatisfy({ 0x20 <= $0 && $0 <= 0x7e }) {
return String(data: data, encoding: .ascii)!
} else {
return String(value)
}
}
Note that the Data initializer is making a copy of the bytes, though it may be possible to avoid that if desired.
Of course, with Core Audio we encounter four character codes with both Int32 and UInt32 types. I haven't done generics with Swift before, but one way to handle them in a single function could be:
private func stringifyErrorCode<T: FixedWidthInteger>(_ value: T) -> String {
let data = withUnsafeBytes(of: value.bigEndian, { Data($0) })
// If all bytes are printable characters, we treat it like characters of a string
if data.allSatisfy({ 0x20 <= $0 && $0 <= 0x7e }) {
return String(data: data, encoding: .ascii)!
} else {
return String(value, radix: 10)
}
}
This may not be suitable for general purpose handling of four character codes (I've seen other answers that support characters in the MacOS Roman encoding versus ASCII in the example above. There's likely some history there I'm not aware of), but may be reasonable for Core Audio status/selector codes.

How to use client.read() to read string between two characters?

In my ESP8266 WiFi project I'm getting characters from a website through a GET request. The current code is this:
while(client.available()){
String line = client.readStringUntil('\r');
Serial.print(line);
}
To get a string between particular characters, how do I edit this?
Put the code snip after read string operation and change the below divider delimeters with yours and gatheredStr will be your desired string:
char firstDivider = 'X';
char secondDivider = 'Y';
int firstDividerIndex = line.indexOf(firstDivider);
int secondDividerIndex = line.indexOf(secondDivider);
String gatheredStr = line.substring(firstDividerIndex, secondDividerIndex);

How to express Strings in Swift using Unicode hexadecimal values (UTF-16)

I want to write a Unicode string using hexadecimal values in Swift. I have read the documentation for String and Character so I know that I can use special Unicode characters directly in strings like the following:
var variableString = "Cat‼🐱" // "Cat" + Double Exclamation + cat emoji
But I would like to do it using the Unicode code points. The docs (and this question) show it for characters, but are not very clear about how to do it for strings.
(Note: Although the answer seems obvious to me now, it wasn't obvious at all just a short time ago. I am answering my own question below as a means of learning how to do this and also to help myself understand Unicode terminology and how Swift Characters and Strings work.)
Character
The Swift syntax for forming a hexadecimal code point is
\u{n}
where n is a hexadecimal number up to 8 digits long. The valid range for a Unicode scalar is U+0 to U+D7FF and U+E000 to U+10FFFF inclusive. (The U+D800 to U+DFFF range is for surrogate pairs, which are not scalars themselves, but are used in UTF-16 for encoding the higher value scalars.)
Examples:
// The following forms are equivalent. They all produce "C".
let char1: Character = "\u{43}"
let char2: Character = "\u{0043}"
let char3: Character = "\u{00000043}"
// Higher value Unicode scalars are done similarly
let char4: Character = "\u{203C}" // ‼ (DOUBLE EXCLAMATION MARK character)
let char5: Character = "\u{1F431}" // 🐱 (cat emoji)
// Characters can be made up of multiple scalars
let char7: Character = "\u{65}\u{301}" // é = "e" + accent mark
let char8: Character = "\u{65}\u{301}\u{20DD}" // é⃝ = "e" + accent mark + circle
Notes:
Leading zeros can be added or omitted
Characters are known as extended grapheme clusters. Even when they are composed of multiple scalars, they are still considered a single character. What is key is that they appear to be a single character (grapheme) to the user.
TODO: How to convert surrogate pair to Unicode scalar in Swift
String
Strings are composed of characters. See the following examples for some ways to form them using hexadecimal code points.
Examples:
var string1 = "\u{0043}\u{0061}\u{0074}\u{203C}\u{1F431}" // Cat‼🐱
// pass an array of characters to a String initializer
let catCharacters: [Character] = ["\u{0043}", "\u{0061}", "\u{0074}", "\u{203C}", "\u{1F431}"] // ["C", "a", "t", "‼", "🐱"]
let string2 = String(catCharacters) // Cat‼🐱
Converting Hex Values at Runtime
At runtime you can convert hexadecimal or Int values into a Character or String by first converting it to a UnicodeScalar.
Examples:
// hex values
let value0: UInt8 = 0x43 // 67
let value1: UInt16 = 0x203C // 8252
let value2: UInt32 = 0x1F431 // 128049
// convert hex to UnicodeScalar
let scalar0 = UnicodeScalar(value0)
// make sure that UInt16 and UInt32 form valid Unicode values
guard
let scalar1 = UnicodeScalar(value1),
let scalar2 = UnicodeScalar(value2) else {
return
}
// convert to Character
let character0 = Character(scalar0) // C
let character1 = Character(scalar1) // ‼
let character2 = Character(scalar2) // 🐱
// convert to String
let string0 = String(scalar0) // C
let string1 = String(scalar1) // ‼
let string2 = String(scalar2) // 🐱
// convert hex array to String
let myHexArray = [0x43, 0x61, 0x74, 0x203C, 0x1F431] // an Int array
var myString = ""
for hexValue in myHexArray {
if let scalar = UnicodeScalar(hexValue) {
myString.append(Character(scalar))
}
}
print(myString) // Cat‼🐱
Further reading
Strings and Characters docs
Glossary of Unicode Terms
Strings in Swift
Working with Unicode code points in Swift
from your Hex "0x1F52D" to actual Emoji
let c = 0x1F602
next step would possibly getting an Uint32 from your Hex
let intEmoji = UnicodeScalar(c!).value
from this you can do something like
titleLabel.text = String(UnicodeScalar(intEmoji)!)
here you have a "😂"
it work with range of hexadecimal too
let emojiRanges = [
0x1F600...0x1F636,
0x1F645...0x1F64F,
0x1F910...0x1F91F,
0x1F30D...0x1F52D
]
for range in emojiRanges {
for i in range {
let c = UnicodeScalar(i)!.value
data.append(c)
}
}
to get multiple UInt32 from your Hex range for exemple

strtoul() Function- Swift

I'm trying to create a swift iOS program that converts a number into dec, bin, and hex numbers. I've come across the strtoul function, but don't quite understand how to use it, would someone be able to explain it? Thanks!
The method strtoul is pretty simple to use. You will need also to use String(radix:()) to convert it to the other direction. You can create an extension to convert from hexaToDecimal or from binaryToDecimal as follow:
Usage String(radix:())
extension Int {
var toBinary: String {
return String(self, radix: 2)
}
var toHexa: String {
return String(self, radix: 16)
}
}
Usage strtoul()
extension String {
var hexaToDecimal: Int {
return Int(strtoul(self, nil, 16))
}
var hexaToBinary: String {
return hexaToDecimal.toBinary
}
var binaryToDecimal: Int {
return Int(strtoul(self, nil, 2))
}
var binaryToHexa: String {
return binaryToDecimal.toHexa
}
}
Testing
let myBinFromInt = 255.toBinary // "11111111"
let myhexaFromInt = 255.toHexa // "ff"
let myIntFromHexa = "ff".hexaToDecimal // 255
let myBinFromHexa = "ff".hexaToBinary // "11111111"
let myIntFromBin = "11111111".binaryToDecimal // 255
let myHexaFromBin = "11111111".binaryToHexa // "ff"
The strtoul() function converts the string in str to an unsigned long
value. The conversion is done according to the given base, which must be between 2 and 36 inclusive, or be the special value 0.
Really it sounds like you want to use NSString
From what it sounds like, you want to convert an unsigned integer to decimal, hex and binary.
For example, if you had an integer n:
var st = NSString(format:"%2X", n)
would convert the integer to hexadecimal and store it in the variable st.
//NSString(format:"%2X", 10) would give you 'A' as 10 is A in hex
//NSString(format:"%2X", 17) would give you 11 as 17 is 11 in hex
Binary:
var st = NSString(format:"%u", n)
Decimal (2 decimal places)
var st = NSString(format:"%.02f", n)

Resources