Does golang provide htonl/htons? - network-programming

In C programming, when I want to send an integer across network, we need to use htonl() or htons() to convert the integer from host byte order
to network byte order before sending it.
But in golang, I have checked the net package, and can't find the similar functions like htons/htonl. So how should I send an integer when using golang? Do I need to implement htons/htonl
myself?

Network byte order is just big endian, so you can use the encoding/binary package to perform the encoding.
For example:
data := make([]byte, 6)
binary.BigEndian.PutUint16(data, 0x1011)
binary.BigEndian.PutUint32(data[2:6], 0x12131415)
Alternatively, if you are writing to an io.Writer, the binary.Write() function from the same package may be more convenient (again, using the binary.BigEndian value as the order argument).

I think what you're after is ByteOrder in encoding/binary.
A ByteOrder specifies how to convert byte sequences into 16-, 32-, or 64-bit unsigned integers.

Just convert integers between big-endian and little-endian.
// NetToHostShort converts a 16-bit integer from network to host byte order, aka "ntohs"
func NetToHostShort(i uint16) uint16 {
data := make([]byte, 2)
binary.BigEndian.PutUint16(data, i)
return binary.LittleEndian.Uint16(data)
}
// NetToHostLong converts a 32-bit integer from network to host byte order, aka "ntohl"
func NetToHostLong(i uint32) uint32 {
data := make([]byte, 4)
binary.BigEndian.PutUint32(data, i)
return binary.LittleEndian.Uint32(data)
}
// HostToNetShort converts a 16-bit integer from host to network byte order, aka "htons"
func HostToNetShort(i uint16) uint16 {
b := make([]byte, 2)
binary.LittleEndian.PutUint16(b, i)
return binary.BigEndian.Uint16(b)
}
// HostToNetLong converts a 32-bit integer from host to network byte order, aka "htonl"
func HostToNetLong(i uint32) uint32 {
b := make([]byte, 4)
binary.LittleEndian.PutUint32(b, i)
return binary.BigEndian.Uint32(b)
}

Related

Is there a Dart function to convert List<int> to Double?

I'm getting a Bluetooth Characteristic from a Bluetooth controller with flutter blue as a List. This characteristic contains weight measurement of a Bluetooth scale. Is there a function to convert this list of ints to a Double?
I tried to find some background on float representations by reading the IEEE 754 standard. There is the dart library typed_data but I am new to dart and have no experience with this lib.
Example:
I have this List: [191, 100, 29, 173] which is coming from a bluetooth controller as a IEEE754 representation for a float value.
Now i believe i have to convert each int to hex and concat these values: bf 64 1d ad
Next thing need to do is convert this to double, but i cannot find a function to convert hex to double. Only int.parse("0xbf641dad").
I guess your mean to convert the list of ints to a list of floats, not to a single float, right?
First, dart has no type called float. Instead it has type double. To convert it, you can use the map() function:
var listInt = [1, 2, 3];
var listDouble = list.map((i) => i.toDouble()).toList();
Had this same issue.
I think you mean how do you make the four bytes into a float32 ?
I needed to do the same thing.
you will want to do something like this:
first take the List value as a byte buffer, then take the byte data, then you can use the getFloat32 function.
ByteBuffer buffer = new Int8List.fromList(value_in).buffer;
ByteData byteData = new ByteData.view(buffer);
result = byteData.getFloat32(0);
Just be a little aware that the order of bytes from the Bluetooth may be back to front, as the convention varys.
Also you will need the typed_data library:
import 'dart:typed_data'; //for data formatting
I'm trying to go in the other direction now . . . for the obvious reason.
You may need this:
double parseHexString(String hexStr, bool littleEndian){
if(hexStr.length % 2 != 0){
return 0;
}
if(littleEndian == true){
List<int> bytes = hex.decode(hexStr).reversed.toList();
hexStr = hex.encode(bytes);
}
var byteConvert = ByteData(12);
byteConvert.setInt64(0, int.parse(hexStr,radix: 16));
return byteConvert.getFloat64(0);}
And demo :
double lat = parseHexString("0000004069665E40",true);
//lat = 121.60017395019531
If you already have:
int.parse("0xbf641dad")
I don't see why this wouldn't work:
int.parse("0xbf641dad").toDouble();
I used dart:typed_data for this:
import 'dart:typed_data';
List<int> intList = [191, 100, 29, 173];
double asFloat = ByteData.view(Uint8List.fromList(List.from(intList)).buffer).getFloat32(0, Endian.little);
Library reference: https://api.dart.dev/be/136883/dart-typed_data/dart-typed_data-library.html

Convert string to base64 byte array in swift and java give different value

Incase of android everything is working perfectly. I want to implement same feature in iOS too but getting different values. Please check the description with images below.
In Java/Android Case:
I tried to convert the string to base64 byte array in java like
byte[] data1 = Base64.decode(balance, Base64.DEFAULT);
Output:
In Swift3/iOS Case:
I tried to convert the string to base64 byte array in swift like
let data:Data = Data(base64Encoded: balance, options: NSData.Base64DecodingOptions(rawValue: 0))!
let data1:Array = (data.bytes)
Output:
Finally solved:
This is due to signed and unsigned integer, meaning unsigned vs signed (so 0 to 255 and -127 to 128). Here, we need to convert the UInt8 array to Int8 array and therefore the problem will be solved.
let intArray = data1.map { Int8(bitPattern: $0) }
In no case should you try to compare data on 2 systems the way you just did. That goes for all types but specially for raw data.
Raw data are NOT presentable without additional context which means any system that does present them may choose how to present them (raw data may represent some text in UTF8 or some ASCII, maybe jpeg image or png or raw RGB pixel data, it might be an audio sample or whatever). In your case one system is showing them as a list of signed 8bit integers while the other uses 8bit unsigned integers for the same thing. Another system might for instance show you a hex string which would look completely different.
As #Larme already mentioned these look the same as it is safe to assume that one system uses signed and the other unsigned values. So to convert from signed (Android) to unsigned (iOS) you need to convert negative values as unsigned = 256+signet so for instance -55 => 256 + (-55) = 201.
If you really need to compare data in your case it is the best to save them into some file as raw data. Then transfer that file to another system and compare native raw data to those in file to check there is really a difference.
EDIT (from comment):
Printing raw data as a string is a problem but there are a few ways. The thing is that many bytes are not printable as strings, may be whitespaces or some reserved codes but mostly the problem is that value of 0 means the end of string in most cases which may exist in the middle of your byte sequence.
So you already have 2 ways of printing byte by byte which is showing Int8 or Uint8 corresponding values. As described in comment converting directly to string may not work as easy as
let string = String(data: data, encoding: .utf8) // Will return nil for strange strings
One way of converting data to string may be to convert each byte into a corresponding character. Check this code:
let characterSequence = data.map { UnicodeScalar($0) } // Create an array of characters from bytes
let stringArray = characterSequence.map { String($0) } // Create an array of strings from array of characters
let myString = stringArray.reduce("", { $0 + $1 }) // Convert an array of strings to a single string
let myString2 = data.reduce("", { $0 + String(UnicodeScalar($1)) }) // Same thing in a single line
Then to test it I used:
let data = Data(bytes: Array(0...255)) // Generates with byte values of 0, 1, 2... up to 255
let myString2 = data.reduce("", { $0 + String(UnicodeScalar($1)) })
print(myString2)
The printing result is:
!"#$%&'()*+,-./0123456789:;<=>?#ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~ ¡¢£¤¥¦§¨©ª«¬­®¯°±²³´µ¶·¸¹º»¼½¾¿ÀÁÂÃÄÅÆÇÈÉÊËÌÍÎÏÐÑÒÓÔÕÖ×ØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõö÷øùúûüýþ
Then another popular way is using a hex string. It can be displayed as:
let hexString = data.reduce("", { $0 + String(format: "%02hhx",$1) })
print(hexString)
And with the same data as before the result is:
000102030405060708090a0b0c0d0e0f101112131415161718191a1b1c1d1e1f202122232425262728292a2b2c2d2e2f303132333435363738393a3b3c3d3e3f404142434445464748494a4b4c4d4e4f505152535455565758595a5b5c5d5e5f606162636465666768696a6b6c6d6e6f707172737475767778797a7b7c7d7e7f808182838485868788898a8b8c8d8e8f909192939495969798999a9b9c9d9e9fa0a1a2a3a4a5a6a7a8a9aaabacadaeafb0b1b2b3b4b5b6b7b8b9babbbcbdbebfc0c1c2c3c4c5c6c7c8c9cacbcccdcecfd0d1d2d3d4d5d6d7d8d9dadbdcdddedfe0e1e2e3e4e5e6e7e8e9eaebecedeeeff0f1f2f3f4f5f6f7f8f9fafbfcfdfeff
I hope this is enough but in general you could do pretty much anything with array of bytes and show them. For instance you could create an image treating bytes as RGB 8-bit per component if it would make sense. It might sound silly but if you are looking for some patterns it might be quite a witty solution.

Converting a hexadecimal string to a decimal integer

I'm writing a Rust program that reads off of an I2C bus and saves the data. When I read the I2C bus, I get hex values like 0x11, 0x22, etc.
Right now, I can only handle this as a string and save it as is. Is there a way I can parse this into an integer? Is there any built in function for it?
In most cases, you want to parse more than one hex byte at once. In those cases, use the hex crate.
parse this into an integer
You want to use from_str_radix. It's implemented on the integer types.
use std::i64;
fn main() {
let z = i64::from_str_radix("1f", 16);
println!("{:?}", z);
}
If your strings actually have the 0x prefix, then you will need to skip over them. The best way to do that is via trim_start_matches or strip_prefix:
use std::i64;
fn main() {
let raw = "0x1f";
let without_prefix = raw.trim_start_matches("0x");
let z = i64::from_str_radix(without_prefix, 16);
println!("{:?}", z);
}

golang convert iso8859-1 to utf8

I am trying to convert an ISO 8859-1 encoded string to UTF-8.
The following function works with my testdata which contains german umlauts, but I'm not quite sure what source encoding the rune(b) cast assumes. Is it assuming some kind of default encoding, e.g. ISO8859-1 or is there any way to tell it what encoding to use?
func toUtf8(iso8859_1_buf []byte) string {
var buf = bytes.NewBuffer(make([]byte, len(iso8859_1_buf)*4))
for _, b := range(iso8859_1_buf) {
r := rune(b)
buf.WriteRune(r)
}
return string(buf.Bytes())
}
rune is an alias for int32, and when it comes to encoding, a rune is assumed to have a Unicode character value (code point). So the value b in rune(b) should be a unicode value. For 0x00 - 0xFF this value is identical to Latin-1, so you don't have to worry about it.
Then you need to encode the runes into UTF8. But this encoding is simply done by converting a []rune to string.
This is an example of your function without using the bytes package:
func toUtf8(iso8859_1_buf []byte) string {
buf := make([]rune, len(iso8859_1_buf))
for i, b := range iso8859_1_buf {
buf[i] = rune(b)
}
return string(buf)
}
The effect of
r := rune(expression)
is:
Declare variable r with type rune (alias for int32).
Initialize variable r with the value of expresion.
No (re)encoding is involved and saying which one should be optionally used is possible only by explicitly writing/handling some re-encoding in code. Luckily, in this case no (re)encoding is necessary, Unicode incorporated those codes of ISO 8859-1 in a comparable way as ASCII. (If I checked correctly here)

Go array slice from function return statement

I have the following functions:
func (c *Class)A()[4]byte
func B(x []byte)
I want to call
B(c.A()[:])
but I get this error:
cannot take the address of c.(*Class).A()
How do I properly get a slice of an array returned by a function in Go?
The value of c.A(), the return value from a method, is not addressable.
Address operators
For an operand x of type T, the address operation &x generates a
pointer of type *T to x. The operand must be addressable, that is,
either a variable, pointer indirection, or slice indexing operation;
or a field selector of an addressable struct operand; or an array
indexing operation of an addressable array. As an exception to the
addressability requirement, x may also be a composite literal.
Slices
If the sliced operand is a string or slice, the result of the slice
operation is a string or slice of the same type. If the sliced operand
is an array, it must be addressable and the result of the slice
operation is a slice with the same element type as the array.
Make the value of c.A(), an array, addressable for the slice operation [:]. For example, assign the value to a variable; a variable is addressable.
For example,
package main
import "fmt"
type Class struct{}
func (c *Class) A() [4]byte { return [4]byte{0, 1, 2, 3} }
func B(x []byte) { fmt.Println("x", x) }
func main() {
var c Class
// B(c.A()[:]) // cannot take the address of c.A()
xa := c.A()
B(xa[:])
}
Output:
x [0 1 2 3]
Have you tried sticking the array in a local variable first?
ary := c.A()
B(ary[:])

Resources