How to decode GSM-TCAP messages using asn1c generated code - parsing

I am using the c code generated by asn1c from the TCAP protocol specification (i.e., the corresponding ASN1 files).
I can successfully encode TCAP packets by the generated code.
However, trying to "decode" related byte streams fails.
A sample code is as follows.
// A real byte stream of a TCAP message:
unsigned char packet_bytes[] = {
0x62, 0x43, 0x48, 0x04, 0x00, 0x18, 0x02, 0x78,
0x6b, 0x1a, 0x28, 0x18, 0x06, 0x07, 0x00, 0x11,
0x86, 0x05, 0x01, 0x01, 0x01, 0xa0, 0x0d, 0x60,
0x0b, 0xa1, 0x09, 0x06, 0x07, 0x04, 0x00, 0x00,
0x01, 0x00, 0x14, 0x03, 0x6c, 0x1f, 0xa1, 0x1d,
0x02, 0x01, 0x00, 0x02, 0x01, 0x2d, 0x30, 0x15,
0x80, 0x07, 0x91, 0x64, 0x21, 0x92, 0x05, 0x31,
0x74, 0x81, 0x01, 0x01, 0x82, 0x07, 0x91, 0x64,
0x21, 0x00, 0x00, 0x90, 0x02
};
// Initializing ...
TCAP_TCMessage_t _pdu, *pdu = &_pdu;
memset(pdu, 0, sizeof(*pdu));
// Decoding:
asn_dec_rval_t dec_ret = ber_decode(NULL, &asn_DEF_TCAP_TCMessage, (void **) &pdu, packet_bytes, sizeof(packet_bytes));
While the message type ("Begin", in this case), is correctly detected, but other paramters are not parsed.
Using other encoding rules, i.e., aper_decode() and uper_decode(), also fails.
I would be thankful if anyone can describe how to use the auto-generated c code for decoding (parsing) a byte string of TCAP messages.

#Vasil, thank you very much for your answer.
Which asn1c are you using (git commit id) and where do you get it
from as there are quite a log of forks out there?
I use the mouse07410's branch.
How do you know that Begin is correctly detected?
From the field present of the pdu variable that is evaluated by ber_decode (you can see the pdu type in the sample code).
From the "Wireshark" output for this byte stream, I know that the correct type of the message is Begin.
You could try compiling with -DASN_EMIT_DEBUG=1 in CFLAGS (or
-DEMIT_ASN_DEBUG=1 depending on the asn1c version you are using) to get some more debug messages.
Thanks for providing the hint; it was helpful.
The problem was related to the asn1 files I was using.
I used osmocom asn1 files and compiled them by
ASN=../asn
asn1c $ASN/DialoguePDUs.asn $ASN/tcap.asn $ASN/UnidialoguePDUs.asn
in which, DialoguePortion is defined as follows (note that the first definition is commented):
--DialoguePortion ::= [APPLICATION 11] EXPLICIT EXTERNAL
-- WS adaptation
DialoguePortion ::= [APPLICATION 11] IMPLICIT DialogueOC
DialogueOC ::= OCTET STRING
To be able to decode TCAP messages,
one needs to use the former definition (as is in the standard), i.e., DialoguePortion should be defined as
DialoguePortion ::= [APPLICATION 11] EXPLICIT EXTERNAL
When using this latter definition in the asn1 file,
and recompiling the asn1 files, the problem solved.
P.S.: This question is also related to my problem.

I am using the c code generated by asn1c from the TCAP protocol specification
Which asn1c are you using (git commit id) and where do you get it from as there are quite a log of forks out there?
While the message type ("Begin", in this case), is correctly detected, but other paramters are not parsed.
How do you know that Begin is correctly detected?
Using other encoding rules, i.e., aper_decode() and uper_decode(), also fails.
There is no point in trying other encodings as they are not binary compatible.
I would be thankful if anyone can describe how to use the auto-generated c code for decoding (parsing) a byte string of TCAP messages.
You are using it correctly and probably there is a bug somewhere in the BER decoder.
You could try compiling with -DASN_EMIT_DEBUG=1 in CFLAGS (or -DEMIT_ASN_DEBUG=1 depending on the asn1c version you are using) to get some more debug messages.

Related

CRC exploring encoding principles(reserve)

We are looking for a solution to the problem of guessing the result value of crc16 with a specific Hex input.
hello. Currently, I am working on estimating the result of crc-16 using specific hex data.
I have figured out the type of input hex value and the crc-16 algorithm, but the result value does not match no matter how the hex value is combined, so I leave a question.
The types of hex values are 0x170, 0xA, 0x00, 0x31
The CRC-16 algorithm used is CRC-16-CCITT XMODEM (Poly = 0x1021, Init = 0x0000).
And the result you want to output is 0x6121 or 0x2161.
It is thought that 0x0170 and 0xA among the above input hex are mixed and divided in some way and input to CRC-16 (for example, after AND operation with 0x017A, division into 0x01 and 0x7A), 0x01, 0x70, 0x0A, 0x00, 0x31 in order Input, 0x31, 0x00, 0x0A, 0x70, 0x01 Even if you change the input order in various ways, such as inputting in reverse order, the result does not come out.
Can you tell me how to find an input sequence or hex input data combination that can solve the above problem?
Waiting for your reply.
thank you

Checksum/CRC reverse engineering of Microsoft NDIS packet

I am trying to decode a 42-byte packet which seems to include 2-byte CRC / checksum field
This is a Microsoft NDIS packet (type IPX) which is sent in HLK (WHQL) tests
I have decoded most parts of the NDIS header but I can't seem to figure out the CRC/Checksum algorithm
Sample of a 45-byte packet (just to explain the decoded fields):
char packet_bytes[] = {
0x02, 0xe4, 0x55, 0xee, 0x12, 0x56, 0x02, 0x93,
0x19, 0x40, 0x89, 0x00, 0x00, 0x1f, 0xaa, 0xaa,
0x03, 0x00, 0x00, 0x00, 0x81, 0x37, 0x4e, 0x44,
0x49, 0x53, 0x01, 0x49, 0x03, 0x00, 0x98, 0xd4,
0x58, 0x55, 0x25, 0xf5, 0x39, 0x00, 0x14, 0x00,
0x00, 0x00, 0x49, 0x4a, 0x4b
};
Raw: 02e455ee1256029319408900001faaaa0300000081374e4449530149030098d4585525f5390014000000494a4b
Decoded fields:
802.2 ethernet header: (Wireshark decoding)
02e455ee1256 : Destination
029319408900 : Source
001f : Length
Logical_link Control: (Wireshark decoding)
aa : DSAP
aa : SSAP
03 : Control
000000 : Organization
8137 : Type (Netware IPX/SPX)
NDIS header: (my estimation for NDIS decoded fields)
4e444953 : NDIS ascii String ("NDIS")
01 : Unknown
49 : payload counter start (first byte of payload, with increasing value afterwards)
0300 : Payload length ( = 0003)
98d4 : test identification number (equal on all packets of the same test)
5855 : Assumed to be checksum
25f53900 : Packet counter ( = 0039f525, Increases gradually per packet)
14000000 : Payload offset ( = 00000014), offset from start of NDIS header to start of payload
494a4b : Payload (3 bytes of increasing counter 49,4a,4b)
To try to understand the checksum algorithm with minimal packet bytes,
I've captured the minimal packets size (42 bytes)
Those packets include the headers above but without payload at all
And tried to reverse eng them using reveng CRC decoder which fail to find any known CRC algorithm
Sample 42-byte packets:
02e455ee1256029319408900001caaaa0300000081374e444953016b000098d495262502000014000000
02e455ee1256029319408900001caaaa0300000081374e44495301a2000098d481ef3802000014000000
02e455ee1256029319408900001caaaa0300000081374e4449530152000098d47f3f3b02000014000000
02e455ee1256029319408900001caaaa0300000081374e44495301d0000098d476c14302000014000000
02e455ee1256029319408900001caaaa0300000081374e44495301f7000098d4539a6602000014000000
02e455ee1256029319408900001caaaa0300000081374e44495301b6000098d444db7502000014000000
02e455ee1256029319408900001caaaa0300000081374e44495301a6000098d431eb8802000014000000
02e455ee1256029319408900001caaaa0300000081374e444953016a000098d40627b402000014000000
Reverse eng the CRC:
reveng.exe -w 16 -s 02e455ee1256029319408900001caaaa0300000081374e444953016b000098d495262502000014000000 02e455ee1256029319408900001caaaa0300000081374e44495301a2000098d481ef3802000014000000 02e455ee1256029319408900001caaaa0300000081374e4449530152000098d47f3f3b02000014000000 02e455ee1256029319408900001caaaa0300000081374e44495301d0000098d476c14302000014000000 02e455ee1256029319408900001caaaa0300000081374e44495301f7000098d4539a6602000014000000 02e455ee1256029319408900001caaaa0300000081374e44495301b6000098d444db7502000014000000 02e455ee1256029319408900001caaaa0300000081374e44495301a6000098d431eb8802000014000000 02e455ee1256029319408900001caaaa0300000081374e444953016a000098d40627b402000014000000
reveng.exe: no models found
Tried reverse eng only the NDIS header part:
4e444953016b000098d495262502000014000000
4e44495301a2000098d481ef3802000014000000
4e4449530152000098d47f3f3b02000014000000
4e44495301d0000098d476c14302000014000000
4e44495301f7000098d4539a6602000014000000
4e44495301b6000098d444db7502000014000000
4e44495301a6000098d431eb8802000014000000
4e444953016a000098d40627b402000014000000
reveng.exe -w 16 -s 4e444953016b000098d495262502000014000000 4e44495301a2000098d481ef3802000014000000 4e4449530152000098d47f3f3b02000014000000 4e44495301d0000098d476c14302000014000000 4e44495301f7000098d4539a6602000014000000 4e44495301b6000098d444db7502000014000000 4e44495301a6000098d431eb8802000014000000 4e444953016a000098d40627b402000014000000
reveng.exe: no models found
Any help would be appreciated.
This seems to be the Internet Checksum, described in RFC 1071, calculated over the NDIS header part of the packet.
In short, you need to add up all of the header contents (except the 16-bit checksum field itself) as 16-bit values, then add the carries (if any) to the least significant 16 bits of the result (thus forming the one's complement sum), and finally, calculate one's complement of this one's complement sum by inverting all bits.
For the example packet you listed, the manual calculation steps would be the following.
Given the whole packet:
02e455ee1256029319408900001faaaa0300000081374e4449530149030098d4585525f5390014000000494a4b
Extract the NDIS header part only, without the payload:
4e4449530149030098d4585525f5390014000000
Split into 16-bit values:
4e44
4953
0149
0300
98d4
5855
25f5
3900
1400
0000
Substitute the checksum field with zeroes:
4e44
4953
0149
0300
98d4
0000
25f5
3900
1400
0000
Add all those 16-bit values together:
1A7A9
Here, the 16 least significant bits are A7A9 and the arithmetic carry is 1. So, add these together (as 16-bit words), to form the so-called one's complement sum:
0001
+ A7A9
= A7AA
Now, invert all bits (apply the bitwise NOT operation), to get the one's complement:
~ A7AA
= 5855
Place this checksum back into the place (which we temporarily zeroed out):
4e44
4953
0149
0300
98d4
5855
25f5
3900
1400
0000
If you only want to check the checksum, do the following.
First, take the original NDIS header (as 16-bit values):
4e44
4953
0149
0300
98d4
5855
25f5
3900
1400
0000
Then sum all of this up:
1FFFE
Again, add the carry to the 16-bit LSB part:
0001
+ FFFE
= FFFF
If all the bits of the result are 1 (i.e., if the result is FFFF), the check is successful.

Export an elliptic curve key from iOS to work with OpenSSL

I have a private/public key pair generated and stored inside Secure Enclave.
It is 256-bit elliptic curve key. (The only key type that can be stored in Secure Enclave).
I use SecKeyCreateWithData and SecKeyCopyExternalRepresentation to import/export the public key between iOS devices, and it works.
However, the exported key doesn't seem to work with OpenSSL.
Because it always show 'unable to load Key' on this command.
openssl ec -pubin -in public_key_file -text
What's the way to export the key ? So I can use it with OpenSSL.
To work with OpenSSL, you need subject public key info (SPKI), either DER or PEM format.
SPKI contains essential information, for example, key.type, key.parameters, key.value.
SecKeyCopyExternalRepresentation only returns raw key binary which is only key.value part.
You have to create SPKI from that key.value. The normal way to do this is to read https://www.rfc-editor.org/rfc/rfc5480, and encode ASN.1 structure to binary-encoded DER format.
But here is a shortcut.
Secure Enclave only supports one key type, 256-bit EC key secp256r1 (equivalent to prime256v1 in OpenSSL).
The SPKI in DER format is a binary encoded data, for example,
3059301306072a8648ce3d020106082a8648ce3d03010703420004fad2e70b0f70f0bf80d7f7cbe8dd4237ca9e59357647e7a7cb90d71a71f6b57869069bcdd24272932c6bdd51895fe2180ea0748c737adecc1cefa3a02022164d
It always consist of two parts
fixed schema header 3059301306072a8648ce3d020106082a8648ce3d030107034200
raw key value 04.......
You can create SPKI by combining these two parts.
spki = fixed_schema_header + SecKeyCopyExternalRepresentation(...)
func createSubjectPublicKeyInfo(rawPublicKeyData: Data) -> Data {
let secp256r1Header = Data(bytes: [
0x30, 0x59, 0x30, 0x13, 0x06, 0x07, 0x2a, 0x86, 0x48, 0xce, 0x3d, 0x02, 0x01, 0x06, 0x08, 0x2a,
0x86, 0x48, 0xce, 0x3d, 0x03, 0x01, 0x07, 0x03, 0x42, 0x00
])
return secp256r1Header + rawPublicKeyData
}
// Usage
let rawPublicKeyData = SecKeyCopyExternalRepresentation(...)!
let publicKeyDER = createSubjectPublicKeyInfo(rawPublicKeyData: rawPublicKeyData)
write(publicKeyDER, to: "public_key.der")
// Test with OpenSSL
// openssl ec -pubin -in public_key.der -text -inform der

Is there a golang library similar to Python's construct? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I really love the Python construct module's declarative syntax for defining bi-directional (binary|text) parser / builders.
I've recently started focusing on golang and was wondering if anyone has seen (or might be the esteemed author of) a similar library for golang.
If you've never used the construct module, you basically build a declarative tree of Python objects that you can feed Python object trees and get binary blobs out, or parse binary blobs into Python object trees.
A simple example from the construct webpage:
>>> PascalString2 = ExprAdapter(PascalString,
... encoder = lambda obj, ctx: Container(length = len(obj), data = obj),
... decoder = lambda obj, ctx: obj.data
... )
>>> PascalString2.parse("\x05hello")
'hello'
>>> PascalString2.build("i'm a long string")
"\x11i'm a long string"
A slightly more complex example from the source that shows a hard drive MBR parser.
mbr = Struct("mbr",
HexDumpAdapter(Bytes("bootloader_code", 446)),
Array(4,
Struct("partitions",
Enum(Byte("state"),
INACTIVE = 0x00,
ACTIVE = 0x80,
),
BitStruct("beginning",
Octet("head"),
Bits("sect", 6),
Bits("cyl", 10),
),
Enum(UBInt8("type"),
Nothing = 0x00,
FAT12 = 0x01,
XENIX_ROOT = 0x02,
XENIX_USR = 0x03,
FAT16_old = 0x04,
Extended_DOS = 0x05,
FAT16 = 0x06,
FAT32 = 0x0b,
FAT32_LBA = 0x0c,
NTFS = 0x07,
LINUX_SWAP = 0x82,
LINUX_NATIVE = 0x83,
_default_ = Pass,
),
BitStruct("ending",
Octet("head"),
Bits("sect", 6),
Bits("cyl", 10),
),
UBInt32("sector_offset"), # offset from MBR in sectors
UBInt32("size"), # in sectors
)
),
Const("signature", b"\x55\xAA"),
)
There's a TCP/IP stack example that really shows how powerful the construct model is, with the ability to have bite-sized blocks of definitions that you combine into a single parser/generator.
I know there are PEG / EBNF parser generators, but I was hoping for something a little prettier to look at.
This isn't the same as Python's Construct package, but there is a version of Yacc for Go:
https://golang.org/cmd/yacc/
Yacc's grammar is similar to EBNF, so it may not meet your criteria, but it's widely used and understood, so I think it's worth mentioning.

Reading temperature and fan speeds on chipset without WMI support in Windows, from Delphi

I have searched and searched but found nothing about how in Delphi, and I am using XE2 how to read sensor information from the Unvoton NCT6776F chip. I am guessing I need some assembly somewhere but there is nothing I can find on how to even begin. Here are the registry details of the chip.
Bus Type = ISAIO
One NCT6776F
Unvoton NCT6776F, IndexReg=A35, DataReg=A36
=============================================================
Fan1 Fan Speed, Bank 6, Offset 0x30, 0x31 RPM = 1350000/(Data=HighByte[12:5], LowByte
[4:0])
Fan2 Fan Speed, Bank 6, Offset 0x32, 0x33 RPM = 1350000/(Data=HighByte[12:5], LowByte
[4:0])
Fan3 Fan Speed, Bank 6, Offset 0x34, 0x35 RPM = 1350000/(Data=HighByte[12:5], LowByte
[4:0])
CPU Voltage, Bank 0, Offset 0x20 Voltage = Data* 0.008
VCCSA Voltage, Bank 0, Offset 0x21 Voltage = Data* 0.008
+3.3V Voltage, Bank 0, Offset 0x22 Voltage = Data* 0.016
Gfx Voltage, Bank 0, Offset 0x24 Voltage = Data* 0.008
+5V Voltage, Bank 0, Offset 0x25 Voltage = Data* 0.008/ (10./40.)
+12V Voltage, Bank 0, Offset 0x26 Voltage = Data* 0.008/ (10./66.2)
3.3VSB Voltage, Bank 5, Offset 0x50 Voltage = Data* 0.016
VBAT Voltage, Bank 5, Offset 0x51 Voltage = Data* 0.016
CPU Temperature, Bank 7, Offset 0x17, 0x18 PECI Count = (Data=HighByte,LowByte<15:6>
hightest bit as sign bit)
High: PECI Count>-15; Midium: -40<PECI Count<=-15; Low: PECI Count<=-40
System Temperature, Bank 0, Offset 0x27 Temperature = Data
Peripheral Temperature, Bank 1, Offset 0x50 Temperature = Data
Chassis Intrusion, Bank 0, Offset 0x42, BitMask 0x10 1 = Bad, 0 = Good
(Clear Bit: Bank 0, Offset 0x46, BitMask 0x80)
Power Supply Failure, NCT6776F, Logical Device 0x0B, CRF7h, BitMask 0x01 0 = Good, 1
= Bad
If anyone has any idea how I can read these addresses and get the required information I would be very grateful. If anyone could post some example code, that would be even better. What I am in fact trying to do is add a Temperature sensor gauge to my server software for monitoring purposes. I need to integrate the data directly and not use a third party application due to the nature of the application I am building.
Thanks.
Alex.
Based on the information on the lm-sensors wiki - the device is accessed using the LPC bus. There is a dedicated GPLed linux driver that can be downloaded to access the device under linux. I would not look at this source if I was planning on an implementation myself because of the possibility of tainting any proprietary code that is written to access the device.
In order to perform peripheral I/O using delphi (as in the inb/outb instructions or their equivalent), you should look at the question how to write to I/O ports in Windows XP

Resources