Finding the Checksum Algorithm in this Serial Communication - checksum

Got a brainteaser for you! :)
I am working on trying to figuring out this serial communication between a microcontroller and a PC for several months and cannot seem to do it. 9600,N81 and I get no frame errors with my capture tool.
Hopefully someone out there can see these samples and it will click with them.
Sample 1
337795, IN,0xF0,0x5,0x62,0x0,0x0,0xA2,0xDE,0xF3,0x75,0xF3, Bytes = 9
337862,OUT,0xF0,0x5,0x63,0x0,0x1,0x1,0x2C,0x92,0xF3,0xF0,0xF3, Bytes = 10
338923, IN,0xF0,0x5,0x63,0x0,0x0,0x7E,0x84,0xF3,0x75,0xF3, Bytes = 9
338990,OUT,0xF0,0x5,0x64,0x0,0x1,0x1,0xD,0xC5,0xF3,0xF0,0xF3, Bytes = 10
340051, IN,0xF0,0x5,0x64,0x0,0x0,0x7B,0x8,0xF3,0x75,0xF3, Bytes = 9
340118,OUT,0xF0,0x5,0x65,0x0,0x1,0x1,0xB6,0xD9,0xF3,0xF0,0xF3, Bytes = 10
340499, IN,0xF0,0x0,0x65,0x5,0x3,0x1,0x1,0x0,0xAB,0xD3,0xF3,0x54,0xF3, Bytes = 12
340572,OUT,0xF0,0x5,0x66,0x0,0x1,0x1,0x7B,0xFC,0xF3,0xF0,0x5,0x66,0x3,0x4,0x4,0x1,0x8,0x3,0x2F,0x9E,0xF3,0xF3, Bytes = 21
340665, IN,0xF0,0x5,0x66,0x0,0x0,0xC3,0xBD,0xF3,0xAB,0xF3, Bytes = 9
340731,OUT,0xF0,0x5,0x67,0x0,0x1,0x1,0xC0,0xE0,0xF3,0xF0,0xF3, Bytes = 10
341794, IN,0xF0,0x5,0x67,0x0,0x0,0x1F,0xE7,0xF3,0xAB,0xF3, Bytes = 9
341860,OUT,0xF0,0x5,0x68,0x0,0x1,0x1,0x39,0x52,0xF3,0xF0,0xF3, Bytes = 10
342923, IN,0xF0,0x5,0x68,0x0,0x0,0xD8,0xAD,0xF3,0xAB,0xF3, Bytes = 9
342989,OUT,0xF0,0x5,0x69,0x0,0x1,0x1,0x82,0x4E,0xF3,0xF0,0xF3, Bytes = 10
344052, IN,0xF0,0x5,0x69,0x0,0x0,0x4,0xF7,0xF3,0xAB,0xF3, Bytes = 9
344118,OUT,0xF0,0x5,0x6A,0x0,0x1,0x1,0x4F,0x6B,0xF3,0xF0,0xF3, Bytes = 10
345180, IN,0xF0,0x5,0x6A,0x0,0x0,0x60,0x18,0xF3,0xAB,0xF3, Bytes = 9
345246,OUT,0xF0,0x5,0x6B,0x0,0x1,0x1,0xF4,0x77,0xF3,0xF0,0xF3, Bytes = 10
345627, IN,0xF0,0x0,0x6B,0x5,0x3,0x1,0x1,0x0,0x9,0xEA,0xF3,0x54,0xF3, Bytes = 12
345700,OUT,0xF0,0x5,0x6C,0x0,0x1,0x1,0xD5,0x20,0xF3,0xF0,0x5,0x6C,0x3,0x4,0x4,0x1,0x8,0x3,0x78,0x77,0xF3,0xF3, Bytes = 21
Sample 2
371435, IN,0xF0,0x5,0x8C,0x0,0x0,0x18,0xC7,0xF3,0x1A,0xF3, Bytes = 9
371502,OUT,0xF0,0x5,0x8D,0x0,0x1,0x1,0xE4,0x88,0xF3,0xF0,0xF3, Bytes = 10
372563, IN,0xF0,0x5,0x8D,0x0,0x0,0xC4,0x9D,0xF3,0x1A,0xF3, Bytes = 9
372630,OUT,0xF0,0x5,0x8E,0x0,0x1,0x1,0x29,0xAD,0xF3,0xF0,0xF3, Bytes = 10
373692, IN,0xF0,0x5,0x8E,0x0,0x0,0xA0,0x72,0xF3,0x1A,0xF3, Bytes = 9
373758,OUT,0xF0,0x5,0x8F,0x0,0x1,0x1,0x92,0xB1,0xF3,0xF0,0xF3, Bytes = 10
374820, IN,0xF0,0x5,0x8F,0x0,0x0,0x7C,0x28,0xF3,0x1A,0xF3, Bytes = 9
374887,OUT,0xF0,0x5,0x90,0x0,0x1,0x1,0xCA,0xC0,0xF3,0xF0,0xF3, Bytes = 10
375949, IN,0xF0,0x5,0x90,0x0,0x0,0x2E,0xE7,0xF3,0x1A,0xF3, Bytes = 9
376015,OUT,0xF0,0x5,0x91,0x0,0x1,0x1,0x71,0xDC,0xF3,0xF0,0xF3, Bytes = 10
376396, IN,0xF0,0x0,0x91,0x5,0x3,0x1,0x1,0x0,0xA4,0x3,0xF3,0xFD,0xF3, Bytes = 12
376469,OUT,0xF0,0x5,0x92,0x0,0x1,0x1,0xBC,0xF9,0xF3,0xF0,0x5,0x92,0x3,0x4,0x4,0x1,0x8,0x3,0x8,0x66,0xF3,0xF3, Bytes = 21
376562, IN,0xF0,0x5,0x92,0x0,0x0,0x96,0x52,0xF3,0xA4,0xF3, Bytes = 9
376628,OUT,0xF0,0x5,0x93,0x0,0x1,0x1,0x7,0xE5,0xF3,0xF0,0xF3, Bytes = 10
377692, IN,0xF0,0x5,0x93,0x0,0x0,0x4A,0x8,0xF3,0xA4,0xF3, Bytes = 9
377758,OUT,0xF0,0x5,0x94,0x0,0x1,0x1,0x26,0xB2,0xF3,0xF0,0xF3, Bytes = 10
378820, IN,0xF0,0x5,0x94,0x0,0x0,0x4F,0x84,0xF3,0xA4,0xF3, Bytes = 9
378887,OUT,0xF0,0x5,0x95,0x0,0x1,0x1,0x9D,0xAE,0xF3,0xF0,0xF3, Bytes = 10
379949, IN,0xF0,0x5,0x95,0x0,0x0,0x93,0xDE,0xF3,0xA4,0xF3, Bytes = 9
380015,OUT,0xF0,0x5,0x96,0x0,0x1,0x1,0x50,0x8B,0xF3,0xF0,0xF3, Bytes = 10
381077, IN,0xF0,0x5,0x96,0x0,0x0,0xF7,0x31,0xF3,0xA4,0xF3, Bytes = 9
381144,OUT,0xF0,0x5,0x97,0x0,0x1,0x1,0xEB,0x97,0xF3,0xF0,0xF3, Bytes = 10
381523, IN,0xF0,0x0,0x97,0x5,0x3,0x1,0x1,0x0,0x5E,0x1B,0x5B,0xF3,0xF3, Bytes = 12
I have more, if desired.
Some observations I've been able to see -
First, the numbers at the begging are aggregate timings in I think
nanoseconds. The 33* is 33 seconds.
Beginning byte 0xF0
Ending byte 0xF3
Sequence byte, the third byte of every packet increments by 1
It seems the destination device (noted as OUT) increments first and the
host (noted as IN) follows..
For the most part OUT packets are 10 bytes and IN packets are 9 bytes.
There are some times when this is not true and the packet can be 21 bytes..
Although I do notice a begin byte and end byte next to each other in the string of 21 bytes, the third byte (sequence#) does not increment.
I am not sure how to understand these longer packets.
This is a point to point communication, there are no other devices connected between these 2.
It is very chatty.
During my testing and probing, my test leads slipped between a couple of pins and killed the microcontroller. Thinking it would be a great project (and it is), I am attempting to recreate the functions of the original microcontroller. Which, I pretty much have done with the exception of the communication, figuring out what they are talking about and the checksum. I assume the second to the last byte is the checksum.
Thank you!

By omitting the begin (0xF0) and end (0xF3) bytes, Reversed CRC-CCITT is used to calculate the checksum.
Thanks to this website, I pasted in the bytes and found it - https://www.scadacore.com/tools/programming-calculators/online-checksum-calculator/

Related

Decode UDP message with LUA

I'm relatively new to lua and programming in general (self taught), so please be gentle!
Anyway, I wrote a lua script to read a UDP message from a game. The structure of the message is:
DATAxXXXXaaaaBBBBccccDDDDeeeeFFFFggggHHHH
DATAx = 4 letter ID and x = control character
XXXX = integer shows the group of the data (groups are known)
aaaa...HHHHH = 8 single-precision floating point numbers
The last ones is those numbers I need to decode.
If I print the message as received, it's something like:
DATA*{V???A?A?...etc.
Using string.byte(), I'm getting a stream of bytes like this (I have "formatted" the bytes to reflect the structure above.
68 65 84 65/42/20 0 0 0/237 222 28 66/189 59 182 65/107 42 41 65/33 173 79 63/0 0 128 63/146 41 41 65/0 0 30 66/0 0 184 65
The first 5 bytes are of course the DATA*. The next 4 are the 20th group of data. The next bytes, the ones I need to decode, and are equal to those values:
237 222 28 66 = 39.218
189 59 182 65 = 22.779
107 42 41 65 = 10.573
33 173 79 63 = 0.8114
0 0 128 63 = 1.0000
146 41 41 65 = 10.573
0 0 30 66 = 39.500
0 0 184 65 = 23.000
I've found C# code that does the decode with BitConverter.ToSingle(), but I haven't found any like this for Lua.
Any idea?
What Lua version do you have?
This code works in Lua 5.3
local str = "DATA*\20\0\0\0\237\222\28\66\189\59\182\65..."
-- Read two float values starting from position 10 in the string
print(string.unpack("<ff", str, 10)) --> 39.217700958252 22.779169082642 18
-- 18 (third returned value) is the next position in the string
For Lua 5.1 you have to write special function (or steal it from François Perrad's git repo )
local function binary_to_float(str, pos)
local b1, b2, b3, b4 = str:byte(pos, pos+3)
local sign = b4 > 0x7F and -1 or 1
local expo = (b4 % 0x80) * 2 + math.floor(b3 / 0x80)
local mant = ((b3 % 0x80) * 0x100 + b2) * 0x100 + b1
local n
if mant + expo == 0 then
n = sign * 0.0
elseif expo == 0xFF then
n = (mant == 0 and sign or 0) / 0
else
n = sign * (1 + mant / 0x800000) * 2.0^(expo - 0x7F)
end
return n
end
local str = "DATA*\20\0\0\0\237\222\28\66\189\59\182\65..."
print(binary_to_float(str, 10)) --> 39.217700958252
print(binary_to_float(str, 14)) --> 22.779169082642
It’s little-endian byte-order of IEEE-754 single-precision binary:
E.g., 0 0 128 63 is:
00111111 10000000 00000000 00000000
(63) (128) (0) (0)
Why that equals 1 requires that you understand the very basics of IEEE-754 representation, namely its use of an exponent and mantissa. See here to start.
See #Egor‘s answer above for how to use string.unpack() in Lua 5.3 and one possible implementation you could use in earlier versions.

NSAttributedString on UILabel in swift 4 crashes. Why? [duplicate]

I'm getting runtime failures whenever I try to set underline or strikethrough attributes on the attributedText of a UITextView. Other properties, like font and background color have no problems. Here's a code snippet. All I have is a test project with a single UITextView. I'm on iOS 11.
class ViewController: UIViewController {
#IBOutlet weak var myTextView: UITextView!
var myMutableString = NSMutableAttributedString(string: "")
override func viewDidLoad() {
super.viewDidLoad()
let string = "The cow jumped over the moon" as NSString
myMutableString = NSMutableAttributedString(string: string as String)
let underlineAttribute = [NSAttributedStringKey.underlineStyle: NSUnderlineStyle.styleSingle]
myMutableString.addAttributes(underlineAttribute, range: string.range(of: "cow"))
myTextView.attributedText = myMutableString
}
}
This results in the text disappearing and some errors printed:
2017-11-05 19:43:34.708830-0800 UITextView_Test[11771:907014]
-[_SwiftValue _getValue:forType:]: unrecognized selector sent to instance 0x60c00004eb50 2017-11-05 19:43:34.709351-0800 UITextView_Test[11771:907014] <NSATSTypesetter: 0x60800016be80>: Exception -[_SwiftValue _getValue:forType:]: unrecognized selector sent to instance 0x60c00004eb50 raised during typesetting layout manager <NSLayoutManager: 0x6080001fe400>
1 containers, text backing has 28 characters
Currently holding 28 glyphs.
Glyph tree contents: 28 characters, 28 glyphs, 1 nodes, 64 node bytes, 64 storage bytes, 128 total bytes, 4.57 bytes per character,
4.57 bytes per glyph
Layout tree contents: 28 characters, 28 glyphs, 0 laid glyphs, 0 laid line fragments, 1 nodes, 64 node bytes, 0 storage bytes, 64 total bytes, 2.29 bytes per character, 2.29 bytes per glyph, 0.00 laid glyphs per laid line fragment, 0.00 bytes per laid line fragment , glyph range {0 28}. Ignoring... 2017-11-05 19:43:34.709827-0800 UITextView_Test[11771:907014] -[_SwiftValue _getValue:forType:]: unrecognized selector sent to instance 0x60c00004eb50 2017-11-05 19:43:34.710014-0800 UITextView_Test[11771:907014] <NSATSTypesetter: 0x60800016be80>: Exception -[_SwiftValue _getValue:forType:]: unrecognized selector sent to instance 0x60c00004eb50 raised during typesetting layout manager <NSLayoutManager: 0x6080001fe400>
1 containers, text backing has 28 characters
Currently holding 28 glyphs.
Glyph tree contents: 28 characters, 28 glyphs, 1 nodes, 64 node bytes, 64 storage bytes, 128 total bytes, 4.57 bytes per character,
4.57 bytes per glyph
Layout tree contents: 28 characters, 28 glyphs, 0 laid glyphs, 0 laid line fragments, 1 nodes, 64 node bytes, 0 storage bytes, 64 total bytes, 2.29 bytes per character, 2.29 bytes per glyph, 0.00 laid glyphs per laid line fragment, 0.00 bytes per laid line fragment , glyph range {0 28}. Ignoring...
Your underline style is invalid. Change this line:
let underlineAttribute =
[NSAttributedStringKey.underlineStyle: NSUnderlineStyle.styleSingle]
to this:
let underlineAttribute =
[NSAttributedStringKey.underlineStyle: NSUnderlineStyle.styleSingle.rawValue]
Result:

UITextView – can't set underline or strikethrough attributes on text?

I'm getting runtime failures whenever I try to set underline or strikethrough attributes on the attributedText of a UITextView. Other properties, like font and background color have no problems. Here's a code snippet. All I have is a test project with a single UITextView. I'm on iOS 11.
class ViewController: UIViewController {
#IBOutlet weak var myTextView: UITextView!
var myMutableString = NSMutableAttributedString(string: "")
override func viewDidLoad() {
super.viewDidLoad()
let string = "The cow jumped over the moon" as NSString
myMutableString = NSMutableAttributedString(string: string as String)
let underlineAttribute = [NSAttributedStringKey.underlineStyle: NSUnderlineStyle.styleSingle]
myMutableString.addAttributes(underlineAttribute, range: string.range(of: "cow"))
myTextView.attributedText = myMutableString
}
}
This results in the text disappearing and some errors printed:
2017-11-05 19:43:34.708830-0800 UITextView_Test[11771:907014]
-[_SwiftValue _getValue:forType:]: unrecognized selector sent to instance 0x60c00004eb50 2017-11-05 19:43:34.709351-0800 UITextView_Test[11771:907014] <NSATSTypesetter: 0x60800016be80>: Exception -[_SwiftValue _getValue:forType:]: unrecognized selector sent to instance 0x60c00004eb50 raised during typesetting layout manager <NSLayoutManager: 0x6080001fe400>
1 containers, text backing has 28 characters
Currently holding 28 glyphs.
Glyph tree contents: 28 characters, 28 glyphs, 1 nodes, 64 node bytes, 64 storage bytes, 128 total bytes, 4.57 bytes per character,
4.57 bytes per glyph
Layout tree contents: 28 characters, 28 glyphs, 0 laid glyphs, 0 laid line fragments, 1 nodes, 64 node bytes, 0 storage bytes, 64 total bytes, 2.29 bytes per character, 2.29 bytes per glyph, 0.00 laid glyphs per laid line fragment, 0.00 bytes per laid line fragment , glyph range {0 28}. Ignoring... 2017-11-05 19:43:34.709827-0800 UITextView_Test[11771:907014] -[_SwiftValue _getValue:forType:]: unrecognized selector sent to instance 0x60c00004eb50 2017-11-05 19:43:34.710014-0800 UITextView_Test[11771:907014] <NSATSTypesetter: 0x60800016be80>: Exception -[_SwiftValue _getValue:forType:]: unrecognized selector sent to instance 0x60c00004eb50 raised during typesetting layout manager <NSLayoutManager: 0x6080001fe400>
1 containers, text backing has 28 characters
Currently holding 28 glyphs.
Glyph tree contents: 28 characters, 28 glyphs, 1 nodes, 64 node bytes, 64 storage bytes, 128 total bytes, 4.57 bytes per character,
4.57 bytes per glyph
Layout tree contents: 28 characters, 28 glyphs, 0 laid glyphs, 0 laid line fragments, 1 nodes, 64 node bytes, 0 storage bytes, 64 total bytes, 2.29 bytes per character, 2.29 bytes per glyph, 0.00 laid glyphs per laid line fragment, 0.00 bytes per laid line fragment , glyph range {0 28}. Ignoring...
Your underline style is invalid. Change this line:
let underlineAttribute =
[NSAttributedStringKey.underlineStyle: NSUnderlineStyle.styleSingle]
to this:
let underlineAttribute =
[NSAttributedStringKey.underlineStyle: NSUnderlineStyle.styleSingle.rawValue]
Result:

Understand NAL Unit of h.264 stream

NAL Units start code: 00 00 00 01 X Y
X = IDR Picture NAL Units (25, 45, 65)
X = Non IDR Picture NAL Units (01, 21, 41, 61) ; 01 = b-frames, 41 = p-frames
What does 61 mean?
"01 = b-frames, 41 = p-frames" this is incorrect
Specification is available online for free: http://www.itu.int/rec/T-REC-H.264
Similar question was here just a few days ago: Non IDR Picture NAL Units - 0x21 and 0x61 meaning

Golang append memory allocation VS. STL push_back memory allocation

I compared the Go append function and the STL vector.push_back and found that different memory allocation strategy which confused me. The code is as follow:
// CPP STL code
void getAlloc() {
vector<double> arr;
int s = 9999999;
int precap = arr.capacity();
for (int i=0; i<s; i++) {
if (precap < i) {
arr.push_back(rand() % 12580 * 1.0);
precap = arr.capacity();
printf("%d %p\n", precap, &arr[0]);
} else {
arr.push_back(rand() % 12580 * 1.0);
}
}
printf("\n");
return;
}
// Golang code
func getAlloc() {
arr := []float64{}
size := 9999999
pre := cap(arr)
for i:=0; i<size; i++ {
if pre < i {
arr = append(arr, rand.NormFloat64())
pre = cap(arr)
log.Printf("%d %p\n", pre, &arr)
} else {
arr = append(arr, rand.NormFloat64())
}
}
return;
}
But the memory address is invarient to the increment of size expanding, this really confused me.
By the way, the memory allocation strategy is different in this two implemetation (STL VS. Go), I mean the expanding size. Is there any advantage or disadvantage? Here is the simplified output of code above[size and first element address]:
Golang CPP STL
2 0xc0800386c0 2 004B19C0
4 0xc0800386c0 4 004AE9B8
8 0xc0800386c0 6 004B29E0
16 0xc0800386c0 9 004B2A18
32 0xc0800386c0 13 004B2A68
64 0xc0800386c0 19 004B2AD8
128 0xc0800386c0 28 004B29E0
256 0xc0800386c0 42 004B2AC8
512 0xc0800386c0 63 004B2C20
1024 0xc0800386c0 94 004B2E20
1280 0xc0800386c0 141 004B3118
1600 0xc0800386c0 211 004B29E0
2000 0xc0800386c0 316 004B3080
2500 0xc0800386c0 474 004B3A68
3125 0xc0800386c0 711 004B5FD0
3906 0xc0800386c0 1066 004B7610
4882 0xc0800386c0 1599 004B9768
6102 0xc0800386c0 2398 004BC968
7627 0xc0800386c0 3597 004C1460
9533 0xc0800386c0 5395 004B5FD0
11916 0xc0800386c0 8092 004C0870
14895 0xc0800386c0 12138 004D0558
18618 0xc0800386c0 18207 004E80B0
23272 0xc0800386c0 27310 0050B9B0
29090 0xc0800386c0 40965 004B5FD0
36362 0xc0800386c0 61447 00590048
45452 0xc0800386c0 92170 003B0020
56815 0xc0800386c0 138255 00690020
71018 0xc0800386c0 207382 007A0020
....
UPDATE:
See comments for Golang memory allocation strategy.
For STL, the strategy depends on the implementation. See this post for further information.
Your Go and C++ code fragments are not equivalent. In the C++ function, you are printing the address of the first element in the vector, while in the Go example you are printing the address of the slice itself.
Like a C++ std::vector, a Go slice is a small data type that holds a pointer to an underlying array that holds the data. That data structure has the same address throughout the function. If you want the address of the first element in the slice, you can use the same syntax as in C++: &arr[0].
You're getting the pointer to the slice header, not the actual backing array. You can think of the slice header as a struct like
type SliceHeader struct {
len,cap int
backingArray unsafe.Pointer
}
When you append and the backing array is reallocated, the pointer backingArray will likely be changed (not necessarily, but probably). However, the location of the struct holding the length, cap, and pointer to the backing array doesn't change -- it's still on the stack right where you declared it. Try printing &arr[0] instead of &arr and you should see behavior closer to what you expect.
This is pretty much the same behavior as std::vector, incidentally. Think of a slice as closer to a vector than a magic dynamic array.

Resources