UITextView – can't set underline or strikethrough attributes on text? - ios

I'm getting runtime failures whenever I try to set underline or strikethrough attributes on the attributedText of a UITextView. Other properties, like font and background color have no problems. Here's a code snippet. All I have is a test project with a single UITextView. I'm on iOS 11.
class ViewController: UIViewController {
#IBOutlet weak var myTextView: UITextView!
var myMutableString = NSMutableAttributedString(string: "")
override func viewDidLoad() {
super.viewDidLoad()
let string = "The cow jumped over the moon" as NSString
myMutableString = NSMutableAttributedString(string: string as String)
let underlineAttribute = [NSAttributedStringKey.underlineStyle: NSUnderlineStyle.styleSingle]
myMutableString.addAttributes(underlineAttribute, range: string.range(of: "cow"))
myTextView.attributedText = myMutableString
}
}
This results in the text disappearing and some errors printed:
2017-11-05 19:43:34.708830-0800 UITextView_Test[11771:907014]
-[_SwiftValue _getValue:forType:]: unrecognized selector sent to instance 0x60c00004eb50 2017-11-05 19:43:34.709351-0800 UITextView_Test[11771:907014] <NSATSTypesetter: 0x60800016be80>: Exception -[_SwiftValue _getValue:forType:]: unrecognized selector sent to instance 0x60c00004eb50 raised during typesetting layout manager <NSLayoutManager: 0x6080001fe400>
1 containers, text backing has 28 characters
Currently holding 28 glyphs.
Glyph tree contents: 28 characters, 28 glyphs, 1 nodes, 64 node bytes, 64 storage bytes, 128 total bytes, 4.57 bytes per character,
4.57 bytes per glyph
Layout tree contents: 28 characters, 28 glyphs, 0 laid glyphs, 0 laid line fragments, 1 nodes, 64 node bytes, 0 storage bytes, 64 total bytes, 2.29 bytes per character, 2.29 bytes per glyph, 0.00 laid glyphs per laid line fragment, 0.00 bytes per laid line fragment , glyph range {0 28}. Ignoring... 2017-11-05 19:43:34.709827-0800 UITextView_Test[11771:907014] -[_SwiftValue _getValue:forType:]: unrecognized selector sent to instance 0x60c00004eb50 2017-11-05 19:43:34.710014-0800 UITextView_Test[11771:907014] <NSATSTypesetter: 0x60800016be80>: Exception -[_SwiftValue _getValue:forType:]: unrecognized selector sent to instance 0x60c00004eb50 raised during typesetting layout manager <NSLayoutManager: 0x6080001fe400>
1 containers, text backing has 28 characters
Currently holding 28 glyphs.
Glyph tree contents: 28 characters, 28 glyphs, 1 nodes, 64 node bytes, 64 storage bytes, 128 total bytes, 4.57 bytes per character,
4.57 bytes per glyph
Layout tree contents: 28 characters, 28 glyphs, 0 laid glyphs, 0 laid line fragments, 1 nodes, 64 node bytes, 0 storage bytes, 64 total bytes, 2.29 bytes per character, 2.29 bytes per glyph, 0.00 laid glyphs per laid line fragment, 0.00 bytes per laid line fragment , glyph range {0 28}. Ignoring...

Your underline style is invalid. Change this line:
let underlineAttribute =
[NSAttributedStringKey.underlineStyle: NSUnderlineStyle.styleSingle]
to this:
let underlineAttribute =
[NSAttributedStringKey.underlineStyle: NSUnderlineStyle.styleSingle.rawValue]
Result:

Related

Finding the Checksum Algorithm in this Serial Communication

Got a brainteaser for you! :)
I am working on trying to figuring out this serial communication between a microcontroller and a PC for several months and cannot seem to do it. 9600,N81 and I get no frame errors with my capture tool.
Hopefully someone out there can see these samples and it will click with them.
Sample 1
337795, IN,0xF0,0x5,0x62,0x0,0x0,0xA2,0xDE,0xF3,0x75,0xF3, Bytes = 9
337862,OUT,0xF0,0x5,0x63,0x0,0x1,0x1,0x2C,0x92,0xF3,0xF0,0xF3, Bytes = 10
338923, IN,0xF0,0x5,0x63,0x0,0x0,0x7E,0x84,0xF3,0x75,0xF3, Bytes = 9
338990,OUT,0xF0,0x5,0x64,0x0,0x1,0x1,0xD,0xC5,0xF3,0xF0,0xF3, Bytes = 10
340051, IN,0xF0,0x5,0x64,0x0,0x0,0x7B,0x8,0xF3,0x75,0xF3, Bytes = 9
340118,OUT,0xF0,0x5,0x65,0x0,0x1,0x1,0xB6,0xD9,0xF3,0xF0,0xF3, Bytes = 10
340499, IN,0xF0,0x0,0x65,0x5,0x3,0x1,0x1,0x0,0xAB,0xD3,0xF3,0x54,0xF3, Bytes = 12
340572,OUT,0xF0,0x5,0x66,0x0,0x1,0x1,0x7B,0xFC,0xF3,0xF0,0x5,0x66,0x3,0x4,0x4,0x1,0x8,0x3,0x2F,0x9E,0xF3,0xF3, Bytes = 21
340665, IN,0xF0,0x5,0x66,0x0,0x0,0xC3,0xBD,0xF3,0xAB,0xF3, Bytes = 9
340731,OUT,0xF0,0x5,0x67,0x0,0x1,0x1,0xC0,0xE0,0xF3,0xF0,0xF3, Bytes = 10
341794, IN,0xF0,0x5,0x67,0x0,0x0,0x1F,0xE7,0xF3,0xAB,0xF3, Bytes = 9
341860,OUT,0xF0,0x5,0x68,0x0,0x1,0x1,0x39,0x52,0xF3,0xF0,0xF3, Bytes = 10
342923, IN,0xF0,0x5,0x68,0x0,0x0,0xD8,0xAD,0xF3,0xAB,0xF3, Bytes = 9
342989,OUT,0xF0,0x5,0x69,0x0,0x1,0x1,0x82,0x4E,0xF3,0xF0,0xF3, Bytes = 10
344052, IN,0xF0,0x5,0x69,0x0,0x0,0x4,0xF7,0xF3,0xAB,0xF3, Bytes = 9
344118,OUT,0xF0,0x5,0x6A,0x0,0x1,0x1,0x4F,0x6B,0xF3,0xF0,0xF3, Bytes = 10
345180, IN,0xF0,0x5,0x6A,0x0,0x0,0x60,0x18,0xF3,0xAB,0xF3, Bytes = 9
345246,OUT,0xF0,0x5,0x6B,0x0,0x1,0x1,0xF4,0x77,0xF3,0xF0,0xF3, Bytes = 10
345627, IN,0xF0,0x0,0x6B,0x5,0x3,0x1,0x1,0x0,0x9,0xEA,0xF3,0x54,0xF3, Bytes = 12
345700,OUT,0xF0,0x5,0x6C,0x0,0x1,0x1,0xD5,0x20,0xF3,0xF0,0x5,0x6C,0x3,0x4,0x4,0x1,0x8,0x3,0x78,0x77,0xF3,0xF3, Bytes = 21
Sample 2
371435, IN,0xF0,0x5,0x8C,0x0,0x0,0x18,0xC7,0xF3,0x1A,0xF3, Bytes = 9
371502,OUT,0xF0,0x5,0x8D,0x0,0x1,0x1,0xE4,0x88,0xF3,0xF0,0xF3, Bytes = 10
372563, IN,0xF0,0x5,0x8D,0x0,0x0,0xC4,0x9D,0xF3,0x1A,0xF3, Bytes = 9
372630,OUT,0xF0,0x5,0x8E,0x0,0x1,0x1,0x29,0xAD,0xF3,0xF0,0xF3, Bytes = 10
373692, IN,0xF0,0x5,0x8E,0x0,0x0,0xA0,0x72,0xF3,0x1A,0xF3, Bytes = 9
373758,OUT,0xF0,0x5,0x8F,0x0,0x1,0x1,0x92,0xB1,0xF3,0xF0,0xF3, Bytes = 10
374820, IN,0xF0,0x5,0x8F,0x0,0x0,0x7C,0x28,0xF3,0x1A,0xF3, Bytes = 9
374887,OUT,0xF0,0x5,0x90,0x0,0x1,0x1,0xCA,0xC0,0xF3,0xF0,0xF3, Bytes = 10
375949, IN,0xF0,0x5,0x90,0x0,0x0,0x2E,0xE7,0xF3,0x1A,0xF3, Bytes = 9
376015,OUT,0xF0,0x5,0x91,0x0,0x1,0x1,0x71,0xDC,0xF3,0xF0,0xF3, Bytes = 10
376396, IN,0xF0,0x0,0x91,0x5,0x3,0x1,0x1,0x0,0xA4,0x3,0xF3,0xFD,0xF3, Bytes = 12
376469,OUT,0xF0,0x5,0x92,0x0,0x1,0x1,0xBC,0xF9,0xF3,0xF0,0x5,0x92,0x3,0x4,0x4,0x1,0x8,0x3,0x8,0x66,0xF3,0xF3, Bytes = 21
376562, IN,0xF0,0x5,0x92,0x0,0x0,0x96,0x52,0xF3,0xA4,0xF3, Bytes = 9
376628,OUT,0xF0,0x5,0x93,0x0,0x1,0x1,0x7,0xE5,0xF3,0xF0,0xF3, Bytes = 10
377692, IN,0xF0,0x5,0x93,0x0,0x0,0x4A,0x8,0xF3,0xA4,0xF3, Bytes = 9
377758,OUT,0xF0,0x5,0x94,0x0,0x1,0x1,0x26,0xB2,0xF3,0xF0,0xF3, Bytes = 10
378820, IN,0xF0,0x5,0x94,0x0,0x0,0x4F,0x84,0xF3,0xA4,0xF3, Bytes = 9
378887,OUT,0xF0,0x5,0x95,0x0,0x1,0x1,0x9D,0xAE,0xF3,0xF0,0xF3, Bytes = 10
379949, IN,0xF0,0x5,0x95,0x0,0x0,0x93,0xDE,0xF3,0xA4,0xF3, Bytes = 9
380015,OUT,0xF0,0x5,0x96,0x0,0x1,0x1,0x50,0x8B,0xF3,0xF0,0xF3, Bytes = 10
381077, IN,0xF0,0x5,0x96,0x0,0x0,0xF7,0x31,0xF3,0xA4,0xF3, Bytes = 9
381144,OUT,0xF0,0x5,0x97,0x0,0x1,0x1,0xEB,0x97,0xF3,0xF0,0xF3, Bytes = 10
381523, IN,0xF0,0x0,0x97,0x5,0x3,0x1,0x1,0x0,0x5E,0x1B,0x5B,0xF3,0xF3, Bytes = 12
I have more, if desired.
Some observations I've been able to see -
First, the numbers at the begging are aggregate timings in I think
nanoseconds. The 33* is 33 seconds.
Beginning byte 0xF0
Ending byte 0xF3
Sequence byte, the third byte of every packet increments by 1
It seems the destination device (noted as OUT) increments first and the
host (noted as IN) follows..
For the most part OUT packets are 10 bytes and IN packets are 9 bytes.
There are some times when this is not true and the packet can be 21 bytes..
Although I do notice a begin byte and end byte next to each other in the string of 21 bytes, the third byte (sequence#) does not increment.
I am not sure how to understand these longer packets.
This is a point to point communication, there are no other devices connected between these 2.
It is very chatty.
During my testing and probing, my test leads slipped between a couple of pins and killed the microcontroller. Thinking it would be a great project (and it is), I am attempting to recreate the functions of the original microcontroller. Which, I pretty much have done with the exception of the communication, figuring out what they are talking about and the checksum. I assume the second to the last byte is the checksum.
Thank you!
By omitting the begin (0xF0) and end (0xF3) bytes, Reversed CRC-CCITT is used to calculate the checksum.
Thanks to this website, I pasted in the bytes and found it - https://www.scadacore.com/tools/programming-calculators/online-checksum-calculator/

Decode UDP message with LUA

I'm relatively new to lua and programming in general (self taught), so please be gentle!
Anyway, I wrote a lua script to read a UDP message from a game. The structure of the message is:
DATAxXXXXaaaaBBBBccccDDDDeeeeFFFFggggHHHH
DATAx = 4 letter ID and x = control character
XXXX = integer shows the group of the data (groups are known)
aaaa...HHHHH = 8 single-precision floating point numbers
The last ones is those numbers I need to decode.
If I print the message as received, it's something like:
DATA*{V???A?A?...etc.
Using string.byte(), I'm getting a stream of bytes like this (I have "formatted" the bytes to reflect the structure above.
68 65 84 65/42/20 0 0 0/237 222 28 66/189 59 182 65/107 42 41 65/33 173 79 63/0 0 128 63/146 41 41 65/0 0 30 66/0 0 184 65
The first 5 bytes are of course the DATA*. The next 4 are the 20th group of data. The next bytes, the ones I need to decode, and are equal to those values:
237 222 28 66 = 39.218
189 59 182 65 = 22.779
107 42 41 65 = 10.573
33 173 79 63 = 0.8114
0 0 128 63 = 1.0000
146 41 41 65 = 10.573
0 0 30 66 = 39.500
0 0 184 65 = 23.000
I've found C# code that does the decode with BitConverter.ToSingle(), but I haven't found any like this for Lua.
Any idea?
What Lua version do you have?
This code works in Lua 5.3
local str = "DATA*\20\0\0\0\237\222\28\66\189\59\182\65..."
-- Read two float values starting from position 10 in the string
print(string.unpack("<ff", str, 10)) --> 39.217700958252 22.779169082642 18
-- 18 (third returned value) is the next position in the string
For Lua 5.1 you have to write special function (or steal it from François Perrad's git repo )
local function binary_to_float(str, pos)
local b1, b2, b3, b4 = str:byte(pos, pos+3)
local sign = b4 > 0x7F and -1 or 1
local expo = (b4 % 0x80) * 2 + math.floor(b3 / 0x80)
local mant = ((b3 % 0x80) * 0x100 + b2) * 0x100 + b1
local n
if mant + expo == 0 then
n = sign * 0.0
elseif expo == 0xFF then
n = (mant == 0 and sign or 0) / 0
else
n = sign * (1 + mant / 0x800000) * 2.0^(expo - 0x7F)
end
return n
end
local str = "DATA*\20\0\0\0\237\222\28\66\189\59\182\65..."
print(binary_to_float(str, 10)) --> 39.217700958252
print(binary_to_float(str, 14)) --> 22.779169082642
It’s little-endian byte-order of IEEE-754 single-precision binary:
E.g., 0 0 128 63 is:
00111111 10000000 00000000 00000000
(63) (128) (0) (0)
Why that equals 1 requires that you understand the very basics of IEEE-754 representation, namely its use of an exponent and mantissa. See here to start.
See #Egor‘s answer above for how to use string.unpack() in Lua 5.3 and one possible implementation you could use in earlier versions.

NSAttributedString on UILabel in swift 4 crashes. Why? [duplicate]

I'm getting runtime failures whenever I try to set underline or strikethrough attributes on the attributedText of a UITextView. Other properties, like font and background color have no problems. Here's a code snippet. All I have is a test project with a single UITextView. I'm on iOS 11.
class ViewController: UIViewController {
#IBOutlet weak var myTextView: UITextView!
var myMutableString = NSMutableAttributedString(string: "")
override func viewDidLoad() {
super.viewDidLoad()
let string = "The cow jumped over the moon" as NSString
myMutableString = NSMutableAttributedString(string: string as String)
let underlineAttribute = [NSAttributedStringKey.underlineStyle: NSUnderlineStyle.styleSingle]
myMutableString.addAttributes(underlineAttribute, range: string.range(of: "cow"))
myTextView.attributedText = myMutableString
}
}
This results in the text disappearing and some errors printed:
2017-11-05 19:43:34.708830-0800 UITextView_Test[11771:907014]
-[_SwiftValue _getValue:forType:]: unrecognized selector sent to instance 0x60c00004eb50 2017-11-05 19:43:34.709351-0800 UITextView_Test[11771:907014] <NSATSTypesetter: 0x60800016be80>: Exception -[_SwiftValue _getValue:forType:]: unrecognized selector sent to instance 0x60c00004eb50 raised during typesetting layout manager <NSLayoutManager: 0x6080001fe400>
1 containers, text backing has 28 characters
Currently holding 28 glyphs.
Glyph tree contents: 28 characters, 28 glyphs, 1 nodes, 64 node bytes, 64 storage bytes, 128 total bytes, 4.57 bytes per character,
4.57 bytes per glyph
Layout tree contents: 28 characters, 28 glyphs, 0 laid glyphs, 0 laid line fragments, 1 nodes, 64 node bytes, 0 storage bytes, 64 total bytes, 2.29 bytes per character, 2.29 bytes per glyph, 0.00 laid glyphs per laid line fragment, 0.00 bytes per laid line fragment , glyph range {0 28}. Ignoring... 2017-11-05 19:43:34.709827-0800 UITextView_Test[11771:907014] -[_SwiftValue _getValue:forType:]: unrecognized selector sent to instance 0x60c00004eb50 2017-11-05 19:43:34.710014-0800 UITextView_Test[11771:907014] <NSATSTypesetter: 0x60800016be80>: Exception -[_SwiftValue _getValue:forType:]: unrecognized selector sent to instance 0x60c00004eb50 raised during typesetting layout manager <NSLayoutManager: 0x6080001fe400>
1 containers, text backing has 28 characters
Currently holding 28 glyphs.
Glyph tree contents: 28 characters, 28 glyphs, 1 nodes, 64 node bytes, 64 storage bytes, 128 total bytes, 4.57 bytes per character,
4.57 bytes per glyph
Layout tree contents: 28 characters, 28 glyphs, 0 laid glyphs, 0 laid line fragments, 1 nodes, 64 node bytes, 0 storage bytes, 64 total bytes, 2.29 bytes per character, 2.29 bytes per glyph, 0.00 laid glyphs per laid line fragment, 0.00 bytes per laid line fragment , glyph range {0 28}. Ignoring...
Your underline style is invalid. Change this line:
let underlineAttribute =
[NSAttributedStringKey.underlineStyle: NSUnderlineStyle.styleSingle]
to this:
let underlineAttribute =
[NSAttributedStringKey.underlineStyle: NSUnderlineStyle.styleSingle.rawValue]
Result:

decodingTCAP message - dialoguePortion

I'm writing an simulator (for learning purposes) for complete M3UA-SCCP-TCAP-MAP stack (over SCTP). So far M3UA+SCCP stacks are OK.
M3UA Based on the RFC 4666 Sept 2006
SCCP Based on the ITU-T Q.711-Q716
TCAP Based on the ITU-T Q.771-Q775
But upon decoding TCAP part I got lost on dialoguePortion.
TCAP is asn.1 encoded, so everything is tag+len+data.
Wireshark decode it differently than my decoder.
Message is:
62434804102f00676b1e281c060700118605010101a011600f80020780a1090607040000010005036c1ba1190201010201163011800590896734f283010086059062859107
Basically, my message is BER-decoded as
Note: Format: hex(tag) + (BER splitted to CLS+PC+TAG in decimal) + hex(data)
62 ( 64 32 2 )
48 ( 64 0 8 ) 102f0067
6b ( 64 32 11 )
28 ( 0 32 8 )
06 ( 0 0 6 ) 00118605010101 OID=0.0.17.773.1.1.1
a0 ( 128 32 0 )
60 ( 64 32 0 )
80 ( 128 0 0 ) 0780
a1 ( 128 32 1 )
06 ( 0 0 6 ) 04000001000503 OID=0.4.0.0.1.0.5.3
6c ( 64 32 12 )
...
So I can see begin[2] message containing otid[8], dialogPortion[11] and componentPortion[12].
otid and ComponentPortion are decoded correctly. But not dialogPortion.
ASN for dialogPortion does not mention any of these codes.
Even more confusing, wireshark decode it differently (oid-as-dialogue is NOT in the dialoguePortion, but as a field after otid, which is NOT as described in ITU-T documentation - or not as I'm understanding it)
Wireshark decoded Transaction Capabilities Application Part
begin
Source Transaction ID
otid: 102f0067
oid: 0.0.17.773.1.1.1 (id-as-dialogue)
dialogueRequest
Padding: 7
protocol-version: 80 (version1)
1... .... = version1: True
application-context-name: 0.4.0.0.1.0.5.3 (locationInfoRetrievalContext-v3)
components: 1 item
...
I can't find any reference for Padding in dialoguePDU ASN.
Can someone point me in the right direction?
I would like to know how to properly decode this message
DialoguePDU format should be simple in this case:
dialogue-as-id OBJECT IDENTIFIER ::= {itu-t recommendation q 773 as(1) dialogue-as(1) version1(1)}
DialoguePDU ::= CHOICE {
dialogueRequest AARQ-apdu,
dialogueResponse AARE-apdu,
dialogueAbort ABRT-apdu
}
AARQ-apdu ::= [APPLICATION 0] IMPLICIT SEQUENCE {
protocol-version [0] IMPLICIT BIT STRING {version1(0)} DEFAULT {version1},
application-context-name [1] OBJECT IDENTIFIER,
user-information [30] IMPLICIT SEQUENCE OF EXTERNAL OPTIONAL
}
Wireshark is still wrong :-). But then... that is display. It displays values correctly - only in the wrong section. Probably some reason due to easier decoding.
What I was missing was definition of EXTERNAL[8]. DialoguePortion is declared as EXTERNAL...so now everything makes sense.
For your message, my very own decoder says:
begin [APPLICATION 2] (x67)
otid [APPLICATION 8] (x4) =102f0067h
dialoguePortion [APPLICATION 11] (x30)
EXTERNAL (x28)
direct-reference [OBJECT IDENTIFIER] (x7) =00118605010101h
encoding:single-ASN1-type [0] (x17)
dialogueRequest [APPLICATION 0] (x15)
protocol-version [0] (x2) = 80 {version1 (0) } spare bits= 7
application-context-name [1] (x9)
OBJECT IDENTIFIER (x7) =04000001000503h
components [APPLICATION 12] (x27)
invoke [1] (x25)
invokeID [INTEGER] (x1) =1d (01h)
operationCode [INTEGER] (x1) = (22) SendRoutingInfo
parameter [SEQUENCE] (x17)
msisdn [0] (x5) = 90896734f2h
Nature of Address: international number (1)
Numbering Plan Indicator: unknown (0)
signal= 9876432
interrogationType [3] (x1) = (0) basicCall
gmsc-Address [6] (x5) = 9062859107h
Nature of Address: international number (1)
Numbering Plan Indicator: unknown (0)
signal= 26581970
Now, wireshark's padding 7 and my spare bits=7 both refer to the the protocol-version field, defined in Q.773 as:
AARQ-apdu ::= [APPLICATION 0] IMPLICIT SEQUENCE {
protocol-version [0] IMPLICIT BIT STRING { version1 (0) }
DEFAULT { version1 },
application-context-name [1] OBJECT IDENTIFIER,
user-information [30] IMPLICIT SEQUENCE OF EXTERNAL
OPTIONAL }
the BIT STRING definition, assigns name to just the leading bit (version1)... the rest (7 bits) are not given a name and wireshark consider them as padding

Getting a 2D histogram of a grayscale image in Julia

Using the Images package, I can open up a color image, convert it to Gray scale and then :
using Images
img_gld = imread("...path to some color jpg...")
img_gld_gs = convert(Image{Gray},img_gld)
#change from floats to Array of values between 0 and 255:
img_gld_gs = reinterpret(Uint8,data(img_gld_gs))
Now I've got a 1920X1080 array of Uint8's:
julia> img_gld_gs
1920x1080 Array{Uint8,2}
Now I want to get a histogram of the 2D array of Uint8 values:
julia> hist(img_gld_gs)
(0.0:50.0:300.0,
6x1080 Array{Int64,2}:
1302 1288 1293 1302 1297 1300 1257 1234 … 12 13 13 12 13 15 14
618 632 627 618 623 620 663 686 189 187 187 188 185 183 183
0 0 0 0 0 0 0 0 9 9 8 7 8 7 7
0 0 0 0 0 0 0 0 10 12 9 7 13 7 9
0 0 0 0 0 0 0 0 1238 1230 1236 1235 1230 1240 1234
0 0 0 0 0 0 0 0 … 462 469 467 471 471 468 473)
But, instead of 6x1080, I'd like 256 slots in the histogram to show total number of times each value has appeared. I tried:
julia> hist(img_gld_gs,256)
But that gives:
(2.0:1.0:252.0,
250x1080 Array{Int64,2}:
So instead of a 256x1080 Array, it's 250x1080. Is there any way to force it to have 256 bins (without resorting to writing my own hist function)? I want to be able to compare different images and I want the histogram for each image to have the same number of bins.
Assuming you want a histogram for the entire image (rather than one per row), you might want
hist(vec(img_gld_gs), -1:255)
which first converts the image to a 1-dimensional vector. (You can also use img_gld_gs[:], but that copies the data.)
Also note the range here: the hist function uses a left-open interval, so it will omit counting zeros unless you use something smaller than 0.
hist also accepts a vector (or range) as an optional argument that specifies the edge boundaries, so
hist(img_gld_gs, 0:256)
should work.

Resources