Swift InputStream status does not match any possible enum values - ios

I have an InputStream in my iOS app (Swift 4) and check the status a few seconds after opening it. The returned value is 7
print(String(self.inputStream!.streamStatus.rawValue)) // prints 7
But there is no corresponding entry with the value 7 in the enum described in the docs: https://developer.apple.com/documentation/foundation/stream.status
How can the value be 7? How can I get a textual representation of the status?
My guess was that the 7 is maybe a combination of other values that can be obtained by interpreting the 7 binary and looking which bits are 1. So in this case the first 3 bits are 1 (decimal 1,2,4) and they belong to (opening=1, open=2, writing=4). But this seems strange to me
EDIT: As Martin said in the comments, 7 is error.

Related

NodeMCU Lua integer max value is 2^31

Lua 5.1.4 on SDK 3.0.1-dev(fce080e)
Trying to use node.dsleepMax() and it is returning a really smaller number (147324921). Then I tried to manually set the sleep time in node.dsleep to the 32-bit max value (4294967295) and it only remained sleeping for around 30 min or so.
Tried the following:
sleeptime = 4294967295
>
=print(sleeptime)
2147483647
which is 2^31 -1.
Also did a loop adding to a variable, and it becomes negatve when it reaches 2^31.
Questions:
Why is the variable wrapping at 2^31?
Isn't node.dsleep supposed to accept a 64-bit value with SDK 2.1 and above?
Regards,
Cesar
You already got some feedback regarding int vs. float. As for dsleep the documentation doesn't explicitly state that it accepts 64bit values but that's indeed what's happening as per https://github.com/nodemcu/nodemcu-firmware/pull/2358 (since April 2018).

No '...' candidates produce 'Range<String.Index>'

While converting an old iOS app to Sift 3.0 I hit the following issue:
The code is:
cutRange = numberString.index(numberString.startIndex, offsetBy:2)...numberString.index(numberString.startIndex, offsetBy:5)
The error message I get is:
No '...' candidates produce the expected contextual result type 'Range<String.Index>' (aka 'Range<String.CharacterView.Index>')
I have seen a few post related to the subject, but was not very satisfied.
So what is the simplest way to solve this problem?
In Swift 3, two range operators generate different results:
closed range operator ... -> ClosedRange (by default)
(half open) range operator ..< -> Range (by default)
So, assuming your cutRange is declared as Range<String.Index>, you need to use half open range operator ..<:
cutRange = numberString.index(numberString.startIndex, offsetBy:2)..<numberString.index(numberString.startIndex, offsetBy:6)
(Please do not miss the last offset is changed to 6.)

Irregularities in Gforth's conversion to doubles

I'm fairly confused about how the s>d and d>s functions work in Forth.
From what I've read, typing 16.0 will put 160 0 on the stack (since it takes up two cells) and d. will show 160.
Now, if I enter 16 s>d I would expect the stack to be 160 0 and d. to show 160 like in the previous example. However, the stack is 16 0 and d. is 16.
Am I entering doubles incorrectly? Is s>d not as simple as "convert a single celled value into a double celled value? Is there any reason for this irregularity? Any clues would be much appreciated.
Gforth interpets all of these the same: 1.60, 16.0, and 160., i.e. 160 converted to a double number. Whereas 16 s>d converts 16 to a double number.
ANS Forth only mandates that when the text interpreter processes a number that is immediately followed by a decimal point and is not found as a definition name, the text interpreter shall convert it to a double-cell number. But Gforth goes beoynd that: http://www.complang.tuwien.ac.at/forth/gforth/Docs-html/Number-Conversion.html#Number-Conversion

Understanding call x"91"

Can someone please help me understand call x"91" function 11 and function 12 with simple example. I have tried to search and couldn't understand it. Right now I an using this code in COBOL under UNIX environment,Does this call works in windows environment as well?
http://opencobol.add1tocobol.com/#what-are-the-xf4-xf5-and-x91-routines
The CALL's X"F4", X"F5", X"91" are from MF.
You can find them in the online MF doc under
Library Routines.
F4/F5 are for packing/unpacking bits from/to bytes.
91 is a multi-use call. Implemented are the subfunctions
get/set cobol switches (11, 12) and get number of call params (16).
Use
CALL X"F4" USING
BYTE-VAR
ARRAY-VAR
RETURNING STATUS-VAR
to pack the last bit of each byte in the 8 byte ARRAY-VAR into corresponding bits of the 1 byte BYTE-VAR.
The X”F5” routine takes the eight bits of byte and moves them to the corresponding occurrence within array.
X”91” is a multi-function routine.
CALL X"91" USING
RESULT-VAR
FUNCTION-NUM
PARAMETER-VAR
RETURNING STATUS-VAR
As mentioned by Roger, OpenCOBOL supports FUNCTION-NUM of 11, 12 and 16.
11 and 12 get and set the on off status of the 8 (eight) run-time OpenCOBOL switches definable in the SPECIAL-NAMES paragraph. 16 returns the number of call parameters given to the current module.
x'91' is a general library routine, for a complete list of those see the MF documentation.
This documentation also specifies what its function 11 and function 12 do: they set/read the COBOL runtime switches 0-7 and the internal debugging mode switch.
Other than these library routines you can also read them one by one from COBOL and set "some" switches via the SET statement.

iOS SecItemCopyMatching RSA public key format?

I'm trying to extract a 1024-bit RSA public key from an already generated key pair (two SecKeyRefs), in order to send it over the wire. All I need is a plain (modulus, exponent) pair, which should take up exactly 131 bytes (128 for the modulus and 3 for the exponent).
However, when I fetch the key info as a NSData object, I get 140 bits instead of 131. Here's an example result:
<30818902 818100d7 514f320d eacf48e1 eb64d8f9 4d212f77 10dd3b48 ba38c5a6
ed6ba693 35bb97f5 a53163eb b403727b 91c34fc8 cba51239 3ab04f97 dab37736
0377cdc3 417f68eb 9e351239 47c1f98f f4274e05 0d5ce1e9 e2071d1b 69a7cac4
4e258765 6c249077 dba22ae6 fc55f0cf 834f260a 14ac2e9f 070d17aa 1edd8db1
0cd7fd4c c2f0d302 03010001>
After retrying the key generation a couple of times and comparing the resulting NSData objects, the bytes that remain the same for all keys are the first 7:
<30818902 818100>
The last three bytes look like the exponent (65537, a common value). There are also two bytes between the "modulus" and the exponent:
<0203>
Can someone with more crypto experience help me identify what encoding is this? DER? How do I properly decode the modulus and exponent?
I tried manually stripping out the modulus and exponent using
NSData* modulus = [keyBits subdataWithRange:(NSRange){ 7, 128 }];
NSData* exponent = [keyBits subdataWithRange:(NSRange){ 7 + 128 + 2, 3 }];
but I get errors when trying to decrypt data which the remote host encoded using that "key".
EDIT:
Here's a gist of the solution I ended up using to unpack the RSA blob: https://gist.github.com/vl4dimir/6079882
Assuming you want the solution to work under iOS, please have a look at this thread. The post confirms that the encoding is DER and shows how to extract the exponent and modulus from the NSData object you started with.
There is another solution that won't work on iOS, but will work on Desktop systems (including MacOS X) that have OpenSSL installed in this thread. Even if you are looking for the iOS-only solution you can still use this to verify your code is working correctly.

Resources