How to store long in a Swift array? - ios

I got a couple user IDs I want to send in an array, but can't figure out the correct Swift 3 syntax for creating an array with very long integers. I tried casting, # prefix and using as AnyObject, but that did not work.
let idArray = [10211420262370680, 10211420262370680]
Error: integer literal overflows when stored into int
What is the correct way to create an array with such long integers?

Try this instead:
let idArray: [UInt64] = [10_211_420_262_370_680, ...]
As a back of the envelope calculation, every 10 bits buys you 3 decimal digits. For instance, UInt32 maxes out around 4_000_000_000 and so on.
By the way, the underscores _ above are just syntax sugar for big number literals ;-)

Signed long's array:
let signed64BitIntegerArray: [Int64] = [-10211420262370680, 10211420262370680]
Unsigned long's array:
let unsigned64BitIntegerArray: [UInt64] = [ 10211420262370680, 10211420262370680]

If you need C interop/FFI, use CLong or CUnsignedLong.

Related

Convert string to base64 byte array in swift and java give different value

Incase of android everything is working perfectly. I want to implement same feature in iOS too but getting different values. Please check the description with images below.
In Java/Android Case:
I tried to convert the string to base64 byte array in java like
byte[] data1 = Base64.decode(balance, Base64.DEFAULT);
Output:
In Swift3/iOS Case:
I tried to convert the string to base64 byte array in swift like
let data:Data = Data(base64Encoded: balance, options: NSData.Base64DecodingOptions(rawValue: 0))!
let data1:Array = (data.bytes)
Output:
Finally solved:
This is due to signed and unsigned integer, meaning unsigned vs signed (so 0 to 255 and -127 to 128). Here, we need to convert the UInt8 array to Int8 array and therefore the problem will be solved.
let intArray = data1.map { Int8(bitPattern: $0) }
In no case should you try to compare data on 2 systems the way you just did. That goes for all types but specially for raw data.
Raw data are NOT presentable without additional context which means any system that does present them may choose how to present them (raw data may represent some text in UTF8 or some ASCII, maybe jpeg image or png or raw RGB pixel data, it might be an audio sample or whatever). In your case one system is showing them as a list of signed 8bit integers while the other uses 8bit unsigned integers for the same thing. Another system might for instance show you a hex string which would look completely different.
As #Larme already mentioned these look the same as it is safe to assume that one system uses signed and the other unsigned values. So to convert from signed (Android) to unsigned (iOS) you need to convert negative values as unsigned = 256+signet so for instance -55 => 256 + (-55) = 201.
If you really need to compare data in your case it is the best to save them into some file as raw data. Then transfer that file to another system and compare native raw data to those in file to check there is really a difference.
EDIT (from comment):
Printing raw data as a string is a problem but there are a few ways. The thing is that many bytes are not printable as strings, may be whitespaces or some reserved codes but mostly the problem is that value of 0 means the end of string in most cases which may exist in the middle of your byte sequence.
So you already have 2 ways of printing byte by byte which is showing Int8 or Uint8 corresponding values. As described in comment converting directly to string may not work as easy as
let string = String(data: data, encoding: .utf8) // Will return nil for strange strings
One way of converting data to string may be to convert each byte into a corresponding character. Check this code:
let characterSequence = data.map { UnicodeScalar($0) } // Create an array of characters from bytes
let stringArray = characterSequence.map { String($0) } // Create an array of strings from array of characters
let myString = stringArray.reduce("", { $0 + $1 }) // Convert an array of strings to a single string
let myString2 = data.reduce("", { $0 + String(UnicodeScalar($1)) }) // Same thing in a single line
Then to test it I used:
let data = Data(bytes: Array(0...255)) // Generates with byte values of 0, 1, 2... up to 255
let myString2 = data.reduce("", { $0 + String(UnicodeScalar($1)) })
print(myString2)
The printing result is:
!"#$%&'()*+,-./0123456789:;<=>?#ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~ ¡¢£¤¥¦§¨©ª«¬­®¯°±²³´µ¶·¸¹º»¼½¾¿ÀÁÂÃÄÅÆÇÈÉÊËÌÍÎÏÐÑÒÓÔÕÖ×ØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõö÷øùúûüýþ
Then another popular way is using a hex string. It can be displayed as:
let hexString = data.reduce("", { $0 + String(format: "%02hhx",$1) })
print(hexString)
And with the same data as before the result is:
000102030405060708090a0b0c0d0e0f101112131415161718191a1b1c1d1e1f202122232425262728292a2b2c2d2e2f303132333435363738393a3b3c3d3e3f404142434445464748494a4b4c4d4e4f505152535455565758595a5b5c5d5e5f606162636465666768696a6b6c6d6e6f707172737475767778797a7b7c7d7e7f808182838485868788898a8b8c8d8e8f909192939495969798999a9b9c9d9e9fa0a1a2a3a4a5a6a7a8a9aaabacadaeafb0b1b2b3b4b5b6b7b8b9babbbcbdbebfc0c1c2c3c4c5c6c7c8c9cacbcccdcecfd0d1d2d3d4d5d6d7d8d9dadbdcdddedfe0e1e2e3e4e5e6e7e8e9eaebecedeeeff0f1f2f3f4f5f6f7f8f9fafbfcfdfeff
I hope this is enough but in general you could do pretty much anything with array of bytes and show them. For instance you could create an image treating bytes as RGB 8-bit per component if it would make sense. It might sound silly but if you are looking for some patterns it might be quite a witty solution.

How to get substring from user input?

i wrote code to get character when user enter in text field and do math with them
this :
#IBOutlet weak internal var textMeli: UITextField!
var myChar = textMeli.text
var numb = [myChar[0]*3 , myChar[1]*7]
but one is wrong
textMeli.text is a String.
myChar is a String.
You can't access a Character from a String using bracket notation.
Take a look at the documentation for the String structure.
You'll see that you can access the string's characters through the characters property. This will return a collection of Characters. Initalize a new array with the collection and you can then use bracket notation.
let string = "Foo"
let character = Array(string.characters)[0]
character will be of type Character.
You'll then need to convert the Character to some sort of number type (Float, Int, Double, etc.) to use multiplication.
Type is important in programming. Make sure you are keeping track so you know what function and properties you can use.
Off the soap box. It looks like your trying to take a string and convert it into a number. I would skip the steps of using characters. Have two text fields, one to accept the first number (as a String) and the other to accept the second number (as a String). Use a number formatter to convert your string to a number. A number formatter will return you an NSNumber. Checking out the documentation and you'll see that you can "convert" the NSNumber to any number type you want. Then you can use multiplication.
Something like this:
let firstNumberTextField: UITextField!
let secondNumberTextField: UITextField!
let numberFormatter = NumberFormatter()
let firstNumber = numberFormatter.number(from: firstNumberTextField.text!)
let secondNumber = numberFormatter.number(from: secondNumberTextField.text!)
let firstInt = firstNumber.integerValue //or whatever type of number you need
let secondInt = secondNumber.integerValue
let product = firstInt * secondInt
Dealing with Swift strings is kind of tricky because of the way they deal with Unicode and "grapheme clusters". You can't index into String objects using array syntax like that.
Swift also doesn't treat characters as interchangeable with 8 bit ints like C does, so you can't do math on characters like you're trying to do. You have to take a String and cast it to an Int type.
You could create an extension to the String class that WOULD let you use integer subscripts of strings:
extension String {
subscript (index: Int) -> String {
let first = self.startIndex
let startIndex = self.index(first, offsetBy: index)
let nextIndex = self.index(first, offsetBy: index + 1)
return self[startIndex ..< nextIndex]
}
}
And then:
let inputString = textMeli.text
let firstVal = Int(inputString[0])
let secondVal = Int(inputString[2])
and
let result = firstVal * 3 + secondVal * 7
Note that the subscript extension above is inefficient and would be a bad way to do any sort of "heavy lifting" string parsing. Each use of square bracket indexing has as bad as O(n) performance, meaning that traversing an entire string would give nearly O(n^2) performance, which is very bad.
The code above also lacks range checking or error handling. It will crash if you pass it a subscript out of range.
Note that its very strange to take multiple characters as input, then do math on the individual characters as if they are separate values. This seems like really bad user interface.
Why don't you step back from the details and tell us what you are trying to do at a higher level?

Swift: Errors when using different integer sizes

I've been trying out Swift, since it's obviously the direction that Apple wants us to go in.
However, I've been really annoyed with the fact that you can't seem to add integers of different sizes:
var a: Int64 = 1500
var b: Int32 = 12349
var c = a + b
if a < b { ... }
The yielded error is "Could not find an overload for '+' that accepts the supplied argument' — obviously since they are object types. None of the class methods seem to be of any help in up/down-converting integers.
Same situation applies with any of the type aliases, obviously, (CInt + CLong).
I can see a lot of real-world situations where it is immensely practical to be able to do integer arithmetic let alone comparisons or bitwise operations on two disparately-sized integers.
How to solve this? Explicit casting with the as operator doesn't seem to work. The Swift language book isn't much help either as it doesn't really discuss this scenario.
The Swift language book does discuss this scenario in the chapter “Numeric Type Conversion”:
let twoThousand: UInt16 = 2_000
let one: UInt8 = 1
let twoThousandAndOne = twoThousand + UInt16(one)
Because both sides of the addition are now of type UInt16, the addition is allowed. The output constant (twoThousandAndOne) is inferred to be of type UInt16, because it is the sum of two UInt16 values.
let a: Int64 = 1500
let b: Int32 = 12349
let c = a + Int64(b)
println("The value of c is \(c)")

How can I convert a string to a char array in ActionScript 3?

How do you convert a string into a char array in ActionScript 3.0?
I tried the below code but I get an error:
var temp:ByteArray = new ByteArray();
temp = input.toCharArray();
From the error, I understand that the toCharArray() function cannot be applied to a string (i.e in my case - input). Please help me out. I am a beginner.
I am not sure if this helps your purpose but you can use String#split():
If you use an empty string ("") as a delimiter, each character in the string is placed as an element in the array.
var array:Array = "split".split("");
Now you can get individual elements using index
array[0] == 's' ; array[1] == 'p' ....
Depending on what you need to do with it, the individual characters can also be accessed with string.charAt(index), without splitting them into an array.

How does ActionScript's writeInt work with Integers?

I wanted to know how does writeInt treat a 32 bit unsigned or a signed integer passed to it?
It is easy to understand that how it works with a hexadecimal number. Util.Print will print the corresponding ASCII Characters.
0x41424344 will be broken down into 4 1 byte characters, A, B, C and D.
It seems like its different when an integer is passed to writeInt.
for instance,
var test: ByteArray = new ByteArray();
test.writeInt(0x41424344); // prints ABCD
test.writeInt(2590463591); // prints gVg
test.writeInt(1119885898); // prints BÀJ
I am unclear how the Util.Print function treats the integers written into the ByteArray by writeInt.
The characters, gVg do not correspond to the integer number, 2590463591
According to the definition of writeInt here:
http://livedocs.adobe.com/livecycle/es/sdkHelp/common/langref/flash/utils/ByteArray.html#writeInt%28%29
It states that it works with a 32 Bit Signed Integer.
If someone can elaborate over how it translates the integers to characters, it would be helpful.
EDIT: And how does it handle negative integers?
For instance,
test.writeInt(-11338743); // prints ÿRü
So,
-11338743 = 0xFF52FC09
is that correct?
Thanks.
If you interpret encoded bytes as ASCII
dec hex ascii
1094861636 = 0x41424344 = ABCD
2590463591 = 0x9A675667 = gVg
1119885898 = 0x42C01A4A = BÀJ
Also, note that int vs unsigned int would implement different functions:
var test:ByteArray = new ByteArray();
test.writeInt(0x41424344);
test.writeUnsignedInt(0x41424344);

Resources