Given an array of ones and zeroes, convert the equivalent binary value to an integer.
Eg: [0, 0, 0, 1] is treated as 0001 which is the binary representation of 1.
Testing: [0, 0, 0, 1] ==> 1
Testing: [0, 0, 1, 0] ==> 2
Testing: [0, 1, 0, 1] ==> 5
Testing: [1, 0, 0, 1] ==> 9
Testing: [0, 0, 1, 0] ==> 2
Testing: [0, 1, 1, 0] ==> 6
Testing: [1, 1, 1, 1] ==> 15
Testing: [1, 0, 1, 1] ==> 11
With Prime numbers I solved everything code from below
int binaryArrayToNumber(List<int> arr) {
return
(arr[0] * 8) +
(arr[1] * 4) +
(arr[2] * 2) +
(arr[3] * 1);
}
How do I make this automatic for larger numbers
A traditional approach would be to iterate over the binary digits and to perform bitshifts and ORs:
int binaryArrayToNumber(List<int> digits) {
var result = 0;
for (var digit in digits) {
result <<= 1;
result |= digit;
}
return result;
}
A more concise, less error-prone (but less efficient) approach would be to convert the list of digits to a String and then parse it:
int binaryArrayToNumber(List<int> digits) =>
int.parse(digits.join(""), radix: 2);
It's good to learn the protocols for the core data types, like List.fold.
int binaryArrayToNumber(List<int> digits) =>
digits.fold(0,(prev, cur) => 2*prev + cur);
Related
This question already has answers here:
Sort array elements based on their frequency
(8 answers)
Closed 12 months ago.
How to sort an integer array based on a duplicate values count. here less number of duplicates should come first.
input [5, 2, 1, 2, 4, 4, 1, 1, 2, 3, 3, 6]
OutPut [5, 6, 4, 4, 3, 3, 2, 2, 2, 1, 1, 1]
Using Martin's comment, here is another approach which aims to reduce the number of loops and conditions we write ourselves by using some functions provided by swift.
// Input
let numbers = [1, 1, 1, 2, 2, 2, 3, 3, 4, 4, 5, 6]
// Count the occurrences based on Martin's comment
let countDict = numbers.reduce(into: [:], { $0[$1, default: 0] += 1 } )
// Get a sorted array of the countDict keys, sorted by value which
// is the number of occurrences
let sortedKeys
= countDict.keys
.sorted { countDict[$0, default: 0] < countDict[$1, default: 0] }
// Initialize an empty array to hold the final sorted numbers
var sortedNumbers: [Int] = []
// Add the elements into the sortedNumbers with in their desired order
for key in sortedKeys {
sortedNumbers.append(contentsOf: repeatElement(key,
count: countDict[key, default: 0]))
}
// prints [5, 6, 4, 4, 3, 3, 1, 1, 1, 2, 2, 2] based on the above input
print(sortedNumbers)
let numbers = [5,2,1,2,4,4,1,1,2,3,3,6]
let sortedNumber = numbers.sorted()
print("Input: ",sortedNumber)
var dict = [Int: Int]()
for item in sortedNumber {
let isExist = dict.contains(where: {$0.key == item})
if !isExist {
dict[item] = 1
} else {
if let value = dict[item] {
dict[item] = value + 1
}
}
}
var finalArray = [Int]()
let sortedArray = dict.sorted { (first, second) -> Bool in
return first.value < second.value
}
for d in sortedArray {
for _ in 1...d.value {
finalArray.append(d.key)
}
}
print("Output: ",finalArray)
Input: [1, 1, 1, 2, 2, 2, 3, 3, 4, 4, 5, 6]
Output: [5, 6, 4, 4, 3, 3, 2, 2, 2, 1, 1, 1]
In Swift 3, how can we calculate sum of every 5 elements in array of Int.
For example, we have an array [1,2,3,4,5,6,7,8,9,0,12,23]
1+2+3+4+5 = 15
6+7+8+9+0 = 30
12+23+0+0+0 = 35
The result something like this [15,30,35]
Here is my solution in playgroud:
//: Playground - noun: a place where people can play
import UIKit
var arr = [1,1,1,1,1,2,2,2,2,2,3,3,3,3,3]
let chunkSize = 5
let chunks = stride(from: 0, to: arr.count, by: chunkSize).map {
Array(arr[$0..<min($0 + chunkSize, arr.count)])
}
print(chunks)
var summ = chunks.map { $0.reduce(0, {$0 + $1}) }
print(summ)
OUTPUT:
[[1, 1, 1, 1, 1], [2, 2, 2, 2, 2], [3, 3, 3, 3, 3]]
[5, 10, 15]
Take a look at:
Finding sum of elements in Swift array
Let say we can mark weekend day active or inactive. I need to use Integer to say system that I marked day active or inactive, to retrieve this integer I need to use array [1, 1, 1, 1, 1, 1, 1]. So if you see at this array all day of week marked as active and in Hexadecimal it's 0000007F.
If I use [0, 0, 0, 0, 0, 0, 1] this string it means Hexadecimal = 00000001. So my question is how to create Hexadecimal from array and then get form it an Integer. So in case with 0000007F it should be 127.
I assume that it should be something like that:
let array = [1, 1, 1, 1, 1, 1, 1]
let hexadecimal = array.toHexadecimal
let intNumber = hexadecimal.toInt
print(intNumber) // prints 127
Also I guess it can be for example an array with ints like [0, 1, 1, 1, 1, 0, 1] wich means that Monday and from Wednesday till Saturday (including) are active days.
You can use reduce method to sum up your binary array (inspired at this answer) and use String(radix:) initializer to convert your integer to hexa string:
Swift 3 • Xcode 8 or later
let binaryArray = [1, 1, 1, 1, 1, 1, 1]
let integerValue = binaryArray.reduce(0, {$0*2 + $1})
let hexaString = String(integerValue, radix: 16) // "7f"
I was trying to get the nullity and kernel of a matrix over the complex field in Maxima.
I get strange results, though.
I can define a matrix A:
M : matrix([0, 1, 1, 0], [-1, 0, 0, 1], [0, 0, 0, 1], [0, 0, -1, 0]);
A : M + %i * ident(4);
... for reference, it looks like this:
%i 1 1 0
-1 %i 0 1
0 0 %i 1
0 0 -1 %i
If I then compute the nullity with nullity(A), I get 3.
If I compute the rank with rank(A), I also get 3.
And if I compute the nullspace with nullspace(A), I get:
span([-1, %i, 0, 0], [-%i, -1, 0, 0], [2%i, 2, 0, 0])
But this is pretty weird, because -%i * second(...) is [-1, %i, 0, 0], which is the first vector.
And indeed, when I do NullSpace[{{i, 1, 1, 0}, {-1, i, 0, 1}, {0, 0, i, 1}, {0, 0, -1, i}}] in Mathematica, I get that the nullspace has basis [%i, 1, 0, 0] and is 1-dimensional (not 3-dimensional).
What am I doing wrong?
You are doing everything right, as far as I can tell. The problem is a bug in Maxima, which I have reported: https://sourceforge.net/p/maxima/bugs/3158/
I don't see any simple way to work around it. I am working on fixing the bug.
Disclaimer, I'm a beginner.
I have an array that is 16 digits, limited to 0's and 1's. I'm trying to create a new array that contains only the index values for the 1's in the original array.
I currently have:
one_pos = []
image_flat.each do |x|
if x == 1
p = image_flat.index(x)
one_pos << p
image_flat.at(p).replace(0)
end
end
The image_flat array is [0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
With the code above, one_pos returns [3, 3] rather than the [3, 5] that I'd expect.
Where am I going wrong?
Where am I going wrong?
When you call
image_flat.index(x)
It only returns first entry of x in image_flat array.
I guess there are some better solutions like this one:
image_flat.each_with_index do |v, i|
one_pos << i if v == 1
end
Try using each_with_index (http://apidock.com/ruby/Enumerable/each_with_index) on your array.
image_flat = [0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
one_pos = []
image_flat.each_with_index do |value, index|
if value == 1
one_pos << index
end
end
I think this is the most elegant solution here:
image_flat.each_index.select{|i| image_flat[i] == 1}
Here is a solution if you are looking for a solution that doesn't reach out of the enumerable block although it does require a chained solution.
image_flat.each_with_index.select { |im,i| im==1 }.map { |arr| arr[1] }
Its chained and will require an additional lookup so Gena Shumilkin's answer will probably be more optimal for larger arrays.
This was what I originally thought Gena Shumilkin was trying to reach until I realized that solution used each_index instead of each_with_index.