This question already has answers here:
How to generate a random number in Swift?
(26 answers)
Closed 7 years ago.
I was wondering if this is the best way to generate a random number in Swift.
This is what I have thought of thus far:
var randomNumber:Int = random() * 100 + 1 //What value does this return?
I was wondering if this is a viable 1 - 100 range in Swift utilizing the random() function? I am unsure if this is. I have not dealt with random numbers much in Swift 2. However in Java, the equivalent would be Math.random() * 100 + 1
I'm curious to know what would be the equivalent of this in Swift 2. Thanks in advance!
let random = Int(arc4random_uniform(100) + 1)
+ 1 since arc4random_uniform generates a random number between 0 and the parameter - 1
Related
This question already has answers here:
How to know if a number is odd or even in Swift?
(5 answers)
Closed 11 months ago.
im absolutely new to development, and trying to learn swift.
Right now i know how to make random number, and my next step is:
Im trying to understand how to check if my random number (127) could be divided by 2 without decimals ?
I have no idea how to do it.
There is a specific API isMultiple(of:) in Standard Library for this purpose
let random = Int.random(in: 0..<100)
let isEven = random.isMultiple(of: 2)
You can use operator % - remainder operator
example:
if randomNumber % 2 == 0 {
print("\(randomNumber) is even")
} else {
print("\(randomNumber) is odd")
}
This question already has answers here:
Is floating point math broken?
(31 answers)
Closed 2 years ago.
Can someone explain to me why the following sum gives wrong result in dart?
final double result = 90071992547409.9 + 0.01;
print(result);
It prints the number 90071992547409.92
According to Dart documentation:
Dart doubles are 64-bit floating-point numbers as specified in the IEEE 754 standard.
It's because of floating-point arithmetic. In your case (I used this converter):
90071992547409.9 = 90071992547409.90625 ~= 90071992547409.91
0.01 = 0.01000000000000000020816681711721685132943093776702880859375 ~= 0.01
90071992547409.91 + 0.01 = 90071992547409.92
The best solution in dart is to use the decimal package.
This question already has answers here:
How to convert a double to an int in Dart?
(11 answers)
How do you round a double in Dart to a given degree of precision AFTER the decimal point?
(28 answers)
Closed 3 years ago.
I want to round a double.
Double x = 5.56753;
x.toStringAsFixed(2);
When i put this, it gives 5.57000.
But i want to get 5.57. How do i get it?
there is num class contained function round():
Num
double numberToRound = 5.56753;
print(numberToRound.round());
//prints 6
If you want decimals
double n = num.parse(numberToRound.toStringAsFixed(2));
print(n);
//prints 5.57
check comment sujestion
For rounding doubles checkout: https://api.dartlang.org/stable/2.4.0/dart-core/double/round.html
Rounding won't work in your case because docs says:
Returns the integer closest to this.
So it will give 6 instead 5.57.
Your solution:
double x = 5.56753;
String roundedX = x.toStringAsFixed(2);
print(roundedX);
This question already has answers here:
How to properly format currency on ios
(8 answers)
How to input currency format on a text field (from right to left) using Swift?
(9 answers)
Closed 4 years ago.
I have a values like that: 100, 1220, 10015 basically last 2 digits are cents and I need to convert to dollar (currency) format similar to:
1.00, 12.20, 100.15
Can somebody suggest a quick implementation?
var a = 1011
var b = Double(a) / 100
This question already has answers here:
Swift 3 for loop with increment
(5 answers)
Replacement for C-style loop in Swift 2.2
(4 answers)
Express for loops in swift with dynamic range
(2 answers)
Closed 5 years ago.
I am self-learning to program and I came across this piece of Swift 2.0 code.
someFunction {
for var i = N; i >= 1; i -= 1 {
//...
}
}
This is a "C-Style" code apparently. What exactly is happening in this control flow? Are we starting from N, and subtracting 1 until we get to equal/greater than 1?
Or does the i >= 1 mean that the iteration count must ALWAYS be greater than or equal to one?