I was wondering how to round a UISilder value to a int value. Right now it is displaying floats. I am using swift 3 at the moment.
#IBOutlet var slidermove: UISlider!
#IBAction func SliderVal(_ sender: AnyObject) {
print(slidermove)
let x = round(slidermove)
}
round() does not work it will give an error, thanks for anyone that helps
You should round it's value instead of UISlider itself.:
let x = roundf(slider.value) // x is Float
If you need an Int instead of Float, just wrap it with Int initializer:
let x = Int(round(slider.value)) // x is Int
To round a Float to the nearest integer and get the result as
an Int, the (perhaps not so well-known) function lrintf()
can be used. Example:
let x = Float(12.6)
let i = lrintf(x)
print(i) // 13
In your case:
let i = lrintf(slider.value) // `i` is an `Int`
Related
This question already has answers here:
Swift: Print decimal precision of division
(3 answers)
Closed 4 years ago.
Firstly Thanks to all upcoming answers .
I am new to swift programming . I am testing out many things lol . I am trying to do a bmi app. I am using print() in order to check all values of my variables.
I am not able to understand why imc value is 0 . Did I miss something ? What's the logic? I tried to hard code it with quotien 90/32400 or x/ySquare same result. I am still getting quotien = 0
import UIKit
class ViewController: UIViewController {
#IBOutlet weak var weightTextField: UITextField!
#IBOutlet weak var heightTextfield: UITextField!
#IBOutlet weak var resultLabel: UILabel!
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
#IBAction func calculateButton(_ sender: Any) {
imcCalculator()
}
func imcCalculator () {
let myHeight = Int(heightTextfield.text!)
let myWeight = Int(weightTextField.text!)
let x = myWeight!
let y = myHeight!
//let imc = Int(Int(x / pow(y, 2)) * 10000)
let ySquare = y * y
let quotien = 90 / 32400
let imc = (x/ySquare)*10000
if imc > 25 {
resultLabel.text = " Your BMI is \(imc) . You are overweight"
}
else if imc 18 {
resultLabel.text = " Your BMI is \(imc) . You have a normal weight"
}
else {
resultLabel.text = "Your BMI is \(imc) . You are underweight"
}
print(x)
print(y)
print(ySquare)
print(quotien)
print(imc)
}
}
What's happening is called Truncation, or Integer division. The result of integer division differs from language to language. And as stated by the Swift docs:
For integer types, any remainder of the division is discarded
That's why let quotien = 90 / 32400 will give 0 as a result.
I would suggest you use Doubles instead, and your code might look like this:
func imcCalculator () {
guard let myHeight = Double("6"), let myWeight = Double("70") else {
fatalError("Error in the text fields")
}
let x = myWeight
let y = myHeight
//let imc = Int(Int(x / pow(y, 2)) * 10000)
let ySquare: Double = y * y
let quotien: Double = 90.0 / 32400.0
let imc: Double = (myHeight / ySquare) * 10000
let imcString: String = String(format: "Your BMI is %.2d", imc)
if imc > 25 {
resultLabel.text = imcString + " . You are overweight"
}
else if imc < 25 && imc > 18 {
resultLabel.text = imcString + " . You have a normal weight"
}
else {
resultLabel.text = imcString + " . You are underweight"
}
print("x =", x)
print("y =", y)
print("ySquare =", ySquare)
print("quotien =", quotien)
print("imc =", imc)
}
The point is: Arithmetic operations between elements of a certain type, give results of the same type.
Thus, when dividing for example 1 by 2, you should expect the result to be an integer too. And it's a convention to define the integer part of the quotient as the result of the division of the two numbers. 1 divided by 2 (in real number division) gives 0.5, the integer part of that is 0.
On the other hand, 1.0/2.0 is 0.5 since both the Dividend and Divisor are infered to be Doubles. If you don't add the .0 after at least one them, thene fractional part is discarded.
You can try this in a playground:
3/10 //0
3.0/10.0 //0.3
3.0/10 //0.3
3/10.0 //0.3
As noted by #Martin R, the result of integer division differs from the quotient of Euclidean division when the Dividend (numerator) is negative, since truncation always rounds toward zero. Here is what is meant by that:
In integer division: (-3)/10 equals 0
In Euclidean division: The quotient of (-3)/10 is -1
So basically I am looking to choose one of the 4 different coloured balls at random to come into the scene which each have an animation, physics properties and movement & spacing that I have already coded. I am not sure exactly how to make the array then choose at random from the array of the 4 coloured balls so that I have one ball chosen at random to come into the scene.
To make it more clear what I'm asking here's some code (I only use two balls in this code so you don't have to read as much):
var moveandremove = SKAction() < this is in my ballScene.swift
The spawn runBlock is inside didMovetoView
let spawn = SKAction.runBlock({
() in
self.allballs()
})
let delay = SKAction.waitForDuration(2.0)
let SpawnDelay = SKAction.sequence([spawn, delay])
let spawndelayforever = SKAction.repeatActionForever(SpawnDelay)
self.runAction(spawndelayforever)
let distance = CGFloat(brnball.frame.width * 20 + brnball.frame.width)
let moveball = SKAction.moveByX(-distance, y: 0, duration: NSTimeInterval(0.003 * distance))
let removeball = SKAction.removeFromParent()
moveandremove = SKAction.sequence([moveball])
}
func allballs() {
TextureAtlasblk = SKTextureAtlas(named: "blkball")
for i in 1...TextureAtlasblk.textureNames.count{
var Name = "blkball_\(i)"
blkarray.append(SKTexture(imageNamed: Name))
}
blkball = SKSpriteNode(imageNamed: "blkball_1")
blkball.position = CGPoint(x: CGRectGetMidX(self.frame) + 100, y: CGRectGetMidY(self.frame))
blkball.zPosition = 7
blkball.setScale(0.1)
self.addChild(blkball)
blkball.runAction(SKAction.repeatActionForever(SKAction.animateWithTextures(blkarray, timePerFrame: 0.2)))
//brownball
TextureAtlasbrn = SKTextureAtlas(named: "brnball")
for i in 1...TextureAtlasbrn.textureNames.count{
var Name = "brnball_\(i)"
brnarray.append(SKTexture(imageNamed: Name))
}
brnball = SKSpriteNode(imageNamed: "brnball_1")
brnball.position = CGPoint(x: CGRectGetMidX(self.frame) + 50, y: CGRectGetMidY(self.frame))
brnball.zPosition = 7
brnball.setScale(0.1)
self.addChild(brnball)
brnball.runAction(SKAction.repeatActionForever(SKAction.animateWithTextures(brnarray, timePerFrame: 0.2)))
Here is my terrible starting point at trying to make an array to choose from each ball (this is inside my allballs() function):
var ballarray: NSMutableArray = [blkball, brnball, yelball, bluball]
runAction(moveandremove)
I am new to swift and pretty hopeless, would be awesome if someone could help me out :)
Thanks
It's hard for me to find the array that you're talking about in your code. But nevertheless, I can still show you how.
Let's say we have an [Int]:
let ints = [10, 50, 95, 48, 77]
And we want to get a randomly chosen element of that array.
As you may already know, you use the subscript operator with the index of the element to access an element in the array, e.g. ints[2] returns 95. So if you give a random index to the subscript, a random item in the array will be returned!
Let's see how can we generate a random number.
The arc4random_uniform function returns a uniformly distributed random number between 0 and one less the parameter. Note that this function takes a UInt32 as a parameter and the return value is of the same type. So you need to do some casting:
let randomNumber = Int(arc4random_uniform(UInt32(ints.count)))
With randomNumber, we can access a random element in the array:
let randomItem = ints[randomNumber]
Try to apply this technique to your situation.
Here's a generic method to do this as well:
func randomItemInArray<T> (array: [T]) -> T? {
if array.isEmpty {
return nil
}
let randomNumber = Int(arc4random_uniform(UInt32(array.count)))
return array[randomNumber]
}
Note that if the array passed in is empty, it returns nil.
You could make and extension for Array that returns a random element.
extension Array {
func randomElement() -> Element {
let i = Int(arc4random_uniform(UInt32(count - 1)))
return self[i]
}
}
You could take that a step further and allow a function to be applied directly to a random element.
mutating func randomElement(perform: (Element) -> Element) {
let i = Int(arc4random_uniform(UInt32(count - 1)))
self[i] = perform(self[i])
}
You can use this function when using an array of reference types.
func randomElement(perform: (Element) -> ()) {
let i = Int(arc4random_uniform(UInt32(count - 1)))
perform(self[i])
}
I don't know why I got this error
Cannot assign to value: 'x' is a 'let' constant
with this code :
func swap(x:AnyObject,y:AnyObject){
let tmp = x
x=y
y=tmp
}
Function parameters are constants in Swift. In previous versions you could make them mutable by marking them with var, but this is going away in Swift 3.
If you want to actually swap the passed values, you have to declare them inout:
func swap(inout x: AnyObject, inout y: AnyObject) {
let tmp = x
x = y
y = tmp
}
or just...
func swap(inout x: AnyObject, inout y: AnyObject) {
(x, y) = (y, x)
}
I wrote a function:
func rms16(buffer: Int, bufferSize: Int) -> Float
{
let sum: Float = 0.0
let mySize: Int = bufferSize / sizeof(CShort)
var buffer_short: Int = buffer
for var i = 0; i < mySize; i++ {
sum += buffer_short[i] * 2
}
let sqrt1: Float = sqrtf(sum / Float(mySize))
return (sqrt1) / Float(mySize)
}
Above function in a for loop it show me error like this:
Type Int has no subscript members
Anyone tell me how can i fix it?
The buffer_short is an integer variable not an array of integer.
You need to change the first argument of your function to an array of integer, like:
func rms16(buffer: [Int], bufferSize: Int) -> Float
{
let sum: Float = 0.0
let mySize: Int = bufferSize / sizeof(CShort)
var buffer_short = buffer
for var i = 0; i < mySize; i++
{
sum += buffer_short[i] * 2
}
let sqrt1: Float = sqrtf(sum / Float(mySize))
return (sqrt1) / Float(mySize)
}
Your comments indicate that the first parameter should be a pointer to a character buffer. The corresponding Swift type is UnsafePointer<CChar>. Note also that you are multiplying by 2
instead of squaring in your Swift code, and that sum must be
declared as a variable.
So your Swift function would be
func rms16(buffer: UnsafePointer<CChar>, bufferSize: Int) -> Float
{
var sum: Float = 0.0
let mySize = bufferSize / sizeof(CShort)
// Create a pointer to a buffer of `CShort` elements from the given pointer:
let buffer_short = UnsafeBufferPointer<CShort>(start: UnsafePointer(buffer), count: mySize)
for var i = 0; i < mySize; i++ {
sum += pow(Float(buffer_short[i]), 2)
}
let sqrt1 = sqrtf(sum / Float(mySize))
return (sqrt1) / Float(mySize)
}
The summation loop can be reduced (!) to:
let sum = buffer_short.reduce(0.0, combine: { $0 + pow(Float($1), 2)} )
I can't multiply currentValue variable value.
Code:
#IBAction func PlusMinus()
{
let v = 0
command = nil
let currentValue = v
let v = v*(-1)
displayLabel!.text = m
}
what is wrong ?
You can see screenshot :
http://cl.ly/image/3c2e0V0m021H
You are redefining a constant with the same name 'v'. Also, you're using several instance vars in your code. Copy all relevant code in your question.
#IBAction func PlusMinus()
{
let v = 0
command = nil
let currentValue = v
let v = v*(-1) // you've already defined a constant named 'v'
displayLabel!.text = m
}
let declares a constant. It's like writing 2 = 2 * (-1), makes no sense. Use var for variable values.