Swift ScrollView - contentSize can't take a variable Int - ios

Trying to set a scrollView's contentSize and I've run across this issue (Xcode 6.4)...
Both of these work perfectly:
scrollView.contentSize = CGSize(width:self.view.frame.width, height:1000)
scrollView.contentSize = CGSizeMake(self.view.frame.width, 1000)
Once a let (or var) gets involved, these do not work:
let testing = 1000
scrollView.contentSize = CGSize(width:self.view.frame.width, height:testing)
Error: Cannot find an initializer for type 'CGSize' that accepts an argument list of type '(width: CGFloat, height: Int)'
let testing = 1000
scrollView.contentSize = CGSizeMake(self.view.frame.width, testing)
Error: Cannot invoke 'CGSizeMake' with an argument list of type '(CGFloat, Int)'

Change the let statement to the following:
let testing:CGFloat = 1000
You need to do this as the CGSizeMake function requires two parameters of the same type, so you can either make both ints or make them both CGFloats. In this case it is probably easier to use testing as a CGFloat in the first place. In other cases, you might want to try something like
let testing = 1000
scrollView.contentSize = CGSizeMake(Int(self.view.frame.width), testing)
Or:
let testing = 1000
scrollView.contentSize = CGSizeMake(self.view.frame.width, CGFloat(testing))
So that both are of the same type.

A number literal doesn't have an explicit type. Its type is inferred at the point that it's evaluated by the compiler.
A number variable must have an explicit type. The default type is Int
let testing : CGFloat = 1000.0
scrollView.contentSize = CGSize(width:self.view.frame.width, height:testing)

Related

Why does the compiler claim CGRect has no width member?

Note that I'm not trying to set the value in a CGRect. I'm mystified as to why the compiler is issuing this claim:
let widthFactor = 0.8
let oldWidth = wholeFrameView.frame.width
let newWidth = wholeFrameView.frame.width * widthFactor // Value of type '(CGRect) -> CGRect' has no member 'width'
let newWidth2 = wholeFrameView.frame.width * 0.8 // This is fine.
Width is a CGFloat where your multiplier is a Double. Explicitly declare the type of your multiplier:
let widthFactor: CGFloat = 0.8
All the dimensions of a CGRect are of type CGFloat, not Double, and because Swift is especially strict about types, you can't multiply a CGFloat by a Double.
The interesting thing though, is that both CGFloat and Double implement ExpressibleByFloatLiteral. This means that 0.8, a "float literal", can be interpreted as either a Double or a CGFloat. Without context, it's always a Double, because of how the compiler is designed. Note that this only applies to float literals like 3.14, 3e8 etc, and not to identifiers of variables.
So the expression wholeFrameView.frame.width * 0.8 is valid because the compiler sees that width is a CGFloat, so it treats 0.8 as a CGFloat as well. No problems.
On the other hand, when you declare the variable widthFactor, it is automatically given the type Double, because there aren't any more context on that line to suggest to the compiler that you want it to be any other type.
This can be fixed by directly telling the compiler that you want widthFactor to be a CGFloat:
let widthFactor: CGFloat = 0.8
Because, as others have noted, you can't multiply a Double and a CGFloat, the compiler doesn't know what you're intending.
So, instead of giving you an error about the frame property, which you currently think it's doing, it's actually making its best guess*, and giving you an error related to the frame method. No method method has a width property, so what it tells you is true.
*Of course, its best guess is not good enough, hence a human being coming here to ask a question about it. So please file a bug!
Stepping onto my soapbox: This confusion would be avoided if Apple hadn't named the method the thing it returns. The convention to prefix all such methods with get solves the problem. Some convention is important in any language with first-class functions, to disambiguate between properties and methods.
wholeFrameView.frame has no member width. Also, you need widthFactor to be of type CGFloat. Try:
let newWidth = wholeFrameView.frame.size.width * CGFloat(widthFactor)

Avoid redundant dereferencing in iOS

Suppose I want to change size of an uiView: UIView to w,h. I can do it like that:
uiView.frame.size.width = w
uiView.frame.size.height = h
In another system I can avoid replication of dereferencing (which means waste of both size and performance) by keeping a reference in a variable (using Swift syntax):
let ref = uiView.frame.size
ref.width = v
ref.height = h
This however doesn't work in iOS, where CGSize is a structure and therefore is copied when assigned to another value.
Is there a way to avoid redundant dereferencing (something like with(uiView.frame.size){...} available in some languages)
I don't think there is a way to do it exactly because the frame is a value-copied structure. You could set the frame directly as Reiner Melian suggests, but to me that seems even longer and uses dereferencing at least the same amount of time as your approach.
There is a way how to make it simpler this using extensions, but behind the scenes it will again be using dereferencing:
extension UIView {
var width: CGFloat {
get {
return self.frame.size.width
}
set {
self.frame.size.width = newValue
}
}
var height: CGFloat {
get {
return self.frame.size.height
}
set {
self.frame.size.height = newValue
}
}
}
And then you could use:
uiView.width = w
uiView.height = h
on any UIView instance.
This is even simpler:
uiView.frame.size = CGSize(width: w, height: h)
As I understand it, RHS is a temporary value released as soon as the content has been copied to frame structure.

how to set frame dynamically in swift?

i am trying to set imageview as subview for scrollview.for that i am setting different frame depends upon the screensize.
This is my code
let screensize:CGRect=UIScreen.mainScreen().bounds
let screenwidth=screensize.width
var frames:CGRect
for var index=0 ;index < arrBirds.count ;index++
{
if(screenwidth==320)
{
frames.origin.x = CGFloat(index) * (self.view.frame.size.width);
frames.origin.y = 0;
frames.size=scrollview.frame.size;
}
else if(screenwidth==375)
{
frames.origin.x = CGFloat(index) * (self.view.frame.size.width);
frames.origin.y=0;
frames.size=scrollview.frame.size;
}
else
{
frames.origin.x = CGFloat(index) * (self.view.frame.size.width);
frames.origin.y=0;
frames.size=scrollview.frame.size;
}
imageView=UIImageView(frame:frames)
imageView.image=UIImage(named: arrBirds[index] as! String)
scrollview .addSubview(imageView)
But i am getting error this line
frames.origin.x = CGFloat(index) * (self.view.frame.size.width);//struct frame must be completely initialised before a member is stored to.
You have not yet created an instance of the struct CGRect yet. Before accessing / setting any members on it you you need to create a instance first and only after that set the different members of it:
Change the line var frames:CGRect to:
var frames = CGRectZero
According to Apple Docs :
Classes and structures must set all of their stored properties to an
appropriate initial value by the time an instance of that class or
structure is created.
Your frames object isn't initialized, instead you should write :
var frames = CGRectZero

Using variables in CGRectMake, Swift, UIkit

For an App I'm making i need to use variables to change the size and position of objects (Labels). I've tried var example = CGRectMake(0, 0, 0, 100), hoping it would ignore the zeros (Not really thinking it would though). I then tried:
var example = 100
Label1.frame = CGRectMake(20, 20, 50, example)
I changed the syntax a bit, adding "" and replacing the CGRectMake with CGRect etc, but nothing worked... I don't get what I'm doing wrong here... Help!
Below is the new syntax used since Swift 3.
CGRect(x: 0, y: 0, width: 100, height: 100)
CGRectMake takes CGFloats for all of its arguments. Your sample code should work fine if you specify that example is supposed to be a CGFloat, using a type identifier:
// v~~~~ add this...
var example: CGFloat = 100
Label1.frame = CGRectMake(20, 20, 50, example)
Otherwise, swift infers the type of example to be Int, and the call to CGRectMake fails, cuz it can't take an Int as a parameter...
So, there is many ways to skin the cat. It all depends what your needs and requirements are (maybe you could elaborate a bit on what you are trying to achieve?). But one way to do it could be to set a variable when something happens, and then update the frame of the label. If you added a tap gesture recognizer to your view, and updated your label like so:
let myLabel = UILabel()
override func viewDidLoad() {
super.viewDidLoad()
let tapGestRecog = UITapGestureRecognizer(target: self, action: "handleTap:")
self.view.addGestureRecognizer(tapGestRecog)
}
func handleTap(sender:UIGestureRecognizer) {
let newXposition = sender.locationInView(self.view).x
let newYposition = sender.locationInView(self.view).y
myLabel.frame = CGRectMake(newXposition, newYposition, 200, 200)
}
This is just an example, and a very crude way of doing it. There are many other ways of doing it, but it hopefully gives you an idea of how to achieve it.
Swift allows syntax that Objective-C does not:
var example = 100
label.frame.size.height = example
In objective-C you would have to do it differently:
CGRect frame = label.frame; //Create a temporary rect to hold the frame value
frame.size.height = example;
label.frame = frame;

Set font size equal to an integer?

In Swift, I have a line that says var timerFontSize = 85. I want to have the line timerLabel.font = UIFont(name: "HelveticaNeue-Ultralight", size: timerFontSize), although that doesn't work. It only allows me to type in a number, not assign it to a variable. The reason I don't want to just type in a number is because I have a timer, where every second the font variable drops by 1.
How can I set the font size equal to an integer?
simply type cast the number to CGFloat
UIFont(name: "HelveticaNeue-Ultralight", size: CGFloat(timerFontSize))
Using this line:
var timerFontSize = 85
You're setting your variable implicitly to type Int, while the method for creating a font takes a CGFloat argument.
You could cast this variable as a CGFloat when you call the method, or you could explicitly create the variable as a CGFloat type:
var timerFontSize: CGFloat = 85
It's worth noting that
var timerFontSize = 85.0
Creates timerFontSize as a double, and
var timerFontSize = 85.0f
creates timerFontSize as a float.
The former will work for 64-bit devices where CGFloat ends up being a double, and the latter will work for 32-bit devices where CGFloat ends up being a float. But you don't want to use either of these options as it will crash on the other device.
CGFloat is a typedef for float/double depending on whether the device is 32 or 64 bit processor.

Resources