Why NULL needs typecasting with blocks? - ios

See this scenario :
#property (nonatomic,copy) UIImage * (^bgImageBlock)(void);
Defination of bgImageBlock block variable :
objDrawing.bgImageBlock = ^(){
return (UIImage *)NULL;
};
bgImageBlock has return type UIImage that if I pass NULL like this :
objDrawing.bgImageBlock = ^(){
return NULL;
};
Would give compile time error : Incompitable block pointer type assigning to UIImage.
Whereas if I take simple UIImage variable and assign it NULL, It is absolutely fine. Then why in case of Blocks It can't accept NULL without typecast. ?

The compiler infers the block return type from the kind of object you return, if you don't explicitly tell it. So, in this case:
objDrawing.bgImageBlock = ^(){
return NULL;
};
...I assume it's seeing the NULL and inferring that the return type of your block is void *, the type of NULL. This doesn't match your property type, so the compiler complains. (Note that the error is about the "block pointer type"; it's a mis-match between the declaration of your block property and the block you're trying to store in it.)
If you explicitly tell it your type, you should be able to use your simple NULL return:
objDrawing.bgImageBlock = ^UIImage*(){
return NULL;
};
That the block type is inferred from the return type is documented in the Clang documentation:
The return type is optional and is inferred from the return statements. If the return statements return a value, they all must return a value of the same type. If there is no value returned the inferred type of the Block is void; otherwise it is the type of the return statement value.
This inference of the return type, and the fact that the compiler checks the pointer types for you, and produces the error you're getting if there's a mis-match, is mentioned in the 2013 WWDC session "Advances in Objective-C". (See the part titled "return type inference").

Your block is expecting the type UIImage*. However, NULL being a pointer, it just does not recognize it to be pointing to anywhere of the type UIImage* (be it nothing). Bam, error.
However, you are now casting a pointer. It does not actually do anything at runtime. It just modifies the type being pointed-to, which then affects subsequent operations on the resulting value. But the actual pointer value will remain the same through the cast.
While the actual pointer remains same (all though, the value being pointed to is nothing), it is now pointing to a type that the block is expected to be returning. All is well. :)

NULL is of type void *, so the block is inferred to have return type void *, and non-Objective-C-object pointer types (like void *) are not compatible with Objective-C-object pointer types (like UIImage *). Instead, you should use nil, which is also a null pointer, but of type id, which is compatible with all Objective-C-object pointer types.

Related

printing variable out of scope does not yield error

I had this weird problem in Dart. Consider the following code :
class Number {
int num = 10;
}
Here, I created a little class with a int object num
When I try to print it using the main() function OUTSIDE the class like :
main() {
print(num);
}
I get the output as :
num
Which is weird, since I expected an error. If I were to print a undefined variable as in print(foo); I would get an error, which is expected.
What I find even more interesting is the runtimeType of a variable whose value is num.
var temp = num;
print(temp.runtimeType);
}
The above code prints _Type, when I expected it to be int.
Can somebody please clear this?
The name num is a type declared in dart:core. It's the supertype of int and double.
When you do print(num); outside the scope where your int num; variable is declared, the num refers to that type from dart:core which is always imported and therefore in scope.
Dart type names can be used as expressions, they evaluate to a Type object.
So, you are printing a Type object for the type num, which prints "num", and the run-time type of that object, which is again a Type object, which prints _Type because that's the actual internal type of the Type object instance.

How does F# know that bitArray elements are bool while enumerating in seq builder?

seq{
for bit in BitArray(10) do
yield bit
}
bit is of bool type. I checked with ILSpy and there's an explicit cast added in one of closures generated.
BitArray implements only plain (not generic) IEnumerable. How does F# know that it's a bool?
According to the F# 4.1 specification's Section 6.5.6 Sequence Iteration Expressions, F# does casting even for a non-generic IEnumerable if the IEnumerable has an Item property with a non-object type (highlighting mine):
An expression of the following form is a sequence iteration
expression:
for pat in expr1 do expr2 done
The type of pat is the same as the return type of the Current property on the enumerator value. However,
if the Current property has return type obj and the collection type ty
has an Item property with a more specific (non-object) return type
ty2, type ty2 is used instead, and a dynamic cast is inserted to
convert v.Current to ty2.
If we look at the source code for BitArray, we see that it does indeed have an Item property with type bool:
public bool this[int index] {
get {
return Get(index);
}
set {
Set(index,value);
}
}
Thus, F# will explicitly cast to bool while iterating.

Default values for arguments when argument is null?

Is there any way to get this to return "default" without writing out special functions to check the argument and set it?
void main() {
Thing stuff = Thing(text: null);
print(stuff.text);
}
class Thing{
String text;
Thing({this.text: "default"});
}
I have a map coming in from Firebase and sometimes values will be null and I'd like my class to use its default values when it is provided null.
Thing({text}) : this.text = text ?? 'default';
You will need to add this small snippet because default values in constructors only work if there is no value specified.
The ?? null-aware operator will only use the 'default' value if the value that is being passed is actually null (which will also be the case if no value is specified).

Why does `if (var = null)` compile in dart?

I've recently came across this question How do I solve the 'Failed assertion: boolean expression must not be null' exception in Flutter
where the problem comes from a should be invalid code that gets treated as valid.
This code can be summarized as :
int stuff;
if (stuff = null) { // = instead of ==
}
But why does this code compiles ? As the following will not.
int stuff;
if (stuff = 42) {
}
With the following compile error :
Conditions must have a static type of 'bool'.
So I'd expect out of consistency that if (stuff = null) to gives the same error.
null is a valid value for a bool variable in Dart, at least until Dart supports non-nullable types.
bool foo = null;
or just
bool foo;
is valid.
Therefore in the first case there is nothing wrong from a statical analysis point of view.
In the 2nd case the type int is inferred because of the assignment, which is known to not be a valid boolean value.
bool foo = 42;
is invalid.
When you say var stuff; with no initial value it is giving stuff a static type of dynamic. Since dyamic might be a bool, it's legal to assign null to a variable of type dynamic, and it's legal to use a possibly null bool in a conditional, the compiler doesn't flag this. When you say int stuff; the compiler knows that stuff could not be a bool. The reported error in that case is cause by the static type of stuff, not the assignment to null.
Edit: Got the real answer from someone who knows how to read the spec.
The static type of an assignment expression is the right hand side of the assignment. So the expression stuff = null has the static type of Null which is assignable to bool.
The reasoning is that the value of an assignment is the right hand side, so it makes sense to also use it's type. This allows expressions like:
int foo;
num bar;
foo = bar = 1;
Commonly assignment operation returns the value that it assigns.
int a = 0;
print(a = 3);//Prints 3
So,
When stuff = null,
'stuff = null' returns null. if statement needs a boolean .null is a sub-Type of boolean.
if(null){}
is valid
When stuff = 42,
'stuff = 42' returns 42. if statement needs a boolean .42 is not a sub-Type of boolean.
if(42){}
is not valid

Swift optional chaining doesn't work in closure

My code looks like this. My class has an optional var
var currentBottle : BottleLayer?
BottleLayer has a method jiggle().
This code, using optional chaining, compiles fine in my class:
self.currentBottle?.jiggle()
Now I want to construct a closure that uses that same code:
let clos = {() -> () in self.currentBottle?.jiggle()}
But I get a compile error:
Could not find member 'jiggle'
As a workaround I can force unwrapping
let clos = {() -> () in self.currentBottle!.jiggle()}
or of course I can use full-fledged optional binding, but I'd rather not. I do recognize that optional chaining is just syntactical sugar, but it is hard to see why this syntactical sugar would stop working just because it's in a handler (though there may, of course, be a reason - but it's a surprise in any case).
Perhaps someone else has banged into this and has thoughts about it? Thanks.
This is NOT a bug. It's simply your closure type which is wrong.
The correct type should return an optional Void to reflect the optional chaining:
let clos = { ()->()? in currentBottle?.jiggle() }
The problem in details:
You declare your closure as a closure that returns Void (namely ->()).
But, do remember that, as like every time you use optional chaining, the return type of the whole expression is of optional type. Because your closure can either return Void if currentBottle do exists… or nil if it doesn't!
So the correct syntax is to make your closure return a Void? (or ()?) instead of a simple Void
class BottleLayer {
func jiggle() { println("Jiggle Jiggle") }
}
var currentBottle: BottleLayer?
currentBottle?.jiggle() // OK
let clos = { Void->Void? in currentBottle?.jiggle() } // Also OK
let clos = { () -> ()? in currentBottle?.jiggle() } // Still OK (Void and () are synonyms)
Note: if you had let Swift infer the correct type for you instead of explicitly forcing it, it would have fixed the issue for you:
// Even better: type automatically inferred as ()->()? — also known as Void->Void?
let clos = { currentBottle?.jiggle() }
[EDIT]
Additional trick: directly assign the optional chaining to a variable
You can even assign the function directly to a variable, like so:
let clos2 = currentBottle?.jiggle // no parenthesis, we don't want to call the function, just refer to it
Note that the type of clos2 (which is not explicitly specified here and is thus inferred automatically by Swift) in this case is not Void->Void? — namely a function that returns either nil or Void) as in the previous case — but is (Void->Void)?, which is the type for "an optional function of type Void->Void".
This means that clos2 itself is "either nil or is a function returning Void". To use it, you could once again use optional chaining, simply like that:
clos2?()
This will evaluate to nil and do nothing if clos2 is itself nil (likely because currentBottle is itself nil)… and execute the closure — thus the currentBottle!.jiggle() code — and return Void if clos2 is non-nil (likely because currentBottle itself is non-nil).
The return type of clos2?() itself is indeed Void?, as it returns either nil or Void.
Doing the distinction between Void and Void? may seem pointless (after all, the jiggle function does not return anything in either case), but it let you do powerful stuff like testing the Void? in an if statement to check if the call actually did happen (and returned Void namely nothing) or didn't happen (and return nil):
if clos2?() { println("The jiggle function got called after all!") }
[EDIT2] As you (#matt) pointed out yourself, this other alternative has one other major difference: it evaluates currentBottle?.jiggle at the time that expression got affected to clos2. So if currentBottle is nil at that time, clos2 will be nil… even if currentBottle got a non-nil value later.
Conversely, clos is affected to the closure itself, and the optional chaining is only evaluated each time clos is called, so it will evaluate to nil if currentBottle is nil… but will be evaluated to non-nil and will call jiggle() if we call clos() at a later time at which point currentBottle became non-nil.
let closure = { () -> () in
self.currentBottle?.jiggle()
return
}
Otherwise the compiler thinks the result of that statement should be returned from the closure and it realizes there is a mismatch between () (return type) and the optional returned by the statement (Optional<Void>). By adding an explicit return, the compiler will know that we don't want to return anything.
The error message is wrong, of course.
Okay, another approach. This is because closures in Swift have Implicit Returns from Single-Expression Closures. Because of the optional chain, your closure has a return type of Void?, so:
let clos = {() -> Void? in self.currentBottle?.jiggle()}
Letting the closure type be inferred seems to work as well.
let clos2 = { currentBottle?.jiggle() }
clos2() // does a jiggle
But I'm pretty sure this is just the compiler assigning a type of () -> ()?

Resources