I have a UInt8? variable named seconds that I need to pass to a function accepting Int?.
This leads to Cannot convert value of type UInt8? to expected argument type Int?.
This means I have to cast, and so I tried the obvious:
Int?(seconds)
But this results in: UInt8? is not convertible to Int.
Of course I could do:
(seconds == nil) ? nil : Int(seconds!)
But WTF, does it really have to be so contrived?
Your type is Optional<UInt8>. A UInt8 can always be converted to an Int with the function Int.init. But since it's wrapped in an Optional, you'll have to map that function over the optional, yielding a new value of type Optional<Int>:
seconds.map(Int.init)
Optional.map(_:) and its companion flatMap often make working with optionals a lot easier.
Related
I am facing this problem as when I am trying to build the code using ANSI C, as I was practicing writing in it and dealing with its rules, it tells me invalid type conversion and I don't know what to do.
this is the code line that makes the error, it is a pointer to function:
((CanIf_FuncTypeCanSpecial)(entry->CanIfUserRxIndication))(
entry->CanIfCanRxPduHrhRef->CanIfCanControllerHrhIdRef,
entry->CanIfCanRxPduId,
CanSduPtr,
CanDlc,
CanId);
and this is howentry->CanIfUserRxIndication is declared, as void *CanIfUserRxIndication;
and this is how CanIf_FuncTypeCanSpecial is declared, as
typedef void (*CanIf_FuncTypeCanSpecial)
(uint8 channel, PduIdType pduId, const uint8 *sduPtr, uint8 dlc, Can_IdType canId);
every parameter in the conversion type is the same type as the input parameters except the first one entry->CanIfCanRxPduHrhRef->CanIfCanControllerHrhIdRef it is from type enum not uint8.
You can find the code on GitHub.
and also the MISRA Rule is telling me this:
#1398-D (MISRA-C:2004 11.1/R) Conversions shall not be performed between a pointer to a function and any type other than an integral type
I tried to convert from enum to uint8 to make all of the parameters as what the type conversion CanIf_FuncTypeCanSpecial takes, but nothing happened.
If I understand correctly, you are trying to cast an existing function to match a function pointer declaration that has a differing argument type. You can cast the parameters and call such a function, but because function pointers themselves may be used anywhere in the program, at the places where they would be used the code would not know what to cast (which may result in a size difference) this is illegal.
When trying to build the sample project of BonMot,
let theCFMutableString = NSMutableString(string: myString) as CFMutableString
CFStringTransform(theCFMutableString, UnsafeMutablePointer<CFRange>(nil), kCFStringTransformToUnicodeName, false)
I get this error on the CFStringTransform line
Ambiguous use of 'init'
The Xcode 8 project uses Swift 3
In Swift 2, pointer types conformed to NilLiteralConvertible, allowing a non-optional pointer type to represent a null pointer. Therefore when you did
UnsafeMutablePointer<CFRange>(nil)
the compiler was actually using the init(_ other: COpaquePointer) initialiser of UnsafeMutablePointer, as COpaquePointer is NilLiteralConvertible and can therefore represent a null pointer.
However in Swift 3 (SE-0055), pointer types no longer conform to ExpressibleByNilLiteral. Rather than allowing a non-optional pointer type to represent a null pointer, this is now simply done with optionals, where nil means a null pointer.
Therefore you can just pass nil directly into the range parameter of CFStringTransform, as it expects a UnsafeMutablePointer<CFRange>!:
CFStringTransform(theCFMutableString, nil, kCFStringTransformToUnicodeName, false)
In Xcode 8 release version, i found a strange scene.
Here is the code,
let implicitlyUnwrappedOptionalString: String! = "implicitlyUnwrappedOptionalString"
let foo = implicitlyUnwrappedOptionalString
print(implicitlyUnwrappedOptionalString)
print(foo)
and here is the result:
implicitlyUnwrappedOptionalString
Optional("implicitlyUnwrappedOptionalString")
These above shows that when i assign a implicitly unwrapped optional to a variable without a explicit type, the type will be inferred to an optional type, not the type which it originally is, aka implicitly unwrapped optional.
My Xcode has been updated to 8. Anyone can verify the behavior in Xcode 7.x?
The change is due to the Swift version changing or the Xcode?
This is a consequence of SE-0054 Abolish ImplicitlyUnwrappedOptional type which has been implemented in Swift 3. Extract from that proposal (emphasis added):
However, the appearance of ! at the end of a property or variable declaration's type no longer indicates that the declaration has IUO type; rather, it indicates that (1) the declaration has optional type, and (2) the declaration has an attribute indicating that its value may be implicitly forced. ...
If the expression can be explicitly type checked with a strong optional type, it will be. However, the type checker will fall back to forcing the optional if necessary. The effect of this behavior is that the result of any expression that refers to a value declared as T! will either have type T or type T?. For example, in the following code:
let x: Int! = 5
let y = x
let z = x + 0
… x is declared as an IUO, but because the initializer for y type checks correctly as an optional, y will be bound as type Int?. However, the initializer for z does not type check with x declared as an optional (there's no overload of + that takes an optional), so the compiler forces the optional and type checks the initializer as Int.
In your case, the assignment
let foo = implicitlyUnwrappedOptionalString
makes foo a strong optional, as in the example let y = x
from the proposal.
You could make foo an IUO by adding an explicit type annotation
let foo: String! = implicitlyUnwrappedOptionalString
but generally you should try to get rid from IUOs in your code,
as stated in the same proposal:
Except for a few specific scenarios, optionals are always the safer bet, and we’d like to encourage people to use them instead of IUOs.
I know the correct way to initial a NSNumber is NSNumber *a = #1;
and when I declare NSNumber *a = 1;, I will got the error
Implicit conversion of int to nsnumber is disallowed with arc
But I don't know why when I declare NSNumber *a = 0; there is no error
In my case, I have write some function in NSNumber category
and then
If the value of NSNumber is #0, I can use the function in category normally
If the value of NSNumber is 0, I can use the function in category, no error happened but when run app, this function will never call
The value 0 is synonymous with nil or NULL, which are valid values for a pointer.
It's a bit of compatibility with C that leads to this inconsistent behavior.
History
In the C language, there is no special symbol to represent an uninitialized pointer. Instead, the value 0 (zero) was chosen to represent such a pointer. To make code more understandable, a preprocessor macro was introduced to represent this value: NULL. Because it is a macro, the C compiler itself never sees the symbol; it only sees a 0 (zero).
This means that 0 (zero) is a special value when assigned to pointers. Even though it is an integer, the compiler accepts the assignment without complaining of a type conversion, implicit or otherwise.
To keep compatibility with C, Objective-C allows assigning a literal 0 to any pointer. It is treated by the compiler as identical to assigning nil.
0 is a null pointer constant. A null pointer constant can be assigned to any pointer variable and sets it to NULL or nil. This was the case in C for the last 45 years at least and is also the case in Objective-C. Same as NSNumber* a = nil.
You can consider 0 as nil or null that can be assign to object but 1 is integer and can't allow to object or non integer.
Objective-C silently ignores method calls on object pointers with value 0 (i.e. nil). That's why nothing happens when you call a method of your NSNumber category pointer which you assigned the value 0.
A nil value is the safest way to initialize an object pointer if you don’t have another value to use, because it’s perfectly acceptable in Objective-C to send a message to nil. If you do send a message to nil, obviously nothing happens.
Note: If you expect a return value from a message sent to nil, the return value will be nil for object return types, 0 for numeric types, and NO for BOOL types. Returned structures have all members initialized to zero.
In the last Apple Doc Working with nil
I noticed some interesting behavior when trying to access values of optional properties in optional classes VS trying to access the values of optional values in optional dictionaires.
It seems that in the former case you only need to unwrap once to access the value. However, in the latter case you have to unwrap twice to access the value. I was wondering why this is, and was hoping someone could provide me with some insight!
Below is an example accessing the value of an optional property in an optional class
class Cat{
var food : Food?
}
class Food{
var amount : Int?
}
var meowzer = Cat()
meowzer.food = Food()
meowzer.food?.amount = 10
var catFoodAmt = meowzer.food?.amount
print("\(catFoodAmt)")
if let catFoodCount = meowzer.food?.amount{
print("\(catFoodCount)")
}
The result of the first print statement is:
Optional(10)
The result of the second print statement (after unwrapping) is:
10
Below is an example of accessing the value of an optional value in an optional dictionary
var dog : [String : Int?]?
dog = ["treat_count" : 10]
var dogTreatAmt = dog?["treat_count"]
print("\(dogTreatAmt)")
if let dogTreatCount = dog?["treat_count"] , dogTreatCountFinal = dogTreatCount{
print("\(dogTreatCount)")
print("\(dogTreatCountFinal)")
}
The result of the first print statement is:
Optional(Optional(10))
The result of the second print statement (after unwrapping once) is:
Optional(10)
The result of the third print statement (after unwrapping twice) is:
10
Why do I need to unwrap twice to access the desired value in the second case but not the first?
My guess is it has to do with that fact that if I were to have used a key other than "treat_count" (like "count" for example) then the value for that key would have been nil. However, I haven't been able to find a iOS "rule" or a better explanation on why this is. Any help would be much appreciated.
The difference is that Cat.food?.amount returns Int? while Dictionary<String, Int?>.subscript(String) returns Int??.
This is because Dictionary.subscript<Key> returns Element?, and Element here is Int?. Thus you're getting one extra level of Optional.
Optional chaining just removes an extra Optional wrapping at the point that it's used. It doesn't collapse all Optionals. In one case you have two, collapsed to 1 (for one ?) and in the second you have three, collapsed to 2 (for one ?).
As vadian suggests, this would be a crazy Dictionary, so it's unlikely to come up in good Swift, but it can be reasoned about. If it weren't like this, you would lose information.
It actually makes sense : simply count the number of ?s :).
With you first example, the property is an optional where, in your second example, you have an optional dictionary where its key is a string and the value is an optional integer.
So if you want to access the value of the second example, you have to unwrap the dictionary, then unwrap whatever there is in that dictionary for a given key.
This is a good question. Let's think the type definition of a Dictionary:
struct Dictionary<Key : Hashable, Value>
And the signature of the subscript function:
public subscript (key: Key) -> Value?
Which can also be thought of as:
public subscript (key: Key) -> Optional<Value>
Armed with this, let's look at your dictionary, which is typed as:
Dictionary<String, Int?>
Or more explicitly
Dictionary<String, Optional<Int>>
Where Value is of type Optional<Int>, and Key is of type String
So if we substitute in the key, the subscript definition is:
subscript (key: String) -> Optional<Value>
And if we substitute it in the Value, we get what you are seeing:
subscript (key: String) -> Optional<Optional<Int>
Now, let's break down your code and to make everything fit together:
var dogTreatAmt = dog?["treat_count"]
Since dog is optional, no matter what you call on it, the result will also be wrapped in an Optional, which we will temporarily think of as Optional(FunctionReturn)
Now let's take a look at FunctionReturn, which is the subscript function. We have already determined that this will return Optional<Optional<Int>>
Which means that really you're returning Optional<Optional<Optional<Int>>>, however as noted by Rob Napier,
"Optional chaining removes an extra Optional wrapping at the point that it's used".