Can someone explain me what is wrong with this statement.
var someString = "Welcome"
someString.append("!")
However this works when I replace the code with,
var someString = "Welcome"
let exclamationMark : Character = "!"
someString.append(exclamationMark)
Thanks in advance
In Swift, there is no character literal (such as 'c' in C-derived languages), there are only String literals.
Then, you have two functions defined on Strings: append, to append a single character, and extend, to append a whole String. So this works:
var someString = "Welcome"
someString.extend("!")
If you really want to use append, you can force a one-char String literal to be turned into a Character either by calling Character's constructor:
someString.append(Character("!"))
or by using a type conversion:
someString.append("!" as Character)
or by using a type annotation as you did with an extra variable:
let exclamationMark: Character = "!"
someString.append(exclamationMark)
String has 2 overloaded append(_:)
mutating func append(x: UnicodeScalar)
mutating func append(c: Character)
and both Character and UnicodeScalar conforms UnicodeScalarLiteralConvertible
enum Character : ExtendedGraphemeClusterLiteralConvertible, Equatable, Hashable, Comparable {
/// Create an instance initialized to `value`.
init(unicodeScalarLiteral value: Character)
}
struct UnicodeScalar : UnicodeScalarLiteralConvertible {
/// Create an instance initialized to `value`.
init(unicodeScalarLiteral value: UnicodeScalar)
}
"!" in this case is a UnicodeScalarLiteral. So, the compiler can not determine "!" is Character or UnicodeScalar, and which append(_:) method should be invoked. That's why you must specify it explicitly.
You can see: "!" can be a UnicodeScalar literal by this code:
struct MyScalar: UnicodeScalarLiteralConvertible {
init(unicodeScalarLiteral value: UnicodeScalar) {
println("unicode \(value)")
}
}
"!" as MyScalar // -> prints "unicode !"
Related
If my textfield takes 11 characters and I need to remove first character and then pass it as a parameter. I tried this code:
var dropFirst: String?
if emailPhoneTextField.text?.count == 11{
dropFirst = emailPhoneTextField.text?.dropFirst()
emailPhoneTextField.text = dropFirst
}
I receive this error:
Cannot assign value of type 'String.SubSequence?' (aka 'Optional') to type 'String?'
dropFirst returns SubSequence so you can't assign it directly to textfield's text property that accepts an optional string (String?) , So replace
dropFirst = emailPhoneTextField.text?.dropFirst()
With
dropFirst = String(emailPhoneTextField.text!.dropFirst())
Create String extension to assign SubSequence to String property.
extension String {
mutating func removeFirstChar() {
self = String(self.dropFirst())
}
}
In Swift 3, I have written a custom operator prefix operator § which I use in a method taking a String as value returning a LocalizedString struct (holding key and value).
public prefix func §(key: String) -> LocalizedString {
return LocalizedString(key: key)
}
public struct LocalizedString {
public var key: String
public var value: String
public init(key: String) {
let translated = translate(using: key) // assume we have this
self.key = key
self.value = translated ?? "!!\(key)!!"
}
}
(Yes I know about the awesome L10n enum in SwiftGen, but we are downloading our strings from our backend, and this question is more about how to work with custom operators)
But what if we wanna get the translated value from the result of the § operator (i.e. the property value from the resulting LocalizedString)
let translation = §"MyKey".value // Compile error "Value of type 'String' has no member 'value'"
We can of course easily fix this compile error by wraping it in parenthesis (§"MyKey").value. But if do not want to do that. Is it possible to set precedence for custom operators in relationship to the 'dot' literal?
Yes I know that only infix operators may declare precedence, but it would make sense to somehow work with precedence in order to achieve what I want:
precedencegroup Localization { higherThan: DotPrecedence } // There is no such group as "Dot"
prefix operator §: Localization
To mark that the Swift compiler first should evaluate §"MyKey" and understand that is not a string, but in fact an LocalizedString (struct).
Feels unlikely that this would be impossible? What am I missing?
The . is not an operator like all the other ones defined in the standard library, it is provided by the compiler instead. The grammar for it are Explicit Member Expressions.
Having a higher precedence than the . is nothing the compiler should enable you to do, as it's such a fundamental use case. Imagine what you could do if the compiler enabled such a thing:
-"Test".characters.count
If you could have a higher precedence than ., the compiler has to check all possibilities:
(-"Test").characters.count // func -(s: String) -> String
(-("Test".characters)).count // func -(s: String.CharacterView) -> String.CharacterView
-("Test".characters.count) // func -(s: Int) -> Int
Which would
Potentially increase the compile time a lot
Be ambiguous
Possibly change behaviour of existing code upon adding overloads
What I suggest you to do is abandon the idea with a new operator, it's only going to be adding more cognitive load by squashing some specific behaviour into a single obscure character. This is how I'd do it:
extension String {
var translatedString : String {
return translate(using: self)
}
}
"MyKey".localizedString
Or if you want to use your LocalizedString:
extension String {
var localized : LocalizedString {
return LocalizedString(key: self)
}
}
"MyKey".localized.value
These versions are much more comprehensive.
Why are implicitly unwrapped optionals not unwrapped when using string interpolation in Swift 3?
Example:
Running the following code in the playground
var str: String!
str = "Hello"
print("The following should not be printed as an optional: \(str)")
produces this output:
The following should not be printed as an optional: Optional("Hello")
Of course I can concatenate strings with the + operator but I'm using string interpolation pretty much everywhere in my app which now doesn't work anymore due to this (bug?).
Is this even a bug or did they intentionally change this behaviour with Swift 3?
As per SE-0054, ImplicitlyUnwrappedOptional<T> is no longer a distinct type; there is only Optional<T> now.
Declarations are still allowed to be annotated as implicitly unwrapped optionals T!, but doing so just adds a hidden attribute to inform the compiler that their value may be force unwrapped in contexts that demand their unwrapped type T; their actual type is now T?.
So you can think of this declaration:
var str: String!
as actually looking like this:
#_implicitlyUnwrapped // this attribute name is fictitious
var str: String?
Only the compiler sees this #_implicitlyUnwrapped attribute, but what it allows for is the implicit unwrapping of str's value in contexts that demand a String (its unwrapped type):
// `str` cannot be type-checked as a strong optional, so the compiler will
// implicitly force unwrap it (causing a crash in this case)
let x: String = str
// We're accessing a member on the unwrapped type of `str`, so it'll also be
// implicitly force unwrapped here
print(str.count)
But in all other cases where str can be type-checked as a strong optional, it will be:
// `x` is inferred to be a `String?` (because we really are assigning a `String?`)
let x = str
let y: Any = str // `str` is implicitly coerced from `String?` to `Any`
print(str) // Same as the previous example, as `print` takes an `Any` parameter.
And the compiler will always prefer treating it as such over force unwrapping.
As the proposal says (emphasis mine):
If the expression can be explicitly type checked with a strong optional type, it will be. However, the type checker will fall back to forcing the optional if necessary. The effect of this behavior is that the result of any expression that refers to a value declared as T! will either have type T or type T?.
When it comes to string interpolation, under the hood the compiler uses this initialiser from the _ExpressibleByStringInterpolation protocol in order to evaluate a string interpolation segment:
/// Creates an instance containing the appropriate representation for the
/// given value.
///
/// Do not call this initializer directly. It is used by the compiler for
/// each string interpolation segment when you use string interpolation. For
/// example:
///
/// let s = "\(5) x \(2) = \(5 * 2)"
/// print(s)
/// // Prints "5 x 2 = 10"
///
/// This initializer is called five times when processing the string literal
/// in the example above; once each for the following: the integer `5`, the
/// string `" x "`, the integer `2`, the string `" = "`, and the result of
/// the expression `5 * 2`.
///
/// - Parameter expr: The expression to represent.
init<T>(stringInterpolationSegment expr: T)
Therefore when implicitly called by your code:
var str: String!
str = "Hello"
print("The following should not be printed as an optional: \(str)")
As str's actual type is String?, by default that's what the compiler will infer the generic placeholder T to be. Therefore the value of str won't be force unwrapped, and you'll end up seeing the description for an optional.
If you wish for an IUO to be force unwrapped when used in string interpolation, you can simply use the force unwrap operator !:
var str: String!
str = "Hello"
print("The following should not be printed as an optional: \(str!)")
or you can coerce to its non-optional type (in this case String) in order to force the compiler to implicitly force unwrap it for you:
print("The following should not be printed as an optional: \(str as String)")
both of which, of course, will crash if str is nil.
This question was inspired by a similar question on how to decrement the ASCII value of a character in Swift: How to decrement a Character that contains a digit?
The UnicodeScalar struct provides an interface to represent a Unicode scalar value in Swift. I am trying to extend UnicodeScalar and create an initializer that takes in a Swift Character and returns a UnicodeScalar value of that Character.
To create a UnicodeScalar from a String is very easy. There is a private initializer for _NSSimpleObjCType (which conforms to UnicodeScalar) which takes a the first character of a String and passes it to the rawValue constructor.
NSObjCRuntime.swift
extension _NSSimpleObjCType {
init?(_ v: UInt8) {
self.init(rawValue: UnicodeScalar(v))
}
init?(_ v: String?) {
if let rawValue = v?.unicodeScalars.first {
self.init(rawValue: rawValue)
} else {
return nil
}
}
}
The result:
let stringScalar = UnicodeScalar("8") // 56 (U+0038)
From a Character literal this is more verbose. I need to manually convert the Character to a String, get its unicodeScalars, and return the first scalar in the array:
let c = Character("8")
let str = String(c)
if let scalar = str.unicodeScalars.first { // 56 (U+0038)
// ...
}
My attempt at an init in an extension:
extension UnicodeScalar {
init?(_ v: Character?) {
if let rawValue = String(v).unicodeScalars.first {
String(v).unicodeScalars.first
self.init(rawValue)
} else {
return nil;
}
}
}
Testing this new init with "8" returns U+004F ("O"), when I would expect it to return "8" (U+0038).
let testChar = Character("8")
let scalar = UnicodeScalar(testChar) // U+004F
I've also tried calling the String initializer with self.init(String(v)) in the init above, but I get the error 'UnicodeScalar.init' with an argument list of type '(String)'. I know this isn't the case since the String initializer is provided with the extension on _NSSimpleObjCType.
Where did my initializer fail?
Your init method
init?(_ v: Character?)
takes an optional character as parameter, therefore the string
interpolation String(v) returns the string
Optional("8")
with capital "O" as the first Unicode scalar ...
Changing the parameter type to a non-optional (or unwrapping
the parameter) should solve the problem.
In some languages, like C# for example, you can create a string in the following way:
"String {0} formatted {1} "
And then format it with String.format by passing in the values to format.
The above declaration is good, because you don't have to know of what type its parameters are when you create the string.
I tried to find similar approach in Swift, but what I found out was something like the following format:
"String %d formatted %d"
which requires you to format the string with String(format: , parameters). This is not good because you would also have to know parameter types when declaring the string.
Is there a similar approach in Swift where I wouldn't have to know the parameter types?
Use this one:
let printfOutput = String(format:"%# %2.2d", "string", 2)
It's the same as printf or the Obj-C formatting.
You can also mix it in this way:
let parm = "string"
let printfOutput = String(format:"\(parm) %2.2d", 2)
Edit: Thanks to MartinR (he knows it all ;-)
Be careful when mixing string interpolation and formatting. String(format:"\(parm) %2.2d", 2) will crash if parm contains a percent character. In (Objective-)C, the clang compiler will warn you if a format string is not a string literal.
This gives some room for hacking:
let format = "%#"
let data = "String"
let s = String(format: "\(format)", data) // prints "String"
In contrast to Obj-C which parses the format string at compile time, Swift does not do that and just interprets it at runtime.
In Swift, types need to conform to the CustomStringConvertible protocol in order to be used inside strings. This is also a requirement for the types used in string interpolation like this:
"Integer value \(intVal) and double value \(doubleVal)"
When you understand the CustomStringConvertible, you can create your own function to fulfill your needs. The following function formats the string based on the given arguments and prints it. It uses {} as a placeholder for the argument, but you can change it to anything you want.
func printWithArgs(string: String, argumentPlaceHolder: String = "{}", args: CustomStringConvertible...) {
var formattedString = string
// Get the index of the first argument placeholder
var nextPlaceholderIndex = string.range(of: argumentPlaceHolder)
// Index of the next argument to use
var nextArgIndex = 0
// Keep replacing the next placeholder as long as there's more placeholders and more unused arguments
while nextPlaceholderIndex != nil && nextArgIndex < args.count {
// Replace the argument placeholder with the argument
formattedString = formattedString.replacingOccurrences(of: argumentPlaceHolder, with: args[nextArgIndex].description, options: .caseInsensitive, range: nextPlaceholderIndex)
// Get the next argument placeholder index
nextPlaceholderIndex = formattedString.range(of: argumentPlaceHolder)
nextArgIndex += 1
}
print(formattedString)
}
printWithArgs(string: "First arg: {}, second arg: {}, third arg: {}", args: "foo", 4.12, 100)
// Prints: First arg: foo, second arg: 4.12, third arg: 100
Using a custom implementation allows you to have more control over it and tweak its behavior. For example, if you wanted to, you could modify this code to display the same argument multiple times using placeholders like {1} and {2}, you could fill the arguments in a reversed order, etc.
For more information about string interpolation in Swift: https://docs.swift.org/swift-book/LanguageGuide/StringsAndCharacters.html#//apple_ref/doc/uid/TP40014097-CH7-ID292