If my textfield takes 11 characters and I need to remove first character and then pass it as a parameter. I tried this code:
var dropFirst: String?
if emailPhoneTextField.text?.count == 11{
dropFirst = emailPhoneTextField.text?.dropFirst()
emailPhoneTextField.text = dropFirst
}
I receive this error:
Cannot assign value of type 'String.SubSequence?' (aka 'Optional') to type 'String?'
dropFirst returns SubSequence so you can't assign it directly to textfield's text property that accepts an optional string (String?) , So replace
dropFirst = emailPhoneTextField.text?.dropFirst()
With
dropFirst = String(emailPhoneTextField.text!.dropFirst())
Create String extension to assign SubSequence to String property.
extension String {
mutating func removeFirstChar() {
self = String(self.dropFirst())
}
}
Related
I am using a textfield to search the value in the array. But I get an error like this.
What should we do to convert the data type? I am using Swift 5.
Value of optional type 'UITextField?' must be unwrapped to refer to member 'text' of wrapped base type 'UITextField'
for i in 0..<allUser.count {
if allUser[i].name.contains(textArea.text) { // here is the error
print(allUser[i].name)
}
}
The error you were getting is because textArea is optional and textArea.text returns an optional string, and the 'contains' function expects a non-optional string.
First, unwrap the text, then loop over the filtered matches and print these.
if let text = textArea?.text {
for user in allUser.filter({ $0.name.contains(text) }) {
print(user.name)
}
}
You should safely unwrap the text using if-let statement.
for i in 0..<allUser.count{
if let textValue = textArea.text {
if allUser[i].name.contains(textValue){
print(allUser[i].name)
}
}
}
The error is about the text field, not the text property.
You can even write
if let textField = textArea {
let text = textField.text!
for user in allUser where user.name.contains(text) {
print(user.name)
}
}
because the text property of an UITextField won’t be nil, however in this case this is preferable
if let text = textArea?.text {
for user in allUser where user.name.contains(text) {
print(user.name)
}
}
I have an Int variable defined like so...
var otpNo = Int()
Now, an integer value has been assigned to this variable.
Now I have passed this int value to another viewcontroller and in that viewcontroller, I want to assign this int value to a textfield. But I am not able to do so.
I have tried this...
Int(myTextField.text!) = otpNo
But I am getting this error message:
Cannot assign to value: function call returns immutable value
Try this:
You can assign text as follow :
myTextField.text = String(format: "%d", otpNo)
myTextField.text = "\(otpNo)"
You need to set the value like so:
myTextField.text = "\(otpNo)"
A UITextField's text property accepts String type. You will need to convert your Int into a String, instead of what you were trying in your question.
When you wrap a string in Int(), you're creating an immutable representation of that string, as an Int. You can't assign anything to that representation - it's just a number
myTextField.text = String(describing: optNo)
This converts the int to a string, and assigns it to the text property of that text field.
This question was inspired by a similar question on how to decrement the ASCII value of a character in Swift: How to decrement a Character that contains a digit?
The UnicodeScalar struct provides an interface to represent a Unicode scalar value in Swift. I am trying to extend UnicodeScalar and create an initializer that takes in a Swift Character and returns a UnicodeScalar value of that Character.
To create a UnicodeScalar from a String is very easy. There is a private initializer for _NSSimpleObjCType (which conforms to UnicodeScalar) which takes a the first character of a String and passes it to the rawValue constructor.
NSObjCRuntime.swift
extension _NSSimpleObjCType {
init?(_ v: UInt8) {
self.init(rawValue: UnicodeScalar(v))
}
init?(_ v: String?) {
if let rawValue = v?.unicodeScalars.first {
self.init(rawValue: rawValue)
} else {
return nil
}
}
}
The result:
let stringScalar = UnicodeScalar("8") // 56 (U+0038)
From a Character literal this is more verbose. I need to manually convert the Character to a String, get its unicodeScalars, and return the first scalar in the array:
let c = Character("8")
let str = String(c)
if let scalar = str.unicodeScalars.first { // 56 (U+0038)
// ...
}
My attempt at an init in an extension:
extension UnicodeScalar {
init?(_ v: Character?) {
if let rawValue = String(v).unicodeScalars.first {
String(v).unicodeScalars.first
self.init(rawValue)
} else {
return nil;
}
}
}
Testing this new init with "8" returns U+004F ("O"), when I would expect it to return "8" (U+0038).
let testChar = Character("8")
let scalar = UnicodeScalar(testChar) // U+004F
I've also tried calling the String initializer with self.init(String(v)) in the init above, but I get the error 'UnicodeScalar.init' with an argument list of type '(String)'. I know this isn't the case since the String initializer is provided with the extension on _NSSimpleObjCType.
Where did my initializer fail?
Your init method
init?(_ v: Character?)
takes an optional character as parameter, therefore the string
interpolation String(v) returns the string
Optional("8")
with capital "O" as the first Unicode scalar ...
Changing the parameter type to a non-optional (or unwrapping
the parameter) should solve the problem.
So I can't figure this out, am I supposed to change the '.text' to something else or do I have to go about converting the string into a double?
Here is the code
if item != nil {
// the errors I keep getting for each one is
unitCost.text = item?.unitCost //cannot assign to a value 'NSNumber?' to a value of type 'String?'
total.text = item?.total //cannot assign to a value 'NSNumber?' to a value of type 'String?'
date.text = item?.date //cannot assign to a value 'NSDate?' to a value of type 'String?'
}
You are trying to assign an invalid type to the text property. The text property is of type String? as stated by the compiler error. You are trying to assign an NSNumber or NSDate. The expected type is a String or nil and so you must ensure that you provide only those two possibilities. As a result, you need to convert your numbers and dates into strings.
In Swift, there is no need to use format specifiers. Instead, best practice is to use string interpolation for simple types like numbers:
unitCost.text = "\(item?.unitCost!)"
total.text = "\(item?.total!)"
For dates, you can use NSDateFormatter to produce a human-friendly date in a desired format:
let formatter = NSDateFormatter()
formatter.dateStyle = .MediumStyle
date.text = "\(formatter.stringFromDate(date))"
While we're at it, why not use optional binding instead of nil comparison:
if let item = item {
// Set your properties here
}
Try this:
unitCost.text = String(format: "%d", item?.unitCost?.integerValue)
You can add an extension for Double/NSNumber/NSDate
extension Double {
func toString() -> String {
return NSNumberFormatter().stringFromNumber(self) ?? ""
}
}
extension NSNumber {
func toString() -> String {
return NSNumberFormatter().stringFromNumber(self) ?? ""
}
}
var doubleValue: Double?
doubleValue?.toString()
if doubleValue is not set it returns empty string. You can make toString() return String? too.. depends on what you need
also, item != nil check is not required in your code as it is optional.
#dbart "\(item?.unitCost)" displays an optional value as String, like Optional(5) rather than 5, we need to unwrap the value
check this code:
if let requiredItem = item {
unitCost.text = requiredItem.unitCost ? "\(requiredItem.unitCost)" : ""
total.text = requiredItem.total ? "\(requiredItem.total)" : ""
date.text = requiredItem.date ? "\(requiredItem.date)" : ""
}
Can someone explain me what is wrong with this statement.
var someString = "Welcome"
someString.append("!")
However this works when I replace the code with,
var someString = "Welcome"
let exclamationMark : Character = "!"
someString.append(exclamationMark)
Thanks in advance
In Swift, there is no character literal (such as 'c' in C-derived languages), there are only String literals.
Then, you have two functions defined on Strings: append, to append a single character, and extend, to append a whole String. So this works:
var someString = "Welcome"
someString.extend("!")
If you really want to use append, you can force a one-char String literal to be turned into a Character either by calling Character's constructor:
someString.append(Character("!"))
or by using a type conversion:
someString.append("!" as Character)
or by using a type annotation as you did with an extra variable:
let exclamationMark: Character = "!"
someString.append(exclamationMark)
String has 2 overloaded append(_:)
mutating func append(x: UnicodeScalar)
mutating func append(c: Character)
and both Character and UnicodeScalar conforms UnicodeScalarLiteralConvertible
enum Character : ExtendedGraphemeClusterLiteralConvertible, Equatable, Hashable, Comparable {
/// Create an instance initialized to `value`.
init(unicodeScalarLiteral value: Character)
}
struct UnicodeScalar : UnicodeScalarLiteralConvertible {
/// Create an instance initialized to `value`.
init(unicodeScalarLiteral value: UnicodeScalar)
}
"!" in this case is a UnicodeScalarLiteral. So, the compiler can not determine "!" is Character or UnicodeScalar, and which append(_:) method should be invoked. That's why you must specify it explicitly.
You can see: "!" can be a UnicodeScalar literal by this code:
struct MyScalar: UnicodeScalarLiteralConvertible {
init(unicodeScalarLiteral value: UnicodeScalar) {
println("unicode \(value)")
}
}
"!" as MyScalar // -> prints "unicode !"