After reading an discussion about using Self in protocol, I did an experiment (see code below). I thought the code would fail to compile because, from my understanding, for class B to conform to Copyable protocol, it should have init(_ instance: B), which I didn't define. But the code actually works.
I wonder why? Thanks for any explanation.
1 import Foundation
2 protocol Copyable: class {
3 init(_ instance: Self)
4 }
5 class A: Copyable {
6 var x: Int
7
8 init(x: Int) {
9 self.x = x
10 }
11
12 required init(_ instance: A) {
13 self.x = instance.x
14 }
15 }
16 class B: A {
17 var y: Int
18
19 init(x: Int, y: Int) {
20 self.y = y
21 super.init(x: x)
22 }
23
24 required init(_ instance: A) {
25 self.y = 1
26 super.init(instance)
27 }
28 }
29 var a = A(x:1)
30 var b = B(a)
According to the documentation Self in this case will be A since A is the one conforming to the protocol, B is only doing it indirectly as a subclass of A.
So when A conforms to Copyable you are saying that A and all its subclasses must have an init(_ instance: A)
In a protocol declaration or a protocol member declaration, the Self type refers to the eventual type that conforms to the protocol.
You can actually test this by removing the required init(_ instance: A) init and you will get an error even if you have an init(_ instance: B), so since A is the class conforming to the protocol you must have an init where the instance argument is A
Related
The only way I've found in compose is to use accompanist-insets and that removes window insets. And such causes other problems with my app's layout.
The Android way seems to be this and I could pass that into my compose app and act accordingly.
Is there another way in jetpack compose?
Update
With the new WindowInsets API, it gets easier
First, to return the correct values, you need to set:
WindowCompat.setDecorFitsSystemWindows(window, false)
Then to use Keyboard as a state:
#Composable
fun keyboardAsState(): State<Boolean> {
val isImeVisible = WindowInsets.ime.getBottom(LocalDensity.current) > 0
return rememberUpdatedState(isImeVisible)
}
use example:
val isKeyboardOpen by keyboardAsState() // true or false
ps: I've tried to use WindowInsets.isImeVisible, but it returns true in the first call.
Without an experimental API
if you want with the statement, I found this solution:
enum class Keyboard {
Opened, Closed
}
#Composable
fun keyboardAsState(): State<Keyboard> {
val keyboardState = remember { mutableStateOf(Keyboard.Closed) }
val view = LocalView.current
DisposableEffect(view) {
val onGlobalListener = ViewTreeObserver.OnGlobalLayoutListener {
val rect = Rect()
view.getWindowVisibleDisplayFrame(rect)
val screenHeight = view.rootView.height
val keypadHeight = screenHeight - rect.bottom
keyboardState.value = if (keypadHeight > screenHeight * 0.15) {
Keyboard.Opened
} else {
Keyboard.Closed
}
}
view.viewTreeObserver.addOnGlobalLayoutListener(onGlobalListener)
onDispose {
view.viewTreeObserver.removeOnGlobalLayoutListener(onGlobalListener)
}
}
return keyboardState
}
and to detect/check the value you'll only need this:
val isKeyboardOpen by keyboardAsState() // Keyboard.Opened or Keyboard.Closed
Here is a solution that uses OnGlobalLayoutListener to listen to changes to the layout and uses the new window insets APIs to perform calculations, as recommended by the documentation. You can place this code anywhere inside a #Composable function and handle the isKeyboardOpen as you wish. I tested and it works on API 21 and above.
val view = LocalView.current
DisposableEffect(view) {
val listener = ViewTreeObserver.OnGlobalLayoutListener {
val isKeyboardOpen = ViewCompat.getRootWindowInsets(view)
?.isVisible(WindowInsetsCompat.Type.ime()) ?: true
// ... do anything you want here with `isKeyboardOpen`
}
view.viewTreeObserver.addOnGlobalLayoutListener(listener)
onDispose {
view.viewTreeObserver.removeOnGlobalLayoutListener(listener)
}
}
For me the other solutions wouldn't work well: the keyboard would result as always closed.
In OnGlobalLayoutListener-based answers, the used formula does not seem to behave as it should, and old APIs are used.
In the WindowInsetListener-based answer, since view is not the root view, no window insets would be applied on it. I tried replacing view with view.rootView, and although the keyboard-detection code would then work, passing the root view to setOnApplyWindowInsetsListener replaces any listener set by components, which is obviously unwanted.
I found a way with Android's viewTreeObserver. It essentially is the Android version but it calls a callback that can be used in compose.
class MainActivity : ComponentActivity() {
var kbGone = false
var kbOpened: () -> Unit = {}
var kbClosed: () -> Unit = {}
override fun onCreate(state: Bundle?) {
super.onCreate(state)
setContent {
kbClosed = {
// dismiss the keyboard with LocalFocusManager for example
}
kbOpened = {
// something
}
MyComponent()
}
setupKeyboardDetection(findViewById<View>(android.R.id.content))
}
fun setupKeyboardDetection(contentView: View) {
contentView.viewTreeObserver.addOnGlobalLayoutListener {
val r = Rect()
contentView.getWindowVisibleDisplayFrame(r)
val screenHeight = contentView.rootView.height
val keypadHeight = screenHeight - r.bottom
if (keypadHeight > screenHeight * 0.15) { // 0.15 ratio is perhaps enough to determine keypad height.
kbGone = false
kbOpened()
} else(!kbGone) {
kbGone = true
kbClosed()
}
}
}
}
Detecting whether keyboard is opening or closing can be inspected with WindowInsest.ime
Set WindowCompat.setDecorFitsSystemWindows(window, false)
To check whether it's open or close use
WindowInsets.isImeVisible
Check if it's going up or opening with using bottom offset however it's not always reliable you need to do extra steps to check if it's opening or closing
val offsetY = WindowInsets.ime.getBottom(density)
you can compare a previous value and detect if it's opening closing, open or close
https://stackoverflow.com/a/73358604/5457853
When it opens it returns values such as
17:40:21.429 I OffsetY: 1017
17:40:21.446 I OffsetY: 38
17:40:21.463 I OffsetY: 222
17:40:21.479 I OffsetY: 438
17:40:21.496 I OffsetY: 586
17:40:21.513 I OffsetY: 685
17:40:21.530 I OffsetY: 764
17:40:21.546 I OffsetY: 825
17:40:21.562 I OffsetY: 869
17:40:21.579 I OffsetY: 907
17:40:21.596 I OffsetY: 937
17:40:21.613 I OffsetY: 960
17:40:21.631 I OffsetY: 979
17:40:21.646 I OffsetY: 994
17:40:21.663 I OffsetY: 1004
17:40:21.679 I OffsetY: 1010
17:40:21.696 I OffsetY: 1014
17:40:21.713 I OffsetY: 1016
17:40:21.730 I OffsetY: 1017
17:40:21.746 I OffsetY: 1017
While closing
17:40:54.276 I OffsetY: 0
17:40:54.288 I OffsetY: 972
17:40:54.303 I OffsetY: 794
17:40:54.320 I OffsetY: 578
17:40:54.337 I OffsetY: 430
17:40:54.354 I OffsetY: 331
17:40:54.371 I OffsetY: 252
17:40:54.387 I OffsetY: 191
17:40:54.404 I OffsetY: 144
17:40:54.421 I OffsetY: 109
17:40:54.437 I OffsetY: 79
17:40:54.454 I OffsetY: 55
17:40:54.471 I OffsetY: 37
17:40:54.487 I OffsetY: 22
17:40:54.504 I OffsetY: 12
17:40:54.521 I OffsetY: 6
17:40:54.538 I OffsetY: 2
17:40:54.555 I OffsetY: 0
17:40:54.571 I OffsetY: 0
Now, with the new WindowInsets api, WindowInsets.isImeVisible can be used. For reference, see this.
In Jetpack compose:
#Composable
fun isKeyboardVisible(): Boolean = WindowInsets.ime.getBottom(LocalDensity.current) > 0
It will return true or false,
True -> Keyboard Visible
False -> Keyboard Not Visible
Also we can use WindowInsetListener, something like this
#Composable
fun keyboardAsState(): State<Boolean> {
val keyboardState = remember { mutableStateOf(false) }
val view = LocalView.current
LaunchedEffect(view) {
ViewCompat.setOnApplyWindowInsetsListener(view) { _, insets ->
keyboardState.value = insets.isVisible(WindowInsetsCompat.Type.ime())
insets
}
}
return keyboardState
}
I'm trying to create a function that calculates and returns compound interest. The variables are having different data types. Whenever I run the program I get an error initializer 'init(_:)' requires that 'Decimal' conform to 'BinaryInteger'. The following is my code:
import Foundation
class Compound{
var p:Double
var t:Int
var r:Double
var n:Int
var interest:Double
var amount:Double
init(p:Double,t:Int,r:Double,n:Int){
self.p = p
self.t = t
self.r = r
self.n = n
}
func calculateAmount() -> Double {
amount = p * Double(pow(Decimal(1 + (r / Double(n))),n * t))
return amount
}
}
The Error:
error: initializer 'init(_:)' requires that 'Decimal' conform to 'BinaryInteger'
amount = p * Double(pow(Decimal(1 + (r / Double(n))),n * t))
^
After looking at a similar problem I've also tried the following technique but I'm still getting the same error
func calculateAmount() -> Double {
let gg:Int = n * t
amount = p * Double(pow(Decimal(1 + (r / Double(n))),Int(truncating: gg as NSNumber) ))
return amount
}
How to solve this?
It would be easier to use the Double func pow(_: Double, _: Double) -> Double instead of using Decimal func pow(_ x: Decimal, _ y: Int) -> Decimal considering that you want to return a Double:
#discardableResult
func calculateAmount() -> Double {
amount = p * pow(1 + (r / Double(n)), Double(n) * Double(t))
return amount
}
I have got a bunch of crash logs from iTunesConnect for my ios swift app with the top of the stacktrace showing the error message:
protocol witness for Strideable.distance(to : A) -> A.Stride in conformance Int64 + 124
This comes from an innocuous line in my code which looks like the following:
if (var1 - var2 > MyClass.THRESHOLD) {
// Do something
}
var1 and var2 are declared to be of type Int64, while THRESHOLD is:
static let THRESHOLD = 900 * 1000
I have a hunch that this is because THRESHOLD is not declared to be of Int64, though I still don't have a hypothesis as to how this could cause a problem. Also, the bug is not reproducible, so I can't verify.
Any help on what this error message means, and what might be the issue here?
The mixed-type comparison can be
the cause for the problem. The subtraction operator is inferred from
the types of its operands as
#available(swift, deprecated: 3.0, obsoleted: 4.0, message: "Mixed-type subtraction is deprecated. Please use explicit type conversion.")
public func -<T>(lhs: T, rhs: T) -> T.Stride where T : SignedInteger
with T == Int64 and T.Stride == Int. Your code would cause a warning message with Xcode 8.3.2:
let THRESHOLD = 900 * 1000
let var1: Int64 = 0x100000000
let var2: Int64 = 0
// warning: '-' is deprecated: Mixed-type subtraction is deprecated. Please use explicit type conversion.
if var1 - var2 > THRESHOLD {
print("foo")
} else {
print("bar")
}
On a 32-bit device the difference can be too large for an Int
and the above example would abort with a runtime error
* thread #1, queue = 'com.apple.main-thread', stop reason = EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0)
frame #0: 0x0223d428 libswiftCore.dylib`protocol witness for Swift.Strideable.distance (to : A) -> A.Stride in conformance Swift.Int64 : Swift.Strideable in Swift + 72
The solution is to explicitly convert the right-hand side:
if var1 - var2 > Int64(THRESHOLD) { ... }
I am new to Swift.
I have following code
class ViewController: UIViewController {
let var1: Double = 0.0
let var2: Int = 0
override func viewDidLoad() {
super.viewDidLoad()
let someObject = TestViewController(x: 20, total: 30, taxPact: 40, subtotal: 50)
var x = 1 + 1.0 /* COMPILER IS FINE WITH ADDING INT AND DOUBLE */
print("sum is \(var1 + var2)") /* COMPILER COMPLAINS HERE BINARY OPERATOR + CANNOT BE APPLIED */
}
Why do we see such inconsistent behavior?
The error message is unrelated to string interpolation, this
let var1: Double = 0.0
let var2: Int = 0
var x = var1 + var2 // error: binary operator '+' cannot be applied to operands of type 'Double' and 'Int'
does not compile either, and the reason is that there is no +
operator which adds an Int to a Double and
Swift does not implicitly convert types. You have to convert explicitly,
e.g.
var x = var1 + Double(var2)
print("sum is \(var1 + Double(var2))")
Your other statement
var x = 1 + 1.0
compiles because both Int and Double (and some more types)
conform to the IntegerLiteralConvertible protocol,
so the literal 1 can be both a Int literal
and a Double literal. Here the compiler chooses 1 to be a
Double because that is the only choice for which a suitable
+ operator exists.
I can't understand why this one works:
var arr = [4,5,6,7]
arr.map() {
x in
return x + 2
}
while this one not
arr.map() {
x in
var y = x + 2
return y
}
with error
Playground execution failed: MyPlayground.playground:13:5: error:
cannot invoke 'map' with an argument list of type '((_) -> _)'
arr.map() {
The problem here is there error message. In general, when you see something like cannot invoke .. with ... it means that the compiler's type inference has just not worked.
In this case, you've run up against one of the limitations of inference within closures. Swift can infer the type of single-statement closures only, not multiple-statement ones. In your first example:
arr.map() {
x in
return x + 2
}
There's actually only one statement: return x + 2. However, in the second:
arr.map() {
x in
var y = x + 2
return y
}
There's an assignment statement (var y = x + 2), and then the return. So the error is a little misleading: it doesn't mean you "can't invoke map() with this type of argument", what it means to say is "I can't figure out what type x or y is".
By the way, in single-statement closures, there are two other things that can be inferred. The return statement:
arr.map() {
x in
x + 2
}
And the variable name itself:
arr.map() { $0 + 2 }
It all produces the same compiled code, though. So it's really a matter of taste which one you choose. (For instance, while I think the inferred return looks clean and easier to read, I don't like the $0, so I generally always put x in or something, even for very short closures. It's up to you, though, obviously.)
One final thing: since this is all really just syntax stuff, it's worth noting that the () isn't needed either:
arr.map { x in x + 2 }
As #MartinR pointed out, the compiler can infer some types from outer context as well:
let b: [Int] = arr.map { x in
var y = x + 2
return y
}
Which is worth bearing in mind. (it seems that the "one-statement" rule only applies when there's no other type info available)
Swift can't infer type every time. Even though it should see that y = x + 2 means y is an Int too. My guess is that Swift parses the closure in a certain order that makes it not aware of the return type ahead of time in your case.
This works:
arr.map() {
x -> Int in
var y = x + 2
return y
}