I'm using Jetpack Compose and trying to find a way to detect if the keyboard is open.
I've tried to use the below code, but I get an error stating Unresolved reference: ime. When I click on the recommended imports (the 2 shown below), this error still remains.
import android.view.WindowInsets
import android.view.WindowInsets.Type.ime
#Composable
fun signInView() {
val isVisible = WindowInsets.ime.getBottom(LocalDensity.current) > 0
}
How can I resolve this?
Add the dependencies for the artifacts you need in the build.gradle file for your app or module:
dependencies {
implementation "androidx.compose.foundation:foundation:1.3.1"
}
android {
buildFeatures {
compose true
}
composeOptions {
kotlinCompilerExtensionVersion = "1.3.2"
}
kotlinOptions {
jvmTarget = "1.8"
}
}
Example:
#Composable
fun signInView() {
var isVisible by remember { mutableStateOf(false) }
val ime = androidx.compose.foundation.layout.WindowInsets.ime
val navbar = androidx.compose.foundation.layout.WindowInsets.navigationBars
var keyboardHeightDp by remember { mutableStateOf(0.dp) }
val localDensity = LocalDensity.current
LaunchedEffect(localDensity.density) {
snapshotFlow {
ime.getBottom(localDensity) - navbar.getBottom(localDensity)
}.collect {
val currentKeyboardHeightDp = (it / localDensity.density).dp
keyboardHeightDp = maxOf(currentKeyboardHeightDp, keyboardHeightDp)
isVisible = currentKeyboardHeightDp == keyboardHeightDp
}
}
}
Related
In Jetpack/Desktop Compose I want a coroutine to run in response to changes to a SnapshotStateList.
In this example:
import androidx.compose.foundation.layout.Column
import androidx.compose.material.Button
import androidx.compose.material.Text
import androidx.compose.runtime.Composable
import androidx.compose.runtime.LaunchedEffect
import androidx.compose.runtime.mutableStateListOf
import androidx.compose.runtime.remember
#Composable
fun TestMutableList() {
val list = remember { mutableStateListOf(1, 2, 3) }
LaunchedEffect(list) {
println("List was changed.")
}
Column {
Button(onClick = { list[0] = 0 }) {
Text("Change List")
}
list.forEach { Text(it.toString()) }
}
}
the LaunchedEffect was run on the first composition. And the Composable recomposes when I click the button, so it knows that the SnapshotStateList<Int> changed. However, it was not run when clicking the button. I understand that this is because the key is the reference to the SnapshotStateList<Int> and that did not change.
How can I have the LaunchedEffect run every time that the list is modified?
You can update an integer for anytime you change list so it will trigger when that value is changed
val list = remember { mutableStateListOf(1, 2, 3) }
var changeIndex by remember {
mutableStateOf(0)
}
LaunchedEffect(list.size, changeIndex) {
// add an if here if you don't want to trigger when changeIndex is 0
println("List was changed.")
}
Column {
Button(onClick = { list[0] = 0 }) {
changeIndex ++
Text("Change List")
}
list.forEach { Text(it.toString()) }
}
I had the same problem and got it working using the list size instead of the list itself.
Like this:
val list = remember { mutableStateListOf(1, 2, 3) }
LaunchedEffect(list.size) {
println("List was changed.")
}
With convert SnapshotStateList to ImmutableList, you can achieve to aim.
#Composable
fun TestMutableList() {
val list = remember { mutableStateListOf(1, 2, 3) }
LaunchedEffect(list.toList()) {
println("List was changed.")
}
Column {
Button(onClick = { list[0] = 0 }) {
Text("Change List")
}
list.forEach { Text(it.toString()) }
}
}
Do you know how to apply Speech Recognition (SpeechRecognizer) in Jetpack Compose?
Something like this, but in Compose.
I followed the steps in this video:
Added these permissions in the manifest:
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.RECORD_AUDIO"/>
Wrote this code in MainActivity:
class MainActivity : ComponentActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContent {
PageUi()
}
}
}
#Composable
fun PageUi() {
val context = LocalContext.current
val talk by remember { mutableStateOf("Speech text should come here") }
Column(
modifier = Modifier.fillMaxSize(),
horizontalAlignment = Alignment.CenterHorizontally,
verticalArrangement = Arrangement.Center
) {
Text(
text = talk,
style = MaterialTheme.typography.h4,
modifier = Modifier
.fillMaxSize(0.85f)
.padding(16.dp)
.background(Color.LightGray)
)
Button(onClick = { askSpeechInput(context) }) {
Text(
text = "Talk", style = MaterialTheme.typography.h3
)
}
}
}
fun askSpeechInput(context: Context) {
if (!SpeechRecognizer.isRecognitionAvailable(context)) {
Toast.makeText(context, "Speech not available", Toast.LENGTH_SHORT).show()
} else {
val i = Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH)
i.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL, RecognizerIntent.LANGUAGE_MODEL_FREE_FORM)
i.putExtra(RecognizerIntent.EXTRA_LANGUAGE, Locale.getDefault())
i.putExtra(RecognizerIntent.EXTRA_PROMPT, "Talk")
//startActivityForResult(MainActivity(),i,102)
}
}
#Preview(showBackground = true)
#Composable
fun PageShow() {
PageUi()
}
But I have no idea how to use startActivityForResult in Compose and do the rest?
And when I test it so far on my phone (or emulator) it always ends up with the toast message!
I am going to explain my own implementation. Let me give you a general idea first, and then I am going to explain each step. So first you need to ask for permissions every time and then if permission is granted then you should start an intent in order to hear what the user says. What the user says is saved on a variable to a View Model. The variable on the View Model is being observed by the composable so you can get the data.
1) Add this to your Manigest:
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
package="your.package">
// Add uses-permission
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
[...]
[...]
[...]
// Add above the last line </manifest> like so:
<queries>
<intent>
<action android:name="android.speech.RecognitionService" />
</intent>
</queries>
</manifest>
2) Create a ViewModel
class ScreenViewModel : ViewModel() {
var textFromSpeech: String? by mutableStateOf(null)
}
You need the ViewModel in order to observe the variable from composable and implement your code logic for clean architecture.
3) Implement asking for permission
In build.gradle add the following:
implementation "com.google.accompanist:accompanist-permissions:$accompanist_version"
Then create a composable like so:
#ExperimentalPermissionsApi
#Composable
fun OpenVoiceWithPermission(
onDismiss: () -> Unit,
vm: ScreenViewModel,
ctxFromScreen: Context,
finished: () -> Unit
) {
val voicePermissionState = rememberPermissionState(android.Manifest.permission.RECORD_AUDIO)
val ctx = LocalContext.current
fun newIntent(ctx: Context) {
val intent = Intent()
intent.action = Settings.ACTION_APPLICATION_DETAILS_SETTINGS
val uri = Uri.fromParts(
"package",
BuildConfig.APPLICATION_ID, null
)
intent.data = uri
intent.flags = Intent.FLAG_ACTIVITY_NEW_TASK
ctx.startActivity(intent)
}
PermissionRequired(
permissionState = voicePermissionState,
permissionNotGrantedContent = {
DialogCustomBox(
onDismiss = onDismiss,
dialogBoxState = DialogLogInState.REQUEST_VOICE,
onRequestPermission = { voicePermissionState.launchPermissionRequest() }
)
},
permissionNotAvailableContent = {
DialogCustomBox(
onDismiss = onDismiss,
dialogBoxState = DialogLogInState.VOICE_OPEN_SYSTEM_SETTINGS,
onOpenSystemSettings = { newIntent(ctx) }
)
}
) {
startSpeechToText(vm, ctxFromScreen, finished = finished)
}
}
DialogBox you can create your own custom as I have done or use the standard version, this is up to you and out of the scope of this answer.
On the above code if permission is granted you move automatically to this piece of code: startSpeechToText(vm, ctxFromScreen, finished = finished) which you have to implement next.
4) Implementing Speech Recognizer
fun startSpeechToText(vm: ScreenViewModel, ctx: Context, finished: ()-> Unit) {
val speechRecognizer = SpeechRecognizer.createSpeechRecognizer(ctx)
val speechRecognizerIntent = Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH)
speechRecognizerIntent.putExtra(
RecognizerIntent.EXTRA_LANGUAGE_MODEL,
RecognizerIntent.LANGUAGE_MODEL_FREE_FORM,
)
// Optionally I have added my mother language
speechRecognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, "el_GR")
speechRecognizer.setRecognitionListener(object : RecognitionListener {
override fun onReadyForSpeech(bundle: Bundle?) {}
override fun onBeginningOfSpeech() {}
override fun onRmsChanged(v: Float) {}
override fun onBufferReceived(bytes: ByteArray?) {}
override fun onEndOfSpeech() {
finished()
// changing the color of your mic icon to
// gray to indicate it is not listening or do something you want
}
override fun onError(i: Int) {}
override fun onResults(bundle: Bundle) {
val result = bundle.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION)
if (result != null) {
// attaching the output
// to our viewmodel
vm.textFromSpeech = result[0]
}
}
override fun onPartialResults(bundle: Bundle) {}
override fun onEvent(i: Int, bundle: Bundle?) {}
})
speechRecognizer.startListening(speechRecognizerIntent)
}
With this implementation it is very customizable and you do not get this pop up from google. So you can inform the user that his device is listening with your own unique way!
5) Call from your composable the function to start listening:
#ExperimentalPermissionsApi
#Composable
fun YourScreen() {
val ctx = LocalContext.current
val vm: ScreenViewModel = viewModel()
var clickToShowPermission by rememberSaveable { mutableStateOf(false) }
if (clickToShowPermission) {
OpenVoiceWithPermission(
onDismiss = { clickToShowPermission = false },
vm = vm,
ctxFromScreen = ctx
) {
// Do anything you want when the voice has finished and do
// not forget to return clickToShowPermission to false!!
clickToShowPermission = false
}
}
}
So on you code everytime you call clickToShowPermission = true you can start listening what the user says...
use registerForActivityResult(ActivityResultContract, ActivityResultCallback) passing in a androidx.activity.result.contract.ActivityResultContracts.StartActivityForResult object for the ActivityResultContract.
By declaring the StartActivityForResult callback function
val startLauncher = rememberLauncherForActivityResult(
ActivityResultContracts.StartActivityForResult()
) {it ->
//you implement
}
start Intent
startLauncher.launch(intent)
test simple example
#Composable
fun TestStartForResult() {
val content = LocalContext.current
val startLauncher = rememberLauncherForActivityResult(
contract = ActivityResultContracts.StartActivityForResult())
{
Toast.makeText(content, "Result", Toast.LENGTH_SHORT).show()
}
Button(onClick = {
startLauncher.launch(Intent(content,TestActivity::class.java))
}) {
Text("start")
}
}
I'd like to make status bar hidden and I've managed to do it like so using the accompanist library:
val systemUiController = rememberSystemUiController()
systemUiController.isStatusBarVisible = false
The issue is that when the app goes to background and comes to foreground, this piece of code is not run and therefore the status bar is shown again. How can I fix that?
Thanks.
You can use OnLifecycleEvent from this answer.
val systemUiController = rememberSystemUiController()
OnLifecycleEvent { _, event ->
when (event) {
Lifecycle.Event.ON_RESUME,
Lifecycle.Event.ON_START,
-> {
systemUiController.isStatusBarVisible = false
}
else -> Unit
}
}
OnLifecycleEvent:
#Composable
fun OnLifecycleEvent(onEvent: (owner: LifecycleOwner, event: Lifecycle.Event) -> Unit) {
val eventHandler = rememberUpdatedState(onEvent)
val lifecycleOwner = rememberUpdatedState(LocalLifecycleOwner.current)
DisposableEffect(lifecycleOwner.value) {
val lifecycle = lifecycleOwner.value.lifecycle
val observer = LifecycleEventObserver { owner, event ->
eventHandler.value(owner, event)
}
lifecycle.addObserver(observer)
onDispose {
lifecycle.removeObserver(observer)
}
}
}
Using jetpack compose, for a clickevent how to perform haptic feedback. I am new to jetpack compose.
This is what i tried -
val hapticFeedback = LocalHapticFeedback
#Composable
fun Tab() {
Row() {
Icon(imageVector = icon, contentDescription = text)
if (selected) {
// i tried both the following ways, none are working.
hapticFeedback.current.performHapticFeedback(
HapticFeedbackType(10)
)
hapticFeedback.current.performHapticFeedback(HapticFeedbackType.TextHandleMove)
....
Spacer(Modifier.width(12.dp))
Text(text.uppercase(Locale.getDefault()))
}
}
}
I am able to see the text when it is getting selected, but not getting a subtle vibrating feedback.
In version rc-01 of Compose you can use only two types of Haptic Feedback: HapticFeedbackType.LongPress or HapticFeedbackType.TextHandleMove.
val haptic = LocalHapticFeedback.current
val context = LocalContext.current
Row(
Modifier.clickable {
haptic.performHapticFeedback(HapticFeedbackType.LongPress)
}
)
Currently (Compose UI 1.1.0-beta03) only LongPress and TextHandleMove are supported via
val haptic = LocalHapticFeedback.current
Button(onClick = {
haptic.performHapticFeedback(HapticFeedbackType.LongPress)
}) { ... }
as #nglauber answer said.
I guess it is because of the multi-platform support for Compose Desktop.
There's another way, however, if you are on Android and you do not need Compose Desktop compatibility, it's fairly easy to use it:
val view = LocalView.current
Button(onClick = {
view.performHapticFeedback(HapticFeedbackConstants.KEYBOARD_TAP)
}) { ... }
The HapticFeedbackConstants class has a lot of constants.
On my phone (and other devices I've tested on) neither LongPress nor TextHandleMove makes the phone vibrate.
We worked around this before we moved to Compose like this:
import android.content.Context
import android.view.HapticFeedbackConstants
import android.view.View
import android.view.accessibility.AccessibilityManager
fun View.vibrate() = reallyPerformHapticFeedback(HapticFeedbackConstants.VIRTUAL_KEY)
fun View.vibrateStrong() = reallyPerformHapticFeedback(HapticFeedbackConstants.LONG_PRESS)
private fun View.reallyPerformHapticFeedback(feedbackConstant: Int) {
if (context.isTouchExplorationEnabled()) {
// Don't mess with a blind person's vibrations
return
}
// Either this needs to be set to true, or android:hapticFeedbackEnabled="true" needs to be set in XML
isHapticFeedbackEnabled = true
// Most of the constants are off by default: for example, clicking on a button doesn't cause the phone to vibrate anymore
// if we still want to access this vibration, we'll have to ignore the global settings on that.
performHapticFeedback(feedbackConstant, HapticFeedbackConstants.FLAG_IGNORE_GLOBAL_SETTING)
}
private fun Context.isTouchExplorationEnabled(): Boolean {
// can be null during unit tests
val accessibilityManager = getSystemService(Context.ACCESSIBILITY_SERVICE) as AccessibilityManager?
return accessibilityManager?.isTouchExplorationEnabled ?: false
}
For now we still have to use this code and access it from Compose like in Daniele Segato's answer:
#Composable
fun VibratingButton() {
val view = LocalView.current
Button(onClick = {
view.vibrate()
}) { ... }
}
For me, this worked well when the user presses down the button. Using Compose 1.2.0-beta01
val interactionSource = remember { MutableInteractionSource() }
val isPressed by interactionSource.collectIsPressedAsState()
val hapticFeedback = LocalHapticFeedback.current
LaunchedEffect(key1 = isPressed) {
if (isPressed) {
hapticFeedback.performHapticFeedback(HapticFeedbackType.LongPress)
}
}
Add the feedback logic inside the Modifier.clickable {...} applied to the Row
Does anyone know how to test the navigation drawer with Kakao? I have simple activity and 2 fragments. I use Jetpack navigation components and want to test it.
class FormScreen : Screen<FormScreen>() {
val drawerLayout = KView { withId(R.id.drawer_layout) }
val navigationView = KNavigationView { withId(R.id.nav_view) }
val textKK = KTextView { withId(R.id.text_home) }
val fromTV = KTextView { withId(R.id.progressBar2) }
}
#Test
fun drawNavigationFromTasksToStatistics() {
// start up Tasks screen
val activityScenario = ActivityScenario.launch(MainActivity::class.java)
dataBindingIdlingResource.monitorActivity(activityScenario)
onScreen<FormScreen>{
drawerLayout{
perform {
navigateTo(R.id.nav_home)
}
}
textKK.isDisplayed()
}