It takes more than a min to display Voice Command on TextView and displaying marker on map based on voice commmand is delayed In Xamarin Android
Related
I am developing a mobile app for iOS using Ionic Framework 4 with Cordova/PhoneGap. The app contains sensitive information. One of the requirements is that the screen become hidden/blank whenever the user pauses the app (i.e. when they press the Home button).
Normally iOS takes a snapshot of the current screen whenever the user presses the Home button. This snapshot needs to be blank.
Is there a plugin for Cordova/PhoneGap or Ionic Native which can be used to make the screen blank?
It appears that this cannot be accomplished through JavaScript inside the webview. The pause can be detected but no interactive code works at that point, so it is too late to blank out the screen. See http://docs.phonegap.com/en/2.9.0rc1/cordova_events_events.md.html#pause
The pause can be intercepted earlier by native iOS code, but this would require a Cordova/PhoneGap plugin. I am hoping someone here can recommend a plugin that can detect the pause and hide the screen.
Up until Android Pie, when I sent multiple Toast messages sequentially, they were all displayed sequentially. On my API 28 emulator, when I send multiple Toasts at the same time, only the last one is shown.
Is this a behavior change across Android P?
I have a UIWebView that loads a web page which consists of an embedded Google Maps map. I want the iOS app to perform an action as long as the user is panning the Google map. (I'm using the standard method of bridging between Javascript and Objective-C code--creating an IFrame and picking it up on the other side as a page load--and that part is working fine.)
The problem I'm having is that the Google Maps API running on Safari on the iPhone does not pick up 'pan' events continuously. It only picks up one event at the very end of panning. (I figured this out using the iPhone simulator web inspector tool available within Safari.) Google Maps running in Desktop Safari, in contrast, picks up a continuous stream of events whenever the user pans the map--which is what I want.
Here's the Javascript code which runs differently between mobile and. desktop Safari.
google.maps.event.addListener(map, 'bounds_changed', function(){
console.log("This browser is noticing panning.")
})
To summarize, in desktop Safari, it prints continuously during a 'bounds changed' action, while in mobile Safari it only prints once at the end of the action.
Is this an issue with Google Maps or with iOS Safari? How can I fix it?
EDIT: While 'bounds_changed' events are not triggered properly in iOS Safari, a 'drag' event is triggered continuously. Still, however, the bounds (retrieved with map.getBounds()) of the map are not updated until after the drag motion is complete, so listening for a 'drag' is not much help if I need the bounds.
How is it possible to wake the application from code?
I am writing a simple timer and when time goes out it displays picture on main activity. But this means that application should stay on screen all the time. If user switches to another app (or simply presses Home) my Activity is no longer visible and I need to show it on screen again (switch back to my application) in the way similar to standard Android Phone or Timer pops up.
So there actually are 2 questions:
How to get application on "top" of screen?
How to correctly display application when screen is locked?
For that you would need a service that starts your activity when that timer triggers.
You can take a look at the Android Alarm Clock source code for how to have an Activity Shown even on the lock screen: https://github.com/android/platform_packages_apps_alarmclock/blob/master/src/com/android/alarmclock/AlarmAlertFullScreen.java
Note especially lines 85 to 90, here flags are added so that it is allowed to be shown on the Lock Screen. This should of course work with Mono for Android as well.
There is also a nice answer here to your questions: Wake Android Device up
It should be fairly easy to port to Mono for Android.
I am working on project its requirement is convert human voice to text , but I heard that in ios 5.1 they have added this new feature .
can any one help me to how to integrate this new feature in my application with a small example .
Thanks in Advance.
saroj.
You don't need to do anything to integrate it into your app - any UITextField or UITextView that the user taps into brings up the keyboard, and this has a microphone icon to the immediate left of the spacebar. The user taps this, does their talking, and taps again. The speech to text is done by Apple's servers, so it takes a few seconds, maybe longer when on a slow connection. While this is happening, there will be three purple coloured circles displayed in the text field to denote that speech to text is happening. These are then replaced with the text that is returned from Apple. Note that if you have no network connection (e.g. wifi off, airplane mode, or just no mobile signal available), then the button is removed from the keyboard. So just note that you have no access to text-to-speech when offline.