Why doesn't CLLocationManager deliver events to handle? - ios

If I create CLLocationManager instance only on UIThread, LocationUpdated event will fired.
Why does this happen?
There is no any clue in Xamarin and Apple documentation that CLLocationManager must be created on UIThread.
Some code asks locationManager.RequestWhenInUseAuthorization ();
NSLocationWhenInUseUsageDescription is setted in Info.plist
private void CreateLocationManagerWorkingOption () {
ExecuteOnMainThread (() => {
locationManager = new CLLocationManager ();
});
locationManager.LocationsUpdated += (object sender, CLLocationsUpdatedEventArgs e) => {
OnLocationChanged (locationManager,e.Locations [e.Locations.Length - 1]);
};
}
private void CreateLocationManagerNotWorkingOption () {
ExecuteOnSomeThread(()=> {
locationManager = new CLLocationManager ();
});
locationManager.LocationsUpdated += (object sender, CLLocationsUpdatedEventArgs e) => {
OnLocationChanged (locationManager,e.Locations [e.Locations.Length - 1]);
};
}
private void StartTrackingImpl() {
ExecuteOnMainThread (() => locationManager?.StartUpdatingLocation ());
}

I guess I know why.
First of all, I was experiencing the same problem. I could not get LocationUpdates working on three different devices. I've tested a lot, but CLLocationManager event still was not fired. I came across this question and finally figured out why events were not fired on my devices.
I instantiated CLLoationManager in a threadPool. ThreadPools are managed by .NET. So, a thread, where I instantiated CLLoationManager were finished after a while, therefore there's nowhere to fire the events.
Hope my answer helps!

You can create and handle it from every thread that has an active run loop.
From the CLLocationManagerDelegate documentation:
The methods of your delegate object are called from the thread in which you started the corresponding location services. That thread must itself have an active run loop, like the one found in your application’s main thread.

Related

Why is this task await leaving the UI context in Xamarin iOS?

The following has somewhat shaken my async/await-based belief system. Under Xamarin/iOS the following fails, saying that UI-related things are being done in a non-UI thread. Adding check points shows that the context does in fact switch after the async file write.
My understanding is that lacking a ConfigureAwait, the following should be completely safe. I'm assuming this is a Xamarin nuance of which I'm unaware but it's difficult to understand what that could be.
This same code works fine on Android and UWP.
private async void ShareButton_OnClicked(object sender, EventArgs e)
{
if (!(BindingContext is PhotoViewModel photoViewModel))
{
return;
}
// in UI context
var name = photoViewModel.Name ?? "temp.jpg";
var file = Path.Combine(FileSystem.CacheDirectory, name);
using (var stream = new FileStream(file, FileMode.Create, FileAccess.Write))
{
await stream.WriteAsync(photoViewModel.Data, 0, photoViewModel.Data.Length);
}
// not in UI context!
// calling this causes SIGABRT: UIKit Consistency error
await Share.RequestAsync(new ShareFileRequest(new ShareFile(file)));
}
calling this causes SIGABRT: UIKit Consistency error
await Share.RequestAsync(new ShareFileRequest(new ShareFile(file)));
Although there is no problem in Android and UWP, it may not be compatible with such writing in iOS. Above line code needs UI thread to invoke , however it is in async method ShareButton_OnClicked. Maybe need to invoke it from Main thread specially, have a try with the follow code to invoke it.
await Device.InvokeOnMainThreadAsync(() =>
{
// inkoke your code .
Share.RequestAsync(new ShareFileRequest(new ShareFile(file)));
});
This turned out to be a bug in mono
https://github.com/mono/mono/issues/16759

How to get notified when network connection changes in Xamarin Forms iOS

I have a requirement in my app where I need to display an image to indicate whether the app is connected to network or not.I was able to do it using the Connectivity Plugin by James Montemagno.But I want to implement it using Reachability class.When I implement the Reachability class the OnChange method never fires.When I turn ON or turn OFF the wifi the OnChange is never called.Can somebody guide me on how to achieve this?
public static event EventHandler ReachabilityChanged;
static void OnChange(NetworkReachabilityFlags flags)
{
ReachabilityChanged?.Invoke(null, EventArgs.Empty);
}
Put below code in your PCL App()
CrossConnectivity.Current.ConnectivityChanged += (object sender, Plugin.Connectivity.Abstractions.ConnectivityChangedEventArgs e) =>
{
bool IsInternetConnected = e.IsConnected;
}
You can pass connectivity status using Messaging Center refer this
Ok let me explain how use the Rehability.cs class
1) add this file in your project.
https://github.com/xamarin/ios-samples/blob/master/ReachabilitySample/reachability.cs
2) change the namespace for the name of your project .
3) Declare this variables in your ViewController like the image
NetworkStatus remoteHostStatus, internetStatus, localWifiStatus;
4) In your ViewController add this method . The line TableView.ReloadData (); put the name of your table o item that you wanna update.
void UpdateStatus (object sender, EventArgs e)
{
remoteHostStatus = Reachability.RemoteHostStatus ();
internetStatus = Reachability.InternetConnectionStatus ();
localWifiStatus = Reachability.LocalWifiConnectionStatus ();
TableView.ReloadData ();
}
5) In the ViewDidLoad add this two lines
UpdateStatus (null, null);
Reachability.ReachabilityChanged += UpdateStatus;
for a more understand of the code download this example and run the app in you Visual studio . https://developer.xamarin.com/samples/monotouch/ReachabilitySample/
Regards

MoveToRegion in xamarin forms maps behaves strangely

I am using a Map control in my app, and i need to set the visible region in such a way that it should cover all the pins.
Irony is same code doesn't work on both the platform, iOS works awkwardly , below code yield almost the same visible region in both platform.
if(Device.OS == TargetPlatform.iOS)
customMap.MoveToRegion (MapSpan.FromCenterAndRadius (customMap.CustomPins [0].Pin.Position, Distance.FromMiles (0.20)));
if(Device.OS == TargetPlatform.Android)
customMap.MoveToRegion (MapSpan.FromCenterAndRadius (customMap.CustomPins [0].Pin.Position, Distance.FromMiles (55.0)));
Can anyone explains it? why I need to code like it?
i have found a workaround , i am waiting for some explanation before accepting my own answer for it
Device.StartTimer(TimeSpan.FromMilliseconds(500), () =>
{
customMap.MoveToRegion(MapSpan.FromCenterAndRadius(customMap.CustomPins [0].Pin.Position, Distance.FromMiles(55.0)));
return false;
});
I was running into a problem where the MovetoRegion was being delayed (15-30 seconds) when trying to center on the users current location using the Xamarin Geolocator Plugin, on both IOS and Android. Things work alot better with Saket Kumar's approach with the 500ms delay. Here is my code snippet, hope this helps someone.
private void CenterOnMe_Clicked(object sender, EventArgs e)
{
var locator = CrossGeolocator.Current;
var t = Task.Run(async () =>
{
var position = await locator.GetPositionAsync(TimeSpan.FromSeconds(10));
Device.StartTimer(TimeSpan.FromMilliseconds(500), () =>
{
AroundMeMap.MoveToRegion(
MapSpan.FromCenterAndRadius(
new Position(position.Latitude, position.Longitude), Distance.FromMiles(1)));
return false;
});
});
}

monotouch remote control events not working

I would like to know if there is a working sample for monotouch that shows a working example for receiving remote control events such as those from the headphone buttons.
I have implemented a single view iphone app, implemented CanBecomeFirstResponder, called BecomeFirstResponder and also UIApplication.SharedApplication.BeginReceivingRemoteControlEvents() but I dont get any events.
Here is my code for my SingleViewController.
public partial class SingleViewViewController : UIViewController
{
public SingleViewViewController () : base ("SingleViewViewController", null)
{
}
public override void DidReceiveMemoryWarning ()
{
// Releases the view if it doesn't have a superview.
base.DidReceiveMemoryWarning ();
// Release any cached data, images, etc that aren't in use.
}
public override void ViewDidLoad ()
{
base.ViewDidLoad ();
// Perform any additional setup after loading the view, typically from a nib.
AVAudioSession audioSession = AVAudioSession.SharedInstance();
NSError error;
audioSession.SetCategory(AVAudioSession.CategoryPlayback, out error);
audioSession.SetActive(true,out error);
this.BecomeFirstResponder();
UIApplication.SharedApplication.BeginReceivingRemoteControlEvents();
}
public override void ViewDidUnload ()
{
base.ViewDidUnload ();
// Clear any references to subviews of the main view in order to
// allow the Garbage Collector to collect them sooner.
//
// e.g. myOutlet.Dispose (); myOutlet = null;
ReleaseDesignerOutlets ();
}
public override bool ShouldAutorotateToInterfaceOrientation (UIInterfaceOrientation toInterfaceOrientation)
{
// Return true for supported orientations
return (toInterfaceOrientation != UIInterfaceOrientation.PortraitUpsideDown);
}
public override bool CanBecomeFirstResponder {
get {
return true;
}
}
public override bool CanResignFirstResponder {
get {
return false;
}
}
public override void RemoteControlReceived (UIEvent theEvent)
{
base.RemoteControlReceived (theEvent);
}
}
I spent a little bit of time on this and I think I might have an answer for you. My first faulty assumption was that the volume up and down controls on the remote (headphones) would register but they don't.
I haven't managed to confirm the following except through trial and error, but it appears that you need to have an AVAudioPlayer playing something, or at least playing something when you start the AVAudioSession. Without playing something the play / stop event gets passed to the Music app which handles it.
In your code, in the ViewDidLoad method after the call to base, I added
AVAudioPlayer player = new AVAudioPlayer(new NSUrl("Music/test.m4a", false), null);
player.PrepareToPlay();
player.Play();
If you look at chapter 27 of these samples on GitHub, you'll see an example that plays audio and handles the remote control events.
https://github.com/mattneub/Programming-iOS-Book-Examples
I wasn't able to get remote control events working without the player playing, your example matched lots of Obj-C samples but I couldn't make it work in Xcode either.
Hope this helps.

Trouble bringing a Blackberry App to Foreground

I have an app that is listening in background and when the user clicks "send" it displays a dialogue. However I need to bring my app to foreground so the user answers some questions before letting the message go. but I haven't been able to do this, this is the code in my SendListener:
SendListener sl = new SendListener(){
public boolean sendMessage(Message msg){
Dialog myDialog = new Dialog(Dialog.D_OK,
"message from within SendListener",
Dialog.OK,Bitmap.getPredefinedBitmap(Bitmap.EXCLAMATION),
Dialog.GLOBAL_STATUS)
{
//Override inHolster to prevent the Dialog from being dismissed
//when a user holsters their BlackBerry. This can
//cause a deadlock situation as the Messages
//application tries to save a draft of the message
//while the SendListener is waiting for the user to
//dismiss the Dialog.
public void inHolster()
{
}
};
//Obtain the application triggering the SendListener.
Application currentApp = Application.getApplication();
//Detect if the application is a UiApplication (has a GUI).
if( currentApp instanceof UiApplication )
{
//The sendMessage method is being triggered from
//within a UiApplication.
//Display the dialog using is show method.
myDialog.show();
App.requestForeground();
}
else
{
//The sendMessage method is being triggered from
// within an application (background application).
Ui.getUiEngine().pushGlobalScreen( myDialog, 1,
UiEngine.GLOBAL_MODAL );
}
return true;
}
};
store.addSendListener(sl);
App is an object I created above:
Application App = Application.getApplication();
I have also tried to invoke the App to foreground using its processID but so far no luck.
i have managed to achieve something similar to what you're describing but the difference is, my dialogs are displayed asynchronously, which might actually be easier... so in your case..
the first i could suggest you try is get the event lock before pushing the screen, ala:
synchronized(Application.getEventLock()){
final UiEngine ui = Ui.getUiEngine();
ui.pushGlobalScreen(theScreen, 1, UiEngine.GLOBAL_MODAL);
}
I would also just create a custom class of type MainScreen and push that instead of plain Dialog.
There, that's better (now with code formatting).
public class MYSendListener implements SendListener {
private UiApplication _myApp;
public MySendListener(UiApplication myApp) {
_myApp = myApp;
}
public boolean sendMessage(Message m) {
...
_myApp.requestForeground();
}
}
Cache your app instance inside your send listener when you construct it, and use that when sendMessage is fired.
Application.getApplication() only gets you the app of the calling thread.

Resources