Xamarin iOS CGImageFromUrl returns null - ios

I am using Xamarin iOS on windows Visual studio to extract metadata from a photo taken by the iphone camera.
private void GetMetaData(NSUrl url)
{
CGImageSource myImageSource;
myImageSource = CGImageSource.FromUrl(url, null);
var ns = new NSDictionary();
var imageProperties = myImageSource.CopyProperties(ns, 0);
var gps = imageProperties.ObjectForKey(CGImageProperties.GPSDictionary) as NSDictionary;
var lat = gps[CGImageProperties.GPSLatitude];
var latref = gps[CGImageProperties.GPSLatitudeRef];
var lon = gps[CGImageProperties.GPSLongitude];
var lonref = gps[CGImageProperties.GPSLongitudeRef];
var loc = String.Format("GPS: {0} {1}, {2} {3}", lat, latref, lon, lonref);
Console.WriteLine(loc);
}
the url being passed into the method is as:- {file:///var/mobile/Media/DCIM/100APPLE/IMG_0006.JPG}
The CGImageSource.FromUrl(url, null) returns null and my app crashes... Can anyone explain to me how I need to fix this?
Edit This is how I am getting the URL for the image.
protected void Handle_FinishedPickingMedia(object sender, UIImagePickerMediaPickedEventArgs e)
{
NSUrl url = null;
try
{
void ImageData(PHAsset asset)
{
if (asset == null) throw new Exception("PHAsset is null");
PHImageManager.DefaultManager.RequestImageData(asset, null, (data, dataUti, orientation, info) =>
{
//Console.WriteLine(data);
Console.WriteLine(info);
url = info.ValueForKey(new NSString("PHImageFileURLKey")) as NSUrl;
// Call method to get MetaData from Image Url //
GetMetaData(url);
});
}

As I said in your last post: Xamarin iOS camera and photos: If you capture photos from camera, the event will not return a ReferenceUrl. But you can get the metadata info with another key:
protected void Handle_FinishedPickingMedia(object sender, UIImagePickerMediaPickedEventArgs e)
{
var metadataInfo = e.Info["UIImagePickerControllerMediaMetadata"]; //or e.MediaMetadata;
}
This will contain some basic information which may be helpful to you. But this NSDictionary doesn't contain the GPS information. Because this picture is created by yourself, you should get the GPS by using CLLocationManager manually:
CLLocationManager manager;
manager = new CLLocationManager();
manager.RequestWhenInUseAuthorization();
manager.LocationsUpdated += (locationSender, e) =>
{
//get current locations
manager.StopUpdatingLocation();
};
manager.StartUpdatingLocation();
Please notice that add the keys in info.plist: NSLocationWhenInUseUsageDescription and NSLocationAlwaysAndWhenInUsageDescription on iOS11, if you want to deploy iOS10 add NSLocationAlwaysUsageDescription.
In this way, there's no need to get metadata from CGImageSource when you pick photos from camera.

Related

How to EnableShutterSound while taking picture in android xamarin

I have a custom camera implementation which I would like to have have my own sound when the picture is taken how to enable shutter sound in hardware camera, I need to only play my camera sound and not the default one
private async void TakePhotoButtonTapped(object sender, EventArgs e)
{
Android.Hardware.Camera.CameraInfo info = new Android.Hardware.Camera.CameraInfo();
// mCameraFacing is CameraID.
Android.Hardware.Camera.GetCameraInfo(mCameraFacing, info);
if (info.CanDisableShutterSound)
camera.EnableShutterSound(false);
else
camera.EnableShutterSound(true);
Bitmap image = textureView.Bitmap;
using (var imageStream = new MemoryStream())
{
await image.CompressAsync(Bitmap.CompressFormat.Jpeg, 100, imageStream);
image.Recycle();
imageBytes = imageStream.ToArray();
AddImages(imageBytes);
Toast.MakeText(Android.App.Application.Context, _languageCache.Translate("Photo Captured"), ToastLength.Short).Show();
}
await Task.Delay(300);
}
You could use the code below of the sound settings and put it into the code which you used to take photo.
I use the system ShutterClick sound for reference. You could use your own as well.
private void SoundSettings()
{
MediaActionSound sound = new MediaActionSound();
AudioManager meng = (AudioManager)Context.GetSystemService(Context.AudioService);
int volume = meng.GetStreamVolume(Android.Media.Stream.Notification);
if (volume != 0)
sound.Play(MediaActionSoundType.ShutterClick);
}

Large File upload to ASP.NET Core 3.0 Web API fails due to Request Body to Large

I have an ASP.NET Core 3.0 Web API endpoint that I have set up to allow me to post large audio files. I have followed the following directions from MS docs to set up the endpoint.
https://learn.microsoft.com/en-us/aspnet/core/mvc/models/file-uploads?view=aspnetcore-3.0#kestrel-maximum-request-body-size
When an audio file is uploaded to the endpoint, it is streamed to an Azure Blob Storage container.
My code works as expected locally.
When I push it to my production server in Azure App Service on Linux, the code does not work and errors with
Unhandled exception in request pipeline: System.Net.Http.HttpRequestException: An error occurred while sending the request. ---> Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException: Request body too large.
Per advice from the above article, I have configured incrementally updated Kesterl with the following:
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseKestrel((ctx, options) =>
{
var config = ctx.Configuration;
options.Limits.MaxRequestBodySize = 6000000000;
options.Limits.MinRequestBodyDataRate =
new MinDataRate(bytesPerSecond: 100,
gracePeriod: TimeSpan.FromSeconds(10));
options.Limits.MinResponseDataRate =
new MinDataRate(bytesPerSecond: 100,
gracePeriod: TimeSpan.FromSeconds(10));
options.Limits.RequestHeadersTimeout =
TimeSpan.FromMinutes(2);
}).UseStartup<Startup>();
Also configured FormOptions to accept files up to 6000000000
services.Configure<FormOptions>(options =>
{
options.MultipartBodyLengthLimit = 6000000000;
});
And also set up the API controller with the following attributes, per advice from the article
[HttpPost("audio", Name="UploadAudio")]
[DisableFormValueModelBinding]
[GenerateAntiforgeryTokenCookie]
[RequestSizeLimit(6000000000)]
[RequestFormLimits(MultipartBodyLengthLimit = 6000000000)]
Finally, here is the action itself. This giant block of code is not indicative of how I want the code to be written but I have merged it into one method as part of the debugging exercise.
public async Task<IActionResult> Audio()
{
if (!MultipartRequestHelper.IsMultipartContentType(Request.ContentType))
{
throw new ArgumentException("The media file could not be processed.");
}
string mediaId = string.Empty;
string instructorId = string.Empty;
try
{
// process file first
KeyValueAccumulator formAccumulator = new KeyValueAccumulator();
var streamedFileContent = new byte[0];
var boundary = MultipartRequestHelper.GetBoundary(
MediaTypeHeaderValue.Parse(Request.ContentType),
_defaultFormOptions.MultipartBoundaryLengthLimit
);
var reader = new MultipartReader(boundary, Request.Body);
var section = await reader.ReadNextSectionAsync();
while (section != null)
{
var hasContentDispositionHeader = ContentDispositionHeaderValue.TryParse(
section.ContentDisposition, out var contentDisposition);
if (hasContentDispositionHeader)
{
if (MultipartRequestHelper
.HasFileContentDisposition(contentDisposition))
{
streamedFileContent =
await FileHelpers.ProcessStreamedFile(section, contentDisposition,
_permittedExtensions, _fileSizeLimit);
}
else if (MultipartRequestHelper
.HasFormDataContentDisposition(contentDisposition))
{
var key = HeaderUtilities.RemoveQuotes(contentDisposition.Name).Value;
var encoding = FileHelpers.GetEncoding(section);
if (encoding == null)
{
return BadRequest($"The request could not be processed: Bad Encoding");
}
using (var streamReader = new StreamReader(
section.Body,
encoding,
detectEncodingFromByteOrderMarks: true,
bufferSize: 1024,
leaveOpen: true))
{
// The value length limit is enforced by
// MultipartBodyLengthLimit
var value = await streamReader.ReadToEndAsync();
if (string.Equals(value, "undefined",
StringComparison.OrdinalIgnoreCase))
{
value = string.Empty;
}
formAccumulator.Append(key, value);
if (formAccumulator.ValueCount >
_defaultFormOptions.ValueCountLimit)
{
return BadRequest($"The request could not be processed: Key Count limit exceeded.");
}
}
}
}
// Drain any remaining section body that hasn't been consumed and
// read the headers for the next section.
section = await reader.ReadNextSectionAsync();
}
var form = formAccumulator;
var file = streamedFileContent;
var results = form.GetResults();
instructorId = results["instructorId"];
string title = results["title"];
string firstName = results["firstName"];
string lastName = results["lastName"];
string durationInMinutes = results["durationInMinutes"];
//mediaId = await AddInstructorAudioMedia(instructorId, firstName, lastName, title, Convert.ToInt32(duration), DateTime.UtcNow, DateTime.UtcNow, file);
string fileExtension = "m4a";
// Generate Container Name - InstructorSpecific
string containerName = $"{firstName[0].ToString().ToLower()}{lastName.ToLower()}-{instructorId}";
string contentType = "audio/mp4";
FileType fileType = FileType.audio;
string authorName = $"{firstName} {lastName}";
string authorShortName = $"{firstName[0]}{lastName}";
string description = $"{authorShortName} - {title}";
long duration = (Convert.ToInt32(durationInMinutes) * 60000);
// Generate new filename
string fileName = $"{firstName[0].ToString().ToLower()}{lastName.ToLower()}-{Guid.NewGuid()}";
DateTime recordingDate = DateTime.UtcNow;
DateTime uploadDate = DateTime.UtcNow;
long blobSize = long.MinValue;
try
{
// Update file properties in storage
Dictionary<string, string> fileProperties = new Dictionary<string, string>();
fileProperties.Add("ContentType", contentType);
// update file metadata in storage
Dictionary<string, string> metadata = new Dictionary<string, string>();
metadata.Add("author", authorShortName);
metadata.Add("tite", title);
metadata.Add("description", description);
metadata.Add("duration", duration.ToString());
metadata.Add("recordingDate", recordingDate.ToString());
metadata.Add("uploadDate", uploadDate.ToString());
var fileNameWExt = $"{fileName}.{fileExtension}";
var blobContainer = await _cloudStorageService.CreateBlob(containerName, fileNameWExt, "audio");
try
{
MemoryStream fileContent = new MemoryStream(streamedFileContent);
fileContent.Position = 0;
using (fileContent)
{
await blobContainer.UploadFromStreamAsync(fileContent);
}
}
catch (StorageException e)
{
if (e.RequestInformation.HttpStatusCode == 403)
{
return BadRequest(e.Message);
}
else
{
return BadRequest(e.Message);
}
}
try
{
foreach (var key in metadata.Keys.ToList())
{
blobContainer.Metadata.Add(key, metadata[key]);
}
await blobContainer.SetMetadataAsync();
}
catch (StorageException e)
{
return BadRequest(e.Message);
}
blobSize = await StorageUtils.GetBlobSize(blobContainer);
}
catch (StorageException e)
{
return BadRequest(e.Message);
}
Media media = Media.Create(string.Empty, instructorId, authorName, fileName, fileType, fileExtension, recordingDate, uploadDate, ContentDetails.Create(title, description, duration, blobSize, 0, new List<string>()), StateDetails.Create(StatusType.STAGED, DateTime.MinValue, DateTime.UtcNow, DateTime.MaxValue), Manifest.Create(new Dictionary<string, string>()));
// upload to MongoDB
if (media != null)
{
var mapper = new Mapper(_mapperConfiguration);
var dao = mapper.Map<ContentDAO>(media);
try
{
await _db.Content.InsertOneAsync(dao);
}
catch (Exception)
{
mediaId = string.Empty;
}
mediaId = dao.Id.ToString();
}
else
{
// metadata wasn't stored, remove blob
await _cloudStorageService.DeleteBlob(containerName, fileName, "audio");
return BadRequest($"An issue occurred during media upload: rolling back storage change");
}
if (string.IsNullOrEmpty(mediaId))
{
return BadRequest($"Could not add instructor media");
}
}
catch (Exception ex)
{
return BadRequest(ex.Message);
}
var result = new { MediaId = mediaId, InstructorId = instructorId };
return Ok(result);
}
I reiterate, this all works great locally. I do not run it in IISExpress, I run it as a console app.
I submit large audio files via my SPA app and Postman and it works perfectly.
I am deploying this code to an Azure App Service on Linux (as a Basic B1).
Since the code works in my local development environment, I am at a loss of what my next steps are. I have refactored this code a few times but I suspect that it's environment related.
I cannot find anywhere that mentions that the level of App Service Plan is the culprit so before I go out spending more money I wanted to see if anyone here had encountered this challenge and could provide advice.
UPDATE: I attempted upgrading to a Production App Service Plan to see if there was an undocumented gate for incoming traffic. Upgrading didn't work either.
Thanks in advance.
-A
Currently, as of 11/2019, there is a limitation with the Azure App Service for Linux. It's CORS functionality is enabled by default and cannot be disabled AND it has a file size limitation that doesn't appear to get overridden by any of the published Kestrel configurations. The solution is to move the Web API app to a Azure App Service for Windows and it works as expected.
I am sure there is some way to get around it if you know the magic combination of configurations, server settings, and CLI commands but I need to move on with development.

iOS Xamarin Attaching a picture to the Email Body - Null Exception

I am using my first Xamarin iOS app to take a picture, extract meta data of the picture, attach the picture to an email body and send email.
// (1.) Take a photo with the Camera //
partial void BtnCamera_TouchUpInside(UIButton sender)
{
UIImagePickerController imagePicker = new UIImagePickerController();
imagePicker.PrefersStatusBarHidden();
imagePicker.SourceType = UIImagePickerControllerSourceType.Camera;
// handle saving picture and extracting meta-data from picture //
imagePicker.FinishedPickingMedia += Handle_FinishedPickingMedia;
// present //
PresentViewController(imagePicker, true, () => { });
}
// (2.) Saves the image to the phone and then extracts metadata //
protected void Handle_FinishedPickingMedia(object sender, UIImagePickerMediaPickedEventArgs e)
{
//NSUrl url = null;
try
{
#region Save Image and Get Meta-data
// Save Image before processing for meta-data //
SaveImagetoPhone(e);
// Get meta-data from saved image //
GetImageMetaData(e);
// (3.) Save picture to the phone and extract photo url //
private static void SaveImagetoPhone(UIImagePickerMediaPickedEventArgs e)
{
NSUrl url = null;
void ImageData(PHAsset asset)
{
if (asset == null)
throw new Exception("PHAsset is null");
PHImageManager.DefaultManager.RequestImageData(asset, null, (data, dataUti, orientation, info) =>
{
url = info.ValueForKey(new NSString("PHImageFileURLKey")) as NSUrl;
PhotoDataClass._file = url.Path;
}); }
PHAsset phAsset;
if (e.ReferenceUrl == null)
{
e.OriginalImage?.SaveToPhotosAlbum((image, error) =>
{
if (error == null)
{
var options = new PHFetchOptions
{
FetchLimit = 1,
SortDescriptors = new[] { new NSSortDescriptor("creationDate", true) }
};
phAsset = PHAsset.FetchAssets(options).LastOrDefault() as PHAsset;
ImageData(phAsset);
}
});
}
else
{
phAsset = PHAsset.FetchAssets(new[] { e.ReferenceUrl }, null).FirstOrDefault() as PHAsset;
ImageData(phAsset);
}
}
At this point I have successfully extracted the URL for the file:-
"/var/mobile/Media/DCIM/100APPLE/IMG_0036.JPG"
However when I run my email function, even though the url is populated, I get a null exception without any explanation that I know of.
// (4.) Email functionality //
https://developer.xamarin.com/recipes/ios/shared_resources/email/send_an_email/
partial void BtnMessageDone_TouchUpInside(UIButton sender)
{
MFMailComposeViewController mailController;
if (MFMailComposeViewController.CanSendMail)
{
StringBuilder htmlBodyMail = FormatEmailBody();
mailController = new MFMailComposeViewController();
// do mail operations here
mailController.SetToRecipients(new string[] { "xxx.yy#email.com" });
mailController.SetSubject("mail test");
mailController.SetMessageBody(htmlBodyMail.ToString(), false);
UIImage img = UIImage.FromFile(PhotoDataClass._file);
mailController.AddAttachmentData(img.AsJPEG(), "image/JPG", "Image.JPG");
mailController.Finished += (object s, MFComposeResultEventArgs args) =>
{
Console.WriteLine(args.Result.ToString());
args.Controller.DismissViewController(true, null);
};
this.PresentViewController(mailController, true, null);
}
}
The null exception occurs on the line inside the email functionality:-
mailController.AddAttachmentData(img.AsJPEG(), "image/JPG", "Image.JPG");
From your code, the image will be null when you use UIImage img = UIImage.FromFile(PhotoDataClass._file);. So this null exception will be thrown.
We should use PHAsset to retrieve image from system photo library instead of using the url directly. You can store the LocalIdentifier to the class PhotoDataClass, then retrieve the PHAsset through this identifier when you want to use it later.
Modify the ImageData method:
void ImageData(PHAsset asset)
{
if (asset == null)
throw new Exception("PHAsset is null");
PhotoDataClass.AssetIdentifier = asset.LocalIdentifier;
}
Then BtnMessageDone_TouchUpInside method can be:
MFMailComposeViewController mailController;
if (MFMailComposeViewController.CanSendMail)
{
var results = PHAsset.FetchAssetsUsingLocalIdentifiers(new string[] { PhotoDataClass.AssetIdentifier }, null);
foreach (PHAsset asset in results)
{
if (asset.LocalIdentifier == PhotoDataClass.AssetIdentifier)
{
PHImageManager.DefaultManager.RequestImageData(asset, null, (data, dataUti, orientation, info) =>
{
StringBuilder htmlBodyMail = FormatEmailBody();
mailController = new MFMailComposeViewController();
// do mail operations here
...
UIImage img = UIImage.LoadFromData(data);
mailController.AddAttachmentData(img.AsJPEG(), "image/JPG", "Image.JPG");
...
});
}
}
}
Moreover That LastOrDefault() is not what I think it is
This is because your NSSortDescriptor, use SortDescriptors = new[] { new NSSortDescriptor("creationDate", false) } or phAsset = PHAsset.FetchAssets(options).FirstOrDefault() as PHAsset; to get your photo you just capture from your camera.

Xamarin iOS camera and photos

I am taking a picture with the iOS camera and trying to extract metadata from the image. This is my code:-
partial void BtnCamera_TouchUpInside(UIButton sender)
{
UIImagePickerController imagePicker = new UIImagePickerController();
imagePicker.PrefersStatusBarHidden();
imagePicker.SourceType = UIImagePickerControllerSourceType.Camera;
// handle saving picture and extracting meta-data from picture //
imagePicker.FinishedPickingMedia += Handle_FinishedPickingMedia;
// present //
PresentViewController(imagePicker, true, () => { });
}
protected void Handle_FinishedPickingMedia(object sender, UIImagePickerMediaPickedEventArgs e)
{
try
{
// determine what was selected, video or image
bool isImage = false;
switch (e.Info[UIImagePickerController.MediaType].ToString())
{
case "public.image":
isImage = true;
break;
}
// get common info
NSUrl referenceURL = e.Info[new NSString("UIImagePickerControllerReferenceURL")] as NSUrl;
if (referenceURL != null)
Console.WriteLine("Url:" + referenceURL.ToString());
I am able to initiate the camera, take the picture and then however when I click 'use photo'... The referenceURL comes back as NULL... How can I get the url, such that to extract GPS coordinates of the photo and such other attributes ?
I had a tremendous amount of trouble with the URL. It can be a file, it can be a web url and it acts different on every device. My app crashed and burned so many times with my test group. I finally found a way to get the Metadata from the data. There are multiple ways to get the DateTaken, Width and Height as well as GPS Coordinates. In addition, I needed the Camera MFG and the Model.
string dateTaken = string.Empty;
string lat = string.Empty;
string lon = string.Empty;
string width = string.Empty;
string height = string.Empty;
string mfg = string.Empty;
string model = string.Empty;
PHImageManager.DefaultManager.RequestImageData(asset, options, (data, dataUti, orientation, info) => {
dateTaken = asset.CreationDate.ToString();
// GPS Coordinates
var coord = asset.Location?.Coordinate;
if (coord != null)
{
lat = asset.Location?.Coordinate.Latitude.ToString();
lon = asset.Location?.Coordinate.Longitude.ToString();
}
UIImage img = UIImage.LoadFromData(data);
if (img.CGImage != null)
{
width = img.CGImage?.Width.ToString();
height = img.CGImage?.Height.ToString();
}
using (CGImageSource imageSource = CGImageSource.FromData(data, null))
{
if (imageSource != null)
{
var ns = new NSDictionary();
var imageProperties = imageSource.CopyProperties(ns, 0);
if (imageProperties != null)
{
width = ReturnStringIfNull(imageProperties[CGImageProperties.PixelWidth]);
height = ReturnStringIfNull(imageProperties[CGImageProperties.PixelHeight]);
var tiff = imageProperties.ObjectForKey(CGImageProperties.TIFFDictionary) as NSDictionary;
if (tiff != null)
{
mfg = ReturnStringIfNull(tiff[CGImageProperties.TIFFMake]);
model = ReturnStringIfNull(tiff[CGImageProperties.TIFFModel]);
//dateTaken = ReturnStringIfNull(tiff[CGImageProperties.TIFFDateTime]);
}
}
}
}
}
}
The little helper function
private string ReturnStringIfNull(NSObject inObj)
{
if (inObj == null) return String.Empty;
return inObj.ToString();
}
You can request a PHAsset from the reference Url and that will contain some metadata. You can request the image data to obtain more.
Note: If you need full EXIF, you need to check to ensure the image on on the device (could be iCloud-based), download it if needed, and then load the image data with the ImageIO framework (lots of SO postings cover this).
public void ImagePicker_FinishedPickingMedia(object sender, UIImagePickerMediaPickedEventArgs e)
{
void ImageData(PHAsset asset)
{
if (asset == null) throw new Exception("PHAsset is null");
PHImageManager.DefaultManager.RequestImageData(asset, null, (data, dataUti, orientation, info) =>
{
Console.WriteLine(data);
Console.WriteLine(info);
});
}
PHAsset phAsset;
if (e.ReferenceUrl == null)
{
e.OriginalImage?.SaveToPhotosAlbum((image, error) =>
{
if (error == null)
{
var options = new PHFetchOptions
{
FetchLimit = 1,
SortDescriptors = new[] { new NSSortDescriptor("creationDate", true) }
};
phAsset = PHAsset.FetchAssets(options).FirstOrDefault() as PHAsset;
ImageData(phAsset);
}
});
}
else
{
phAsset = PHAsset.FetchAssets(new[] { e.ReferenceUrl }, null).FirstOrDefault() as PHAsset;
ImageData(phAsset);
}
}
Note: Make sure you have request runtime photo library authorization PHPhotoLibrary.RequestAuthorization) and have set the Privacy - Photo Library Usage Description string in your info.plist to avoid a nasty privacy crash

blackberry gps getting current Location Address?

Hi i am new to blackberry application development.I want to get current gps location detail.I had successfully got the latitude and longitude but i dont know how to get the current address.can any one give me a sample?Pls thanks in advance.
What you are looking for is called "Reverse Geocoding." RIM has an example of exactly how to do this on the BB platform (well, one way to do it anyway).
You can use Google map to resolve your issue, Google map will return a JSON (or XML) according to your choice if you request with a lattitude and longitude.
The url is: http://maps.google.com/maps/geo?json&ll="+_lattitude+","+_longitude
This will return all the details of the given latitude and longitude in JSON format,And you have to parse the JSON returned by Google map.
You can use the below code:
private void getLocationFromGoogleMaps() {
try {
StreamConnection s = null;
InputStream iStream = null;
s=(StreamConnection)javax.microedition.io.Connector.open("http://maps.google.com/maps/geo?json&ll="+_lattitude+","+_longitude+getConnectionStringForGoogleMap());//&deviceside=false&ConnectionType=mds-public"
HttpConnection con = (HttpConnection)s;
con.setRequestMethod(HttpConnection.GET);
con.setRequestProperty("Content-Type", "//text");
int status = con.getResponseCode();
if (status == HttpConnection.HTTP_OK)
{
iStream=s.openInputStream();
int len=(int) con.getLength();
byte[] data = new byte[8000];
byte k;
String result="";
while((k = (byte)iStream.read()) != -1) {
result = result+(char)k;
}
try {
JSONObject jsonObjectMapData=new JSONObject(result);
JSONArray jsonaryPlaceMark = jsonObjectMapData.getJSONArray("Placemark");
JSONObject address= jsonaryPlaceMark.getJSONObject(0);
String placeName=address.getString("address");
if(placeName!=null)
lblLoc.setText(address.getString("address"));
else
lblLoc.setText("Location information currently unavilable");
} catch (Exception e) {
lblLoc.setText("location information Currently Unavilable");
}
}
}catch (Exception e) {
System.out.println(e);
lblLoc.setText("location information Currently Unavilable");
}
}
Note: I use a FacebookBlackBerrySDK-0.3.5-src to parse JSON or you can XML aslo.

Resources