Android Attach Image between text with SpannableStringBuilder - android-edittext

Sory for my english. I want attach image from gallery and show in edit text with SpannableStringBuilder. I have success for get Image Path from Gallery. After picture selected, Picture Show. But, for the second attach image picture not show and first image attach became a text. any one give me solution ? big thanks.
this is my code:
private void IntentPict() {
Intent intent = new Intent();
intent.setType("image/*");
intent.setAction(Intent.ACTION_GET_CONTENT);
startActivityForResult(Intent.createChooser(intent, "Select File"),MY_INTENT_CLICK);
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == RESULT_OK){
if (requestCode == MY_INTENT_CLICK){
if (null == data) return;
String selectedImagePath;
Uri selectedImageUri = data.getData();
//there is no problem with get path
selectedImagePath = ImageFilePath.getPath(getApplicationContext(), selectedImageUri);
//pathImag is ListArray<String>
pathImg.add(selectedImagePath);
addImageBetweentext(pathImg);
}
}
}
private void addImageBetweentext(ArrayList<String> listPath) {
SpannableStringBuilder ssb = new SpannableStringBuilder();
for(int i=0;i<listPath.size();i++){
String path = listPath.get(i);
Drawable drawable = Drawable.createFromPath(path);
drawable.setBounds(0, 0, 400, 400);
ssb.append(mBodyText+"\n");
ssb.setSpan(new ImageSpan(drawable), ssb.length()-(path.length()+1),
ssb.length()-1, Spannable.SPAN_EXCLUSIVE_EXCLUSIVE);
}
mBodyText.setText(ssb);
}

Here's something that should work. I am loading resource drawables instead, just because it was quicker for me to validate, but I tried to keep as mich as possible for your original flow and variable names:
Drawable[] drawables = new Drawable[2];
drawables[0] = getResources().getDrawable(R.drawable.img1);
drawables[1] = getResources().getDrawable(R.drawable.img2);
SpannableStringBuilder ssb = new SpannableStringBuilder();
for (int i = 0; i < drawables.length; i++) {
Drawable drawable = drawables[i];
drawable.setBounds(0, 0, 400, 400);
String newStr = drawable.toString() + "\n";
ssb.append(newStr);
ssb.setSpan(
new ImageSpan(drawable),
ssb.length() - newStr.length(),
ssb.length() - "\n".length(),
Spannable.SPAN_INCLUSIVE_EXCLUSIVE);
}
mBodyText.setText(ssb);
Long story short: the start position for the incremental builder needs to be set to the current length of it, minus what you just added.

Related

Share file failed because the file names are too long

I used below codes to pop up system share feature.
public void Share(List<String> filepath)
{
NSObject[] activityItems = new NSObject[filepath.Count];
int i = 0;
foreach (var item in filepath)
{
activityItems[i] = NSUrl.CreateFileUrl(item, false, null);
i++;
}
var activityController = new UIActivityViewController(activityItems, null);
if (UIDevice.CurrentDevice.UserInterfaceIdiom == UIUserInterfaceIdiom.Phone)
{
// Phone
if (UIApplication.SharedApplication.KeyWindow.RootViewController != null)
UIApplication.SharedApplication.KeyWindow.RootViewController.PresentViewController(
activityController, true, null);
}
else
{
// Tablet
var popup = new UIPopoverController(activityController);
if (UIApplication.SharedApplication.KeyWindow.RootViewController != null)
{
UIView view = UIApplication.SharedApplication.KeyWindow.RootViewController.View;
CGRect rect = new CGRect(view.Frame.Width / 2, view.Frame.Height, 50, 50);
popup.PresentFromRect(rect, view, UIPopoverArrowDirection.Any, true);
}
}
}
Test results:
sharing via Mail App, the result is OK.
sharing via AirDrop, the file name length is normal, OK.
sharing via AirDrop, the file name length is long, FAILED.
For some reason, I can not decrease the file name length, so how to solve the problem?

How to pick the image from gallery, crop the image and the save image as profile image in xamarin android

hello please help me in this regard I want to select pic from gallery and crop the pic and save the pic in some folder
Please help me in this regard
enter code here
private void ProfilePic_Click(object sender, EventArgs e)
{
Intent = new Intent();
Intent.SetType("image/*");
Intent.SetAction(Intent.ActionGetContent);
StartActivityForResult(Intent.CreateChooser(Intent, "EZ-Gift Profile Pic"), PickImageId);
}
protected override void OnActivityResult(int requestCode, Result resultCode, Intent data)
{
if ((requestCode == PickImageId) && (resultCode == Result.Ok) && (data != null))
{
Android.Net.Uri uri = data.Data;
//Toast.MakeText(this, path, ToastLength.Long).Show();
Toast.MakeText(this, uri.ToString(), ToastLength.Long).Show();
ProfilePic.SetImageURI(uri);
string path = GetPathToImage(data.Data);
edit = prefs.Edit();
edit.PutString("ProfilePicUri", uri.ToString());
Toast.MakeText(this, uri.ToString(), ToastLength.Long).Show();
Toast.MakeText(this, path, ToastLength.Long).Show();
}
}
private string GetPathToImage(Android.Net.Uri contentURI)
{
ICursor cursor = ContentResolver.Query(contentURI, null, null, null, null);
cursor.MoveToFirst();
string documentId = cursor.GetString(0);
documentId = documentId.Split(':')[1];
cursor.Close();
cursor = ContentResolver.Query(
Android.Provider.MediaStore.Images.Media.ExternalContentUri,
null, MediaStore.Images.Media.InterfaceConsts.Id + " = ? ", new[] { documentId }, null);
cursor.MoveToFirst();
string path = cursor.GetString(cursor.GetColumnIndex(MediaStore.Images.Media.InterfaceConsts.Data));
cursor.Close();
return path;
}
Add the below code to Crop the image.Modify your OnActivityResult code , try like this. Hope this may help you.
protected override void OnActivityResult(int requestCode, Result resultCode, Intent data)
{
if ((requestCode == PickImageId) && (resultCode == Result.Ok) && (data != null))
{
Android.Net.Uri uri = data.Data;
//Toast.MakeText(this, path, ToastLength.Long).Show();
Toast.MakeText(this, uri.ToString(), ToastLength.Long).Show();
// It will crop the image accordingly.
cropPicture(uri);
string path = GetPathToImage(data.Data);
Toast.MakeText(this, uri.ToString(), ToastLength.Long).Show();
Toast.MakeText(this, path, ToastLength.Long).Show();
}
else if(requestCode == CROP_PIC)
{
// Get your cropped image here using Bundle and save the bitmap in particular location.
}
}
private void cropPicture(Android.Net.Uri picUri)
{
try
{
// call the standard crop action intent (the user device may not
// support it)
Intent cropIntent = new Intent("com.android.camera.action.CROP");
// indicate image type and Uri
cropIntent.SetDataAndType(picUri, "image/*");
// set crop properties
cropIntent.PutExtra("crop", "true");
// indicate aspect of desired crop
cropIntent.PutExtra("aspectX", 2);
cropIntent.PutExtra("aspectY", 1);
// indicate output X and Y
cropIntent.PutExtra("outputX", 256);
cropIntent.PutExtra("outputY", 256);
// retrieve data on return
cropIntent.PutExtra("return-data", true);
// start the activity - we handle returning in onActivityResult
StartActivityForResult(cropIntent, CROP_PIC);
}
// respond to users whose devices do not support the crop action
catch (ActivityNotFoundException anfe)
{
Toast toast = Toast
.MakeText(this, "This device doesn't support the crop action!", ToastLength.Long);
toast.Show();
}
}

How to perform Image comparison in Appium?

I need one help.
I am using Appium and I have a scenerio wherein I need to capture an image from the app under test, preserve that image as a baseline image and then capture another image from that app under test and then perform an image comparison between these two images.
Can anyone guide me how to do this please.
This is how you do it:
#Test(dataProvider = "search")
public void eventsSearch(String data) throws InterruptedException, IOException {
Thread.sleep(2000);
boolean status = false;
//WebElement img = driver.findElementByClassName("android.widget.ImageView");
//take screen shot
File screen = ((TakesScreenshot) driver)
.getScreenshotAs(OutputType.FILE);
// for (String event : events) {
driver.findElementById("your id")
.sendKeys(data);
driver.hideKeyboard();
List<WebElement> list = driver
.findElementsByXPath("//*[#class='android.widget.TextView' and #index='1']");
System.out.println(list.size());
int i = 0;
for (WebElement el : list) {
String eventList = el.getText();
System.out.println("events" + " " + eventList);
if (eventList.equals("gg")) {
status = true;
break;
}
i++;
}
//capture image of searched contact icon
List<WebElement > imageList = driver.findElementsByXPath("//*[#class='android.widget.ImageView' and #index='0']");
System.out.println(imageList.size());
System.out.println(i);
WebElement image = imageList.get(1);
Point point = image.getLocation();
//get element dimension
int width = image.getSize().getWidth();
int height = image.getSize().getHeight();
BufferedImage img = ImageIO.read(screen);
BufferedImage dest = img.getSubimage(point.getX(), point.getY(), width,
height);
ImageIO.write(dest, "png", screen);
File file = new File("Menu.png");
FileUtils.copyFile(screen, file);
//verify images
verifyImage("Menu.png", "Menu.png" );
//Assert.assertTrue(status, "FAIL Event doesn't match" + data);
}
#DataProvider(name = "search")
public Object[][] searchData() {
return new Object[][] { { "gg" } };
}
public void verifyImage(String image1, String image2) throws IOException{
File fileInput = new File(image1);
File fileOutPut = new File(image2);
BufferedImage bufileInput = ImageIO.read(fileInput);
DataBuffer dafileInput = bufileInput.getData().getDataBuffer();
int sizefileInput = dafileInput.getSize();
BufferedImage bufileOutPut = ImageIO.read(fileOutPut);
DataBuffer dafileOutPut = bufileOutPut.getData().getDataBuffer();
int sizefileOutPut = dafileOutPut.getSize();
Boolean matchFlag = true;
if(sizefileInput == sizefileOutPut) {
for(int j=0; j<sizefileInput; j++) {
if(dafileInput.getElem(j) != dafileOutPut.getElem(j)) {
matchFlag = false;
break;
}
}
}
else
matchFlag = false;
Assert.assertTrue(matchFlag, "Images are not same");
}

Printing from Win RT Application (Windows 8.1 App)

I am developing a POS application in windows 8.1 (Universal App).
Order receipt will be printed from the application and even I am able to do so.
Printer - EPSON TM-U220D
Input - A grid is being created pragmatically with dynamic content within the ViewModel. So this grid is the input for printer
Output - In the print preview (Attahced pic 1) all looks good but when the receipt is actually printed then content is cut off from the end (Attahced pic 2)
PIC 1
PIC 2
Observations -
If I print a normal text file manually using print command (Right click file and then print), then all content is printed perfectly.
If I print that SAME content, from the application, by creating Dynamic grid, then with Small font size print is good but with bit bigger font size content again cuts off.
Tried -
Optimized the code for Creating Gird, by specifying height
Questions-
If preview is all good then why not output
Did anyone tried using ePOS-Print_SDK_141020E for Windows Store app?
Generate Dynamic Grid Code
private void AddRows(Grid grid, int count)
{
for (int i = 0; i < count; i++)
{
RowDefinition row = new RowDefinition();
row.Height = GridLength.Auto;
grid.RowDefinitions.Add(row);
}
}
private void AddColumns(Grid grid, int count)
{
for (int i = 0; i < count; i++)
{
grid.ColumnDefinitions.Add(new ColumnDefinition());
}
}
private TextBlock CreateTextBlock(string text, Color color, FontWeight fw, double fs = 10, int thick = 5)
{
if (color == null) color = Colors.Black;
TextBlock txtBlock1 = new TextBlock();
txtBlock1.Text = text;
txtBlock1.FontSize = fs;
txtBlock1.FontWeight = fw;
txtBlock1.Foreground = new SolidColorBrush(color);
txtBlock1.VerticalAlignment = VerticalAlignment.Center;
txtBlock1.Margin = new Thickness(thick);
return txtBlock1;
}
private async Task<Grid> CreateDynamicWPFGrid()
{
Grid ParentGrid = new Grid();
AddRows(ParentGrid, 8);
/* Start First Grid*/
Grid DynamicGrid = new Grid();
DynamicGrid.Width = 230;
DynamicGrid.HorizontalAlignment = HorizontalAlignment.Left;
DynamicGrid.VerticalAlignment = VerticalAlignment.Top;
DynamicGrid.Margin = new Thickness(24, 0, 0, 0);
AddColumns(DynamicGrid, 2);
AddRows(DynamicGrid, 3);
TextBlock txtBlock1 = CreateTextBlock(DateTime.Now.ToString("M/d/yy"), Colors.Black, FontWeights.Normal);
Grid.SetRow(txtBlock1, 0);
Grid.SetColumn(txtBlock1, 1);
.
.
.
.
Return ParentGrid;
}
Printer Events Code
//Register Print Contract
async Task RegisterPrintContract()
{
PrintManager manager = PrintManager.GetForCurrentView();
manager.PrintTaskRequested += OnPrintTaskRequested;
await PrintManager.ShowPrintUIAsync();
}
//Unregister Print Contract
void UnregisterPrintContract()
{
PrintManager printMan = PrintManager.GetForCurrentView();
printMan.PrintTaskRequested -= OnPrintTaskRequested;
}
void OnPrintTaskRequested(PrintManager sender, PrintTaskRequestedEventArgs args)
{
// If I need to be asynchronous, I can get a deferral. I don't *need*
// to do this here, I'm just faking it.
var deferral = args.Request.GetDeferral();
PrintTask printTask = args.Request.CreatePrintTask("My Print Job", OnPrintTaskSourceRequestedHandler);
printTask.Completed += OnPrintTaskCompleted;
deferral.Complete();
}
void OnPrintTaskCompleted(PrintTask sender, PrintTaskCompletedEventArgs args)
{
// TODO: Tidy up.
this._document = null;
this._pages = null;
}
async void OnPrintTaskSourceRequestedHandler(PrintTaskSourceRequestedArgs args)
{
var deferral = args.GetDeferral();
await Dispatcher.RunAsync(CoreDispatcherPriority.Normal,
() =>
{
this._document = new PrintDocument();
this._document.Paginate += OnPaginate;
this._document.GetPreviewPage += OnGetPreviewPage;
this._document.AddPages += OnAddPages;
// Tell the caller about it.
args.SetSource(this._document.DocumentSource);
});
deferral.Complete();
}
void OnAddPages(object sender, AddPagesEventArgs e)
{
// Loop over all of the preview pages and add each one to add each page to be printied
// We should have all pages ready at this point...
foreach (var page in this._pages)
{
//this._pages[page.Key]
this._document.AddPage(this._pages[page.Key]);
}
PrintDocument printDoc = (PrintDocument)sender;
// Indicate that all of the print pages have been provided
printDoc.AddPagesComplete();
}
async void OnGetPreviewPage(object sender, GetPreviewPageEventArgs e)
{
Grid x = await CreateDynamicWPFGrid();
await Dispatcher.RunAsync(CoreDispatcherPriority.Normal,
() =>
{
// NB: assuming it's ok to keep all these pages in
// memory. might not be the right thing to do
// of course.
if (this._pages == null)
{
this._pages = new Dictionary<int, UIElement>();
}
if (!this._pages.ContainsKey(e.PageNumber))
{
this._pages[e.PageNumber] = x;
}
if (this._document == null)
this._document = new PrintDocument();
this._document.SetPreviewPage(e.PageNumber, this._pages[e.PageNumber]);
}
);
}
async void OnPaginate(object sender, PaginateEventArgs e)
{
await Dispatcher.RunAsync(CoreDispatcherPriority.Normal,
() =>
{
// I have one page and that's *FINAL* !
this._document.SetPreviewPageCount(e.CurrentPreviewPageNumber, PreviewPageCountType.Final);
});
}

Issue in Invoking camera and saving images in SD card

In my task I have to invoke camera in a button click and take picture and have to save it and display the image in the same screen. I have tried it and succeed in emulator. but its not working in real device. getting some errors. tried a lot. but cant able to find out the issue. more over, Its working perfectly in 9700 emulator and showing some error in 9500.
public class CameraScreen extends MainScreen implements FieldChangeListener
{
/** The camera's video controller */
private VideoControl _videoControl;
private Field _videoField;
private EncodingProperties[] _encodings;
private int _indexOfEncoding = 0;
private static String FILE_NAME = System.getProperty("fileconn.dir.photos")+"IMAGE"; //"file:///SDCard/" + "myphotos/" + "IMAGE";//
private static String EXTENSION = ".bmp";
private static int _counter;
int flag = 0;
BitmapField imageField = new BitmapField();
HorizontalFieldManager menuBar = new HorizontalFieldManager(Field.USE_ALL_WIDTH);
VerticalFieldManager main_vfm = new VerticalFieldManager();
VerticalFieldManager camera_vfm = new VerticalFieldManager();
VerticalFieldManager image_vfm = new VerticalFieldManager();
ButtonField bt = new ButtonField("Click",ButtonField.CONSUME_CLICK);
ButtonField front_bt = new ButtonField("Front",ButtonField.CONSUME_CLICK);
ButtonField back_bt = new ButtonField("Back",ButtonField.CONSUME_CLICK);
ButtonField side1_bt = new ButtonField("Side 1",ButtonField.CONSUME_CLICK);
ButtonField side2_bt = new ButtonField("Side 2",ButtonField.CONSUME_CLICK);
public CameraScreen()
{
setTitle("First Screen");
bt.setChangeListener(this);
front_bt.setChangeListener(this);
back_bt.setChangeListener(this);
side1_bt.setChangeListener(this);
side2_bt.setChangeListener(this);
image_vfm.add(menuBar);
menuBar.add(front_bt);
menuBar.add(back_bt);
menuBar.add(side1_bt);
menuBar.add(side2_bt);
image_vfm.add(bt);
try {
Bitmap image = Bitmap.createBitmapFromBytes( readFile(),0, -1, 5 );
imageField.setBitmap(image);
image_vfm.add(imageField);
} catch(Exception e) {
System.out.println(e.toString());
}
main_vfm.add(image_vfm);
add(main_vfm);
front_bt.setFocus();
// Initialize the camera object and video field
initializeCamera();
// Initialize the list of possible encodings
initializeEncodingList();
// If the field was constructed successfully, create the UI
if(_videoField != null)
{
createUI();
}
// If not, display an error message to the user
else
{
camera_vfm.add( new RichTextField( "Error connecting to camera." ) );
}
}
/**
* Takes a picture with the selected encoding settings
*/
public void takePicture()
{
try
{
// A null encoding indicates that the camera should
// use the default snapshot encoding.
String encoding = null;
if( _encodings != null )
{
// Use the user-selected encoding
encoding = _encodings[_indexOfEncoding].getFullEncoding();
}
// Retrieve the raw image from the VideoControl and
// create a screen to display the image to the user.
createImageScreen( _videoControl.getSnapshot( encoding ) );
}
catch(Exception e)
{
home.errorDialog("ERROR " + e.getClass() + ": " + e.getMessage());
}
}
/**
* Prevent the save dialog from being displayed
* #see net.rim.device.api.ui.container.MainScreen#onSavePrompt()
*/
protected boolean onSavePrompt()
{
return true;
}
/**
* Initializes the Player, VideoControl and VideoField
*/
private void initializeCamera()
{
try
{
// Create a player for the Blackberry's camera
Player player = Manager.createPlayer( "capture://video" );
// Set the player to the REALIZED state (see Player javadoc)
player.realize();
// Grab the video control and set it to the current display
_videoControl = (VideoControl)player.getControl( "VideoControl" );
if (_videoControl != null)
{
// Create the video field as a GUI primitive (as opposed to a
// direct video, which can only be used on platforms with
// LCDUI support.)
_videoField = (Field) _videoControl.initDisplayMode (VideoControl.USE_GUI_PRIMITIVE, "net.rim.device.api.ui.Field");
_videoControl.setDisplayFullScreen(true);
//_videoControl.setDisplaySize(50, 50);
_videoControl.setVisible(true);
}
// Set the player to the STARTED state (see Player javadoc)
player.start();
}
catch(Exception e)
{
home.errorDialog("ERROR " + e.getClass() + ": " + e.getMessage());
}
}
/**
* Initialize the list of encodings
*/
private void initializeEncodingList()
{
try
{
// Retrieve the list of valid encodings
String encodingString = System.getProperty("video.snapshot.encodings");
// Extract the properties as an array of word
String[] properties = StringUtilities.stringToKeywords(encodingString);
// The list of encodings
Vector encodingList = new Vector();
//Strings representing the four properties of an encoding as
//returned by System.getProperty().
String encoding = "encoding";
String width = "width";
String height = "height";
String quality = "quality";
EncodingProperties temp = null;
for(int i = 0; i < properties.length ; ++i)
{
if( properties[i].equals(encoding))
{
if(temp != null && temp.isComplete())
{
// Add a new encoding to the list if it has been
// properly set.
encodingList.addElement( temp );
}
temp = new EncodingProperties();
// Set the new encoding's format
++i;
temp.setFormat(properties[i]);
}
else if( properties[i].equals(width))
{
// Set the new encoding's width
++i;
temp.setWidth(properties[i]);
}
else if( properties[i].equals(height))
{
// Set the new encoding's height
++i;
temp.setHeight(properties[i]);
}
else if( properties[i].equals(quality))
{
// Set the new encoding's quality
++i;
temp.setQuality(properties[i]);
}
}
// If there is a leftover complete encoding, add it.
if(temp != null && temp.isComplete())
{
encodingList.addElement( temp );
}
// Convert the Vector to an array for later use
_encodings = new EncodingProperties[ encodingList.size() ];
encodingList.copyInto((Object[])_encodings);
}
catch (Exception e)
{
// Something is wrong, indicate that there are no encoding options
_encodings = null;
home.errorDialog(e.toString());
}
}
/**
* Adds the VideoField to the screen
*/
private void createUI()
{
// Add the video field to the screen
camera_vfm.add(_videoField);
}
/**
* Create a screen used to display a snapshot
* #param raw A byte array representing an image
*/
private void createImageScreen( byte[] raw )
{
main_vfm.replace(camera_vfm, image_vfm);
fieldChanged(raw);
Bitmap image1 = Bitmap.createBitmapFromBytes( readFile(),0, -1, 5 );
try{
if(flag == 1){
}
else{
image_vfm.delete(imageField);
}
imageField.setBitmap(image1);
image_vfm.add(imageField);
}
catch(Exception e){System.out.println(e.toString());}
}
private byte[] readFile() {
byte[] result = null;
FileConnection fconn = null;
try {
fconn = (FileConnection)Connector.open(FILE_NAME + "_front" + EXTENSION);
} catch (IOException e) {
System.out.print("Error opening file");
}
if (!fconn.exists()) {
//Dialog.inform("file not exist");
} else {
InputStream in = null;
ByteVector bytes = new ByteVector();
try {
in = fconn.openInputStream();
} catch (IOException e) {
System.out.print("Error opening input stream");
}
try {
int c = in.read();
while (-1 != c) {
bytes.addElement((byte) c);
c = in.read();
}
result = bytes.getArray();
} catch (IOException e) {
System.out.print("Error reading input stream");
}
try {
fconn.close();
} catch (IOException e) {
System.out.print("Error closing file");
}
}
return result;
}
public void fieldChanged( final byte[] _raw )
{
try
{
flag ++;
// Create the connection to a file that may or
// may not exist.
FileConnection file = (FileConnection)Connector.open(FILE_NAME + "_front" + EXTENSION);
// If the file exists, increment the counter until we find
// one that hasn't been created yet.
if (file.exists()) {
file.delete();
file.close();
file = (FileConnection) Connector.open(FILE_NAME + "_front" + EXTENSION);
}
//FileConnection file_temp = (FileConnection)Connector.open(FILE_NAME + "tempimg" + EXTENSION);
//file_temp.delete();
// We know the file doesn't exist yet, so create it
file.create();
// Write the image to the file
OutputStream out = file.openOutputStream();
out.write(_raw);
// Close the connections
//out.close();
file.close();
//Dialog.inform( "Saved to " + FILE_NAME + "_front" + EXTENSION );
}
catch(Exception e)
{
home.errorDialog("ERROR " + e.getClass() + ": " + e.getMessage());
Dialog.inform( "File not saved this time");
}
}
/**
* Sets the index of the encoding in the 'encodingList' Vector
* #param index The index of the encoding in the 'encodingList' Vector
*/
public void setIndexOfEncoding(int index)
{
_indexOfEncoding = index;
}
/**
* #see net.rim.device.api.ui.Screen#invokeAction(int)
*/
protected boolean invokeAction(int action)
{
boolean handled = super.invokeAction(action);
if(!handled)
{
switch(action)
{
case ACTION_INVOKE: // Trackball click
{
takePicture();
return true;
}
}
}
return handled;
}
public void fieldChanged(Field field, int context) {
// TODO Auto-generated method stub
srn2 screen2 = new srn2();
srn3 screen3 = new srn3();
srn4 screen4 = new srn4();
if(field==bt)
{
main_vfm.replace(image_vfm, camera_vfm);
}
if(field==back_bt)
{
UiApplication.getUiApplication().popScreen(UiApplication.getUiApplication().getActiveScreen());
UiApplication.getUiApplication().pushScreen(screen2);
}
if(field==side1_bt)
{
UiApplication.getUiApplication().popScreen(UiApplication.getUiApplication().getActiveScreen());
UiApplication.getUiApplication().pushScreen(screen3);
}
if(field==side2_bt)
{
UiApplication.getUiApplication().popScreen(UiApplication.getUiApplication().getActiveScreen());
UiApplication.getUiApplication().pushScreen(screen4);
}
}
}
Note: This Error displaying first "javax.microedition.media.MediaException: There is already another active Player. Call Player.close() on the existing Player to free up the resources." and camera gets open and when i try to take picture this error displays "Error Class Java.lang.ArrayIndexOutOfBoundsException: Index 0>=0"
After _videoControl.getSnapshot( encoding ) has been called you need to close player (i.e. to call player.close(). And that's exactly what the exception tells about.
However this method of taking images is highly unreliable - you'll not be able to use it for every BB device model. I don't know why RIM put it in SDK samples. By doing that RIM pushes developers to a wrong way. As alishaik786 mentiones a proper method of taking images on BB is using Invoke.invokeApplication(Invoke.APP_TYPE_CAMERA, new CameraArguments()) with a FileSystemJournalListener implementation. Just search on StackOverflow on these for the implementation details. I vaguely recall the implementation is painful (like many other parts on BB), but once done it will work on any device.
you got two error
1."javax.microedition.media.MediaException: There is already another active Player. Call Player.close()
this exception thrown if you try to open camera (player) while another instance of camera is already opened. In your code, you need to define the player object as a class level and must close the player after taking the snapshot (also need to close the player if you push from one screen to another).
2."Error Class Java.lang.ArrayIndexOutOfBoundsException: Index 0>=0"
This error may occur when you access encoding array while encoding array size is zero.
You can ensure this issue by using _videoControl.getSnapshot( null), which take the snapshot in default encoding.
So first insure these issue and reply me.

Resources