I have a custom camera implementation which I would like to have have my own sound when the picture is taken how to enable shutter sound in hardware camera, I need to only play my camera sound and not the default one
private async void TakePhotoButtonTapped(object sender, EventArgs e)
{
Android.Hardware.Camera.CameraInfo info = new Android.Hardware.Camera.CameraInfo();
// mCameraFacing is CameraID.
Android.Hardware.Camera.GetCameraInfo(mCameraFacing, info);
if (info.CanDisableShutterSound)
camera.EnableShutterSound(false);
else
camera.EnableShutterSound(true);
Bitmap image = textureView.Bitmap;
using (var imageStream = new MemoryStream())
{
await image.CompressAsync(Bitmap.CompressFormat.Jpeg, 100, imageStream);
image.Recycle();
imageBytes = imageStream.ToArray();
AddImages(imageBytes);
Toast.MakeText(Android.App.Application.Context, _languageCache.Translate("Photo Captured"), ToastLength.Short).Show();
}
await Task.Delay(300);
}
You could use the code below of the sound settings and put it into the code which you used to take photo.
I use the system ShutterClick sound for reference. You could use your own as well.
private void SoundSettings()
{
MediaActionSound sound = new MediaActionSound();
AudioManager meng = (AudioManager)Context.GetSystemService(Context.AudioService);
int volume = meng.GetStreamVolume(Android.Media.Stream.Notification);
if (volume != 0)
sound.Play(MediaActionSoundType.ShutterClick);
}
Related
Media Projection and Image Reader duo not working on Netflix/Youtube on Android TV.
I am trying to get image from android TV with Media Projection and Image Reader.
On the log side, a warning catches my eye; probably because Image Reader uses OpenGL:
W/libEGL: EGLNativeWindowType 0xd7995a68 disconnect failed
Although it works correctly in the simulator, Image Reader does not work properly in programs such as Youtube, Netflix and so on licensed Android TV. (No problem in simulator)
I tried, works fine on CNBC-e and Android TV User Interface.
I'm starting to think it's due to HDCP. Any guesses about it? Or do you think these programs have Media Projection protection?
Images captured on Netflix and Youtube are always the same. It's like time has stopped.
The heart of my codes is as follows.
private class ImageAvailableListener implements ImageReader.OnImageAvailableListener {
#Override
public void onImageAvailable(ImageReader reader) {
FileOutputStream fos = null;
Bitmap bitmap = null;
try (Image image = mImageReader.acquireLatestImage()) {
if (image != null) {
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = (ByteBuffer) planes[0].getBuffer().rewind();
int pixelStride = planes[0].getPixelStride();
int rowStride = planes[0].getRowStride();
int rowPadding = rowStride - pixelStride * mWidth;
// create bitmap
bitmap = Bitmap.createBitmap(mWidth + rowPadding / pixelStride, mHeight, Bitmap.Config.ARGB_8888);
bitmap.copyPixelsFromBuffer(buffer);
// write bitmap to a file
fos = new FileOutputStream(mStoreDir + "/myscreen_" + IMAGES_PRODUCED + ".png");
resizedBitmap.compress(Bitmap.CompressFormat.JPEG, 100, fos);
IMAGES_PRODUCED++;
Log.e(TAG, "captured image: " + IMAGES_PRODUCED);
}
} catch (Exception e) {
e.printStackTrace();
} finally {
if (fos != null) {
try {
fos.close();
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
if (bitmap != null) {
bitmap.recycle();
}
}
}
}
#SuppressLint("WrongConstant")
private void createVirtualDisplay() {
// get width and height
mWidth = Resources.getSystem().getDisplayMetrics().widthPixels;
mHeight = Resources.getSystem().getDisplayMetrics().heightPixels;
// start capture reader
mImageReader = ImageReader.newInstance(mWidth, mHeight, PixelFormat.RGBA_8888, 2);
mVirtualDisplay = mMediaProjection.createVirtualDisplay(SCREENCAP_NAME, mWidth, mHeight,
mDensity, getVirtualDisplayFlags(), mImageReader.getSurface(), null, mHandler);
mImageReader.setOnImageAvailableListener(new ImageAvailableListener(), mHandler);
}
I am using Xamarin iOS on windows Visual studio to extract metadata from a photo taken by the iphone camera.
private void GetMetaData(NSUrl url)
{
CGImageSource myImageSource;
myImageSource = CGImageSource.FromUrl(url, null);
var ns = new NSDictionary();
var imageProperties = myImageSource.CopyProperties(ns, 0);
var gps = imageProperties.ObjectForKey(CGImageProperties.GPSDictionary) as NSDictionary;
var lat = gps[CGImageProperties.GPSLatitude];
var latref = gps[CGImageProperties.GPSLatitudeRef];
var lon = gps[CGImageProperties.GPSLongitude];
var lonref = gps[CGImageProperties.GPSLongitudeRef];
var loc = String.Format("GPS: {0} {1}, {2} {3}", lat, latref, lon, lonref);
Console.WriteLine(loc);
}
the url being passed into the method is as:- {file:///var/mobile/Media/DCIM/100APPLE/IMG_0006.JPG}
The CGImageSource.FromUrl(url, null) returns null and my app crashes... Can anyone explain to me how I need to fix this?
Edit This is how I am getting the URL for the image.
protected void Handle_FinishedPickingMedia(object sender, UIImagePickerMediaPickedEventArgs e)
{
NSUrl url = null;
try
{
void ImageData(PHAsset asset)
{
if (asset == null) throw new Exception("PHAsset is null");
PHImageManager.DefaultManager.RequestImageData(asset, null, (data, dataUti, orientation, info) =>
{
//Console.WriteLine(data);
Console.WriteLine(info);
url = info.ValueForKey(new NSString("PHImageFileURLKey")) as NSUrl;
// Call method to get MetaData from Image Url //
GetMetaData(url);
});
}
As I said in your last post: Xamarin iOS camera and photos: If you capture photos from camera, the event will not return a ReferenceUrl. But you can get the metadata info with another key:
protected void Handle_FinishedPickingMedia(object sender, UIImagePickerMediaPickedEventArgs e)
{
var metadataInfo = e.Info["UIImagePickerControllerMediaMetadata"]; //or e.MediaMetadata;
}
This will contain some basic information which may be helpful to you. But this NSDictionary doesn't contain the GPS information. Because this picture is created by yourself, you should get the GPS by using CLLocationManager manually:
CLLocationManager manager;
manager = new CLLocationManager();
manager.RequestWhenInUseAuthorization();
manager.LocationsUpdated += (locationSender, e) =>
{
//get current locations
manager.StopUpdatingLocation();
};
manager.StartUpdatingLocation();
Please notice that add the keys in info.plist: NSLocationWhenInUseUsageDescription and NSLocationAlwaysAndWhenInUsageDescription on iOS11, if you want to deploy iOS10 add NSLocationAlwaysUsageDescription.
In this way, there's no need to get metadata from CGImageSource when you pick photos from camera.
I've been following various answers to various questions on the subject and I've come to a result and some code which looks like it works. I'm getting stuck with the NSURL part of it. I've got 2 mp3 tracks in the assets folder of my iOS gluon project. I've made the IOSAudioService Class to handle the playback. and I'm passing an argument from the play button in the view to the Play() method. Everything other than the actual file is registering as working. I'm getting an NSError, which from looking at the code is a nil value, so either the argument isn't passing correctly or it can't find the file. Code below.
public AVAudioPlayer backgroundMusic;
private double currentPosition;
NSURL songURL = null;
#Override
public void Play(String filename){
songURL = new NSURL(filename);
try {
if (backgroundMusic != null) {
Resume();
}
else {
//Start the audio at the beginning.
currentPosition = 0;
backgroundMusic = new AVAudioPlayer(songURL);
//create the mendia player and assign it to the audio
backgroundMusic.prepareToPlay();
backgroundMusic.play();}
//catch the audio error
} catch(NSErrorException e) {
System.out.println("error: " + e);
}
}
#Override
public void Stop() {
backgroundMusic.stop();
backgroundMusic = null;
}
#Override
public void Pause() {
currentPosition = backgroundMusic.getCurrentTime();
backgroundMusic.pause();
}
#Override
public void Resume() {
backgroundMusic.setCurrentTime(currentPosition);
backgroundMusic.play();
}
try {
services = (AudioService) Class.forName("com.gluonhq.charm.down.plugins.ios.IOSAudioService").newInstance();
} catch (ClassNotFoundException | InstantiationException | IllegalAccessException ex) {
System.out.println("Error " + ex);
}
I'm getting the error at the catch block for NSExceptionError e.
if (services != null) {
final HBox hBox = new HBox(10,
MaterialDesignIcon.PLAY_ARROW.button(e -> services.Play("/audio.mp3")),
MaterialDesignIcon.PAUSE.button(e -> {
if (!pause) {
services.Pause();
pause = true;
} else {
services.Resume();
pause = false;
}
}),
MaterialDesignIcon.STOP.button(e -> services.Stop()));
//set the HBox alignment
hBox.setAlignment(Pos.CENTER);
hBox.getStyleClass().add("hbox");
//create and set up a vbox to include the image, audio controls and the text then set the alignment
final VBox vBox = new VBox(5, Image(), hBox, text1);
vBox.setAlignment(Pos.CENTER);
setCenter(new StackPane(vBox));
} else {
//start an error if service is null
setCenter(new StackPane(new Label("Only for Android")));
}
Services.get(LifecycleService.class).ifPresent(s -> s.addListener(LifecycleEvent.PAUSE, () -> services.Stop()));
}
I've also follow the advice on creating the service factory class and the interface from Audio performance with Javafx for Android (MediaPlayer and NativeAudioService) taking out the add audio element and I'm intending to do this on a view by view basis if possible.
Problem solved after must fiddling and looking in the Javadocs.Solved by adding/replacing some code with the below.
songURL = new NSURL("file:///" + NSBundle.getMainBundle().findResourcePath(filename, "mp3"));
try {
songURL.checkResourceIsReachable();}
catch(NSErrorException d) {
System.out.println("Song not found!" + d);
}
I'm trying to add barcode scanner feature to my xamarin.ios app. I'm developing from visual studio and I've added the Zxing.Net.Mobile component from xamarin components store.
I've implemented it as shown in the samples:
ScanButton.TouchUpInside += async (sender, e) => {
//var options = new ZXing.Mobile.MobileBarcodeScanningOptions();
//options.AutoRotate = false;
//options.PossibleFormats = new List<ZXing.BarcodeFormat>() {
// ZXing.BarcodeFormat.EAN_8, ZXing.BarcodeFormat.EAN_13
//};
var scanner = new ZXing.Mobile.MobileBarcodeScanner(this);
//scanner.TopText = "Hold camera up to barcode to scan";
//scanner.BottomText = "Barcode will automatically scan";
//scanner.UseCustomOverlay = false;
scanner.FlashButtonText = "Flash";
scanner.CancelButtonText = "Cancel";
scanner.Torch(true);
scanner.AutoFocus();
var result = await scanner.Scan(true);
HandleScanResult(result);
};
void HandleScanResult(ZXing.Result result)
{
if (result != null && !string.IsNullOrEmpty(result.Text))
TextField.Text = result.Text;
}
The problem is that when I tap the scan button, the capture view is shown correctly but if I try to capture a barcode nothing happens and it seems the scanner doesn't recognize any barcode.
Someone has experienced this issue? How can I made it work?
Thanks in advance for your help!
I answered a similar question here. I couldn't get barcodes to scan because the default camera resolution was set too low. The specific implementation for this case would be:
ScanButton.TouchUpInside += async (sender, e) => {
var options = new ZXing.Mobile.MobileBarcodeScanningOptions {
CameraResolutionSelector = HandleCameraResolutionSelectorDelegate
};
var scanner = new ZXing.Mobile.MobileBarcodeScanner(this);
.
.
.
scanner.AutoFocus();
//call scan with options created above
var result = await scanner.Scan(options, true);
HandleScanResult(result);
};
And then the definition for HandleCameraResolutionSelectorDelegate:
CameraResolution HandleCameraResolutionSelectorDelegate(List<CameraResolution> availableResolutions)
{
//Don't know if this will ever be null or empty
if (availableResolutions == null || availableResolutions.Count < 1)
return new CameraResolution () { Width = 800, Height = 600 };
//Debugging revealed that the last element in the list
//expresses the highest resolution. This could probably be more thorough.
return availableResolutions [availableResolutions.Count - 1];
}
I have custom an camera app in Windows Phone 8. I need add a watermark image to each frame from camera capture and then record to a video.
I can customise each frame from the preview using the following code:
int[] pixelData = new int[(int)(camera.PreviewResolution.Width * camera.PreviewResolution.Height)];
camera.GetPreviewBufferArgb32(pixelData);
return pixelData;
and write it back to the preview.
my problem is that while I can show the frames on the screen while the user is recording a movie using the camera I cant find a working solution for WP8 to encode the frames and audio to save to a file.
I already tried opencv,libav and others without success,
if anyone can point me to the right direction it would be greatly appreciated.
You can do it like this.
private void GetCameraPicture_Click(object sender, RoutedEventArgs e)
{
Microsoft.Phone.Tasks.CameraCaptureTask cameraCaptureTask = new Microsoft.Phone.Tasks.CameraCaptureTask();
cameraCaptureTask.Completed += cct_Completed;
cameraCaptureTask.Show();
}
try
{
if (e.TaskResult == Microsoft.Phone.Tasks.TaskResult.OK)
{
var imageStream = e.ChosenPhoto;
var name = e.OriginalFileName;
using (MemoryStream mem = new MemoryStream())
{
TextBlock tb = new TextBlock() { Text = DateTime.Now.ToString("dd MMM yyyy, HH:mm"), Foreground = new SolidColorBrush(Color.FromArgb(128, 0, 0, 0)), FontSize = 40 };
BitmapImage finalImage = new BitmapImage();
finalImage.SetSource(imageStream);
WriteableBitmap wbFinal = new WriteableBitmap(finalImage);
wbFinal.Render(tb, null);
wbFinal.Invalidate();
wbFinal.SaveJpeg(mem, wbFinal.PixelWidth, wbFinal.PixelHeight, 0, 100);
mem.Seek(0, System.IO.SeekOrigin.Begin);
MediaLibrary lib = new MediaLibrary();
lib.SavePictureToCameraRoll("Copy" + name, mem.ToArray());
}
}
}
catch (Exception exp) { MessageBox.Show(exp.Message); }
Hope it may help you.