Copy raw video frame buffer to IDirect3DSurface - directx

Normally, MediaPlayer has supported function CopyFrameToVideoSurface(IDirect3DSurface destination) to copy video frame to Direct3DSurface.
But now I receive a video frame as byte[] pbuffer with width and Height, how do I transfer it to IDirect3DSurface like MediaPlayer did?
Thanks
Edit:
To render video frame of MediaPlayer, I use:
public void VideoFrameAvailable(Windows.Media.Playback.MediaPlayer sender, object args)
{
var swapChain = _swapChain.Object;
using var dxgiSurface = swapChain.GetBuffer<IDXGISurface>(0);
var dxgiSurfacePtr = Marshal.GetComInterfaceForObject<IDXGISurface, IDXGISurface>(dxgiSurface.Object);
CreateDirect3D11SurfaceFromDXGISurface(dxgiSurfacePtr, out var graphicsSurface).ThrowOnError();
Marshal.Release(dxgiSurfacePtr);
using var d3DSurface = MarshalInterface<IDirect3DSurface>.FromAbi(graphicsSurface)!;
Marshal.Release(graphicsSurface);
sender.CopyFrameToVideoSurface(d3DSurface);
swapChain.Present(1, 0).ThrowOnError();
}

Related

How to EnableShutterSound while taking picture in android xamarin

I have a custom camera implementation which I would like to have have my own sound when the picture is taken how to enable shutter sound in hardware camera, I need to only play my camera sound and not the default one
private async void TakePhotoButtonTapped(object sender, EventArgs e)
{
Android.Hardware.Camera.CameraInfo info = new Android.Hardware.Camera.CameraInfo();
// mCameraFacing is CameraID.
Android.Hardware.Camera.GetCameraInfo(mCameraFacing, info);
if (info.CanDisableShutterSound)
camera.EnableShutterSound(false);
else
camera.EnableShutterSound(true);
Bitmap image = textureView.Bitmap;
using (var imageStream = new MemoryStream())
{
await image.CompressAsync(Bitmap.CompressFormat.Jpeg, 100, imageStream);
image.Recycle();
imageBytes = imageStream.ToArray();
AddImages(imageBytes);
Toast.MakeText(Android.App.Application.Context, _languageCache.Translate("Photo Captured"), ToastLength.Short).Show();
}
await Task.Delay(300);
}
You could use the code below of the sound settings and put it into the code which you used to take photo.
I use the system ShutterClick sound for reference. You could use your own as well.
private void SoundSettings()
{
MediaActionSound sound = new MediaActionSound();
AudioManager meng = (AudioManager)Context.GetSystemService(Context.AudioService);
int volume = meng.GetStreamVolume(Android.Media.Stream.Notification);
if (volume != 0)
sound.Play(MediaActionSoundType.ShutterClick);
}

Showing downloaded image in ImageView not working in Xamarin.Android

I have a small png image I like to show in an imageview using Xamarin.Android.
I am downloading the file using the following code:
private void Download()
{
var url = "https://hns.d7u.de/v4/images/hvvstoerungen_facebook.png";
var directory = Environment.GetFolderPath(Environment.SpecialFolder.Personal) + "/myapp/";
var fileName = url.Substring(url.LastIndexOf("/") +1);
var path = directory + fileName;
System.Net.WebClient wC = new System.Net.WebClient();
wC.Headers.Add(System.Net.HttpRequestHeader.AcceptEncoding, "gzip");
wC.DownloadDataCompleted += WC_DownloadDataCompleted;
wC.DownloadDataAsync(new Uri(url), path);
}
private void WC_DownloadDataCompleted(object sender, System.Net.DownloadDataCompletedEventArgs e)
{
var path = e.UserState.ToString();
var bytes = e.Result;
if (File.Exists(path))
File.Delete(path);
if (!File.Exists(path))
File.WriteAllBytes(path, bytes);
}
It is stored at /data/user/0/myapp/files/hns/hvvstoerungen_facebook.png and a File.Exists(...) returns a true for that path. So I am sure, that the file is downloaded and it exists.
When I want to show it in the ImageView, I do it like this:
if (System.IO.File.Exists(imageFilePath))
{
Android.Net.Uri andrUri = Android.Net.Uri.Parse(imageFilePath);
ImageIcon.SetImageURI(andrUri);
//Also not working:
//Bitmap bitmap = BitmapFactory.DecodeFile(imageFilePath);
//ImageIcon.SetImageBitmap(bitmap);
//And also not working:
//Android.Net.Uri andrUri = Android.Net.Uri.Parse(imageFilePath);
//Bitmap bmp = BitmapFactory.DecodeStream(Android.App.Application.Context.ContentResolver.OpenInputStream(andrUri));
//ImageIcon.SetImageBitmap(bmp);
}
The Output windows shows the following when the image should be shown:
02-01 23:41:24.770 E/Drawable(19815): Unable to decode stream:
android.graphics.ImageDecoder$DecodeException: Failed to create image
decoder with message 'unimplemented'Input contained an error. 02-01
23:41:24.770 W/ImageView(19815): resolveUri failed on bad bitmap uri:
/data/user/0/myapp/files/hns/hvvstoerungen_facebook.png
But I cannot figured out what exactly this means.
One additional thing is: If I run the app in a brand new Android Emulator instance, this image and all other of its kind are not shown.
If I run the app in an old Android Emulator instance, where the app was already running before but on Android.Forms basis, the old images that were known by the old project are shown while the newly downloaded images are not. All images are in the same folder and I cannot see any differences between them.
Does anyone has an Idea?
Edit:
My working version has the following Download() Method instead:
private void Download()
{
var noCompression = new string[] { ".png", ".jpg", ".jpeg", ".gif", ".zip", ".7z", ".mp3", ".mp4" };
var url = "https://hns.d7u.de/v4/images/hvvstoerungen_facebook.png";
var directory = Environment.GetFolderPath(Environment.SpecialFolder.Personal) + "/myapp/";
var fileName = url.Substring(url.LastIndexOf("/") +1);
var path = directory + fileName;
System.Net.WebClient wC = new System.Net.WebClient();
if (!noCompression.Contains(url.Substring(url.LastIndexOf('.'))))
wC.Headers.Add(System.Net.HttpRequestHeader.AcceptEncoding, "gzip");
wC.DownloadDataCompleted += WC_DownloadDataCompleted;
wC.DownloadDataAsync(new Uri(url), path);
}
You could try the code below.
Download the image from Url:
public Bitmap GetImageBitmapFromUrl(string url)
{
Bitmap imageBitmap = null;
using (var webClient = new WebClient())
{
var imageBytes = webClient.DownloadData(url);
if (imageBytes != null && imageBytes.Length > 0)
{
imageBitmap = BitmapFactory.DecodeByteArray(imageBytes, 0, imageBytes.Length);
}
}
return imageBitmap;
}
Usage:
bitmap = GetImageBitmapFromUrl("https://hns.d7u.de/v4/images/hvvstoerungen_facebook.png");
And save the image as png:
void ExportBitmapAsPNG(Bitmap bitmap)
{
var folderPath = System.Environment.GetFolderPath(System.Environment.SpecialFolder.Personal);
filePath = System.IO.Path.Combine(folderPath, "test.png");
var stream = new FileStream(filePath, FileMode.Create);
bitmap.Compress(Bitmap.CompressFormat.Png, 100, stream);
stream.Close();
}
Usage:
ExportBitmapAsPNG(bitmap);
Check the file exists or not and set into the imageview:
if (File.Exists(filePath))
{
Bitmap myBitmap = BitmapFactory.DecodeFile(filePath);
imageview.SetImageBitmap(myBitmap);
}

How to write to file (encode) modified camera frames in Windows Phone 8

I have custom an camera app in Windows Phone 8. I need add a watermark image to each frame from camera capture and then record to a video.
I can customise each frame from the preview using the following code:
int[] pixelData = new int[(int)(camera.PreviewResolution.Width * camera.PreviewResolution.Height)];
camera.GetPreviewBufferArgb32(pixelData);
return pixelData;
and write it back to the preview.
my problem is that while I can show the frames on the screen while the user is recording a movie using the camera I cant find a working solution for WP8 to encode the frames and audio to save to a file.
I already tried opencv,libav and others without success,
if anyone can point me to the right direction it would be greatly appreciated.
You can do it like this.
private void GetCameraPicture_Click(object sender, RoutedEventArgs e)
{
Microsoft.Phone.Tasks.CameraCaptureTask cameraCaptureTask = new Microsoft.Phone.Tasks.CameraCaptureTask();
cameraCaptureTask.Completed += cct_Completed;
cameraCaptureTask.Show();
}
try
{
if (e.TaskResult == Microsoft.Phone.Tasks.TaskResult.OK)
{
var imageStream = e.ChosenPhoto;
var name = e.OriginalFileName;
using (MemoryStream mem = new MemoryStream())
{
TextBlock tb = new TextBlock() { Text = DateTime.Now.ToString("dd MMM yyyy, HH:mm"), Foreground = new SolidColorBrush(Color.FromArgb(128, 0, 0, 0)), FontSize = 40 };
BitmapImage finalImage = new BitmapImage();
finalImage.SetSource(imageStream);
WriteableBitmap wbFinal = new WriteableBitmap(finalImage);
wbFinal.Render(tb, null);
wbFinal.Invalidate();
wbFinal.SaveJpeg(mem, wbFinal.PixelWidth, wbFinal.PixelHeight, 0, 100);
mem.Seek(0, System.IO.SeekOrigin.Begin);
MediaLibrary lib = new MediaLibrary();
lib.SavePictureToCameraRoll("Copy" + name, mem.ToArray());
}
}
}
catch (Exception exp) { MessageBox.Show(exp.Message); }
Hope it may help you.

WinRT StorageFile write downloaded file

I'm struggling with a easy problem. I want to download an image from web using this code:
WebRequest requestPic = WebRequest.Create(#"http://something.com/" + id + ".jpg");
WebResponse responsePic = await requestPic.GetResponseAsync();
Now I wanted to write the WebResponse's stream in a StorageFile (eg. create a file id.jpg in the app's storage), but I haven't found any way to achieve that. I searched the web for it, but no success - all ways incompatible Stream types and so on.
Could you please help?
I have found the following solution, which works and is not too complicated.
public async static Task<StorageFile> SaveAsync(
Uri fileUri,
StorageFolder folder,
string fileName)
{
var file = await folder.CreateFileAsync(fileName, CreationCollisionOption.ReplaceExisting);
var downloader = new BackgroundDownloader();
var download = downloader.CreateDownload(
fileUri,
file);
var res = await download.StartAsync();
return file;
}
You will need to read the response stream into a buffer then write the data to a StorageFile. THe following code shows an example:
var fStream = responsePic.GetResponseStream();
var file = await ApplicationData.Current.LocalFolder.CreateFileAsync("testfile.txt");
using (var ostream = await file.OpenStreamForWriteAsync())
{
int count = 0;
do
{
var buffer = new byte[1024];
count = fStream.Read(buffer, 0, 1024);
await ostream.WriteAsync(buffer, 0, count);
}
while (fStream.CanRead && count > 0);
}
That can be done using the C++ REST SDK in Windows Store Apps. It's explained by HTTP Client Tutorial.

save bmpData as jpg using adobe flex in air application

i use the following code:
protected function videoDisplay_playheadUpdateHandler(event:mx.events.VideoEvent):void
{
if(!captured && videoDisplay.playheadTime>=0){
capture();
}
}
private function capture():void
{
var bmpData:BitmapData = new BitmapData(videoDisplay.width, videoDisplay.height);
bmpData.draw(videoDisplay);
captured = true;
store(...); //????????
}
in order to capture a frame from a videoDisplay object
1) is it correct or am i doing something wrong?
2) what can i do to store the bmpData as .jpg at my computer?
i am using flex4.5 and it is an air app...
any ideas??
Thanks in advance!!
The following code should help you
var jpegEncoder:JPEGEncoder = new JPEGEncoder(90);
var jpgSource:BitmapData = new BitmapData(videoDisplay.width,videoDisplay.height);
jpgSource.draw(this);
var fileReference:FileReference = new FileReference();
fileReference.save(jpegEncoder.encode(jpgSource),"videoImage.jpg");
To use jpeg encode you need to have to import
import mx.graphics.codec.JPEGEncoder;
The above changes should be enough to allow the user to take a snapshot of a running video.
Please note, that with this the user will be prompted to select the location of the file.
Incase you want a silent save, let me know, I will put up the required code.
Somewhere in your application keep an image tag as follows.Most apt place should be just below the video.
<mx:Image scaleContent="true" width="150" height="120" maintainAspectRatio="false" id="myScaledSnapshot"/>
Now with this done, do the following changes in your code:
private function capture(filename:String):void
{
var bitmapData:BitmapData = new BitmapData(videoDisplay.width, videoDisplay.height);
bitmapData.draw(videoDisplay,new Matrix());
var bitmap : Bitmap = new Bitmap(bitmapData);
var jpg:JPEGEncoder = new JPEGEncoder();
var ba:ByteArray = jpg.encode(bitmapData);
myImageSnapshot.source=ba;
var jpegEncoder:JPEGEncoder = new JPEGEncoder(50);
var imageSnapshot:ImageSnapshot = ImageSnapshot.captureImage(this.myImageSnapshot,90,jpegEncoder);
var imageByteArray:ByteArray = imageSnapshot.data;
var newImage:File = File.desktopDirectory.resolvePath("KioskThumbs/"+filename+".jpg");
fileStream = new FileStream();
fileStream.open(newImage, FileMode.UPDATE);
fileStream.writeBytes(imageByteArray);
fileStream.close();
captured = true;
}
The above code really doesnt do anything special.
Its just using an image 'component' from flex, making it do the work for scaling the Video image, then taking the snapshot of this resized image component, and then writing it into a file.
My capture at last is:
private function capture(filename:String):void
{
var bitmapData:BitmapData = new BitmapData(videoDisplay.width, videoDisplay.height);
bitmapData.draw(videoDisplay,new Matrix());
var bitmap : Bitmap = new Bitmap(bitmapData);
var jpg:JPEGEncoder = new JPEGEncoder();
var ba:ByteArray = jpg.encode(bitmapData);
var newImage:File = File.desktopDirectory.resolvePath("KioskThumbs/"+filename+".jpg");
fileStream = new FileStream();
fileStream.open(newImage, FileMode.UPDATE);
fileStream.writeBytes(ba);
fileStream.close();
captured = true;
}
it works fine except the fact that i want to scale the photo lets say to 150 width 120height and not BitmapData(videoDisplay.width,videoDisplay.height); what can i do to solve that?
Thanks a lot all of you!

Resources