I'm working on a UWP app for school where I'm trying to display a MJPEG stream from my raspberry pi in the application. All the available decoders seem to work for windows phone 8.1 but not for the new UWP apps.
Is there anything I can do to use these streams in my application?
If not, is there a tool I can use to convert the streams and stream them on another port in the right format? This can be for raspberry or just windows.
Thanks in advance
Here is a MJPEG Decoder that supports UWP apps. To use it, we can download MJPEG Decoder Binaries and then reference MjpegProcessor.winmd in the project.
After this, we can use following code to display the MJPEG stream.
public sealed partial class MainPage : Page
{
private MjpegDecoder mjpegDecoder;
public MainPage()
{
this.InitializeComponent();
mjpegDecoder = new MjpegDecoder();
mjpegDecoder.FrameReady += mjpeg_FrameReady;
}
protected override void OnNavigatedTo(NavigationEventArgs e)
{
mjpegDecoder.ParseStream(new Uri("URI HERE"));
}
private async void mjpeg_FrameReady(object sender, FrameReadyEventArgs e)
{
using (InMemoryRandomAccessStream ms = new InMemoryRandomAccessStream())
{
await ms.WriteAsync(e.FrameBuffer);
ms.Seek(0);
var bmp = new BitmapImage();
await bmp.SetSourceAsync(ms);
//image is the Image control in XAML
image.Source = bmp;
}
}
}
Related
I combined learnings from two different sources to detect open windows using UIAutomation and to move the windows:
Detect the opening of a new window in C#
Move a UI Automation Element
I can get practically every app window to move (Win32 apps and UWP apps). Except for the Microsoft Edge browser window!
QUESTION: Does anyone know how to use UIAutomation to move the Edge window? Does anyone know what is special about the Edge browser that prevents it from adhering to Microsoft's own UIAutomation library?
Here's my full console app code (required: Add references to UIAutomationClient and UIAutomationTypes):
using System;
using System.Threading;
using System.Windows.Automation;
namespace UiAutomationTest
{
class Program
{
static void Main(string[] args)
{
Automation.AddAutomationEventHandler(eventId: WindowPattern.WindowOpenedEvent, element: AutomationElement.RootElement, scope: TreeScope.Children, eventHandler: OnWindowOpened);
Console.ReadLine();
Automation.RemoveAllEventHandlers();
}
private static void OnWindowOpened(object sender, AutomationEventArgs e)
{
try
{
var element = sender as AutomationElement;
if (element != null)
{
var hWnd = new IntPtr(element.Current.NativeWindowHandle);
Console.WriteLine($"Opened: {element.Current.Name} (Pid:{element.Current.ProcessId}),hWnd:{Convert.ToInt32(hWnd.ToString())}");
var _windowPattern = GetControlPattern(element, WindowPattern.Pattern) as WindowPattern;
if (_windowPattern == null)
{
Console.WriteLine("WindowPattern is null! Aborting.");
return;
}
if (false == _windowPattern.WaitForInputIdle(10000))
{
Feedback("Object not responding in a timely manner.");
return;
}
Console.WriteLine("Window is ready for input");
var _transformPattern = GetControlPattern(element, TransformPattern.Pattern) as TransformPattern;
if (_transformPattern == null)
{
Console.WriteLine("TransformPattern is null! Aborting.");
return;
}
// Is the TransformPattern object moveable?
if (_transformPattern.Current.CanMove)
{
Console.WriteLine("Waiting 3 seconds before move...");
// Wait a bit
Thread.Sleep(3000);
// Move element
_transformPattern.Move(250, 500);
Console.WriteLine("Window was moved!");
}
else
{
Feedback("Window is not moveable.");
}
}
}
catch (Exception ex)
{
Console.WriteLine($"{ex.GetType()}: {ex.Message}\n{ex.StackTrace}");
}
}
/// <summary>
/// Gets a specified control pattern.
/// </summary>
/// <param name="ae">The automation element we want to obtain the control pattern from.</param>
/// <param name="ap">The control pattern of interest.</param>
/// <returns>A ControlPattern object.</returns>
private static object GetControlPattern(AutomationElement ae, AutomationPattern ap)
{
if (false == ae.TryGetCurrentPattern(ap, out object oPattern))
{
Feedback("Object does not support the " + ap.ProgrammaticName + " Pattern");
return null;
}
Feedback("Object supports the " + ap.ProgrammaticName + " Pattern.");
return oPattern;
}
private static void Feedback(string message)
{
Console.WriteLine(message);
}
}
}
Here's a video showing that I can open Internet Explorer and move it, Chrome and move it, but Edge won't move!
CantMoveEdgeWindow.mp4
System Details:
Windows 10 1809
Visual Studio 2019 16.7
.NET Framework 4.7.2 C# Console App
Lastly, I have even tried using old school Win32 APIs such as EnumWindows, MoveWindow, SetWindowPlacement, and SetWindowPos. The Edge browser appears to be housed inside of an ApplicationFrameHost.exe process window. When I try to move the window, I get the same result as using the UIAutomation libraries: It "says" it passed, but the window doesn't actually move!
since a couple of days I'm struggling with a problem in reading temperature/humidity data from sensor (DHT11) using Android Things kit (i.MX7D). I've googled many examples and all of them were made using Arduino, Raspberry Pi or STM's (so C/C++), but none for i.MX7D and Java.
My problem is that I cannot read real values of temperature/humidity, because all I get from the sensor is only a boolean value indicating HIGH/LOW state. I haven't found any library for this sensor that would somehow help to convert it to real degrees/percent values.
Do you know if it's even possible to obtain these real values using the hardware that I have? If it is, could you please give me a hint or show some code how to do that, so that I can finally make some progress? I will much appreciate all kind of help.
Here is my piece of code:
import android.app.Activity;
import android.os.Bundle;
import com.google.android.things.pio.Gpio;
import com.google.android.things.pio.GpioCallback;
import com.google.android.things.pio.PeripheralManager;
public class MainActivity extends Activity {
private Gpio gpio;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
PeripheralManager manager = PeripheralManager.getInstance();
try {
gpio = manager.openGpio("GPIO2_IO05");
configureInput(gpio);
configureOutput(gpio);
} catch (IOException e) {
e.printStackTrace();
}
}
private GpioCallback gpioCallback = new GpioCallback() {
#Override
public boolean onGpioEdge(Gpio gpio) {
try {
if (gpio.getValue()) {
System.out.println("high");
} else {
System.out.println("low");
}
} catch (IOException e) {
e.printStackTrace();
}
return true;
}
#Override
public void onGpioError(Gpio gpio, int error) {
System.out.println(gpio + ": Error event " + error);
}
};
public void configureInput(Gpio gpio) throws IOException {
gpio.setDirection(Gpio.DIRECTION_IN);
gpio.setActiveType(Gpio.ACTIVE_HIGH);
gpio.setEdgeTriggerType(Gpio.EDGE_BOTH);
gpio.registerGpioCallback(gpioCallback);
}
public void configureOutput(Gpio gpio) throws IOException {
gpio.setDirection(Gpio.DIRECTION_OUT_INITIALLY_HIGH);
gpio.setActiveType(Gpio.ACTIVE_LOW);
gpio.setValue(true);
}
}
It's not possible to read from that sensor from Android things. It uses a one wire protocol similar to I2C but the speed required for the GPIO port to be able to read it is too fast for Android things.
As suggested, you can read it using an Arduino and then connect the Arduino as I2C slave, of you can use a different temperature and humidity sensor, like the BME280 which communicate via I2C and it's still reasonably cheap
I'm working on a (Universal Windows) c++/cx Directx project, which builds to a dll used in a c# UWP project.
I'm using the DirectX Toolkit to load textures.
I already use it to create a texture from file, but now I need it to create a texture from a byte array that was send from the UWP project.
But when trying to use CreateWICTextureFromMemory(), the HRESULT says 0x88982F50:"The component cannot be found"
All I can find about this problem indicates the bytes are not a correct image, but I tested it in the UWP project, there I get the byte array from bingmaps (it's a static map image), and I could make a working image from these bytes.
Does annyone know what I'm doing wrong?
UWP c# download code (to get the bytes):
private async Task DownloadTexture()
{
byte[] buffer = null;
try
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(_url);
WebResponse response = await request.GetResponseAsync();
using (Stream stream = response.GetResponseStream())
using (MemoryStream ms = new MemoryStream())
{
stream.CopyTo(ms);
buffer = ms.ToArray();
}
}
catch (Exception exception)
{
Logger.Error($"Could not Download Texture: {exception}");
}
_track3D.SetImage(out buffer[0], (ulong)buffer.Length);
}
Directx C++ code (that fails):
void Track3D::SetImage(uint8_t* ddsData, size_t ddsDataSize)
{
HRESULT result = CreateWICTextureFromMemory(_d3dDevice.Get(), _d3dContext.Get(), ddsData, ddsDataSize, nullptr, _Terrain.ReleaseAndGetAddressOf());
//here it goes wrong
//TODO: use the texture
}
UWP C# test code that works (displays image):
private async void setImage(byte[] buffer) //test
{
try
{
BitmapImage bmpImage = new BitmapImage();
using (InMemoryRandomAccessStream stream = new InMemoryRandomAccessStream())
{
await stream.WriteAsync(buffer.AsBuffer());
stream.Seek(0);
await bmpImage.SetSourceAsync(stream);
}
Image image = new Image();
image.Source = bmpImage;
((Grid)Content).Children.Add(image);
}
catch (Exception exception)
{
Logger.Error($"{exception}");
}
}
EDIT:
OK, turns out the first byte in the buffer is different in the C++ code, than it was when sent from UWP. When I change that first byte to the correct value in the C++ code (as a test), the texture is correctly created.
Which raises the question, why did the value of the first byte change?
(Or what did I do wrong?)
As requested, The function setImage() looks like this in c#:
[MethodImpl]
public void __ITrack3DPublicNonVirtuals.SetImage(out byte ddsData, [In] ulong ddsDataSize);
(also, I just realised the parameter names still have 'dds' in their name, sorry about that, will change that in my code as it is misleading)
0x88982F50: “The component cannot be found”
This is WINCODEC_ERR_COMPONENTNOTFOUND which happens whenever WIC can't determine what format codec to use for a file/binary. Your problem is your transfer of the data from managed to native code is wrong.
Your interop method is set to:
[MethodImpl]
public void __ITrack3DPublicNonVirtuals.SetImage(out byte ddsData, [In] ulong ddsDataSize);
With the C++ method signature being:
void Track3D::SetImage(uint8_t* ddsData, size_t ddsDataSize)
Because of the out your first parameter is being passed as a safe array with the length in the first element.
Instead you should use:
SetImage([In] byte ddsData, [In] ulong ddsDataSize); // C#
void Track3D::SetImage(const uint8_t* ddsData, size_t ddsDataSize); // C++.
I am developing a web scraper using JavaFX webview. For the scraping purpose, I don't need to have the images to be loaded. When the page is being loaded, Webkit spawns lots of UrlLoader thread. So I think it's better to have the images disabled, so I will save lots of system resources. Does anyone know how to disable automatic image loading in Webview?
Solution Approach
Define your own protocol handler for http and filter out anything with an image mime type or content.
URL.setURLStreamHandlerFactory(new HandlerFactory());
Sample Code
import javafx.application.Application;
import javafx.scene.Scene;
import javafx.scene.layout.StackPane;
import javafx.scene.web.*;
import javafx.stage.Stage;
import java.io.IOException;
import java.net.*;
public class LynxView extends Application {
private static final String BLANK_IMAGE_LOC =
"https://upload.wikimedia.org/wikipedia/commons/c/ce/Transparent.gif";
public static final String WEBSITE_LOC =
"http://fxexperience.com";
public static final String IMAGE_MIME_TYPE_PREFIX =
"image/";
#Override
public void start(Stage stage) throws Exception {
WebView webView = new WebView();
WebEngine engine = webView.getEngine();
engine.load(WEBSITE_LOC);
stage.setScene(new Scene(new StackPane(webView)));
stage.show();
}
public static void main(String[] args) throws IOException {
URL.setURLStreamHandlerFactory(new URLStreamHandlerFactory() {
#Override
public URLStreamHandler createURLStreamHandler(String protocol) {
if ("http".equals(protocol)) {
return new sun.net.www.protocol.http.Handler() {
#Override
protected URLConnection openConnection(URL url, Proxy proxy) throws IOException {
String[] fileParts = url.getFile().split("\\?");
String contentType = URLConnection.guessContentTypeFromName(fileParts[0]);
// this small hack is required because, weirdly, svg is not picked up by guessContentTypeFromName
// because, for Java 8, svg is not in $JAVA_HOME/lib/content-types.properties
if (fileParts[0].endsWith(".svg")) {
contentType = "image/svg";
}
System.out.println(url.getFile() + " : " + contentType);
if ((contentType != null && contentType.startsWith(IMAGE_MIME_TYPE_PREFIX))) {
return new URL(BLANK_IMAGE_LOC).openConnection();
} else {
return super.openConnection(url, proxy);
}
}
};
}
return null;
}
});
Application.launch();
}
}
Sample Notes
The sample uses concepts from:
Getting A File's Mime Type In Java
The sample only probes the filename to determine the content type and not the input stream attached to the url. Though probing the input stream would be a more accurate way to determine if the resource the url is connected to is actually an image or not, it is slightly less efficient to probe the stream, so the solution presented trades accuracy for efficiency.
The provided solution only demonstrates locations served by a http protocol, and not locations served by a https protocol.
The provided solution uses a sun.net.www.protocol.http.Handler class which may not be publicly visible in Java 9, (so the solution might not work for Java 9).
The urlStreamHandlerFactory is a global setting for the JVM, so once it is set, it will stay that way (e.g. all images for any java.net.URL connections will be ignored).
The sample solution returns a blank (transparent) image, which it loads over the net. For efficiency, the image could be loaded as a resource from the classpath instead of over the net.
You could return a null connection rather a than a connection to a blank image, if you do so, the web view code will start reporting null pointer exceptions to the console because it is not getting the url connection it expects, and will replace all images with an x image to show that the image is missing (I wouldn't really recommend an approach which returned a null connection).
public URLStreamHandler createURLStreamHandler(String protocol) {
if ("http".equals(protocol)) {
return new URLFortuneHandler();
}
else return null;
}
}
public class URLFortuneHandler extends sun.net.www.protocol.http.Handler {
protected URLConnection openConnection(URL url) throws IOException {
String file = url.getFile();
int mid= file.lastIndexOf(".");
String ext = file.substring(mid+1,file.length());
if ("jpg".equals(ext) || "png".equals(ext))
return somethinghere;
else
return super.openConnection(url);
}
}
I am trying to display a PNG image on blackberry device for OS 5.0 using J2ME MIDlet class instead of a blackberry RIMlet class. Can I use J2ME MIDlet instead of RIMlets? Would it be compatible with blackberry as blackberry do support J2ME? Can I get the image from it?
To display an image on the screen of a BlackBerry® device, create an Image object and populate it by calling the static Image.createImage() method. Provide the location of the image as a parameter.
refer display an PNG image using J2ME MIDlet classes on blackberry device
Can i use J2ME MIDlet instead of RIMlets...
YES, but there are certain advantages like mentioned here.
and if you want to go with MIDlet, here is an example using ImageItem,
import javax.microedition.lcdui.*;
import javax.microedition.midlet.*;
public class ImageItemMIDlet extends MIDlet implements CommandListener{
private Command exit;
private ImageItem imageItem;
private Image image;
private Display display;
private Form form;
public ImageItemMIDlet(){
try{
image = Image.createImage("/yourImage.png");
} catch (Exception e){ }
imageItem = new ImageItem("This is the IMAGE_ITEM Application",
image, ImageItem.LAYOUT_DEFAULT, "image");
}
public void startApp(){
form = new Form("ImageItem Example");
display = Display.getDisplay(this);
exit = new Command("Exit", Command.EXIT, 1);
form.append(imageItem);
form.addCommand(exit);
form.setCommandListener(this);
display.setCurrent(form);
}
public void pauseApp(){}
public void destroyApp(boolean unconditional){
notifyDestroyed();
}
public void commandAction(Command c, Displayable d){
String label = c.getLabel();
if(label.equals("Exit")){
destroyApp(true);
}
}
}
public class Midlet extends MIDlet {
public Display display;
public void startApp() {
Canvas obj = new DrawImage();
display = Display.getDisplay(this);
display.setCurrent(obj);
}
public void pauseApp() {
}
public void destroyApp(boolean unconditional) {
}
public class DrawImage extends Canvas{
int width = getWidth();
int height = getHeight();
protected void paint(Graphics g) {
try {
System.out.println("111111");
Image image = Image.createImage("/Waterfall.png");
if(image != null)
g.drawImage(image, 0, 0, Graphics.TOP | Graphics.LEFT);
else
System.out.println("2222");
} catch (IOException ex) {
System.out.println(ex);
}
}
}
}
Its good to use Midlet with canvas to show on canvas because if you use Midlet with Form then its show image but its also showing the theme of mobile in background of form. If you use canvas you can use also background image for your front image.
Thanks