I am creating software in Delphi 10.2 Tokyo using FMX. I want to show a video in the background behind all components using TMediaPlayerControl. I send the TMediaPlayerControl to the back and play the video, but the video doesn't play behind the components, but rather it plays on top of them.
I put a TMemo in the client area and a TMediaPlayerControl behind it. What it should do is play the video behind the TMemo, but instead it plays on top of the TMemo.
How can I play video in the background behind the TMemo?
If you want to do this, you will need a video player that can render directly on the openGL surface (ie render on the delphi form actually). As far as i know their is only one videoplayer that can do this, it's the TalVideoPlayer implemented in alcinoe (https://sourceforge.net/projects/alcinoe/). The current implementation is only for ios / android but the windows implementation seam quite simple to do and the macos version quite trivial as it's use the same api as ios
You can use the libvlc to render directly to a form or whatever you want.
Call libvlc_media_player_set_display_window() to set a window handle where the media player should render its video output.
So you can show a video in the background behind all components.
I'm using wrapper classes and components from
http://prog.olsztyn.pl/paslibvlc
very simple indeed
Checked in FMX Delphi 10.4.2 - worked properly (libVLC 3.014 - latest)
Related
I'm developing an hybrid app for iOS using ionic/cordova. This app implements some typical webrtc features such as video calls and file transfers between two peers.
For this purpose I'm using cordova-plugin-iosrtc which exposes all W3C WebRTC APIs.
While RTCPeerConnection, getUserMedia and other Javascript WebRTC APIs implementations are pretty good, otherwise the video element to which streams are attached is substituted by a native UIView layer (see Usage).
This way you can't completely control via JS the pseudo-video element (that is UIView). For example it's not possible to resize the video, position it, change all its CSS properties and so on. UIView size and position are set at initial value of HTML video element.
Is there a workaround or an alternative to this limitation (opened as an issue) of iosrtc cordova plugin?
Yes, the video element is not an actual HTML DOM element displays the video, however the library mimics the CSS of the video element to the best of its ability (for dimensions and positioning).
You can still manipulate the video element using JavaScript, but you must call an iosrtc method to update the UIView afterwards Using:
iosrtc.refreshVideos()
More information can be found here: https://github.com/eface2face/cordova-plugin-iosrtc/blob/master/docs/iosrtc.md#iosrtcrefreshvideos
cordova plugin iosrtc is updated and fixed the issue according to https://github.com/eface2face/cordova-plugin-iosrtc/pull/179.
Update the plugin and it will solve the issue.
Is there a way to use actual iOS controls inside a Qt application? As one specific example, there is a switch control in iOS that has a very specific look that is quite different from the look of the Qml switch and I don't there is a QWidget that does quite the same thing (QCheckBox is probably the closest).
Creating my own control where I build all the UI pieces so it looks like the actual iOS control is not an option, as one of the requirements I've been given is that the resulting program should match the control style of whichever iOS version it's installed on (within reason of course).
Some ideas:
Find out if there's a way for iOS to render a control off-screen to a buffer. Then use that as an image for your control.
Make a proxy for the native control, and overlay it on top of your UI. The proxy should relay the position from QML to native, and the size and state from native to QML.
Have a short lived (could be just one frame) screen that renders the controls you wish, in the states you need, then capture that. This only makes sense if you can't render the controls off-screen.
I'm hoping it's possible to make interactive presentations that play sounds on certain user events using the canvas, and have it work properly and load on iOS, either as a mobile site, phonegap type thing, or wrapped webview. I know html5 and video is a total dead end on iOS (stupid) because of the no-autoplay and no gobble all clicks issue. If I'm not using video but just moving pictures around (think "Ken Burns effect" with buttons too), and I want some short audio clips to play when a button is clicked, is this going to work on iOS or are there crazy no-autoplay-no-preload restrictions that prevent that too? Any pointers to working examples would be fine too, the signal-to-noise ratio on searching this question is abysmal. ;-)
For playing audio on iOS and android you have to use html5 audio tag .
<audio controls>
<source src="yourFile.mp3" type="audio/mp3">
</audio>
this comes with inbuilt control . If you want custom look and fill then there are some libraries in jquery ,you can use that .
I am working with a sample application like vine. My requirement is that I have to create a 'ghost' filter for video as in vine.
Actual requirement is
-Record a video on long press on the view
-On pause of record, I need to show the last frame of the recorded video above my view. Please see the expected working here
I have checked PBJVision library and found this feature working. But I need to implement this feature separately in my application.
While analysing the code, I found that this can be achieved using Open GLES. I have tried using a GLKView but it just shows a dark shade instead of image frame. Since I am new in this area, please help me.
Hi Ive been trying to find an answer to this question, but all i can find are people talking about converting the swf file to play on an x-box. I dont want to convert it i just want ot play it, I'm trying to make a full flash menu that can be played when the xna game is paused. the menu does not have any control over any xna parameters.
Thanks..VilkaS
You might want to take a look at Robin Debreuil's blog. He shows how to use Swf files in XNA.