Javafx WebView and Youtube videos - youtube

I'm building simple web browser using JavaFX but I found that there are some problems with playing youtube videos in it.
To reproduce the problem just run this simple code:
import javafx.application.Application;
import javafx.scene.Scene;
import javafx.scene.web.WebView;
import javafx.stage.Stage;
public class Main extends Application {
#Override
public void start(Stage primaryStage) {
try {
WebView root = new WebView();
Scene scene = new Scene(root,400,400);
primaryStage.setScene(scene);
primaryStage.show();
root.getEngine().load("http://youtube.com");
} catch(Exception e) {
e.printStackTrace();
}
}
public static void main(String[] args) {
launch(args);
}
}
When I try to play youtube video I see that it sometimes work but more often it shows the message "An error occurred, please try again later" (with the same movie).
As far as I know JavaFX uses Webkit so there should not be big differences between playing youtube (html5) videos in Chrome/Firefox or using WebView. I tried JRE 1.8.0, 1.8.20 and two different notebooks and it didn't help.
When video somehow start working there is error message:
Outstanding resource locks detected:
D3D Vram Pool: 15 529 382 used (5,8%), 15 529 382 managed (5,8%), 268 435 456 total
41 total resources being managed
average resource age is 12.8 frames
0 resources at maximum supported age (0,000000)
10 resources marked permanent (24,400000)
3 resources have had mismatched locks (7,300000)
3 resources locked (7,300000)
15 resources contain interesting data (36,600000)
0 resources disappeared (0,000000)
So I think these two problems can be connected with memory leaks in webview. Am I right or do You think there is maybe other solution to solve this problem?

Related

ngx videogular not playing vgHls stream on iOs browsers

I am using ngx-videogular in one of my application to stream live media. It works perfect everywhere except the browsers in iPhone/iPad. I am using Hls.js along with ngx-videogular for streaming. Is there anything else I need to consider to make it working on browsers (chrome/safari/firefox) in iOS (iPhone or iPad).
Thanks in advance
I have found the solution just posting it here in case if someone needs in the future
Before reading further please take a look about the issue described in detail here which helped me to understand the root cause
https://github.com/video-dev/hls.js#embedding-hlsjs
If you are using npm you need to add the Hls.js in angular.json
You have to import Hls from hls.js in you videoPlayer component
import Hls from 'hls.js';
and you need to update your ngAfterViewInit() method like below
ngAfterViewInit(): void {
this.videoElement = this.videoElementRef?.nativeElement;
if (Hls.isSupported()) {
const hls = new Hls();
hls.loadSource('your media url');
hls.attachMedia(this.videoElement);
hls.on(Hls.Events.MANIFEST_PARSED, function () {
this.videoElement.play();
});
}
else if (this.videoElement.canPlayType('application/vnd.apple.mpegurl')) {
this.videoElement.src = 'your media url';
}
}
Make sure you have the HTMLVideoElement and the element reference created in ts file
#ViewChild('videoPlayer') private videoElementRef: ElementRef;
videoElement!: HTMLVideoElement;
And your HTML video tag need to be updated with the reference like this
<video #videoPlayer autoplay controls playsinline>
Have a good day!

Mono audio output in iOS app when using a webRTC powered video call

The app i'm writing contains 2 parts:
An audio player that plays stereo MP3 files
Video conferencing using webRTC
Each part works perfectly in isolation, but the moment i try them together, one of two things happens:
The video conference audio fades out and we just hear the audio files (in stereo)
We get audio output from both, but the audio files are played in mono, coming out of both ears equally
My digging had taken me down a few routes:
https://developer.apple.com/forums/thread/90503
&
https://github.com/twilio/twilio-video-ios/issues/77
Which suggest that the issue could be with the audio session category, mode or options.
However i've tried lots of the combos and am struggling to get anything working as intended.
Does anyone have a better understanding of the audio options to point in the right direction?
My most recent combination
class BBAudioClass {
static private var audioCategory : AVAudioSession.Category = AVAudioSession.Category.playAndRecord
static private var audioCategoryOptions : AVAudioSession.CategoryOptions = [
AVAudioSession.CategoryOptions.mixWithOthers,
AVAudioSession.CategoryOptions.allowBluetooth,
AVAudioSession.CategoryOptions.allowAirPlay,
AVAudioSession.CategoryOptions.allowBluetoothA2DP
]
static private var audioMode = AVAudioSession.Mode.default
static func setCategory() -> Void {
do {
let audioSession: AVAudioSession = AVAudioSession.sharedInstance()
try audioSession.setCategory(
BBAudioClass.audioCategory,
mode: BBAudioClass.audioMode,
options: BBAudioClass.audioCategoryOptions
)
} catch {
};
}
}
Update
I managed to get everything working as i wanted by:
Starting the audio session
Connecting to the video conference (at this point all audio is mono)
Forcing all output to the speaker
Forcing output back to the headphones
Obviously this is a crazy thing to have to do, but does prove that it should work.
But it would be great if anyone knew WHY this works, in order that i can actually get things to work properly first time without going through all these hacky steps

Testing React Native with Appium: how to get past "Permit drawing over other apps"?

In my initial proof of concept for testing a React Native mobile app using Appium, I noticed that when I load the APK to start my test, I am presented with an Android prompt to "Permit drawing over other apps" as I am creating my AndroidDriver driver. If I move the slider manually, then click the back button, all is good -- the app loads fully, and my test proceeds. However, I don't see how to do this with automation using my Appium script because it looks like the driver instantiation will not complete until the slider is moved.
Most people don't see this in Appium testing as it appears to be specific to React Native in dev mode, as seen here...
Here's my code where I've put in conditional code to click the slider, but I never get there because it waits at the "driver = ..." line:
AndroidDriver driver = null;
#Before
public void setUp() throws Exception {
// < defining capabilities (emulator6p) up here...>
// initialize driver object
try {
driver = new AndroidDriver<WebElement>(new URL("http://127.0.0.1:4723/wd/hub"), emulator6p);
} catch (MalformedURLException e) {
System.out.println(e.getMessage());
}
}
#Test
public void test1() throws Exception {
By BY_slider_permitDrawing = By.id("android:id/switchWidget");
boolean present = (driver.findElements(BY_slider_permitDrawing).size() == 1);
if (present) {
driver.findElement(BY_slider_permitDrawing).click();
driver.navigate().back();
}
WebElement button_begin = driver.findElementByAccessibilityId("button-lets-begin");
button_begin.click();
}
I definitely hear plenty of people say that Appium is a viable solution for React Native testing, and really need to get over this hump.
Thanks in advance for any suggestions!
jph
p.s. In case it wasn't really clear, the test does NOT hang at the "driver = " line if I am not loading the APK from scratch, but I will need to do that for CI testing in the future.
Setting right capabilities helped me to run Appium tests in Debug mode on Android:
{
...
"appWaitPackage": "com.android.settings, com.yourPackage",
"appWaitActivity": "com.android.settings.Settings$AppDrawOverlaySettingsActivity, com.yourPackage.MainActivity",
}
Replace yourPackage with your real package name used in Android app. Some docs here: https://appium.io/docs/en/writing-running-appium/caps/#android-only

Problem playing sound on iPad device with MonoTouch

I am using the following code to play a .CAF file in an iPad application via Monotouch:
public void PlayClick()
{
PlaySound("Media/Click.caf");
}
private void PlaySound(string soundFile)
{
//var mediaFile = NSUrl.FromFilename(soundFile);
//var audioPlayer = new SystemSound(mediaFile);
//audioPlayer.PlaySystemSound();
var audioPlayer = new SystemSound(soundFile);
if (audioPlayer != null)
{
audioPlayer.PlaySystemSound();
}
}
It works find in the simulator - in fact, all variations I've tried (SystemSound, AVAudioPlayer, etc.) appear to work ok in the simulator, but I've not gotten a version to play on a real device yet. The sound files are all marked as Content and I checked the bundle uploaded to the iPad and the files are definitely there in a subfolder named "Media". If I change the code to use SystemSound (via the constructor with Url), I get an InvalidOperationException with the details:
Could not create system sound ID for url file://localhost/private/var/mobile/Applications/AC24496E-12E9-4690-B154-BA1AD1123EDC/Sample.app/Media/Click.caf; error=SystemSoundUnspecified
Anyone know what am I doing wrong? Thanks for any pointers to get me past this issue!
The simulator is case-aware, but the actual device is case-sensitive. This has tripped up many people.

Black berry camera programming

I want take pictures using black berry camera for my app, it is possible in v5.0 and yes then how?
Yes it is definitely possible, but not a very simple task if you don't get some advice up front.
First and foremost, there is some sample code that is shipped with the Eclipse package at least (CameraDemo) that shows how to create a viewfiender using a Field, Player, and VideoScreen. The biggest issue is third party developers cannot overlay anything on top of the view finder (which is the what they'll call the Field after you set it as such with a VideoControl.
Also, you are very limited to what size you can set the Field -- I only got half size and fullscreen working, some dimensions got ignored and others caused it to not be displayed at all.
Here is some code that shows this:
private Field _videoField;
private Player _player;
private VideoControl _videoControl;
private void initCamera() {
try{
_player = Manager.createPlayer( "capture://video??encoding=jpeg&width=640&height=480" );
_player.realize();
_player.prefetch();
_videoControl = (VideoControl)_player.getControl("VideoControl");
_player.start();
if (_videoControl != null){
_videoField = (Field) _videoControl.initDisplayMode (VideoControl.USE_GUI_PRIMITIVE, "net.rim.device.api.ui.Field");
_videoControl.setDisplayFullScreen(true);
add(_videoField);
}
}
catch(Exception e)
{
//show error
}
}
After you do this you can use
byte[] image = _videoControl.getSnapshot(sizeAndEncodingParamters);
to snap the picture. To determine what sizeAndEncodingParameters your device supports, you can use System.getProperty("video.snapshot.encodings"); which will return a String[] that you can iterate over to determine what to use.
Take a look at samples that come with the BB SDK installation on your PC. There is CameraDemo sample. You can just try searching for CameraDemo.java on your HDD if you're unsure where those samples are.

Resources