Help - Blackberrry BrowserField2, Media Player and Threads - blackberry

In my application, I have BrowserField2 added to MainScreen and Media player based on Streaming media - Start to finish. I am trying to open media player from Browser using extended javascript. My plan is that when user clicks on some links in web page, I call extended javascript function with some parameters like url of the video to stream. This function in turn pushes media player screen with the url passed. Media player works very well and streams video when used stand alone. But it doesn't play video when coupled with BrowserField using extended javascript.
I suspect that the issue is synchronizing with Event thread or related to threading. I push screen containing media player using runnable. The screen is displayed. But when I click on play button (which starts some threads to fetch video and play it), nothing happens and my application freezes. I am not able to figure out exact problem. Will appreciate if someone can pin point the problem.
Thank you.
Some relevant code listings as below:
public void extendJavaScript() throws Exception
{
ScriptableFunction playVideo = new ScriptableFunction()
{
public Object invoke(Object thiz, Object[] args) throws Exception
{
openMediaPlayer(args[0].toString());
return Boolean.FALSE;
}
};
_bf2.extendScriptEngine("bb.playVideo", playVideo);
}
private void openMediaPlayer(final String url){
UiApplication.getUiApplication().invokeAndWait(new Runnable() {
public void run() {
PlayerScreen _playerScreen = new PlayerScreen(url + ";deviceside=true");
UiApplication.getUiApplication().pushScreen(_playerScreen);
}
});
}

Never mind. Got it resolved. It turned out that the video that I was trying to access from the web page was in incompatible format and hence throwing an error and freezing the media player.

Related

How to record audio and play it back in Safari on iOS in one click?

Problem: Safari will throw exceptions when certain actions are performed to audio elements inside of a callback.
This includes:
Playback
Setting the src property
Playing a URL object that was recorded/generated by the user
I need to record audio and visualize the data in one step. When a button is clicked, the recorded audio should be played back.
My application works like this: The user presses a record button, and the application begins recording audio once the user gives permission for the app to record. When the user presses the stop button, the recording is stopped and then two asynchronous operations happen:
Recording Phase
I. The user stops the recording by pressing a stop button in the UI.
This invokes the stop method on the recording controller class.
II.
Two asynchronous methods fire here: 1. The audio is decoded into an
object URL and set to the src of an HTMLAudioElement that is not
loaded into the DOM. 2. The audio is decoded into a buffer then
visualized on the screen.
//RecordingController
public stopRecording(): Promise
{
return new Promise( (resovle,reject)=>
{
if (this._isRecording)
{
this._isRecording=false;
this._mediaRecorder.stop();
this._mediaRecorder.onstop=()=>{
const blob = new Blob(this._recordBuffer, { 'type' : 'audio/ogg; codecs=opus' });
this._recordBuffer=[];
this.emitter.emit("recording-stopped",blob);
//another view subscribes to recording-stopped and visualizes the data
resovle(blob);
};
}
else {
reject();
}
});
}
//Vue UI - invoked when pressing stop button
protected stopRecording()
{
this.recording=false;
const stopRecording=()=>{
this.controller.stopRecording().then(blob=>{
console.log("RECORDING STOPPED!")
const url = URL.createObjectURL(blob);
if (this._data.item)
{
this._data.item.src=url;
}
this._data.hidePlay=false;
this.emitter.emit("recording-stopped");
});
};
if (!this.isIos)
{
setTimeout(stopRecording,500);
}
else {
stopRecording();
}
}
Everything works perfectly in Firefox and Chrome on Android and Windows. Also works perfectly on Chromium edge. I've devised an unacceptable workaround on Safari iOS that works, partially. When the user presses the play button, it decodes the object URL into a data URL, performs the visualization and then sets the SRC to the data url instead of the object url. Play is also invoked when the data URL is returned, but it fails because it's done in an asynch method. When play is pressed a 2nd time, it plays but is not audible.
Giving an onscreen in-DOM HTMLAudioElement the data URL will play back the audio data and you can hear it. The unacceptable workaround is to use the default HTMLAudioElement.

JavaFX WebEngine timeout handling

I'm wondering if anyone has figured out a way to properly handle timeouts in the JavaFX 8 (jdk 1.8.0_31) WebView. The problem is the following:
Consider you have an instance of WebView and you tell it to load a specific URL. Furthermore, you want to process the document once it's loaded, so you attach a listener to the stateProperty of the LoadWorker of the WebEngine powering the web view. However, a certain website times out during loading, which causes the stateProperty to transition into Worker.State.RUNNING and remain stuck there.
The web engine is then completely stuck. I want to implement a system that detects a timeout and cancels the load. To that end, I was thinking of adding a listener to the progressProperty and using some form of Timer. The idea is the following:
We start a load request on the web view. A timeout timer starts running immediately. On every progress update, the timer is reset. If the progress reaches 100%, the timer is invalidated and stopped. However, if the timer finishes (because there are no progress updates in a certain time frame we assume a time out), the load request is cancelled and an error is thrown.
Does anyone know the best way to implement this?
Kind regards
UPDATE
I've produced a code snippet with behavior described in the question. The only thing still troubling me is that I can't cancel the LoadWorker: calling LoadWorker#cancel hangs (the function never returns).
public class TimeOutWebEngine implements Runnable{
private final WebEngine engine = new WebEngine();
private ScheduledExecutorService exec;
private ScheduledFuture<?> future;
private long timeOutPeriod;
private TimeUnit timeOutTimeUnit;
public TimeOutWebEngine() {
engine.getLoadWorker().progressProperty().addListener((ObservableValue<? extends Number> observable, Number oldValue, Number newValue) -> {
if (future != null) future.cancel(false);
if (newValue.doubleValue() < 1.0) scheduleTimer();
else cleanUp();
});
}
public void load(String s, long timeOutPeriod, TimeUnit timeOutTimeUnit){
this.timeOutPeriod = timeOutPeriod;
this.timeOutTimeUnit = timeOutTimeUnit;
exec = Executors.newSingleThreadScheduledExecutor();
engine.load(s);
}
private void scheduleTimer(){
future = exec.schedule(TimeOutWebEngine.this, timeOutPeriod, timeOutTimeUnit);
}
private void cleanUp(){
future = null;
exec.shutdownNow();
}
#Override
public void run() {
System.err.println("TIMED OUT");
// This function call stalls...
// engine.getLoadWorker().cancel();
cleanUp();
}
}
I don't think that you can handle timeouts properly now. Looks at this method. As you can see it has hardcoded value for setReadTimeout method. Is it mean that SocketTimeoutException exception will be raised after one hour of loading site. And state will be changed to FAILED only after that event.
So, you have only one way now: try to hack this problem use Timers as you described above.
P.S.
Try to create issue in JavaFX issue tracker. May be anyone fixed it after 5 years...
I have the same problem and used a simple PauseTransition. Same behavior, not so complicated. =)

Audio and Recording Reuse on iPhone with Monotouch

I just started testing this very simple audio recording application that was built through Monotouch on actual iPhone devices today. I encountered an issue with what seemed to be the re-use of the AVAudioRecorder and AVPlayer objects after their first use and I am wondering how I might could solve it.
Basic Overview
The application consists of the following three sections :
List of Recordings (TableViewController)
Recording Details (ViewController)
New Recording (ViewController)
Workflow
When creating a recording, the user would click the "Add" button from the List of Recordings area and the application pushes the New Recording View Controller.
Within the New Recording Controller, the following variables are available:
AVAudioRecorder recorder;
AVPlayer player;
each are initialized prior to their usage:
//Initialized during the ViewDidLoad event
recorder = AVAudioRecorder.Create(audioPath, audioSettings, out error);
and
//Initialized in the "Play" event
player = new AVPlayer(audioPath);
Each of this work as intended on the initial load of the New Recording Controller area, however any further attempts do not seem to work (No Audio Playback)
The Details area also has a playback portion to allow the user to playback any recordings, however, much like the New Recording Controller, playback doesn't function there either.
Disposal
They are both disposed as follows (upon exiting / leaving the View) :
if(recorder != null)
{
recorder.Dispose();
recorder = null;
}
if(player != null)
{
player.Dispose();
player = null;
}
I have also attempted to remove any observers that could possible keep any of the objects "alive" in hopes that would solve the issue and have ensured they are each instantiated with each display of the New Recording area, however I still receive no audio playback after the initial Recording session.
I would be happy to provide more code if necessary. (This is using MonoTouch 6.0.6)
After further investigation, I determined that the issue was being caused by the AudioSession as both recording and playback were occurring within the same controller.
The two solutions that I determined were as follows:
Solution 1 (AudioSessionCategory.PlayAndRecord)
//A single declaration of this will allow both AVAudioRecorders and AVPlayers
//to perform alongside each other.
AudioSession.Category = AudioSessionCategory.PlayAndRecord;
//Upon noticing very quiet playback, I added this second line, which allowed
//playback to come through the main phone speaker
AudioSession.OverrideCategoryDefaultToSpeaker = true;
Solution 2 (AudioSessionCategory.RecordAudio & AudioSessionCategory.MediaPlayback)
void YourRecordingMethod()
{
//This sets the session to record audio explicitly
AudioSession.Category = AudioSessionCategory.RecordAudio;
MyRecorder.record();
}
void YourPlaybackMethod()
{
//This sets the session for playback only
AudioSession.Category = AudioSessionCategory.MediaPlayback;
YourAudioPlayer.play();
}
For some additional information on usage of the AudioSession, visit Apple's AudioSession Development Area.

MediaElement.Stop doesn't work when playing a live streaming source in windows store app

I want to make a Windows Store App play a live-streaming source. The source plays fine but I can't stop the source from playing once it has begun. When I call Stop() on the instance of Windows.UI.Xaml.Controls.MediaElement nothing happens.
Below is my code:
public MainPage(){
this.InitializeComponent();
this.mediaplayer.AutoPlay = true;
this.mediaplayer.Source = new Uri("mms://somedomain/mylive");
}
...
void StopButton_Click(object sender, RoutedEventArgs e)
{
//I can reach here when I set a breakpoint
this.mediaplayer.Stop();
}
I just came across the same problem... and was able to solve it by calling MediaElement.Pause() instead.

Blackberry simple progressbar for BrowserField2

In my app I have a BrowserField2 loading different pages and I want to show a simple spinning progressbar/indicator. As simple as possible really, without percent etc. - just a small animation to indicate to the user that something is happening.
I come from Android development and there such a thing is called Progressbar, though for Blackberry it maybe is called something completely different? (Progressbar for Blackberry seems to always include calculating the progress made).
What should I be looking for?
I solved it in a rather unorthodox way, something I probably wouldn't recommend ANYONE but I'll write it anyway since maybe it will help someone who's in a hurry to get it done. Just remember this is a bad way of doing it.
My app basically consists of 4 buttons and a browserfield.
To display a spinning "load animation" I use alishaik786's tip (see his comments) of the custom PopupScreen triggered by a browserfieldlistener:
// BrowserFieldListener to catch when a page started loading and when it is finished
BrowserFieldListener listener = new BrowserFieldListener() {
public void documentCreated(BrowserField browserField, ScriptEngine scriptEngine, Document document) throws Exception{
displayLoadAnimation();
// see method below
}
public void documentLoaded(BrowserField browserField, Document document) throws Exception{
try{
popUp.close();
}catch(IllegalStateException es){
es.printStackTrace();
}
}
};
// The method for showing the popup
private void displayLoadAnimation(){
popUp = new LoadingPopupScreen();
UiApplication.getUiApplication().invokeLater(new Runnable() {
public void run() {
UiApplication.getUiApplication().pushScreen(popUp);
}
});
}
Then in my custom PopupScreen I check where the user is clicking in "protected boolean touchEvent(TouchEvent event)" by checking event.getGlobalY() & event.getGlobalX() of the touch and comparing it to the positions of the buttons. If the user presses within the X&Y of a button then the popup screen is closed and I trigger the button being pressed.
As I said this is a bad way of doing it (many things need to be static), but it works if you want a quick and dirty sollution.

Resources