BlackBerry - Possible to Hide Video Field? - blackberry

I want to write an application like a Flashlight (with the help of the camera LED).
Player player = javax.microedition.media.Manager.createPlayer("capture://video?encoding=video/3gpp");
player.realize();
VideoControl videoControl = (VideoControl) player.getControl("VideoControl");
if(videoControl != null)
{
videoField = (Field)videoControl.initDisplayMode( VideoControl.USE_GUI_PRIMITIVE, "net.rim.device.api.ui.Field" );
try
{
videoControl.setDisplaySize(1, 1);
}
catch(Exception e)
{
PGLogUtil.logString(e.toString());
}
videoControl.setVisible(true);
add(videoField);
FlashControl flashControl = (FlashControl)
player.getControl("javax.microedition.amms.control.camera.FlashControl");
setFlashlight(true);
}
player.start();
The code above works perfectly, but I want to hide the videoField. When I removed add(videoField) or use videoControl.setVisible(false), the flashlight does not work. Can someone explain why?
How I can turn lights on with a hidden videoField?

I just got a bb that had a flash i wanted to try my hands on this same issue. I finally got it to work anyway.
On thing i observed during the whole testing time was that if the videoField is hidden like you said, the flash wouldn't work...So the trick i did was
<pre>
_videoControl.setDisplaySize( 1 , 1 );
</pre>
And that did the job for me. You could as well set it to
<pre>
_videoControl.setDisplaySize( 0 , 0 );
</pre>
But whatever you do ... ensure you set
<pre>
_videoControl.setVisible(true);
</pre>
else your flash will not work

Related

SWT: Integrate clickable link into StyledText

With the help of this question I was able to figure out how I can display a link inside a StyledText widget in SwT. The color is correct and even the cursor changes shape when hovering over the link.
So far so good, but the link is not actually clickable. Although the cursor changes its shape, nothing happens if clicking on the link. Therefore I am asking how I can make clicking the link to actually open it in the browser.
I thought of using a MouseListener, tracking the click-location back to the respective text the click has been performed on and then deciding whether to open the link or not. However that seems way too complicated given that there already is some routine going on for changing the cursor accordingly. I believe that there is some easy way to do this (and assuring that the clicking-behavior is actually consistent to when the cursor changes its shape).
Does anyone have any suggestions?
Here's an MWE demonstrating what I have done so far:
public static void main(String[] args) throws MalformedURLException {
final URL testURL = new URL("https://stackoverflow.com/questions/1494337/can-html-style-links-be-added-to-swt-styledtext");
Display display = new Display();
Shell shell = new Shell(display);
shell.setLayout(new GridLayout(1, true));
StyledText sTextWidget = new StyledText(shell, SWT.READ_ONLY);
final String firstPart = "Some text before ";
String msg = firstPart + testURL.toString() + " some text after";
sTextWidget.setText(msg);
sTextWidget.setLayoutData(new GridData(SWT.FILL, SWT.FILL, true, true));
StyleRange linkStyleRange = new StyleRange(firstPart.length(), testURL.toString().length(), null, null);
linkStyleRange.underline = true;
linkStyleRange.underlineStyle = SWT.UNDERLINE_LINK;
linkStyleRange.data = testURL.toString();
sTextWidget.setStyleRange(linkStyleRange);
shell.open();
while(!shell.isDisposed()) {
display.readAndDispatch();
}
}
Okay I was being a little too fast on posting this question... There's a snippet that deals with exactly this problem and it shows, that one indeed has to use an extra MouseListener in order to get things working.
The snippet can be found here and this is the relevant part setting up the listener:
styledText.addListener(SWT.MouseDown, event -> {
// It is up to the application to determine when and how a link should be activated.
// In this snippet links are activated on mouse down when the control key is held down
if ((event.stateMask & SWT.MOD1) != 0) {
int offset = styledText.getOffsetAtLocation(new Point (event.x, event.y));
if (offset != -1) {
StyleRange style1 = null;
try {
style1 = styledText.getStyleRangeAtOffset(offset);
} catch (IllegalArgumentException e) {
// no character under event.x, event.y
}
if (style1 != null && style1.underline && style1.underlineStyle == SWT.UNDERLINE_LINK) {
System.out.println("Click on a Link");
}
}
}
});

onaudioprocess not called on ios11

I am trying to get audio capture from the microphone working on Safari on iOS11 after support was recently added
However, the onaudioprocess callback is never called. Here's an example page:
<html>
<body>
<button onclick="doIt()">DoIt</button>
<ul id="logMessages">
</ul>
<script>
function debug(msg) {
if (typeof msg !== 'undefined') {
var logList = document.getElementById('logMessages');
var newLogItem = document.createElement('li');
if (typeof msg === 'function') {
msg = Function.prototype.toString(msg);
} else if (typeof msg !== 'string') {
msg = JSON.stringify(msg);
}
var newLogText = document.createTextNode(msg);
newLogItem.appendChild(newLogText);
logList.appendChild(newLogItem);
}
}
function doIt() {
var handleSuccess = function (stream) {
var context = new AudioContext();
var input = context.createMediaStreamSource(stream)
var processor = context.createScriptProcessor(1024, 1, 1);
input.connect(processor);
processor.connect(context.destination);
processor.onaudioprocess = function (e) {
// Do something with the data, i.e Convert this to WAV
debug(e.inputBuffer);
};
};
navigator.mediaDevices.getUserMedia({audio: true, video: false})
.then(handleSuccess);
}
</script>
</body>
</html>
On most platforms, you will see items being added to the messages list as the onaudioprocess callback is called. However, on iOS, this callback is never called.
Is there something else that I should do to try and get it called on iOS 11 with Safari?
There are two problems. The main one is that Safari on iOS 11 seems to automatically suspend new AudioContext's that aren't created in response to a tap. You can resume() them, but only in response to a tap.
(Update: Chrome mobile also does this, and Chrome desktop will have the same limitation starting in version 70 / December 2018.)
So, you have to either create it before you get the MediaStream, or else get the user to tap again later.
The other issue with your code is that AudioContext is prefixed as webkitAudioContext in Safari.
Here's a working version:
<html>
<body>
<button onclick="beginAudioCapture()">Begin Audio Capture</button>
<script>
function beginAudioCapture() {
var AudioContext = window.AudioContext || window.webkitAudioContext;
var context = new AudioContext();
var processor = context.createScriptProcessor(1024, 1, 1);
processor.connect(context.destination);
var handleSuccess = function (stream) {
var input = context.createMediaStreamSource(stream);
input.connect(processor);
var recievedAudio = false;
processor.onaudioprocess = function (e) {
// This will be called multiple times per second.
// The audio data will be in e.inputBuffer
if (!recievedAudio) {
recievedAudio = true;
console.log('got audio', e);
}
};
};
navigator.mediaDevices.getUserMedia({audio: true, video: false})
.then(handleSuccess);
}
</script>
</body>
</html>
(You can set the onaudioprocess callback sooner, but then you get empty buffers until the user approves of microphone access.)
Oh, and one other iOS bug to watch out for: the Safari on iPod touch (as of iOS 12.1.1) reports that it does not have a microphone (it does). So, getUserMedia will incorrectly reject with an Error: Invalid constraint if you ask for audio there.
FYI: I maintain the microphone-stream package on npm that does this for you and provides the audio in a Node.js-style ReadableStream. It includes this fix, if you or anyone else would prefer to use that over the raw code.
Tried it on iOS 11.0.1, and unfortunately this problem still isn't fixed.
As a workaround, I wonder if it makes sense to replace the ScriptProcessor with a function that takes the steam data from a buffet and then processes it every x milliseconds. But that's a big change to the functionality.
Just wondering... do you have the setting enabled in Safari settings? It comes enabled by default in iOS11, but maybe you just disabled it without noticing.

Youtube video with hyphen not working in IOS

I am using Cordova YoutubeVideoPlayer Plugin in my ionic project. It works well in android.But in IOS, it is not working with video id having a hyphen(-) in it(eg: "6L-ZHjUhcQY"). It works fine with all other urls. How can I solve this.
.controller('menuController', function () {
var id = "6L-ZHjUhcQY";
YoutubeVideoPlayer.openVideo(id);
});
In the debugger, are the video ids with hyphens correctly saved in the variable? It might be a character encoding issue, or an issue with the plug-in itself.
Fixed the ios issue by updating some files
https://github.com/fingentffts/CordovaYoutubeVideoPlayer
Now I found some issue in android tabs.
Videos are playing even if the screen is locked. And also hyphen video are not playing in samsung tab.
android issue also fixed by editing code in YoutubeVideoPlayer.java file in plugin
private Intent createYoutubeIntent(String videoId) {
// if (Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.LOLLIPOP){
Intent intent;
Context cordovaContext = cordova.getActivity();
String version = YouTubeIntents.getInstalledYouTubeVersionName(cordovaContext);
if(version != null && version.startsWith("11.16") && YouTubeIntents.canResolvePlayVideoIntent(cordovaContext)) {
intent = YouTubeIntents.createPlayVideoIntent(cordovaContext, videoId);
} else {
if(YouTubeIntents.canResolvePlayVideoIntentWithOptions(cordovaContext)){
intent = YouTubeIntents.createPlayVideoIntentWithOptions(cordovaContext, videoId, true, true);
} else {
intent = new Intent(Intent.ACTION_VIEW, Uri.parse("http://www.youtube.com/watch?v=" + videoId), cordovaContext, YouTubeActivity.class);
intent.putExtra("videoId", videoId);
ConfigXmlParser parser = new ConfigXmlParser();
parser.parse(cordovaContext);
CordovaPreferences prefs = parser.getPreferences();
intent.putExtra("YouTubeApiId", prefs.getString("YouTubeDataApiKey","YOUTUBE_API_KEY"));
}
}
return intent;
// }
//return new Intent(null, Uri.parse("ytv://" + videoId), cordova.getActivity(), OpenYouTubePlayerActivity.class);
}

YouTube IFrame API stopping before end

Is it possible to stop a video using the YouTube API one second before the end or just stopping it resetting back to the beginning? This is regardless of the length of the video itself.
Thanks in advance
I disagree with the argument of NOT being able to stop a video before it ends. Specifically, in the API, you have access to:
https://developers.google.com/youtube/js_api_reference#Playback_status
Player.getCurrentTime();
Player.getDuration();
If you were to monitor the player object and compare these two, you could easily stop the video before it ends. An example would be (although probably not good in production):
setInterval(function() {
var player = getPlayer(); //retrieve the player object
if(player) {
var duration = player.getDuration();
var current = player.getCurrentTime();
if((duration - current) <= 1) {
player.stopVideo();
}
}
}, 1000); //every second
I ran into an issue where for some reason the iframe player was stopping the video before the video's end, and so not triggering the ENDED event, and ran into this question while trying to find a solution for that. Since my approach can solve your problem as stated, here's what I did (I left the console.log() debug lines in to make it easier for someone else using this to play around with):
var player_timeout;
function onPlayerStateChange(event) {
if (event.data == YT.PlayerState.PLAYING) {
console.log("PLAY [" + event.data + "]");
timeout_set();
}
if (event.data == YT.PlayerState.PAUSED) {
console.log("PAUSED [" + event.data + "]");
clearTimeout(player_timeout);
}
}
function timeout_func() {
console.log("timeout_func");
player.stopVideo(); // or whatever other thing you want to do 1 second before the end of the video
}
function timeout_set() {
console.log("timeout_set");
var almostEnd_ms = (player.getDuration() - player.getCurrentTime() - 1) * 1000;
console.log("almostEnd_ms: " + almostEnd_ms);
console.log("player.getCurrentTime(): " + player.getCurrentTime());
player_timeout = setTimeout(timeout_func, almostEnd_ms);
}

AIR4: How to restore audio in iOS 8 (beta 5) after requesting mic access?

Problem: iOS 8 (beta 5) fades out all sound output in my app after listening for sample data, and never restores it, but only if sound has been played before requesting microphone sample data.
Another user noted that this behaviour stems from requesting microphone access, but sound plays as expected in our released app using iOS7 in both of the cases below, and the workaround to close/reopen the app isn't a viable solution since microphone recording is an recurring part of our app.
Conditions:
Flex 4.6.0 & AIR 4.0
iPad 3 (MD335LL/A)
iOS 8 (12A4345D)
Both test cases assume that microphone permission has been granted..
Test case 0:
Play sound
Stop sound channel
Audio stops
Connect to microphone, remove listener once sample data received
Attempt to play sound
No audible output, nor does the sound complete event ever fire
Test case 1:
Connect to microphone, remove listener once sample data received
Sounds plays without a problem and sound complete event is fired
Sample code:
protected var _url:String = 'audio/organfinale.mp3';
protected var _sound:Sound;
protected var _soundChannel:SoundChannel;
protected var _microphone:Microphone;
public function init():void
{
initSound();
}
/** SOUND **/
protected function initSound():void
{
// Set up to mimic app
SoundMixer.audioPlaybackMode = AudioPlaybackMode.MEDIA;
_sound = new Sound( new URLRequest( _url ) );
_sound.addEventListener(Event.COMPLETE, onSoundLoaded );
}
protected function onSoundLoaded(e:Event):void
{
_sound.removeEventListener(Event.COMPLETE, onSoundLoaded);
switch ( 0 )
{
// Will cut audio completely, prevent audio dispatch
case 0:
playSound();
setTimeout( initMicrophone, 1500 );
break;
// Works perfectly (because no overlap between sound/microphone?)
case 1:
initMicrophone();
break;
}
}
protected function playSound():void
{
trace( 'Play sound' );
if ( _soundChannel && _soundChannel.hasEventListener( Event.SOUND_COMPLETE ) )
_soundChannel.removeEventListener( Event.SOUND_COMPLETE, onSoundCompleted );
_soundChannel = _sound.play( 0 );
_soundChannel.addEventListener( Event.SOUND_COMPLETE, onSoundCompleted, false, 0, true );
}
protected function onSoundCompleted( e:Event ):void
{
trace( "Sound complete" );
}
/** MICROPHONE **/
protected function initMicrophone():void
{
if ( Microphone.isSupported )
{
// Testing pre-emptive disposal of sound that might be conflicting with microphone in iOS 8
if ( _soundChannel )
{
// Tested this, but it throws an error because sound is not streaming
// _sound.close();
_soundChannel.removeEventListener( Event.SOUND_COMPLETE, onSoundCompleted );
_soundChannel.stop();
_soundChannel = null;
// Instead the sound will be cut abruptedly, and will fail to dispatch complete event
}
_microphone = Microphone.getMicrophone();
_microphone.setSilenceLevel(0);
// _microphone.setUseEchoSuppression(true);
// _microphone.gain = 50;
// _microphone.rate = 44;
_microphone.addEventListener( SampleDataEvent.SAMPLE_DATA, onSampleDataReceived, false, 0, true );
}
else
{
trace( 'Microphone is not supported!' );
}
}
protected function onSampleDataReceived( e:SampleDataEvent ):void
{
trace( 'Sample data received' );
_microphone.removeEventListener( SampleDataEvent.SAMPLE_DATA, onSampleDataReceived );
_microphone = null;
setTimeout( playSound, 1500 );
}
Notes:
I stopped the sound channel before adding the mic listener, assuming that might have been causing a conflict, but it made no difference.
I've tested using the same device/OS after compiling the app with Flex 4.6.0 & AIR 14.0, and the problem persists.
By comparison, testing this app on iOS 7 worked in both cases.
Any ideas?
Update 1: Bug has been logged here: https://bugbase.adobe.com/index.cfm?event=bug&id=3801262
Update 2: Bug is fixed as of AIR 15.0.0.274: http://labs.adobe.com/downloads/air.html

Resources