ROKU-Using urlTransfer to Call Script File - url

Not sure how Roku and Brightscript actually works. I need to call a script file just before the channel starts to stream. The script file will convert the stream on fly. I asked how to do this in Roku forum and was told to use urlTransfer. Well, the sdk gives little help that I can see when explaining how to. I ran across this post on stackoverflow:
How to make api request to some server in roku
It gives a good example which I think I understand. My confusion comes in where and how the function is called. It has to happen right before the video url is called so the conversion can start.
Any advice appreciated.

If you are using roVideoPlayer then just before you call the play function and if you are using roVideoScreen then just before the show function.
Example snippet:
roVideoPlayer
player=CreateObject('roVideoPlayer')
* Your code to add content for the player
* Your call to script
player.play()
roVideoScreen
player=CreateObject('roVideoScreen')
* Your code to add content for the player
* Your call to script
player.show()
Hope this helps

Related

Google cloud speech very inaccurate and misses words on clean audio

I am using Google cloud speech through Python and finding many transcriptions are inaccurate and missing several words. This is a simple script I'm using to return a transcript of an audio file, in this case 'out307.wav':
client = speech.SpeechClient()
with io.open('out307.wav', 'rb') as audio_file:
content = audio_file.read()
audio = speech.types.RecognitionAudio(content=content)
config = speech.types.RecognitionConfig(
enable_word_time_offsets=True,
language_code='en-US',
audio_channel_count=1)
response = client.recognize(config, audio)
for result in response.results:
alternative = result.alternatives[0]
print(u'Transcript: {}'.format(alternative.transcript))
This returns the following transcript:
to do this the tensions and suspicions except
This is very far off what the actual audio says (I've uploaded it at https://vocaroo.com/i/s1zdZ0SOH1Ki). The audio is a .wav and very clear with no background noise. This is worse than average, as in some cases it will get the transcription fully correct on a 10 second audio file, or it may miss just a couple of words. Is there anything I can do to improve results?
This is weird, I tried your audio file with your code and I get the same result, but, if I change the language_code to "en-UK" I am able to get the full response.
I'm working for Google Cloud and I created for you a public issue here, you can track there the updates.

Request Fullscreen in Dart

How does requestFullscreen in Dart works? I want enable Fullscreen-mode on mobile devices.
I wrote following Code. But it changes nothing.
querySelector(".btn").onTouchEnd.listen((l) {
var body = document.body;
body.requestFullscreen();
});
But it didn't worked.I'm becomming on click always same error document.body.requestFullscreen is not a function
Seems to be something like https://api.dartlang.org/stable/1.24.3/dart-html/VideoElement/enterFullscreen.html, so you need to call it on your video element.
Edit: Oh, yes, there's also https://api.dartlang.org/stable/1.24.3/dart-html/Element/requestFullscreen.html -- that might be the one you want.
Edit2: Apparently, this has already been asked and answered, and needs a workaround: How to request fullscreen in compiled dart

How to play RTSP url from within app in ios

I have found many suggestion in stack overflow regarding usage of FFmpeg and link of github for DFURTSPPlayer but it is not compiling. But after integrating FFmpeg what I have to write? suppose i am having HTTP urls then I write:
code
moviePath = "http:/path.mp4"
movieURL = NSURL.URLWithString(moviePath!)
moviePlayer = MPMoviePlayerController(contentURL: movieURL)
moviePlayer!.play()
So for using RTSP urls what kind of code should i write?
Here is another post that has an example FFmpeg code that receives an RTSP stream (this one also decodes the stream to YUV420, stores it in pic, then converts the frame to RGB24, stores in picrgb and writes it to a file). So to achieve something similar to what you have for HTTP you should:
1) Write a wrapper Objective-C class for the FFmpeg C code, or just wrap the code in functions/functions that you will call directly from Objective-C code. You should have a way to pass the RTSP url to the class or function and provide a callback for a new frame. In the class/function start a new thread that will actually execute something similar to the code in the example and call a callback for each new decoded frame. NOTE: FFmpeg has a way to perform asynchronous I/O by using your own custom IO context and that would actually allow you to avoid creating the thread, but if you are new to FFmpeg maybe start with the basics and then you can improve your code later on.
2) In the callback update the view or whatever you are using for display with the decoded frame data.

How does phoneGap (Cordova) work internally, iOS specific

I have started developing html applications for mutliple platforms. I recently heard about Cordova 2.0(PhoneGap) and ever since I have been curious to know how the bridge works.
After lot of code walking, i saw that the Exec.js is the code where call from JS -> Native happens
execXhr = execXhr || new XMLHttpRequest();
// Changeing this to a GET will make the XHR reach the URIProtocol on 4.2.
// For some reason it still doesn't work though...
execXhr.open('HEAD', "file:///!gap_exec", true);
execXhr.setRequestHeader('vc', cordova.iOSVCAddr);
if (shouldBundleCommandJson()) {
execXhr.setRequestHeader('cmds', nativecomm());
}
execXhr.send(null);
} else {
execIframe = execIframe || createExecIframe();
execIframe.src = "gap://ready";
But want to understand how that works, what is the concept here, what does file:///!gap_exec or gap://ready do? and how does the call propgate to the lower layers (native code layers)
thanks a bunch in advance.
The trick is easy:
There is a webview. This displays your app. The webview will handle all navigation events.
If the browser navigates to:
file:///!gap_exec
or
gap://
the webview will cancel the navigation. Everything behind these strings is re-used as an identifier, to get the concrete plugin/plugin-method and parameter:
pseudo-url example:
gap://echoplugin/echothistext?Hello World
This will cause phonegap to look for an echoplugin and call the echothistext method to send the text "Hello World" to the (native) plugin.
update
The way back from native to javascript is (or may be) loading a javascript: url into the webview.
The concrete implementation is a little bit more complex, because the javascript has to send a callback-id to native code. There could be more than one native call are running at the same time. But in fact this is no magic at all. Just a number to get the correct JSON to the right javascript-callback.
There are different ways to communicate between the platform and javascript. For Android there are three or four different bridges.
I am trying to figure this out in more detail, too. Basically there are 2 Methods on the iOS side that can help ...
- webView:shouldStartLoadWithRequest:navigationType: and
- stringByEvaluatingJavaScriptFromString:script
From the sources it seems cordova sends a "READY" message using webView:shouldStartLoadWithRequest:... and then picks up results with the second message, but I am not sure.
Cordova Sources iOSExec
There is much to learn there.

Not able to play the .wav sound which i recorded through coding in blackberry

i have tried to record the audio in wav format and i succeeded but when i try to play that file, it gives an error that 'media file is of unsupported format'.
the main portion of code is as follows:
player = Manager.createPlayer("capture://audio?encoding=pcm&rate=44100&bits=16&channels=1");
player.realize();
controller = (RecordControl) player.getControl("RecordControl");
controller.setRecordLocation("file:///SDCard/BlackBerry/voicenotes/voice.wav");
//controller.setRecordSizeLimit(396900);
controller.startRecord();
player.start();
Thread.sleep(7000);
controller.commit();
player.close();
This code works well and it gave me a voice.wav file but i am not able to play this file.Is there is something which i forget?
Thanks in advance.
Sounds like the same problem as here:
http://supportforums.blackberry.com/t5/Java-Development/Bug-in-media-Manager-Player/td-p/1009027/page/2
Essentially, you need to add the RIFF/WAVE-header for the recorded data yourself, see the last post on the thread for example.

Resources