Simultaneously playing and recording video on blackberry smartphone - blackberry

I intend to record and play video at the same time on my application. I am using BlackBerry JDE 5.0 and the Blackberry 9550 .
I initialize two players, one to play media, and another one to record media. I used Manager.createPlayer(rtsp://server ip: port) to play video.
String myEncoding = "encoding=video/3gpp&width=360&height=480&fps=6&vi​?deo_rate=16000&rate=8000";
mRecordPlayer = javax.microedition.media.Manager.createPlayer("cap​ture://video?" + myEncoding);
mRecordPlayer.realize();
_recordControl = (RecordControl)mRecordPlayer.getControl("RecordCon​trol");
mRecordPlayer.start();
I have recorded to a file . but the file always is 0 kB. I do not believe this way :createPlayer("capture://video?" + myEncoding) can record video from other player. It just records video from camera.
Is it possible to achieve the desired functionality with the API's exposed by RIM.
I would really appreciate if anyone could provide some pointer or help to solve the above mentioned problems.

Related

How to broadcast wowza video from one iPad to multiple iPad devices - swift

I want to broadcast existing videos to multiple users through wowza...
Suppose I want to broadcast any 1 uploaded video (in wowza server) to my multiple users? so how can I do that.. can wowza call any API to start streaming in other users devices? Means when I started streaming video from my application then it should start in other devices also through wowza API.
Are you talking about broadcasting (streaming) an MP4 file as a simulated Live stream (playout) or as Video On Demand (VOD)?
Obviously you cannot force devices to start playing a stream. That'd only work if you develop an App that can listen for commands and trigger playback accordingly. Wowza doesn't have such an App, nor any built-in features that can do this.
If you want devices to access a stream on-demand you can simply upload the file to Wowza's content folder. If you want to have a programmed playout, like a TV channel, then you can check out this article: https://www.wowza.com/docs/how-to-schedule-streaming-with-wowza-streaming-engine-streampublisher
(the source code of the plug-in that is used in that article is available from https://github.com/WowzaMediaSystems/wse-plugin-streampublisher)
As from your Question, You can broadcast stream successfully and you might have used Wowza Go Coder SDK for doing it.
On Broadcasting live stream, broadcasted videos stored in CONTENT directory which is inside of Stream Engine installation directory structure.
You can find all the streamed videos in it.
Now, You want to broadcast particular stored video then you can do it by loading specific URL for that video. Broadcasting for that Video is not possible but, you can play that particular video as below which will be accessible to all your application users :
In IOS, Link or URL for Videos stored in CONTENT directory is as below :
http://[Host Address]:[PORT]/vod/mp4:sample.mp4/playlist.m3u8
rtsp://[Host Address]:[PORT]/vod/sample.mp4 (In Android)
Here, sample is the name of your Stream. You have to broadcast live stream video with different stream names so that all videos are accessible.
In this way, you can play stored live stream videos.

iOS MobileVLCKit and VideoCore conflict

I'm using MobileVLCKit to stream video and audio from Wowza RTMP server. At the same time I'm using VideoCore]1 to stream audio to Wowza RTMP server (I closed off the video channel in VideoCore). Now I'm attempting to make this sort of a teleconferencing solution. Now I'm limited to RTMP or RTSP, not teleconferencing solution (WebRTC or SIP or what not...I am not familiar with these at the moment) because of the limitation on the other end of the line.
The above setup doesn't work. Turning the both functions (video and audio streaming down and audio streaming up) individually runs fine. But not when run simultaneously as audio cannot be heard on the other end. In fact, when app started with VideoCore streaming audio upstream, as soon as I started to downstream via MobileVLCKit, audio cannot be heard on the other end, even though the stream is open. It appears that microphone is somehow wrested away from VideoCore, even though MobileVLC should not need the microphone.
However, when I made the two into two apps and allow them to run in the background (audio & airplay background mode), the two runs fine with one app stream down video & audio and the other picking up microphone voices and stream to the other end.
Is there any reason why the two functions appear to apparently be in conflict within the same app, and any ideas how to resolve the conflict?
I encountered the same problem. Say I have two objects, a vlc player and the other audio processor which listens the microphone. It works fine in simulator for operating both functions in the same time. But conflict in the iPhone device. I think the root cause is that there is only one position or right for listening the microphone. And vlc occupies the right so that my audio processor cannot work. But for some reasons, I cannot modify the vlc code. So I'd to figure out the workaround resolution. And I found one.
The problem comes from vlc which occupies the right but doesn't event use the microphone, and my audio processor did. So the way appears clearly. That is, vlc player plays first and then we new the other object instance, audio processor in my case, which needs to listen the microphone. Since audio processor comes after vlc player, it takes back the right of microphone listening. And they both work properly.
For your reference and hope it can help you.

how to do live streaming in iphone from the server?

I want to broadcast the video from the local server into the iPhone.
I only get link which is coming from the web-service with any extension of the video.
Video can be in any format..
LIKE :#"avi",#"wmv",#"rmvb",#"flv",#"f4v",#"swf",#"mkv",#"dat",#"vob",#"mts",#"ogg",#"mpg",#"wma"
so,which player is better for my app.
1)MPMovieplaycontroller or
2)AVPlayer controller
Please help me.
From the MPMoviePlayerController docs:
Supported Formats
This class supports any movie or audio files that already play correctly on an iPod or iPhone. This includes both streamed content and fixed-length files. For movie files, this typically means files with the extensions .mov, .mp4,.mpv, and .3gp and using one of the following compression standards:
H.264 Baseline Profile Level 3.0 video, up to 640 x 480 at 30 fps. (The Baseline profile does not support B frames.)
MPEG-4 Part 2 video (Simple Profile) If you use this class to play audio files, it displays a white screen with a QuickTime logo while the audio plays. For audio files, this class supports AAC-LC audio at up to 48 kHz, and MP3 (MPEG-1 Audio Layer 3) up to 48 kHz, stereo audio
You have to use 3rd Party libraries for your mentioned case
The built in media player won't support any of those formats.
Your only option is really a third party library like VLCKit. I've never used it, but it likely supports more formats that you require:
https://wiki.videolan.org/VLCKit/
Though I've never tried before, but I am sure you will get help from this apple documentation.
There is a nice discussion here about your problem. Sorry for not giving a direct answer. Hope this helps.. :)
This AVPlayer SDK DOC may be helpful for you. But as per your requirement you have go for third part or your custom implementations.

Play the Video/Audio which is getting from the server

Is there any solution of this below one?
I have the Video/audio URLs
My Requirement is:
Is it possible to get the video/audio from the server and at the same time I have to open the player to play it(Like showing the Live-video directly in browser Field).
Means
Getting streaming into a buffer in back-end and at the same time I want to show it in the player.
If above is possible
I want to save that particular video/audio streaming data in to one file.
This blackberry KB link explains about streaming video from server. It may help you.

Play audio file on BlackBerry Storm

I want to play an mp4 audio file from a URL. This code works for a non-touch device, but I am facing a problem playing on the Storm.
Player player = javax.microedition.media.Manager.createPlayer(url);//_source
System.out.println("******************LoginScreen.player" + LoginScreen.player);
playerListener = new MediaPlayerListener();
LoginScreen.player.addPlayerListener(playerListener);
LoginScreen.player.realize();
LoginScreen.player.prefetch();
LoginScreen.player.start();
As you are using a HTTP location, you should specify an interface.
For wifi: ;interface=wifi
Thus your url should be like so:
String url= "http://yoursite.com/file.m4a;deviceside=true;interface=wifi";
Player player = javax.microedition.media.Manager.createPlayer(url);//_source
Check if your file is created with the supported codecs for Blackberry Storm
http://docs.blackberry.com/en/smartphone_users/deliverables/18349/711-01774-123_Supported_Media_Types_on_BlackBerry_Smartphones.pdf
that will be the reason why you can´t reproduce your audio file
for a MP4 Audio files playing in BlackBerry Storm smartphones the
supported codecs are: AAC-LC, AAC+, eAAC+
AMR-NB QCELP EVRC
for my experience sometimes you can reproduce files created with other codecs but you will find some problems on playin... =(
hope it helps

Resources