How to build a video wall with multi-streams - directx

I am attempting to make a mosaic display for our video wall. The mosaic will play video files (wmv) in a matrix view (probably 4x2). I am looking for a programmatic aproach to stream multiple videos using mms.
I have accomplished something similar using vlc mosaic plugin, but it terminates when the first video finishes playing. I am interested in running the tool in a loop.
Here is a vlc mosaic example: https://gist.github.com/1367589
First question, what is the easiest technology that I can implement this in, DirectX SDK, Windows SDK for Media Foundation, OpenGL, libvlc?
Are there any tutorials for multi video playback coding?

VLC and gStreamer are good options for this and have support for picture in picture videomixing.

Related

SoundCloud waveform generation mechanics and display

I am developing an app for iOS devices that is supposed to have a waveforms of music files like on SoundCloud. The problem is that I have achieved generation of waveform of fully downloaded file, how to generate a waveform of streaming audio during its playback? If someone's aware of how SoundCloud presents its waveforms please reply.
If we are talking about SoundCloud, for displaying an audio waveform, what I think is that they are -somehow- working with metadata for each specific audio to draw its desired waveform; Why it might be right? that's because the waveform will be drawn for each audio even before playing it (without waiting for streaming it). Applying the previous approach might be suitable solution for your issue.
However, I suggest to checkout this library, it might contains what are you looking for (drawing the waveform while streaming the audio file).
Also checking this Q&A might be helpful to your case.
Hope this helped.

Play video using GPUimage

In my app I have to play an alpha channel video as an overlay over the current view (I'm planning to achieve this alpha channel video using GPUImageAlphaBlendFilter or GPUImageChromaKeyBlendFilter), so I wanted to know if the output video after applying these filters can be played using GPUImage? If we can, then can I get some sample code for the same.
I know AVAnimator is an option but I want to apply filters to these overlay videos i.e.brightness,saturation etc which has to be visible while video is being played because of which I can't use AVAnimator. But this being the next step for now I want to know how to play video using GPUImage.
Thanks in advance! :]
Well, even though I like telling people about AVAnimator, Brad Larson's GPUImage is specifically designed to be a GPU based filtering framework for iOS apps. Application of real time effects like the ones you describe is exactly what GPUImage was designed for. See GPUImage Chroma Key filter

ios overlaying alpha channel video on another video

I have been trying to create a video template which uses alpha channel video overlayed on the mp4 videos and images.
This is how I need to create a video http://viewptch.ptchcdn.com/rendered/52b28a9f8d4f980f3a3f99c3_cb44bf2b/52b28a9f8d4f980f3a3f99c3_lrg_main_main.mov
For overlaying alpha video on another videos, I have used AVAnimator, I was succeeded for playing a preview using AVFoundation, AVSynchronizedLayer and AVAnimator.
When rendering video from composition, frames of alpha channel videos renders very slowly.
I need to create a video with alpha channel video on top of another video.
Can any one please suggest me what are the possible ways to render a video like http://viewptch.ptchcdn.com/rendered/52b28a9f8d4f980f3a3f99c3_cb44bf2b/52b28a9f8d4f980f3a3f99c3_lrg_main_main.mov ?
You mention that you have looked at AVAnimator, did you download the KittyBoom example project and try it out? The specifics of how it works are detailed in this post. One thing to note is that when you build and run on the device, you need to turn Debug mode off otherwise it will not execute quickly because a number of extra checks are done in debug mode. Also, you have to make sure to test on the actual device, the simulator is not a good measure of performance on a real device. Performance is a key problem with video that contains an alpha channel as iOS does not support video with an alpha channel by default.

Blackberry Sound Graphic Analyser

In my BB application I need to play/record sound and simultaneously show a Sound Graphic Analyser(as shown in image attached) within the application.I have searched forums but have found nothing significant.
I want to show graphics as shown when playing or recording music dependending upon the pitch of the sound. Is this possible?
As far as I know there is no API in RIM SDK to obtain such data upon playing a media file. But you can analyze the sound file contents by yourself, draw diagram and implement "cursor" (vertical green line on your image) that will be based on the time passed after start of the sound playing.

Playing mpg file in XNA

Is there any way to play mpg files in XNA? (I want to develop a game that a video stream has to play at background)
XNA has built-in video playback. A good place to get started using it might be Catalin's XNA 3.1 Video Sample.
One downside to XNA's built-in functionality is that it has limited format support (specifically WMV9). So you will need to convert your video to that format. Two options for encoding are Windows Movie Maker and Windows Media Encoder (which seems to have recently become Expression Encoder 4).
Once in that format, you can simply add it as content to your project. Then load it as a Video through the content manager, and use VideoPlayer to play it back, calling videoPlayer.GetTexture() to get a texture of the current video frame you can set on the device or pass to spriteBatch.Draw().

Resources