Using Windows media center SDK from Builder C++ or Delphi - delphi

I have an application done already, it shows images, videos, etc. I just want to add support for TV channels, and I want to use windows media center from my program.
I have downloaded the Windows Media Center SDK but i dont know how to instance the classes ... Microsoft.MediaCenter, Microsoft.MediaCenter.Hosting, etc.
Any help would be welcomed, since i'm a bit lost.
The main target is be able to control WMC, run tv tunner and change tv channels programmatically

Related

UWP Game Capture on Xbox DirectX or MMF

I'm investigating building a "Game Capture" App that works within UWP on Xbox One, as for capturing the actual content of the screen during game-play, it appears there are two ways to go within the wider eco-system of Microsoft libraries:
DirectX (Now part of Windows API)
Microsoft Media Foundation
With that in mind, my assumption is that DirectX is natively accessible by UWP apps via the Windows Runtime API, and aside from limitations on the DirectX feature-sets and hardware, basic APIs exist for capturing the content of the Xbox's screen.
MMF I'm not so sure about, though it does encapsulate some interesting access to using an accelerated video encoding but does not appear to be part of the UWP subset of APIs available on the Xbox.
Beyond the correct library to use, are there any other known limitations on developing apps that "capture" the Xbox's screen that run natively on the device.
Thanks
It's not possible at this time.
The Xbox One is a closed platform and not as open as Windows 10 running on a desktop PC, for example.
On a PC it's possible to use existing APIs to capture the output from a game, app, etc. On Xbox One, this is handled by the system only. The console is recording all the time, but the user decides when to save that footage or broadcast it via Twitch, YouTube, etc.
UWP apps running on Xbox One cannot record footage themselves or access the built-in APIs for this functionality.

Do I use AIR or native code for iOS application?

I have been using Flex for desktop development for years. I'm new to mobile development and need to create an app for the iPhone 6 but have no idea whether I should be using Flex/AIR or native code.
The requirements of the iOS app is as follows:
Record video to local storage within app
Playback video from local storage within app
Pause/seek video within app
Overlay controls on top of the video such as TextInput, TextArea and DropDownList so user can make notes as video is being recorded
Data entered via TextInput/TextArea/DropDownList is saved to local storage
Option to upload video and text data to local server
Can I achieve all of this using AIR or should I be using native code?
Can I achieve all of this using AIR?
Yes. Use CameraUI to capture video on the device (using the device's native camera UI), and StageVideo to play the video (using GPU). Overlay your app UIs (Flex or your own) to control video playback. Save local data using File or EncryptedLocalStore or SharedObject. Upload to a server using File/upload().
Or should I be using native code?
That's up to you. There are trade-offs, and everyone will have their own opinion. If you have a lot of experience with Flex/AS3, then you should be able to work out this app in AIR quite easily and see if it meets your needs.
You can achieve this using Adobe AIR but performance will get a hit.
You will need a lot of effort in order to optimize performance especially for video capture functionality.
It is better to do it natively.

"flick" an image from an iPad to a screen

I'm wondering if the following has ever been done before, ideally in unity.
What I want is to be able to take an image I have on my iPad and send it to a screen to be displayed with a flick gesture. Much like what you do with a window on a computer with dual monitors.
You drag it, and it instantly appears on the other monitor.
if this hasn't been done before, how would you go about making this possible? I know that it is going to require a fair deal of networking if I'm to pass an image from one device to another.
This depends strongly on what do you want to include under the designation of "monitor" in this particular context.
If it's the screen connected to a computer you can theoretically do all this by creating an IPad client application and a computer server application, and using some way to communicate between the two (either direct WiFi, or through internet, etc). Of course you'd need to make specific server applications for the Operating Systems you'd like to target (Windows, Linux, etc).
If the screen is a TV, you should check which are the "smart-enough" ones to allow for an external wireless connection (or maybe wired too if that's what you prefer) that an IPad can use. Then, for each of those TVs you'd have to check what APIs they provide, and if said APIs allow you to access the connectors in the way you want. Then it's the same story as before, chose which platforms you want to target and create the software for those.
One way you can explore this is with your tablet and computer (either any Android Tablet+Any Computer with any OS or IPad with Apple Computer). Try doing some simple coding, just to see how you can transfer an image from tablet to computer. Once that's done, the presentation part of your IPad application (gallery of images, finger flick support, etc) can be done with Unity.

do apple devices support windows media player

I have a podcast page where the mp3 podcasts are played in an
"object" tag using windows Media Player. But they don't work in iPhone, iPad and iTouch. Is it because apple devices don't support WMP ?
Do I need to use HTML5 for this?
Yes, Apple won't support Microsoft products.
Also if you are creating cross platform application there will be such kind of chanllenges.
I would recommend using Flow Player, I have used it my past projects and it works like charm
http://flash.flowplayer.org/plugins/javascript/ipad.html

Saving video screencast of iPhone application

Is there a way to capture video of a screen from your own application? As far as I see there is no way to do it with UIImagePickerController (cameras only), but maybe there is a way to do it with iOS 4 AV Foundation or Core Video?
There seems to be two ways of capturing the content of the application while it's running:
Use the private API UIGetScreenImage() function which seems to be accepted by Apple now;
Use the following thread's captureView method to capture the image.
You'll have to capture it at many times per second (I guess 24 times should be ok for human eye persistence) then you'll have to produce the movie. Perhaps you could use the ffmpeg iphone port.
Alternatively, if you'd like to capture your application's running for a demo, your best bet would be to run your application on the simulator and use a Mac OS X screencast software to capture it. See also SimFinger which "bundle of little tricks to make a screen capture of the iPhone Simulator suck less".
Finally, perhaps the following StackOverflow thread might help you produce better screencasts.
SimFinger and ScreenFlow are great if you can shoot in the simulator.
If you have to shoot on the device (e.g. when accelerometer, GPS, camera, etc. are used) you currently have to resort to the jailbreak world. The app "Display Recorder", available for $5 in the Cydia Store, allows to create an AVI movie of the iPhone's screen content. This works across all apps. There's a YouTube video showing it. The movie files can then be uploaded to YouTube or pulled off the iPhone via the built-in web server.

Resources