Is it possible to create an application for BBOS10 that shares screen to other phone/PC on wifi network just like Radmin on Windows?
I presume you are talking about sharing the screen regardless of content, rather than sharing the screen for a specific application that you have written.
I am not aware of any "API"s for doing this.
This leaves, I believe two options, which are loosely:
capture screen shots and forward these
capture a video of the screen and forward that
Now the screen shot API has been available since fairly early on in BB10 evolution. To use it you would just create a background Thread and take screen shots at regular intervals, which you would then send, presumably over a socket interface, to the receiving user. I suspect The biggest issue with this is that it is likely to be extremely data heavy, since the screen shots are complete images, as opposed to a streaming video which is (in my understanding) typically a series of diffs from the preceding frame.
Until very recently, it has not been possible to capture video of the BB screen, but it seems with 10.2, you now can. Please review this Thread:
Capture Video
on the BB10 forum.
Looking at this, it would appear you can capture each video frame and forward that, or presumably, capture the entire stream and forward that.
Related
I want to build a application in which I want to block screen shot by user like in Netflix application, it should return a black screen shot as in Netflix. Right now I am unable to get anything regarding this.
I don't know how Netflix is handling this?
Is there any way to detect captured image in the block and we can make it obscured?
Netflix (and other providers) use "FairPlay Streaming" which is what prevents capture of the video content - link: https://developer.apple.com/streaming/fps/
Note that you can do a screen-shot of the Netflix UI (menus and such), just not the streaming content.
If you do a little searching, you will find plenty of discussion explaining that you cannot block screen captures in iOS.
Even with the release of WatchOS2, it seems developers still can't access the iPhone's camera for a live feed/stream. So I was wondering for different ways this can be done, basically I want to see on my apple watch, what is visible on my iPhone's camera.
Apple's builtin camera app (in watch) does this and so does the garage door control app.
I read a approach to the above problem in this post and found it quite possible, but still had a few questions and doubts:
Camera live View in apple watch
At what frame rate should the images be captured at ? For example,
for a 10frames/second, should a method to capture image be fired at
0.1second ?
Aside this, the image will be shared from iPhone to Watch using Shared App Group or MMWormHole ? Or is MMWH just for
"notifications" between devices on whats going on ?
Secondly, do I
need to save the image each time physically on device, transfer it
and delete it (from iphone and Apple Watch) after next image comes in, or
just do so using a imageObject ?
Aside this, I also want to show a overlay frame over the camera feed (like how image editing apps in iPhone show frames, etc.).
So, for this, should I be merging/overlaying my frame image over the feed image directly on the iPhone part before sending the image frame over to the watch ?
I'm looking for a way for my users to take a video (defaulting to front facing camera, but with the ability to switch) lasting 2 seconds, display that video immediately and have it loop indefinitely (no controls displayed). Essentially mimicking a 2 second gif. I would like to do this in app so they can see the video before posting, and potentially retake. Any ideas? I've found some functionality here: https://developer.apple.com/library/ios/documentation/MediaPlayer/Reference/MPMoviePlayerController_Class/index.html#//apple_ref/c/tdef/MPMovieControlStyle but it doesn't seem to address the entire problem set.
Checkout this guide from Apple:
Using Video
My app uses AVQueuePlayer to show video clips back to back. In testing on my AppleTV, it seems that when I switch to the next video in the queue, there is a small time gap where the Apple TV 'takes over' the screen and the home screen of the Apple TV is displayed. Is there any way to prevent this gap from happening, even a black screen or loading indicator would be a better experience.
I was having this exact same problem and was dismayed when I saw that no one had yet to find an answer.
The solution I ended up going with involved taking advantage of the multiple display support added in iOS 5. There are some helpful links in learning how to do this here:
https://developer.apple.com/library/ios/documentation/WindowsViews/Conceptual/WindowAndScreenGuide/Introduction/Introduction.html#//apple_ref/doc/uid/TP40012555-CH1-SW1
http://blog.redfin.com/devblog/2012/05/creating_a_dual-screen_airplay_experience_for_ios_and_apple_tv.html#.UjCe5mRATZY
The app I am designing will basically look for the availability of an external screen. If it finds one it will play the videos on the viewcontroller I provide for that external screen. That viewcontroller can have a black background so that any pause in between videos looks natural and no longer has the Apple TV home screen pop into view for a moment.
I want to play interactive (user inputs/actions) flash contents-videos on the iOS devices. I am having flv files in which user can have their inputs like option selection, page turn etc.
I am having 2 approach about the functionality. Please correct if I am wrong.
1.Adobe-air can be used on the iPad devices. Does it have the ability to parse flash content run time? (use flash content as resources/bundle)
2.With the help of FFMPEG lib flash files/videos will work, but will it provide user actions/interactions?
No to both. Adobe Air can only be used to create applications. You can't actually play flash files using it. FFMPEG will only play flash videos, it will not allow interaction.
Basically, if you have flash interactive content that you want to display on the iPhone you are going to have to think of a different way to present it.
Since air 3.8 it got much more easier to run swf content from ios - but it is a pain if your content is external yes...
Also I was able to play dynamic video using starling and flv in the background at 12fps on ipod and 24fps on iphones and ipads.
The key there is how you upload the bitmapData to the GPU:
flash.display3D.textures.Texture(videoLayer.texture.base).uploadFromBitmapData(bitmapData);
Where videoLayer is a Starling Image and bitmapData is a drawn bitmapData from the video.
The rest of the code is trivial stuff that you can find easily online, but the Texture.uploadFromBitmapData is what really made a performance difference.
See it in action in the app store: https://itunes.apple.com/app/id723967141?mt=8