Camera feed from iPhone to Watch and image editing - ios

Even with the release of WatchOS2, it seems developers still can't access the iPhone's camera for a live feed/stream. So I was wondering for different ways this can be done, basically I want to see on my apple watch, what is visible on my iPhone's camera.
Apple's builtin camera app (in watch) does this and so does the garage door control app.
I read a approach to the above problem in this post and found it quite possible, but still had a few questions and doubts:
Camera live View in apple watch
At what frame rate should the images be captured at ? For example,
for a 10frames/second, should a method to capture image be fired at
0.1second ?
Aside this, the image will be shared from iPhone to Watch using Shared App Group or MMWormHole ? Or is MMWH just for
"notifications" between devices on whats going on ?
Secondly, do I
need to save the image each time physically on device, transfer it
and delete it (from iphone and Apple Watch) after next image comes in, or
just do so using a imageObject ?
Aside this, I also want to show a overlay frame over the camera feed (like how image editing apps in iPhone show frames, etc.).
So, for this, should I be merging/overlaying my frame image over the feed image directly on the iPhone part before sending the image frame over to the watch ?

Related

Video Preview Layer Running While in Split-View?

I currently have an app that displays the front facing camera atop a video preview layer. By default in iOS 9, the preview layer is interrupted/paused and will not resume until split-view is dismissed. Based on the nature of the app, maintaining the running camera preview layer while multitasking is essential.
Is there any way to force the capture session to continue previewing while in split view?
Update: Seems as if Apple does not allow any sort of camera use while the device has more than one application open. You can, however, invoke UIImagePickerController in order to take a photo while in split-view. Of course this solution only allows you to snap a single photo, and nothing more. Hope this helps someone!

Trying to create an Xcode Objective-C function that records a video capture of my UIView contents and saves to phone

I'm trying to create an Xcode Objective-C function that can be called from a button tap, that will record the contents of a UIView and its subviews (or a fixed section of the screen e.g. 320x320 in the center) and then allow the user to save the video to their iPhone camera roll. I also want to include the audio that is being played by the app at the time of the recording e.g background music and sound effects.
I'm having trouble finding any info on this as their seems to be a lot of people trying to record their running app for external purposes like the app store video preview. I need my video captured within the app, to be used as an app feature.
Does anyone know if this can be done or know a website or tutorial where I can learn what's needed? Thanks
I know this post is two years old, but for anybody who comes along who might need to record their iOS app's screens and save them to the phone's camera roll or even a specific URL, take a look at https://github.com/alskipp/ASScreenRecorder
I've tried it and it works! The frames per second aren't 60 so I don't know how well it would work if you were trying to record an action game, but it's still pretty awesome.
You can't do that with just one function, check that project:
https://github.com/coolstar/RecordMyScreen

blackberry OS 10 development to share screen

Is it possible to create an application for BBOS10 that shares screen to other phone/PC on wifi network just like Radmin on Windows?
I presume you are talking about sharing the screen regardless of content, rather than sharing the screen for a specific application that you have written.
I am not aware of any "API"s for doing this.
This leaves, I believe two options, which are loosely:
capture screen shots and forward these
capture a video of the screen and forward that
Now the screen shot API has been available since fairly early on in BB10 evolution. To use it you would just create a background Thread and take screen shots at regular intervals, which you would then send, presumably over a socket interface, to the receiving user. I suspect The biggest issue with this is that it is likely to be extremely data heavy, since the screen shots are complete images, as opposed to a streaming video which is (in my understanding) typically a series of diffs from the preceding frame.
Until very recently, it has not been possible to capture video of the BB screen, but it seems with 10.2, you now can. Please review this Thread:
Capture Video
on the BB10 forum.
Looking at this, it would appear you can capture each video frame and forward that, or presumably, capture the entire stream and forward that.

iOS Concurrent Camera Usage

I am developing a background app that periodically makes use of the camera. This app is for my jailbroken device, so no problem with public SDKs restrictions. I want to take a photo of the different places I go during the day, in an automatic manner.
I am using AVCaptureSession, getting the frames from a video in order to produce no sound.
The problem is that if another application wants to make use of the camera (The Camera App for instance), and my app tries to take a photo, the Camera interface get frozen. Then I need to reopen the Camera app again in order for it to work. I guess it is because the startRunning method of the AVCaptureSession blocks the usage of previous camera sessions.
Is there any way to use the Camera in shared mode, or concurrently?
I don't know if there is some kind of mixWithOthers property as is included for the sound compatibility.

Is face detection possible on iPad camera without showing camera feed?

I am creating an iPad app that is to be used in a public kiosk, and am playing with different ways of knowing when a user is present, has gone away or a new user has come along. I wondered therefore if this would be possible using a iOS 5 Core Image detection script, but I need to be able to use the camera feed without it actually showing the user. Is this in any way possible?

Resources