iOS Concurrent Camera Usage - ios

I am developing a background app that periodically makes use of the camera. This app is for my jailbroken device, so no problem with public SDKs restrictions. I want to take a photo of the different places I go during the day, in an automatic manner.
I am using AVCaptureSession, getting the frames from a video in order to produce no sound.
The problem is that if another application wants to make use of the camera (The Camera App for instance), and my app tries to take a photo, the Camera interface get frozen. Then I need to reopen the Camera app again in order for it to work. I guess it is because the startRunning method of the AVCaptureSession blocks the usage of previous camera sessions.
Is there any way to use the Camera in shared mode, or concurrently?
I don't know if there is some kind of mixWithOthers property as is included for the sound compatibility.

Related

multiple webRTC replaceTrack() calls for swapping two video streams

I'm making a video call application with two video sources: camera / screen record,
and let the users switch camera <-> screen record multiple times during a call.
RTCRtpSender.replaceTrack() allows me to replace the camera stream with the screen record stream.
But if I try to switch back to the camera stream, it doesn't work on ios.
On android, I found this document. It seems that replaceTrack() will auto-dispose the track when it's no longer needed. If I use setTrack(_, takeOwnership=false), and it works ok.
I wonder if I can do the same on ios.
I'm developing a flutter application & using this library, but I wonder if this is even possible on ios native.

Camera feed from iPhone to Watch and image editing

Even with the release of WatchOS2, it seems developers still can't access the iPhone's camera for a live feed/stream. So I was wondering for different ways this can be done, basically I want to see on my apple watch, what is visible on my iPhone's camera.
Apple's builtin camera app (in watch) does this and so does the garage door control app.
I read a approach to the above problem in this post and found it quite possible, but still had a few questions and doubts:
Camera live View in apple watch
At what frame rate should the images be captured at ? For example,
for a 10frames/second, should a method to capture image be fired at
0.1second ?
Aside this, the image will be shared from iPhone to Watch using Shared App Group or MMWormHole ? Or is MMWH just for
"notifications" between devices on whats going on ?
Secondly, do I
need to save the image each time physically on device, transfer it
and delete it (from iphone and Apple Watch) after next image comes in, or
just do so using a imageObject ?
Aside this, I also want to show a overlay frame over the camera feed (like how image editing apps in iPhone show frames, etc.).
So, for this, should I be merging/overlaying my frame image over the feed image directly on the iPhone part before sending the image frame over to the watch ?

Trying to create an Xcode Objective-C function that records a video capture of my UIView contents and saves to phone

I'm trying to create an Xcode Objective-C function that can be called from a button tap, that will record the contents of a UIView and its subviews (or a fixed section of the screen e.g. 320x320 in the center) and then allow the user to save the video to their iPhone camera roll. I also want to include the audio that is being played by the app at the time of the recording e.g background music and sound effects.
I'm having trouble finding any info on this as their seems to be a lot of people trying to record their running app for external purposes like the app store video preview. I need my video captured within the app, to be used as an app feature.
Does anyone know if this can be done or know a website or tutorial where I can learn what's needed? Thanks
I know this post is two years old, but for anybody who comes along who might need to record their iOS app's screens and save them to the phone's camera roll or even a specific URL, take a look at https://github.com/alskipp/ASScreenRecorder
I've tried it and it works! The frames per second aren't 60 so I don't know how well it would work if you were trying to record an action game, but it's still pretty awesome.
You can't do that with just one function, check that project:
https://github.com/coolstar/RecordMyScreen

How to record video when application is in background ios

How can i record / capture video from application and once start recording, when i press home button and application goes to background, recording video should be continued even if application is in background.
How can GPUImage framework help in this?
You can't.
Applications running in the background don't have access to the camera. Also, GPUImage requires access to OpenGL ES, which is also something that background applications are not allowed to use.

Is face detection possible on iPad camera without showing camera feed?

I am creating an iPad app that is to be used in a public kiosk, and am playing with different ways of knowing when a user is present, has gone away or a new user has come along. I wondered therefore if this would be possible using a iOS 5 Core Image detection script, but I need to be able to use the camera feed without it actually showing the user. Is this in any way possible?

Resources