I am creating an iPad app that is to be used in a public kiosk, and am playing with different ways of knowing when a user is present, has gone away or a new user has come along. I wondered therefore if this would be possible using a iOS 5 Core Image detection script, but I need to be able to use the camera feed without it actually showing the user. Is this in any way possible?
Related
I want to create an enhanced casting on my iOS app.
In simple words, I want to be able to customise the view that's on the screen the user is casting into.
UI for example:
User is watching a video on iOS device.
User clicks on airplay to cast the video to an external device, such as a TV.
Device's screen still showing the video, but on the external device the display contains the video with some customise widgets on it, and a window showing the camera coming from the iOS mobile.
I couldn't find any related information on web, maybe you'll have an idea on what can I do? and what is possible.
Thanks!
I've developed a dual camera app in ios. It has this layout. While developing, I came across the challenge of saving the video in this specific layout. I couldn't think of any other way of recording the video so I came up with the solution of recording the entire screen and save it as a video. But recently I've seen apps that do not record the screen and yet they're able to save the video in a custom layout. One example of such an app is this. Now I'm unable to figure out how to make a video and save it in custom layout without screen recording(since its not perfect solution). So any help in this regard would be appreciated.
Even with the release of WatchOS2, it seems developers still can't access the iPhone's camera for a live feed/stream. So I was wondering for different ways this can be done, basically I want to see on my apple watch, what is visible on my iPhone's camera.
Apple's builtin camera app (in watch) does this and so does the garage door control app.
I read a approach to the above problem in this post and found it quite possible, but still had a few questions and doubts:
Camera live View in apple watch
At what frame rate should the images be captured at ? For example,
for a 10frames/second, should a method to capture image be fired at
0.1second ?
Aside this, the image will be shared from iPhone to Watch using Shared App Group or MMWormHole ? Or is MMWH just for
"notifications" between devices on whats going on ?
Secondly, do I
need to save the image each time physically on device, transfer it
and delete it (from iphone and Apple Watch) after next image comes in, or
just do so using a imageObject ?
Aside this, I also want to show a overlay frame over the camera feed (like how image editing apps in iPhone show frames, etc.).
So, for this, should I be merging/overlaying my frame image over the feed image directly on the iPhone part before sending the image frame over to the watch ?
I'm trying to create an Xcode Objective-C function that can be called from a button tap, that will record the contents of a UIView and its subviews (or a fixed section of the screen e.g. 320x320 in the center) and then allow the user to save the video to their iPhone camera roll. I also want to include the audio that is being played by the app at the time of the recording e.g background music and sound effects.
I'm having trouble finding any info on this as their seems to be a lot of people trying to record their running app for external purposes like the app store video preview. I need my video captured within the app, to be used as an app feature.
Does anyone know if this can be done or know a website or tutorial where I can learn what's needed? Thanks
I know this post is two years old, but for anybody who comes along who might need to record their iOS app's screens and save them to the phone's camera roll or even a specific URL, take a look at https://github.com/alskipp/ASScreenRecorder
I've tried it and it works! The frames per second aren't 60 so I don't know how well it would work if you were trying to record an action game, but it's still pretty awesome.
You can't do that with just one function, check that project:
https://github.com/coolstar/RecordMyScreen
I am developing a background app that periodically makes use of the camera. This app is for my jailbroken device, so no problem with public SDKs restrictions. I want to take a photo of the different places I go during the day, in an automatic manner.
I am using AVCaptureSession, getting the frames from a video in order to produce no sound.
The problem is that if another application wants to make use of the camera (The Camera App for instance), and my app tries to take a photo, the Camera interface get frozen. Then I need to reopen the Camera app again in order for it to work. I guess it is because the startRunning method of the AVCaptureSession blocks the usage of previous camera sessions.
Is there any way to use the Camera in shared mode, or concurrently?
I don't know if there is some kind of mixWithOthers property as is included for the sound compatibility.