iOS using both cameras at the same time? [duplicate] - ios

This question already has an answer here:
How can I use front camera and back camera simultaneously on the iPhone
(1 answer)
Closed 9 years ago.
Is there anything in the HIG or in Apples doc's that say that you cannot use the front camera and the back camera simultaneously at the same time to take a photo? Or technically is this not possible because of some limitation? Thank you

It is impossible and it is not allowed to access both simultaneously
So you can not use both cameras at a time because as one camera begins, the other will stop.

Related

Record videos simultaneously with both of iOS device's front and back cameras [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 4 years ago.
Improve this question
Is it possible to record videos simultaneously with both front and back cameras? I have seen apps doing this for capturing images, but I have not seen a single app recording video simultaneously from both front and back camera.
From the AVFoundation Programming Guide:
Note: Media capture does not support simultaneous capture of both the front-facing and back-facing cameras on iOS devices.

How do I detect if a node has left the screen using SpriteKit [duplicate]

This question already has answers here:
How do you tell if a node is on the screen spritekit swift
(6 answers)
Closed 5 years ago.
I'm making a game, and I want to check if a node(the "player node") has left the screen.
There are 2 ways I thought of doing it:
Check the x and y coordinates, and see if they had exceeded those of the boundary, if they do, run gameOver().
Place a node just off screen, then if the player node comes in contact with that node, it runs gameOver().
I have no idea how I would do either of these so please . could someone help. Thanks!
Depends on the number of end points. If there are too many possible positions outside the screen, then check for the coordinates. If there are only a few, make sure they are nodes with a property noting for outside.

Detect Bump On Bottom Of iPhone [duplicate]

This question already has answers here:
Detect when an iphone has been bumped
(5 answers)
Closed 6 years ago.
I have been doing a lot of research in the area of an iPhone's magnetometer, gyroscope and accelerometer, but can't seem to a good conclusion.
Things I have tried so far:
Acceleration in one direction to detect a stop. (Always detected when person is holding phone, and seems to just move in that direction)
Gyroscope angle changing directions regarding the way in which a person is holding the device. (Not consistent because trying to get shake to work with this as well)
Bump API/Existing Code (Does not work)
Has anyone come across a solid solution for detecting a tap on the bottom of and iPhone, against an object? Preferably with sample code.
Perhaps this is what you are looking for? It's a set of custom wrappers around the hardware that is designed specifically for detecting "accidents", which sounds like what you're looking for.

Lua smooth damp/tween algorithm [duplicate]

This question already has an answer here:
Smooth damp or tween algorithm
(1 answer)
Closed 8 years ago.
I am trying to make an FPS but I need help on how to do smooth damp on my gun. It currently follows the mouse's position exactly but I want it to take a second to get there. Like a delay. I need this in Lua and can't use libraries.
Yes! Did some thinking and came up with an algorithm. Basically, I took mouse's current position and then made an if statement saying that if the position changes then it takes that number and divides it by 5

ios detect heart rate [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Detecting heart rate using the camera
I need the same functionality as the application Instant Heart Rate.
The basic process requires the user to:
Place the tip of the index finger gently on the camera lens.
Apply even pressure and cover the entire lens.
Hold it steady for 10 seconds and get the heart rate.
This can be accomplished by turning the flash on and watch the light change as the blood moves through the index finger.
how can i start
I'd start by using AVFoundation to turn the light on. The answers in the linked post below include examples of how to do this:
How to turn the iPhone camera flash on/off?
Then as far as detecting the light change goes, you can probably use Brad Larson's GPUImage Framework. This framework includes a couple of helpful functions that you may be able to use to achieve this, including:
GPUImageAverageLuminanceThresholdFilter
GPUImageAverageColor
GPUImageLuminosity
Using the filters listed above you should be able to measure color variations in your finger, and monitor the time intervals between the occurrences of these changes. Using this framework you may even be able to specify an arbitrary variance requirement for the color/luminosity change.
From there all you have to do is convert the time interval in between the color changes to a pulse. Here's an example of how to calculate pulse.
http://www.wikihow.com/Calculate-Your-Target-Heart-Rate

Resources