When using the iPhone camera app for the rear facing camera, the video mode torch option enables the True Tone LED, switching on all 4 LEDs (2 "white", 2 warmer).
However when accessing the camera in video mode through AVCaptureDevice, the torchMode options are only "on | off | auto" and neither enables the additional 2 warmer LEDs.
Is there a hidden function that enables this?
Generally, how are features like this usually enabled for the stock apps, and not for others? Is it a case of hidden functions in the api, that are possible to find? or something more low-level?
In addition to the torchMode on/off there is also setTorchLevel with a value from 0 to 1.0. I have tested this on several models of the iPhone. To do this I have created a simple app to play with the TorchLevel value. Here are the results.
Using the built in iPhone flashlight
iPhone 6s white LED only (one LED)
iPhone 8 white LEDs only (two LEDs)
iPhone 10S white LEDs only (two LEDs)
iPhone SE2 white LEDs only (two LEDs)
All phones use only white LEDs through four levels of brightness in the built in flashlight app.
Using the built in Camera App
iPhone 6s - iOS 13.7
white and yellow LEDs when flash turned from OFF to ON in video mode
white and yellow LEDs when flash is turned from AUTO to ON in video mode
white LED only when flash is left ON in video mode but you switch to photo mode and back to video mode
iPhone 8 - iOS 13.7
white and yellow LEDs on when flash turned on (AUTO) in video mode in a dark environment
iPhone SE - iOS 13.7 and iPhone 10S - iOS 14.0
white and yellow LEDs on when flash is turned on (AUTO) AND recording in a dark environment
All phones tend to use both the white and yellow LEDs while recording video except the iPhone 6s can be “tricked” into using just the white LED.
Using setTorchLevel = 0.0 through 1.0
iPhone 6s - iOS 13.7
Brightness transition levels with White LED only - yellow LED off:
0.0 OFF
0.005 1
0.5 2
0.835 3
0.995 4
1.000 5 - Full Brightness
The iPhone 6s only lights up the White LED when using setTorchLevel to adjust the camera LED brightness.
iPhone 8 - iOS 13.7, iPhone SE - iOS 13.7, iPhone 10S - iOS 14.0
On the newer phones normally both the White and Yellow LEDs are active when setting the TorchLevel but you can "trick" the phones to get mainly the White LEDs active.
Brightness transition levels when White and Yellow LEDs are active:
0.0 OFF
0.005 1
0.125 2
0.165 3
0.245 4
0.285 5
0.325 6
0.405 7
0.445 8
0.525 9
0.565 10
0.605 11
0.685 12
0.725 13
0.805 14
0.845 15
0.885 16
0.965 17
1.000 18 - Full Brightness
To turn off the yellow LEDs, use the photo app to take a flash picture in a dark environment. After that, setting torchLevel = 1.00 turns on White LEDs full power and yellow LEDs at < 1/2 power.
Reducing TorchLevel slowly from this point drops White LED brightness as in the above table and slowly extinguishes the Yellow LEDs
By 1/2 Power the Yellow LEDs are nearly out. Switching back to Full power yields White LEDs on full and yellow LEDs barely on.
As torch level is reduced the yellow LEDs remain at very low power and are fully extinguished at some levels (0.660, 0.330, 0.180, 0.100 and others).
Once the torch level is reduced to 0.090 the yellow LEDs “come alive” and their brightness tracks the White LED brightness over the full range of torch level until you take another flash picture with the camera app.
Related
We are capturing video from iOS devices and processing it using OpenGL but are encountering a FPS difference between the devices when trying to set them at their maximum rate.
The iPhone 6S and iPad Pro FaceTime cameras are maxing at 60 fps.
The 7+ and iPhone X however are maxing at 30 fps.
It seems unusual that the FPS would decrease with later versions of Apple's hardware. We are trying to figure out if this is a software issue or simply the technical specifications for the frame rate for the FaceTime camera.
We looked on Apple's technical specifications page but the frame rates of the FaceTime cameras aren't listed (though the rear cameras are.) For example:
https://www.apple.com/iphone-6s/specs/
https://www.apple.com/iphone-x/specs/
What are the FPS of the FaceTime Camera for the iPhone 6S vs 7+ vs X vs iPad Pro?
It looks like you’re correct. According to Apple’s iOS Device Compatibility Reference...
iPhone 6s series front camera does up to 60 FPS (all formats)
iPhone 7 series front camera does up to 30 FPS
iPhone 8 series / X front camera does up to 60 FPS, but only in binned formats (so you probably need to set the device’s activeFormat, not just choose a preset)
iPad Pro info is also in the tables at that link.
I'm looking into AR.js for an augmented reality use case where 3D objects do not appear directly on the hiro marker, but somewhere around the marker.
When I view my AR scene through my iPhone 7 from the top, everything looks fine, but when I tilt my camera to get more perspective, AR.js doesn't apply the same perspective to the AR world so that distant virtual objects appear as if they were located on an inclined plane.
I created an example page to illustrate this behaviour: When viewed from above, the grids match perfectly, but when viewed from the side, the planes mismatch.
Are there any settings I can apply to configure AR.js (or ARToolKit, which it depends on)? Maybe there's a way to define the field of view of my webcam there?
[EDIT] One week later, I would reword my question to: How can I use a device-specific camera_para.dat ARToolkit camera calibration file in AR.js without generating side effects such as a distorted rendering?
Updating the intrinsic optical characteristics of the camera, also known as calibration, might help!
The artoolkitx-calibration app is made for calibrating cameras. Unfortunately, the app is not available on the App Store currently. You can, however, deploy it to your development device using Xcode.
Alternatively, the ARToolKitX calibration server might contain camera calibration results for your smartphone - Unfortunately, it returns 204 (no content) for the iPhone 7 (a.k.a. apple/iPhone/iPhone9,3, camera 0, aspect ratio 16:9).
By the way,
camera_para.dat for several older iOS devices can be found on GitHub:
iPad 2: 0,7 MP
iPad Air 2: 8 MP, f/2.4
iPad Mini 3: 5 MP, f/2.4
iPhone 4: 5 MP, f/2.8
iPhone 4s: 8 MP, f/2.4
iPhone 5: 8 MP, f/2.4
iPhone 5s: 8 MP, f/2.2 (similar to iPhone 6, iPhone 6 Plus)
iPhone 6s Plus: 12 MP, f/2.2 (similar to iPhone SE, iPhone 6s)
Unfortunately, newer iPhone cameras have different specs (e.g. iPhone 7 or iPhone 8: 12 MP, f/1.8) so I doubt that any of these camera calibration settings would fit perfectly for them...
Is there an efficient or programmatic way to force .png files to display as the same color regardless of whether they are viewed on an iPad Pro or an iPad with a retina display (for my purposes, iPads 3 and higher that aren't a Pro)
My iPad app contains a large set of art that is basically just red and blue lines all saved as .png files. They are meant to be viewed with the most commonly available red/blue 3D glasses on the market. We can't change the physical glasses that people use to look at the app to get that pop out 3D effect.
The tolerance between all of the iPads with retina displays has been fine for our purposes. But once we run the app on an iPad Pro, the blue is no longer the same color blue as on the retina display.
This is with the iPad Pro brightness at max, and True Tone either on or off and Night Shift off (also tried it with Night Shift on but it is just a different blue)
Is the best solution to have all of the art re-created and use a UIDevice call to display one set for an iPad with retina display and another set of art for an iPad Pro?
The app is written entirely in Objective C and is for iPad only.
Lets assume my home screen have UIView that is 80% black in color. Does this help to save battery? I know backlight of iPhone will on no matter what is the screen color is. But having more black pixels make the battery less stressful?
Thats true for OLED displays only, but it's LCD display in iPhone. So answer - black screen will not save you battery.
I made an iPhone app using Apple's SceneKit. On iPhone 5s the app runs with 60fps with very occasional drops to 40fps (I think that happens when not much is changing in the scene). I tried to run the same app, made absolutely no chnages to the code, on an iPad 3 and the app has a huuuuge lag. fps never go above 16. I was comparing the gpu models of 5s and iPad 3 and they are both A7. Why am I experiencing such drastic drop in fps? Is this a hardware problem and I need to lower the graphical intensity of my app?
Edit: the iPad is 3 years in use iOS9, 5s 7 months and runs iOS8.4
They have different GPUs and CPUs.
The iPad 3 was the first with a retina display which means 4 times display resulution over the iPad2. The Pad was quickly replaced with the iPad 4 wich had a stromger GPU with higher clockspeed. Btw the iPad 3 had aARM-Cortex A9 CPU with 1 Ghz and the iPad 4 came with one of Apples first oen CPUs called Apple Swift running at 1,4 GHz.
As for your scene. Might be best if you create less objects on that device or exclude it from the compability list.