Selecting a specific camera through Qt when multiple 'lenses' are present? - ios

I have a Qt5.15 QWidgets application (targeted primarily for use on iOS). I would like the user to be able to select a specific rear-facing camera of a phone that has multiple rear-facing cameras of different focal lengths. (e.g.- the iPhone 12 has three rear-facing cameras: wide (0.5x), normal (1x) and telephoto (2.5x)). Qt Quick solutions are also welcome.
const QList<QCameraInfo> cameras = QCameraInfo::availableCameras(); Only returns two cameras on the iPhone (one front, one rear) even though there are really four (one front, three rear).
QCameraFocus::zoomTo can increase the digital zoom of the rear facing camera, but changing the optical zoom has no affect. QCameraFocus::maximumOpticalZoom returns 1.0 for the rear camera as well.
How can I use any of the rear-facing cameras other than the default with Qt?

Related

Is there a way to detect real darkness with ios camera devices?

I am developing an app for blind users and I am blind too. Working with photos images cameras and so on is not a rewarding work. I am encountering this brain teaser:
I implemented an auto torch mode algoritm. The torch switches on and off almost apropriatelly based on exifdata brightness values. The problem is that on a dark scene the torch switch on and off repeatly as it met the right condition. What I can not determine is which darkness is real and which darkness is artificial. For example if I put my hand behind the camera it becomes artificial if I put my phone on the table the darkness is not real.What kind of data and values I could use to mitigate or eliminate this issue?

why ARCore Supported device Limited?

what makes the ARCore supported device supports ARCore?
Which Features Makes This Device Support ArCore?
What is difference between ARCore Device And Other non Supported Device?
What happens is not about how new the mobile is, but if this mobile had some tests and mesures when it was design and build.
What it means, you cellphone today need some hardware like:
Accelerometer: measures acceleration, which is change in speed divided by time. Simply put, it’s the measure of change in velocity. Acceleration forces can be static/continuous—like gravity—or dynamic, such as movement or vibrations.
Gyroscope: measures and/or maintains orientation and angular velocity. When you change the rotation of your phone while using an AR experience, the gyroscope measures that rotation and ARCore ensures that the digital assets respond correctly.
Phone Camera: with mobile AR, your phone camera supplies a live feed of the surrounding real world upon which AR content is overlaid. In addition to the camera itself, ARCore-capable phones like the Google Pixel rely on complementary technologies like machine learning, complex image processing, and computer vision to produce high-quality images and spatial maps for mobile AR.
Magnetometer: gives smartphones a simple orientation related to the Earth's magnetic field. Because of the magnetometer, your phone always knows which direction is North, allowing it to auto-rotate digital maps depending on your physical orientation. This device is key to location-based AR apps.
GPS: a global navigation satellite system that provides geolocation and time information to a GPS receiver, like in your smartphone. For ARCore-capable smartphones, this device helps enable location-based AR apps.
Display: the display on your smartphone is important for crisp imagery and displaying 3D rendered assets. For instance, Google Pixel XL’s display specification is 5.5" AMOLED QHD (2560 x 1440) 534ppi display, which means that the phone can display 534 pixels per inch—making for rich, vivid images.
You can find this information and more on the Introduction to Augmented Reality and ARCore by Google AR & VR
In order to get AR experience, ARCore-compatible device must have 4 sensors:
Accelerometer
Gyroscope
Back RGB Camera
Magnetometer (it'll be supported in the future)
If your smartphone doesn't have a gyroscope (or it has a virtual gyroscope) that measures an orientation (rotation) in space – your Android device is considered as ARCore-incompatible. Every ARCore-compatible device must fulfil 3D tracking using 6 degrees of freedom (XYZ position and XYZ rotation).
When you're developing AR apps for Google Play Store you should write a code that checks that ARCore compatibility.
How to perform runtime checks.
Check whether ARCore is supported (for AR Optional apps only).
AR Optional apps can use ArCoreApk.checkAvailability() to determine if the current device supports ARCore. On devices that do not support ARCore, AR Optional apps should disable AR related functionality and hide associated UI elements:
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
// Enable AR related functionality on ARCore supported devices only.
maybeEnableArButton();
}
void maybeEnableArButton() {
ArCoreApk.Availability availability = ArCoreApk.getInstance().checkAvailability(this);
if (availability.isTransient()) {
// Re-query at 5Hz while compatibility is checked in the background.
new Handler().postDelayed(new Runnable() {
#Override
public void run() {
maybeEnableArButton();
}
}, 200);
}
if (availability.isSupported()) {
mArButton.setVisibility(View.VISIBLE);
mArButton.setEnabled(true);
// indicator on the button.
} else { // Unsupported or unknown.
mArButton.setVisibility(View.INVISIBLE);
mArButton.setEnabled(false);
}
}
AR core requires devices with 6 Degrees of freedom , unsupported devices usually have 3 degrees of freedom (DOF).

Is it possible to obtain individual left and right image (or cvpixelbuffer) from iPhone dual camera

AVFoundation uses the dual camera on some the recent "plus" iPhones to compute depth map. However, I am trying to obtain the individual left and right images as taken by the camera. I tried a few googling but there's nothing yet that either from apple development page, or people who have tried and posted as blog.
Note: I am not trying to get the depth map (which is a commonly known know-how), I would like the raw individual left and right images, and process the parallax info in some other ways.

Objective C Iphone take photos both cameras simultaneously

I need to take one picture from rear camera and another from back camera. I read that it wouldn´t possible at same time, but, you know if it is possible to switch between cameras in thi minimum time and try to take one fron and one back picture?
EDIT:
As I said before, I want to capture from both cameras at the same time. I know that it is not possible on Iphone devices but i tried to switch cameras very quickly. Iphone waste a lot of time switching between cameras. The ideal is to show in preview back camera and record frames from it, and record frames in front camera at the same time without previewing it and do not lose the front preview.
Thanks in advance.

Simultaneous pictures with iPhone 7 Plus cameras

Is there a way to take a picture with the Telephoto lens and the Wideangle lens of the iPhone 7 Plus ?
I explored the different methods, but the best I can come with is to change the camera by removing the input AVCaptureDeviceTypeBuiltInTelephotoCamera and adding the input from AVCaptureDeviceTypeBuiltInWideangleCamera. This takes about 0.5 second however, I would like to capture it simultaneouly. From a hardware point of view, it should be possible since Apple is doing the same when using the AVCaptureDeviceTypeBuiltInDuoCamera.
Does anybody know other methods to capture a photo from both cameras at (almost) the same time?
Thanks!
I wanted to capture from both cameras too, but what I've found is this:
When you are using the AVCaptureDeviceTypeBuiltInDualCamera that
automatically switches between wide and tele, they are synchronized to
the same clock. Simultaneous running of the
AVCaptureDeviceTypeBuiltInTelephotoCamera and
AVCaptureDeviceTypeBuiltInWideAngleCamera cameras is not supported.
Source - https://forums.developer.apple.com/thread/63347

Resources