how to change Ipad lidar resolution - ipad

I have a proejct about lidar scan
It is not detail scan so We have a lot of problem of object scan
Along the way. I found a some app. Its name is 3DscannerApp
There are some function such as resolution and max depth for using lidar scan.
Coud I get these infomation about resolution and max depth when I use ARWorldTrackingConfiguration?

Related

How do I size an image to actual scale using the known distance from the surface to camera?

I'm working on an iOS project that can create scaled photos using only the center point of the lidar distance information on newer iOS devices. I have a Spike by IkeGPS that acheives this result using a bluetooth laser device to obtain a distance measurement. I am looking for a formula for this that I can reproduce in Xcode. So far, my assumptions are that I will need the resolution of the photos, the focal length of the lens, and the distance to the surface.
So far, all that I have acheived is realtime distance information from the center point, but I would like to use this to create a scaled image.

How to match pixel data obtained by capture card with original pixel data

I have two PC. Two computers is connected by capture card. My purpose is to enable PC1 to obtain PC2's screen pixel data without a difference in pixel values using a capture board.
Actually, the two images("Original image data of screen of PC2" and "Image data obtained by OpenCV videocapture() using capture card") don't look much different to the naked eye, but the pixel values ​​of the two are different.
Resolution of monitor of PC2 is FHD(1920*1080). And Specification of my capture card is this
HDMI resolution Maxinput can be 4K#30Hz
Support video format 8/10/12bit Deep color
Video output format YUB,JPEG
Video output resolution Max output can be 1080P#30Hz
If it is not simply a matter of specification of the device, how can i obtain perfectly identical pixel data?
else if it is a matter of specification of the device, what kind of capture card should i buy?

How to get ultra wide camera calibration data?

Is it possible to get calibration data AVCapturePhoto::cameraCalibrationData for ultra wide camera?
Documentation says:
Camera calibration data is present only if you specified the cameraCalibrationDataDeliveryEnabled and dualCameraDualPhotoDeliveryEnabled settings when requesting capture.
but dualCameraDualPhotoDeliveryEnabled was deprecated.
I tried to set cameraCalibrationDataDeliveryEnabled for builtInDualWideCamera and builtInUltraWideCamera without any success.
The calibration data is meant to give you information about the intrinsics of multiple cameras in a virtual camera capture scenario. This used to be the dual camera (introduced with the iPhone X), but with the release of the iPhone 11 Pro, the API switched it's naming. It's now called isVirtualDeviceConstituentPhotoDeliveryEnabled and you can now specify the set of cameras that should be involved in the capture with virtualDeviceConstituentPhotoDeliveryEnabledDevices.
Note that the calibration data only seem to be available for virtual devices with at least two cameras involved (so builtInDualCamera, builtInDualWideCamera and builtInTripleCamera).

Can i use Xtion Pro Live outdoors

I am working with Xtion Pro Live on Ubuntu 12.04 with Opencv 2.4.10. I want to do object recognition on daylight.
So far i have achieved object recognition indoors by producing a depth and a disparity map. When i go outdoors the maps that i mentioned above are black and i cannot perform object recognition.
I would like to ask you if Asus Xtion Pro Live can work outdoors.
If it cannot, is there a way to fix it (through code) in order to do object detection outdoors?
I have searched around and i found out that i should take another stereoscopic camera. Could anyone help?
After some research I discovered that the Xtion Pro Live stereoscopic camera, can not be used outdoors because of the IR sensor. This sensor is responsible for the production of depth map and is affected by sunlight. Because of this, there are no clear results. Without clear results the creation of depth and disparity map (with the proper values) is impossible.

Is there anyway to figure out a good estimate for sensor size for OpenCV?

I have extracted a camera calibration matrix using OpenCV and need to translate the focal length to mm. It seems the easiest way to do this is through calibrationMatrixValues(). For this, I need to supply the apertureWidth and apertureHeight which are actually the width and height of the physical sensor. I took all my photos with the built-in "FaceTime HD" camera for the Mid-2011 27" iMac. I haven't been able to find any technical specifications for the camera. While I've asked Apple Stackexchange for the exact sensor size, is there a way to move forward with any sorts of good estimates?
If your camera can save jpg images, try to see the EXIF header if it contains what you search, and verify if it provide an accurate value.

Resources