Is iPhone GPS signal being smoothed? - ios

I have been testing the GPS tracking on both iPhone and Android, and noticed some serious precision differences. The pictures present the two-way trip around few blocks. As you will see, android had tracked the differences between the movement directions - the lines do not align, while on iPhone, the route seems to perfectly align, which is highly unlikely, because the trip has been taken in two directions.
The purple line is from the Android device, the second one is from iPhone 7 Plus.
The question is - is iPhone doing any GPS smoothing, if so - is it possible to disable it?
I am using kCLLocationAccuracyBestForNavigation and here is how I setup location manager:
manager = CLLocationManager()
manager?.delegate = self
manager?.distanceFilter = 5.0
manager?.desiredAccuracy = kCLLocationAccuracyBestForNavigation
manager?.headingFilter = 1.0;
manager?.pausesLocationUpdatesAutomatically = false
if #available(iOS 9.0, *) {
manager?.allowsBackgroundLocationUpdates = true
} else {
// Fallback on earlier versions
}

Related

Choosing suitable camera for barcode scanning when using AVCaptureDeviceTypeBuiltInTripleCamera

I've had some barcode scanning code in my iOS app for many years now. Recently, users have begun complaining that it doesn't work with an iPhone 13 Pro.
During investigation, it seemed that I should be using the built in triple camera if available. Doing that did fix it for iPhone 13 Pro but subsequently broke it for iPhone 12 Pro, which seemed to be working fine with the previous code.
How are you supposed to choose a suitable camera for all devices? It seems bizarre to me that Apple has suddenly made it so difficult to use this previously working code.
Here is my current code. The "fallback" section is what the code has used for years.
_session = [[AVCaptureSession alloc] init];
// Must use macro camera for barcode scanning on newer devices, otherwise the image is blurry
if (#available(iOS 13.0, *)) {
AVCaptureDeviceDiscoverySession * discoverySession =
[AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:#[AVCaptureDeviceTypeBuiltInTripleCamera]
mediaType:AVMediaTypeVideo
position:AVCaptureDevicePositionBack];
if (discoverySession.devices.count == 0) {
// no BuiltInTripleCamera
_device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
} else {
_device = discoverySession.devices.firstObject;
}
} else {
// Fallback on earlier versions
_device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
The accepted answer works but not all the time. Because lenses have different minimum focus distance it is harder for the device to focus on small barcodes because you have to put you device too close (before the minimum focus distance). This way it will never autofocus on small barcodes. It used to work on older lenses where autofocus was 10-12 cm but newer lenses especially those on iPhone 14 Pros that have the distance 20cm will be problematic.
The solution is to use ideally AVCaptureDeviceTypeBuiltInWideAngleCamera and setting videoZoomFactor on the AVCaptureDevice to zoom in little bit so the barcode will be nicely focused. The value should be calculated based on the input video properties and minimum size of barcode.
For details please refer to this WWDC 2019 video where they address exactly this issue https://developer.apple.com/videos/play/wwdc2021/10047/?time=133.
Here is implementation of class that sets zoom factor on a device that works for me. You can instantiate this class providing your device instance and call applyAutomaticZoomFactorIfNeeded() just before you are about to commit your capture session configuration.
///
/// Calling this method will automatically zoom the device to increase minimum focus distance. This distance appears to be problematic
/// when scanning barcodes too small or if a device's minimum focus distance is too large (like on iPhone 14 Pro and Max - 20cm, iPhone 13 Pro - 15 cm, older iPhones 12 or less.). By zooming
/// the input the device will be able to focus on a preview and complete the scan more easily.
///
/// - See https://developer.apple.com/videos/play/wwdc2021/10047/?time=133 for more detailed explanation and
/// - See https://developer.apple.com/documentation/avfoundation/capture_setup/avcambarcode_detecting_barcodes_and_faces
/// for implementation instructions.
///
#available(iOS 15.0, *)
final class DeviceAutomaticVideoZoomFactor {
enum Errors : Error {
case minimumFocusDistanceUnknown
case deviceLockFailed
}
private let device: AVCaptureDevice
private let minimumCodeSize: Float
init(device: AVCaptureDevice, minimumCodeSize: Float) {
self.device = device
self.minimumCodeSize = minimumCodeSize
}
///
/// Optimize the user experience for scanning QR codes down to smaller sizes (determined by `minimumCodeSize`, for example 2x2 cm).
/// When scanning a QR code of that size, the user may need to get closer than the camera's minimum focus distance to fill the rect of interest.
/// To have the QR code both fill the rect and still be in focus, we may need to apply some zoom.
///
func applyAutomaticZoomFactorIfNeeded() throws {
let deviceMinimumFocusDistance = Float(self.device.minimumFocusDistance)
guard deviceMinimumFocusDistance != -1 else {
throw Errors.minimumFocusDistanceUnknown
}
Logger.logIfStaging("Video Zoom Factor", "using device: \(self.device)")
Logger.logIfStaging("Video Zoom Factor", "device minimum focus distance: \(deviceMinimumFocusDistance)")
/*
Set an inital square rect of interest that is 100% of the view's shortest side.
This means that the region of interest will appear in the same spot regardless
of whether the app starts in portrait or landscape.
*/
let formatDimensions = CMVideoFormatDescriptionGetDimensions(self.device.activeFormat.formatDescription)
let rectOfInterestWidth = Double(formatDimensions.height) / Double(formatDimensions.width)
let deviceFieldOfView = self.device.activeFormat.videoFieldOfView
let minimumSubjectDistanceForCode = self.minimumSubjectDistanceForCode(fieldOfView: deviceFieldOfView,
minimumCodeSize: self.minimumCodeSize,
previewFillPercentage: Float(rectOfInterestWidth))
Logger.logIfStaging("Video Zoom Factor", "minimum subject distance: \(minimumSubjectDistanceForCode)")
guard minimumSubjectDistanceForCode < deviceMinimumFocusDistance else {
return
}
let zoomFactor = deviceMinimumFocusDistance / minimumSubjectDistanceForCode
Logger.logIfStaging("Video Zoom Factor", "computed zoom factor: \(zoomFactor)")
try self.device.lockForConfiguration()
self.device.videoZoomFactor = CGFloat(zoomFactor)
self.device.unlockForConfiguration()
Logger.logIfStaging("Video Zoom Factor", "applied zoom factor: \(self.device.videoZoomFactor)")
}
private func minimumSubjectDistanceForCode(fieldOfView: Float,
minimumCodeSize: Float,
previewFillPercentage: Float) -> Float {
/*
Given the camera horizontal field of view, we can compute the distance (mm) to make a code
of minimumCodeSize (mm) fill the previewFillPercentage.
*/
let radians = self.degreesToRadians(fieldOfView / 2)
let filledCodeSize = minimumCodeSize / previewFillPercentage
return filledCodeSize / tan(radians)
}
private func degreesToRadians(_ degrees: Float) -> Float {
return degrees * Float.pi / 180
}
}
Thankfully with the help of reddit I was able to figure out that the solution is simply to replace
AVCaptureDeviceTypeBuiltInTripleCamera
with
AVCaptureDeviceTypeBuiltInWideAngleCamera

How to obtain detailed zoom specs for iPhone camera(-s)? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 months ago.
Improve this question
I'm making a zoom control for my app, and I'd like to make it advanced, like in default camera app from Apple: sample
I did some research, and still have some questions about it.
Is it possible to get focal length value programmaticaly? There are labels like 13mm, 26mm for different back cameras in the default app, but there is no such property on AVCaptureDevice. (It is probably needed to determine zoom values, see the next question)
How can we determine zoom values to display in UI? The thing is that AVCaptureDevice's minZoomFactor always starts from 1x, but in the camera app we can see that on devices with ultrawide camera the scale starts at 0.5x, so there should be some way to map this values onto each other. As I understand, Apple considers "usual" back camera as default (that is, 1x), and all other values are relative to it: 13mm is 0.5 * 26mm, so the first value on iphone 13 pro zoom control will be 0.5x, the second value is the "default" and is 1x (26mm), and telephoto camera is 77mm, so the third value is 3x (26mm * 3 = 78mm ~= 77mm). Please clarify how it is actually calculated and correct me if my assumption is wrong.
What is the correct way to get max zoom value? If I try AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInTripleCamera], mediaType: .video, position: .back).devices.first!.maxAvailableVideoZoomFactor, it says 123.75 (iphone 13 pro), but in the default camera app max zoom value is 15x. Why is it exactly 15x and where does it come from? (My assumption is that max digital zoom for all iPhones equals to 5x, so on 13 Pro telephoto camera zooms 3x as "usual" camera, thus we get 3x * 5x = 15x max zoom)
Is there any universal way to get "the best" (i.e with all features) camera? For example, now I can specify [.builtInTripleCamera, .builtInDualWideCamera, .builtInDualCamera, .builtInWideAngleCamera] for discovery session and pick the first item in devices array, but if Apple will release, lets say, some ".builtInQuadrupleCamera" in a couple of years, this code will have to be modified, because it won't include it automatically.
To sum up (TL;DR version):
As I suppose, final code should look something like this:
let deviceTypes: [AVCaptureDevice.DeviceType]
if #available(iOS 13, *) {
deviceTypes = [.builtInTripleCamera, .builtInDualWideCamera, .builtInDualCamera, .builtInWideAngleCamera]
} else {
deviceTypes = [.builtInDualCamera, .builtInWideAngleCamera]
}
let session: AVCaptureDevice.DiscoverySession(
deviceTypes: deviceTypes,
mediaType: .video,
position: .back
)
if let device = session.devices.first {
device.getUIZoomValues()
}
extension AVCaptureDevice {
func getUIZoomValues() -> [Float] {
// Hardcode. Seems like all iPhones limit digital zoom to 5x
let maxDigitalZoom: Float = 5
// fallback for old iOS versions
guard #available(iOS 13, *) else { return [1, maxDigitalZoom] }
let uiZoomValues: [Float]
let factors = virtualDeviceSwitchOverVideoZoomFactors
switch deviceType {
case .builtInTripleCamera, .builtInDualWideCamera:
// ultrawide camera is available - starting zoom from 0.5x
let firstZoom: Float = 1.0 / factors.first!.floatValue
uiZoomValues = [firstZoom] + factors.map { $0.floatValue * firstZoom } + [firstZoom * factors.last!.floatValue * maxDigitalZoom]
case .builtInDualCamera:
// no ultrawide. Starting from 1x
uiZoomValues = [1.0] + factors.map { $0.floatValue } + [factors.last!.floatValue * maxDigitalZoom]
case .builtInWideAngleCamera:
// just a single "usual" camera.
uiZoomValues = [1, maxDigitalZoom]
default:
fatalError("this should not happen on a real device")
}
return uiZoomValues
}
}
2 main concerns about this code:
1 - We have to hardcode maxDigitalZoom. Is there any way to get it programmaticaly? Apple states 5x in iPhone specs, and there is AVCaptureDevice.maxAvailableVideoZoomFactor, but those values are different (for example, iPhone 13 pro has 15x in specs vs 123.75x in maxAvailableVideoZoomFactor).
2 - Case builtInDualCamera (iPhone XS Max, for example). All the code above relies on virtualDeviceSwitchOverVideoZoomFactors var, which is available only from iOS 13, but builtInDualCamera is available from iOS 10.2, so what will happen if user has XS Max? Will it work on iOS >= 13 but break on earlier versions? Or it will not work at all?
Questions in order:
I think not
Works for me:
Created dictionary var zoomFactors: [String: CGFloat] = ["1": 1]
Than manage AVCaptureDevice?
I think u can play with getApproximation() to achieve the goal
No
Code Questions:
No. but i think one of the easiest method it's to play with getApproximation() idea
I believe it will be crash

CLLocationManager - returns wrong speed

I am trying to calculate users current driving speed, but there is a huge difference between cllocationmanager speed and actual driving speed.
As I am driving at 50 kmph and cllocation manager shows ~72/~73 kmph. Below is the code I am using.
locationManager = CLLocationManager()
locationManager.delegate = self
locationManager.desiredAccuracy = kCLLocationAccuracyBestForNavigation
locationManager.requestWhenInUseAuthorization()
locationManager.allowsBackgroundLocationUpdates = true
locationManager.pausesLocationUpdatesAutomatically = false
locationManager.distanceFilter = 1.0
locationManager.headingFilter = 0.1
locationManager.startUpdatingLocation()
And below is the location manager protocol
func locationManager(_ manager: CLLocationManager, didUpdateLocations locations: [CLLocation]){
let speedInKmph = location.speed * 3.6
if speedInKmph > 10 {
MyRide.shared.speedInfo.append(SpeedInfo(speed: speedInKmph))
self.view.showToast("\(speedInKmph) **********", position: .bottom, popTime: 2.0, dismissOnTap: false)
}
}
#Anand, the CLLocationManager could be obtaining speed on one of two ways:
measuring the distance between two lat/long/alt triplets and dividing by some sort of time reference
using the native Doppler speed measurement on the GPS chip. Doppler speed is computed for all of the satellites that the particular GPS chip sees. Maybe Apple publishes the names and model numbers of the GPS chips inside of iPhones, I don't know.
I don't know which technique is used by CLLocationManager.
I think you have to test your system at a couple of speeds and three directions (due north, due east, and 45 degrees angle to due North and due East).
If all measurements show the same fudge factor of 1.45, you are probably good to go with:
let speedInKmph = location.speed * 3.6 * (1/1.45)
If your measurements don't match each other, then more studying required. If I figure it out I will post it here. The linked youtube might be a good idea.

How to stop CLLocationManager Heading value change when iPhone roll little bit.

My CLLocationManager is working fine. I can be noticed when the heading value changed.
However, I found the heading value are different when the iPhone roll even little bit angle.
I have set the
locationManager.headingOrientation = CLDeviceOrientation.landscapeRight
I have also set other properties:
locationManager.requestWhenInUseAuthorization()
orientation = getCLDeviceOrientation(by: UIDevice.current.orientation)
locationManager.desiredAccuracy = kCLLocationAccuracyBest
locationManager.headingFilter = 0.1
locationManager.headingOrientation = CLDeviceOrientation.landscapeRight
locationManager.startUpdatingHeading()
locationManager.delegate = self
The headingOrientation property that you are setting is only used as reference point when you don't want the default: the top of the device in portrait mode represents due north (0 degrees)
Having set the reference, you will get changes in degrees for every yaw movement.
You can see it documented here: https://developer.apple.com/reference/corelocation/cllocationmanager/1620556-headingorientation

iPhone GPS User Location is moving back and forth while the phone is still

I am doing some mapkit and corelocation programming where I map out a users route. E.g. they go for a walk and it shows the path they took.
On the simulator things are working 100% fine.
On the iPhone I've run into a major snag and I don't know what to do. To determine if the user has 'stopped' I basically check if the speed is (almost) 0 for a certain period of time.
However just keeping the phone still spits out this log for newly updated location changes (from the location manager delegate). These are successive updates in the locationManager(_:didUpdateLocations:) callback.
speed 0.021408926025254 with distance 0.192791659974976
speed 0.0532131983839802 with distance 0.497739230237728
speed 11.9876451887096 with distance 15.4555990691609
speed 0.230133198005176 with distance 3.45235789063791
speed 0.0 with distance 0.0
speed 0.984378335092039 with distance 11.245049843458
speed 0.180509147029171 with distance 2.0615615724029
speed 0.429749086272364 with distance 4.91092459284206
Now I have the accuracy set to best:
_locationManager = CLLocationManager()
_locationManager.delegate = self
_locationManager.distanceFilter = kCLDistanceFilterNone
_locationManager.desiredAccuracy = kCLLocationAccuracyBest
Do you know if there is a setting or I can change to prevent this back and forth behaviour. Even the user pin moves wildly left and right every few seconds when the phone is still.
Or is there something else I need to code to account for this wild swaggering?
I check if the user has moved a certain distance within a certain time to determine if they have stopped (thanks to rmaddy for the info):
/**
Return true if user is stopped. Because GPS is in accurate user must pass a threshold distance to be considered stopped.
*/
private func userHasStopped() -> Bool
{
// No stop checks yet so false and set new location
if (_lastLocationForStopAnalysis == nil)
{
_lastLocationForStopAnalysis = _currentLocation
return false
}
// If the distance is greater than the 'not stopped' threshold, set a new location
if (_lastLocationForStopAnalysis.distanceFromLocation(_currentLocation) > 50)
{
_lastLocationForStopAnalysis = _currentLocation
return false
}
// The user has been 'still' for long enough they are considered stopped
if (_currentLocation.timestamp.timeIntervalSinceDate(_lastLocationForStopAnalysis.timestamp) > 180)
{
return true
}
// There hasn't been a timeout or a threshold pass to they haven't stopped yet
return false
}

Resources