Laggy WCSession sendMessageData - ios

I am polling the apple watch for Core Motion at a frequency of 0.01. The purpose of the application is to see movement in real-time. To capture data as quickly as possible, I leverage the didReceiveMessageData/ sendMessageData functions. On the iPhone, I have a simple function that reads the data:
func session(_ session: WCSession, didReceiveMessageData messageData: Data) {
let records : [Double] = try! NSKeyedUnarchiver.unarchivedObject(ofClasses: [NSArray.self], from: messageData) as! [Double]
}
And on an Apple Watch 6, I have a simple function that sends the data. However, sending suffers from a sporadic yet significant delay.
class MyController: WKInterfaceController, WCSessionDelegate {
private let motion = CMMotionManager()
private let motionQueue = OperationQueue()
private let messagingQueue = OperationQueue()
private let healthStore = HKHealthStore()
private var stack : QuaternionStack = QuaternionStack()
override init() {
super.init()
if WCSession.isSupported() {
let session = WCSession.default
session.delegate = self
if session.activationState == .notActivated { session.activate() }
}
// Serial queue for sample handling and calculations.
messagingQueue.qualityOfService = .userInteractive
messagingQueue.maxConcurrentOperationCount = 1
motionQueue.qualityOfService = .userInteractive
motionQueue.maxConcurrentOperationCount = 1
startGettingData();
}
func startGettingData() {
// If we have already started the workout, then do nothing.
if (workoutSession != nil) { return }
if !motion.isDeviceMotionAvailable { return }
let workoutConfiguration = HKWorkoutConfiguration()
workoutConfiguration.activityType = .functionalStrengthTraining
workoutConfiguration.locationType = .indoor
do {
workoutSession = try HKWorkoutSession(healthStore: healthStore, configuration: workoutConfiguration)
} catch { fatalError("Unable to create the workout session!") }
// Start the workout session and device motion updates.
workoutSession!.startActivity(with: Date())
motion.deviceMotionUpdateInterval = 0.01
motion.startDeviceMotionUpdates(using: .xArbitraryZVertical, to: motionQueue) { [self] (deviceMotion: CMDeviceMotion?, _ : Error?) in
guard let motion = deviceMotion else { return }
let attitude = motion.attitude.quaternion
stack.push(Quaternion(x: attitude.x, y: attitude.y, z: attitude.z, w: attitude.w))
guard let quaternions = stack.pop() else { return }
messagingQueue.cancelAllOperations()
let blockOperation = BlockOperation()
blockOperation.addExecutionBlock({ [unowned blockOperation] in
if blockOperation.isCancelled { return }
self.sendDataToPhone(quaternions: quaternions)
})
messagingQueue.addOperation(blockOperation)
}
}
private func sendDataToPhone(quaternions: [Quaternion]) {
if WCSession.default.isReachable {
var capturedQuaternions : [Double] = [Double]()
for quat in quaternions { capturedQuaternions.append(contentsOf: [quat.x, quat.y, quat.z, quat.w]) }
WCSession.default.sendMessageData(try! NSKeyedArchiver.archivedData(withRootObject: capturedQuaternions, requiringSecureCoding: false), replyHandler: nil, errorHandler: nil);
}
}
}
I've implemented a stack as follows:
struct QuaternionStack {
private let max = 2;
private var array: [Quaternion] = []
mutating func push(_ element: Quaternion) {
array.append(element)
if array.count > max { array.removeFirst() }
}
mutating func pop() -> [Quaternion]? {
if (array.count < max) { return nil }
var results : [Quaternion] = [Quaternion]()
for _ in 0 ..< max { results.append(array.popLast()!)}
results.reverse()
array.removeAll()
return results
}
}
If I set QuaternionStack.max to a big number, like 10, I see no obvious throttling on the iPhone when receiving data. This is because I send more data but less often. However, decreasing the number degrades the performance. As an example, imagine I send every 2 incoming packets ( QuaternionStack.max = 2 ). Sometimes, a few seconds pass between when the packets are received. When this happens, the iWatch seems to send them very quickly in an effort to catch up. Another example of this issue is when listening to music on paired Apple Airpods or receiving an incoming call. The WCSession sendMessageData from the watch becomes very inconsistent.
What must I do to increase the throughput of the WCSession sendMessageData ? The application I am writing requires very fast ( 100hz ) and continuous motion updates.

Related

Unwanted "smoothing" in AVDepthData on iPhone 13 (not evident in iPhone 12)

We are writing an app which analyzes a real world 3D data by using the TrueDepth camera on the front of an iPhone, and an AVCaptureSession configured to produce AVDepthData along with image data. This worked great on iPhone 12, but the same code on iPhone 13 produces an unwanted "smoothing" effect which makes the scene impossible to process and breaks our app. We are unable to find any information on this effect, from Apple or otherwise, much less how to avoid it, so we are asking you experts.
At the bottom of this post (Figure 3) is our code which configures the capture session, using an AVCaptureDataOutputSynchronizer, to produce frames of 640x480 image and depth data. I boiled it down as much as possible, sorry it's so long. The main two parts are the configure function, which sets up our capture session, and the dataOutputSynchronizer function, near the bottom, which fires when a sycned set of data is available. In the latter function I've included my code which extracts the information from the AVDepthData object, including looping through all 640x480 depth data points (in meters). I've excluded further processing for brevity (believe it or not :)).
On an iPhone 12 device, the PNG data and the depth data merge nicely. The front view and side view of the merged pointcloud are below (Figure 1) . The angles visible in the side view are due to the application of the focal length which "de-perspectives" the data and places them in their proper position in xyz space.
The same code on an iPhone 13 produces depth maps that result in point cloud further below (Figure 2 -- straight on view, angled view, and side view). There is no longer any clear distinction between objects and the background becasue the depth data appears to be "smoothed" between the mannequin and the background -- i.e., there are seven or eight points between the subject and background that are not realistic and make it impossible to do any meaningful processing such as segmenting the scene.
Has anyone else encountered this issue, or have any insight into how we might change our code to avoid it? Any help or ideas are MUCH appreciated, since this is a definite showstopper (we can't tell people to only run our App on older phones :)). Thank you!
Figure 1 -- Merged depth data and image into point cloud, from iPhone 12
Figure 2 -- Merged depth data and image into point cloud, from iPhone 13; unwanted smoothing effect visible
Figure 3 -- Our configuration code and capture handler; edited to remove downstream processing of captured data (which was basically formatting it into an XML file and uploading to the cloud)
import Foundation
import Combine
import AVFoundation
import Photos
import UIKit
import FirebaseStorage
public struct AlertError {
public var title: String = ""
public var message: String = ""
public var primaryButtonTitle = "Accept"
public var secondaryButtonTitle: String?
public var primaryAction: (() -> ())?
public var secondaryAction: (() -> ())?
public init(title: String = "", message: String = "", primaryButtonTitle: String = "Accept", secondaryButtonTitle: String? = nil, primaryAction: (() -> ())? = nil, secondaryAction: (() -> ())? = nil) {
self.title = title
self.message = message
self.primaryAction = primaryAction
self.primaryButtonTitle = primaryButtonTitle
self.secondaryAction = secondaryAction
}
}
///////////////////////////////////////////////////////////////////////////////////
///////////////////////////////////////////////////////////////////////////////////
//
//
// this is the CameraService class, which configures and runs a capture session
// which acquires syncronized image and depth data
// using an AVCaptureDataOutputSynchronizer
//
//
///////////////////////////////////////////////////////////////////////////////////
///////////////////////////////////////////////////////////////////////////////////
public class CameraService: NSObject,
AVCaptureVideoDataOutputSampleBufferDelegate,
AVCaptureDepthDataOutputDelegate,
AVCaptureDataOutputSynchronizerDelegate,
MyFirebaseProtocol,
ObservableObject{
#Published public var shouldShowAlertView = false
#Published public var shouldShowSpinner = false
public var labelStatus: String = "Ready"
var images: [UIImage?] = []
public var alertError: AlertError = AlertError()
public let session = AVCaptureSession()
var isSessionRunning = false
var isConfigured = false
var setupResult: SessionSetupResult = .success
private let sessionQueue = DispatchQueue(label: "session queue") // Communicate with the session and other session objects on this queue.
#objc dynamic var videoDeviceInput: AVCaptureDeviceInput!
private let videoDeviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInTrueDepthCamera], mediaType: .video, position: .front)
var videoCaptureDevice : AVCaptureDevice? = nil
let videoDataOutput: AVCaptureVideoDataOutput = AVCaptureVideoDataOutput() // Define frame output.
let depthDataOutput = AVCaptureDepthDataOutput()
var outputSynchronizer: AVCaptureDataOutputSynchronizer? = nil
let dataOutputQueue = DispatchQueue(label: "video data queue", qos: .userInitiated, attributes: [], autoreleaseFrequency: .workItem)
var scanStateCounter: Int = 0
var m_DepthDatasetsToUpload = [AVCaptureSynchronizedDepthData]()
var m_FrameBufferToUpload = [AVCaptureSynchronizedSampleBufferData]()
var firebaseDepthDatasetsArray: [String] = []
#Published var firebaseImageUploadCount = 0
#Published var firebaseTextFileUploadCount = 0
public func configure() {
/*
Setup the capture session.
In general, it's not safe to mutate an AVCaptureSession or any of its
inputs, outputs, or connections from multiple threads at the same time.
Don't perform these tasks on the main queue because
AVCaptureSession.startRunning() is a blocking call, which can
take a long time. Dispatch session setup to the sessionQueue, so
that the main queue isn't blocked, which keeps the UI responsive.
*/
sessionQueue.async {
self.configureSession()
}
}
// MARK: Checks for user's permisions
public func checkForPermissions() {
switch AVCaptureDevice.authorizationStatus(for: .video) {
case .authorized:
// The user has previously granted access to the camera.
break
case .notDetermined:
/*
The user has not yet been presented with the option to grant
video access. Suspend the session queue to delay session
setup until the access request has completed.
*/
sessionQueue.suspend()
AVCaptureDevice.requestAccess(for: .video, completionHandler: { granted in
if !granted {
self.setupResult = .notAuthorized
}
self.sessionQueue.resume()
})
default:
// The user has previously denied access.
setupResult = .notAuthorized
DispatchQueue.main.async {
self.alertError = AlertError(title: "Camera Access", message: "SwiftCamera doesn't have access to use your camera, please update your privacy settings.", primaryButtonTitle: "Settings", secondaryButtonTitle: nil, primaryAction: {
UIApplication.shared.open(URL(string: UIApplication.openSettingsURLString)!,
options: [:], completionHandler: nil)
}, secondaryAction: nil)
self.shouldShowAlertView = true
}
}
}
// MARK: Session Management
// Call this on the session queue.
/// - Tag: ConfigureSession
private func configureSession() {
if setupResult != .success {
return
}
session.beginConfiguration()
session.sessionPreset = AVCaptureSession.Preset.vga640x480
// Add video input.
do {
var defaultVideoDevice: AVCaptureDevice?
let frontCameraDevice = AVCaptureDevice.default(.builtInTrueDepthCamera, for: .video, position: .front)
// If the rear wide angle camera isn't available, default to the front wide angle camera.
defaultVideoDevice = frontCameraDevice
videoCaptureDevice = defaultVideoDevice
guard let videoDevice = defaultVideoDevice else {
print("Default video device is unavailable.")
setupResult = .configurationFailed
session.commitConfiguration()
return
}
let videoDeviceInput = try AVCaptureDeviceInput(device: videoDevice)
if session.canAddInput(videoDeviceInput) {
session.addInput(videoDeviceInput)
self.videoDeviceInput = videoDeviceInput
} else if session.inputs.isEmpty == false {
self.videoDeviceInput = videoDeviceInput
} else {
print("Couldn't add video device input to the session.")
setupResult = .configurationFailed
session.commitConfiguration()
return
}
} catch {
print("Couldn't create video device input: \(error)")
setupResult = .configurationFailed
session.commitConfiguration()
return
}
//////////////////////////////////////////////////////////////////////////////////////////////////////////////
// MARK: add video output to session
//////////////////////////////////////////////////////////////////////////////////////////////////////////////
videoDataOutput.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString) : NSNumber(value: kCVPixelFormatType_32BGRA)] as [String : Any]
videoDataOutput.alwaysDiscardsLateVideoFrames = true
videoDataOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "camera_frame_processing_queue"))
if session.canAddOutput(self.videoDataOutput) {
session.addOutput(self.videoDataOutput)
} else if session.outputs.contains(videoDataOutput) {
} else {
print("Couldn't create video device output")
setupResult = .configurationFailed
session.commitConfiguration()
return
}
guard let connection = self.videoDataOutput.connection(with: AVMediaType.video),
connection.isVideoOrientationSupported else { return }
connection.videoOrientation = .portrait
//////////////////////////////////////////////////////////////////////////////////////////////////////////////
// MARK: add depth output to session
//////////////////////////////////////////////////////////////////////////////////////////////////////////////
// Add a depth data output
if session.canAddOutput(depthDataOutput) {
session.addOutput(depthDataOutput)
depthDataOutput.isFilteringEnabled = false
//depthDataOutput.setDelegate(T##delegate: AVCaptureDepthDataOutputDelegate?##AVCaptureDepthDataOutputDelegate?, callbackQueue: <#T##DispatchQueue?#>)
depthDataOutput.setDelegate(self, callbackQueue: DispatchQueue(label: "depth_frame_processing_queue"))
if let connection = depthDataOutput.connection(with: .depthData) {
connection.isEnabled = true
} else {
print("No AVCaptureConnection")
}
} else if session.outputs.contains(depthDataOutput){
} else {
print("Could not add depth data output to the session")
session.commitConfiguration()
return
}
// Search for highest resolution with half-point depth values
let depthFormats = videoCaptureDevice!.activeFormat.supportedDepthDataFormats
let filtered = depthFormats.filter({
CMFormatDescriptionGetMediaSubType($0.formatDescription) == kCVPixelFormatType_DepthFloat16
})
let selectedFormat = filtered.max(by: {
first, second in CMVideoFormatDescriptionGetDimensions(first.formatDescription).width < CMVideoFormatDescriptionGetDimensions(second.formatDescription).width
})
do {
try videoCaptureDevice!.lockForConfiguration()
videoCaptureDevice!.activeDepthDataFormat = selectedFormat
videoCaptureDevice!.unlockForConfiguration()
} catch {
print("Could not lock device for configuration: \(error)")
session.commitConfiguration()
return
}
//////////////////////////////////////////////////////////////////////////////////////////////////////////////
// Use an AVCaptureDataOutputSynchronizer to synchronize the video data and depth data outputs.
// The first output in the dataOutputs array, in this case the AVCaptureVideoDataOutput, is the "master" output.
//////////////////////////////////////////////////////////////////////////////////////////////////////////////
outputSynchronizer = AVCaptureDataOutputSynchronizer(dataOutputs: [videoDataOutput, depthDataOutput])
outputSynchronizer!.setDelegate(self, queue: dataOutputQueue)
session.commitConfiguration()
self.isConfigured = true
//self.start()
}
// MARK: Device Configuration
/// - Tag: Stop capture session
public func stop(completion: (() -> ())? = nil) {
sessionQueue.async {
//print("entered stop")
if self.isSessionRunning {
//print(self.setupResult)
if self.setupResult == .success {
//print("entered success")
DispatchQueue.main.async{
self.session.stopRunning()
self.isSessionRunning = self.session.isRunning
if !self.session.isRunning {
DispatchQueue.main.async {
completion?()
}
}
}
}
}
}
}
/// - Tag: Start capture session
public func start() {
// We use our capture session queue to ensure our UI runs smoothly on the main thread.
sessionQueue.async {
if !self.isSessionRunning && self.isConfigured {
switch self.setupResult {
case .success:
self.session.startRunning()
self.isSessionRunning = self.session.isRunning
if self.session.isRunning {
}
case .configurationFailed, .notAuthorized:
print("Application not authorized to use camera")
DispatchQueue.main.async {
self.alertError = AlertError(title: "Camera Error", message: "Camera configuration failed. Either your device camera is not available or its missing permissions", primaryButtonTitle: "Accept", secondaryButtonTitle: nil, primaryAction: nil, secondaryAction: nil)
self.shouldShowAlertView = true
}
}
}
}
}
// ------------------------------------------------------------------------
// MARK: CAPTURE HANDLERS
// ------------------------------------------------------------------------
public func dataOutputSynchronizer(_ synchronizer: AVCaptureDataOutputSynchronizer, didOutput synchronizedDataCollection: AVCaptureSynchronizedDataCollection) {
//printWithTime("Capture")
guard let syncedDepthData: AVCaptureSynchronizedDepthData =
synchronizedDataCollection.synchronizedData(for: depthDataOutput) as? AVCaptureSynchronizedDepthData else {
return
}
guard let syncedVideoData: AVCaptureSynchronizedSampleBufferData =
synchronizedDataCollection.synchronizedData(for: videoDataOutput) as? AVCaptureSynchronizedSampleBufferData else {
return
}
///////////////////////////////////////////////////////////////////////////////////
///////////////////////////////////////////////////////////////////////////////////
//
//
// Below is the code that extracts the information from depth data
// The depth data is 640x480, which matches the size of the synchronized image
// I save this info to a file, upload it to the cloud, and merge it with the image
// on a PC to create a pointcloud
//
//
///////////////////////////////////////////////////////////////////////////////////
///////////////////////////////////////////////////////////////////////////////////
let depth_data : AVDepthData = syncedDepthData.depthData
let cvpixelbuffer : CVPixelBuffer = depth_data.depthDataMap
let height : Int = CVPixelBufferGetHeight(cvpixelbuffer)
let width : Int = CVPixelBufferGetWidth(cvpixelbuffer)
let quality : AVDepthData.Quality = depth_data.depthDataQuality
let accuracy : AVDepthData.Accuracy = depth_data.depthDataAccuracy
let pixelsize : Float = depth_data.cameraCalibrationData!.pixelSize
let camcaldata : AVCameraCalibrationData = depth_data.cameraCalibrationData!
let intmat : matrix_float3x3 = camcaldata.intrinsicMatrix
let cal_lensdistort_x : CGFloat = camcaldata.lensDistortionCenter.x
let cal_lensdistort_y : CGFloat = camcaldata.lensDistortionCenter.y
let cal_matrix_width : CGFloat = camcaldata.intrinsicMatrixReferenceDimensions.width
let cal_matrix_height : CGFloat = camcaldata.intrinsicMatrixReferenceDimensions.height
let intrinsics_fx : Float = camcaldata.intrinsicMatrix.columns.0.x
let intrinsics_fy : Float = camcaldata.intrinsicMatrix.columns.1.y
let intrinsics_ox : Float = camcaldata.intrinsicMatrix.columns.2.x
let intrinsics_oy : Float = camcaldata.intrinsicMatrix.columns.2.y
let pixelformattype : OSType = CVPixelBufferGetPixelFormatType(cvpixelbuffer)
CVPixelBufferLockBaseAddress(cvpixelbuffer, CVPixelBufferLockFlags(rawValue: 0))
let int16Buffer = unsafeBitCast(CVPixelBufferGetBaseAddress(cvpixelbuffer), to: UnsafeMutablePointer<Float16>.self)
let int16PerRow = CVPixelBufferGetBytesPerRow(cvpixelbuffer) / 2
for x in 0...height-1
{
for y in 0...width-1
{
let luma = int16Buffer[x * int16PerRow + y]
/////////////////////////
// SAVE DEPTH VALUE 'luma' to FILE FOR PROCESSING
}
}
CVPixelBufferUnlockBaseAddress(cvpixelbuffer, CVPixelBufferLockFlags(rawValue: 0))
}

Clicks / Distortion in AudioKit

When I add a bunch (20-40) samples playing and overlapping eachother simultaneously sometimes it starts getting distorted and then some waving, oscillating, and clicking begins to happen. A similar sound happens when the samples are playing the the app crashes - sounds like an abrupt, crunchy halt.
Notice the waviness begins between 0:05 and 0:10; nasty clicks start around 0:15.
Listen Here
How can I make it smoother? I am spawning AKPlayer objects (from 4.1) that play 4-8 second .wav files. Those go into AKBoosters which go into AKMixers which go into the final AKMixer for output.
Edit:
Many PenAudioNodes get plugged into the mixer of the AudioReceiver singleton.
Here's my AudioReceiver singleton:
class AudioReceiver {
static var sharedInstance = AudioReceiver()
private var audioNodes = [UUID : AudioNode]()
private let mixer = AKMixer()
private let queue = DispatchQueue(label: "audio-queue")
//MARK: - Setup & Teardown
init() {
AudioKit.output = mixer //peakLimiter
AudioKit.start()
}
//MARK: - Public
func audioNodeBegan(_ message : AudioNodeMessage) {
queue.async {
var audioNode: AudioNode?
switch message.senderType {
case .pen:
audioNode = PenAudioNode()
case .home:
audioNode = LoopingAudioNode(with: AudioHelper.homeLoopFile())
default:
break
}
if let audioNode = audioNode {
self.audioNodes[message.senderId] = audioNode
self.mixer.connect(input: audioNode.output)
audioNode.start(message)
}
}
}
func audioNodeMoved(_ message : AudioNodeMessage) {
queue.async {
if let audioNode = self.audioNodes[message.senderId] {
audioNode.update(message)
}
}
}
func audioNodeEnded(_ message : AudioNodeMessage) {
queue.async {
if let audioNode = self.audioNodes[message.senderId] {
audioNode.stop(message)
}
self.audioNodes[message.senderId] = nil
}
}
}
Here's my PenAudioNode:
class PenAudioNode {
fileprivate var mixer: AKMixer?
fileprivate var playersBoosters = [AKPlayer : AKBooster]()
fileprivate var finalOutput: AKNode?
fileprivate let file: AKAudioFile = AudioHelper.randomBellSampleFile()
//MARK: - Setup & Teardown
init() {
mixer = AKMixer()
finalOutput = mixer!
}
}
extension PenAudioNode: AudioNode {
var output: AKNode {
return finalOutput!
}
func start(_ message: AudioNodeMessage) {
}
func update(_ message: AudioNodeMessage) {
if let velocity = message.velocity {
let newVolume = Swift.min((velocity / 50) + 0.1, 1)
mixer!.volume = newVolume
}
if let isClimactic = message.isClimactic, isClimactic {
let player = AKPlayer(audioFile: file)
player.completionHandler = { [weak self] in
self?.playerCompleted(player)
}
let booster = AKBooster(player)
playersBoosters[player] = booster
booster.rampTime = 1
booster.gain = 0
mixer!.connect(input: booster)
player.play()
booster.gain = 1
}
}
func stop(_ message: AudioNodeMessage) {
for (_, booster) in playersBoosters {
booster.gain = 0
}
DispatchQueue.global().asyncAfter(deadline: DispatchTime.now() + 1) {
self.mixer!.stop()
self.output.disconnectOutput()
}
}
private func playerCompleted(_ player: AKPlayer) {
playersBoosters.removeValue(forKey: player)
}
}
This sounds like you are not releasing objects and you are eventually overloading the audio engine with too many instances of processing nodes connected in the graph. In particular not releasing AKBoosters will cause an issue like this. I can't really tell what your code is doing, but if you are spawning objects without releasing them properly, it will lead to garbled audio.
You want to conserve objects as much as possible and make sure you are using the absolute minimum amount of AKNode based processing.
There are various ways to debug this, but you can start by printing out the current state of the AVAudioEngine:
AudioKit.engine.description
That will show how many nodes you have connected in the graph at any given moment.

can't get returned values in a structure in swift 3

I'm having this code:
//
// Measurement.swift
// BicycleSpeed
import Foundation
import CoreBluetooth
// CSC Measurement
// https://developer.bluetooth.org/gatt/characteristics/Pages/CharacteristicViewer.aspx?u=org.bluetooth.characteristic.csc_measurement.xml
//
// Flags : 1 byte. Bit 0: Wheel. Bit 1: Crank
// Cumulative Wheel revolutions: 4 bytes uint32
// Last wheel event time: 2 bytes. uint16 (1/1024s)
// Cumulative Crank revolutions: 2 bytes uint16
// Last cranck event time: 2 bytes. uint16 (1/1024s)
struct Measurement : CustomDebugStringConvertible {
let hasWheel:Bool
let hasCrank:Bool
let cumulativeWheel:UInt32
let lastWheelEventTime:TimeInterval
let cumulativeCrank:UInt16
let lastCrankEventTime:TimeInterval
let wheelSize:UInt32
init(data:Data, wheelSize:UInt32) {
self.wheelSize = wheelSize
// Flags
var flags:UInt8=0
(data as NSData).getBytes(&flags, range: NSRange(location: 0, length: 1))
hasWheel = ((flags & BTConstants.WheelFlagMask) > 0)
hasCrank = ((flags & BTConstants.CrankFlagMask) > 0)
var wheel:UInt32=0
var wheelTime:UInt16=0
var crank:UInt16=0
var crankTime:UInt16=0
var currentOffset = 1
var length = 0
if ( hasWheel ) {
length = MemoryLayout<UInt32>.size
(data as NSData).getBytes(&wheel, range: NSRange(location: currentOffset, length: length))
currentOffset += length
length = MemoryLayout<UInt16>.size
(data as NSData).getBytes(&wheelTime, range: NSRange(location: currentOffset, length: length))
currentOffset += length
}
if ( hasCrank ) {
length = MemoryLayout<UInt16>.size
(data as NSData).getBytes(&crank, range: NSRange(location: currentOffset, length: length))
currentOffset += length
length = MemoryLayout<UInt16>.size
(data as NSData).getBytes(&crankTime, range: NSRange(location: currentOffset, length: length))
currentOffset += length
}
cumulativeWheel = CFSwapInt32LittleToHost(wheel)
lastWheelEventTime = TimeInterval( Double(CFSwapInt16LittleToHost(wheelTime))/BTConstants.TimeScale)
cumulativeCrank = CFSwapInt16LittleToHost(crank)
lastCrankEventTime = TimeInterval( Double(CFSwapInt16LittleToHost(crankTime))/BTConstants.TimeScale)
}
func timeIntervalForCurrentSample( _ current:TimeInterval, previous:TimeInterval ) -> TimeInterval {
var timeDiff:TimeInterval = 0
if( current >= previous ) {
timeDiff = current - previous
}
else {
// passed the maximum value
timeDiff = ( TimeInterval((Double( UINT16_MAX) / BTConstants.TimeScale)) - previous) + current
}
return timeDiff
}
func valueDiffForCurrentSample<T:UnsignedInteger>( _ current:T, previous:T , max:T) -> T {
var diff:T = 0
if ( current >= previous ) {
diff = current - previous
}
else {
diff = ( max - previous ) + current
}
return diff
}
func valuesForPreviousMeasurement( _ previousSample:Measurement? ) -> ( cadenceinRPM:Double?, distanceinMeters:Double?, speedInMetersPerSecond:Double?)? {
var distance:Double?, cadence:Double?, speed:Double?
guard let previousSample = previousSample else {
return nil
}
if ( hasWheel && previousSample.hasWheel ) {
let wheelTimeDiff = timeIntervalForCurrentSample(lastWheelEventTime, previous: previousSample.lastWheelEventTime)
let valueDiff = valueDiffForCurrentSample(cumulativeWheel, previous: previousSample.cumulativeWheel, max: UInt32.max)
distance = Double( valueDiff * wheelSize) / 1000.0 // distance in meters
if distance != nil && wheelTimeDiff > 0 {
speed = (wheelTimeDiff == 0 ) ? 0 : distance! / wheelTimeDiff // m/s
}
}
if( hasCrank && previousSample.hasCrank ) {
let crankDiffTime = timeIntervalForCurrentSample(lastCrankEventTime, previous: previousSample.lastCrankEventTime)
let valueDiff = Double(valueDiffForCurrentSample(cumulativeCrank, previous: previousSample.cumulativeCrank, max: UInt16.max))
cadence = (crankDiffTime == 0) ? 0 : Double(60.0 * valueDiff / crankDiffTime) // RPM
}
print( "Cadence: \(String(describing: cadence)) RPM. Distance: \(String(describing: distance)) meters. Speed: \(String(describing: speed)) Km/h" )
return ( cadenceinRPM:cadence, distanceinMeters:distance, speedInMetersPerSecond:speed)
}
var debugDescription:String {
get {
return "Wheel Revs: \(cumulativeWheel). Last wheel event time: \(lastWheelEventTime). Crank Revs: \(cumulativeCrank). Last Crank event time: \(lastCrankEventTime)"
}
}
var myMeasurement = ((Measurement?) -> (cadenceinRPM: Double?, DistanceinMeters: Double?, speedinMetersPerSecond: Double?)).self
struct dataVariables {
static var mySpeed = myMeasurement.speedInMetersPerSecond
static var myCadence : Double?
static var miDistance : Double?
static var myLastWheelEventTime : Double? = Measurement.valuesForPreviousMeasurement(lastWheelEventTime)
static var myLastCrankEventTime : Double? = Measurement.valuesForPreviousMeasurement(lastCrankEventTime)
}
}
}
and I'm trying to assign to static variables inside the struct dataVariables the returned values cadenceinRPM,distanceinMeters,speedinMetersPerSecond
as well as lastWheelEventTime and lastCrankEventTime, so I can access them in another class, but I'm having the following errors:
on var mySpeed : Instance member'myMeasurement' cannot be used on type 'Measurement'
on var myLastWheelEventTime and var myLastCrankEventTime : Instance member 'valuesForPreviousMeasurement' cannot be used on type 'Measurement' ; did you mean to use a value of this type instead?
how can a reference to those returned values than?
Can somebody explain the error? I searched for other similar questions but I'm not getting to a solution yet
I have tried to change var myVariable to
var myMeasurement = Measurement.self
but the errors stayed the same.
and this other code
//
// CadenceSensor.swift
import Foundation
import CoreBluetooth
/*
// Bluetooth "Cycling Speed and Cadence"
https://developer.bluetooth.org/gatt/services/Pages/ServiceViewer.aspx?u=org.bluetooth.service.cycling_speed_and_cadence.xml
Service Cycling Speed and Cadence. Characteristic [2A5B] // Measurement
Service Cycling Speed and Cadence. Characteristic [2A5C] // Supported Features
Service Cycling Speed and Cadence. Characteristic [2A5D] // Sensor location
Service Cycling Speed and Cadence. Characteristic [2A55] // Control Point
*/
public struct BTConstants {
static let CadenceService = "1816"
static let CSCMeasurementUUID = "2a5b"
static let CSCFeatureUUID = "2a5c"
static let SensorLocationUUID = "2a5d"
static let ControlPointUUID = "2a55"
static let WheelFlagMask:UInt8 = 0b01
static let CrankFlagMask:UInt8 = 0b10
static let DefaultWheelSize:UInt32 = UInt32(myVariables.circonferenzaRuota!) // In millimiters. 700x30 (by default my bike's wheels) :)
static let TimeScale = 1024.0
}
protocol CadenceSensorDelegate {
func errorDiscoveringSensorInformation(_ error:NSError)
func sensorReady()
func sensorUpdatedValues( speedInMetersPerSecond speed:Double?, cadenceInRpm cadence:Double?, distanceInMeters distance:Double? )
}
class CadenceSensor: NSObject {
let peripheral:CBPeripheral
var sensorDelegate:CadenceSensorDelegate?
var measurementCharasteristic:CBCharacteristic?
var lastMeasurement:Measurement?
let wheelCircunference:UInt32
init(peripheral:CBPeripheral , wheel:UInt32=BTConstants.DefaultWheelSize) {
self.peripheral = peripheral
wheelCircunference = wheel
}
func start() {
self.peripheral.discoverServices(nil)
self.peripheral.delegate = self
}
func stop() {
if let measurementCharasteristic = measurementCharasteristic {
peripheral.setNotifyValue(false, for: measurementCharasteristic)
}
}
func handleValueData( _ data:Data ) {
let measurement = Measurement(data: data, wheelSize: wheelCircunference)
print("\(measurement)")
let values = measurement.valuesForPreviousMeasurement(lastMeasurement)
lastMeasurement = measurement
sensorDelegate?.sensorUpdatedValues(speedInMetersPerSecond: values?.speedInMetersPerSecond, cadenceInRpm: values?.cadenceinRPM, distanceInMeters: values?.distanceinMeters)
}
}
extension CadenceSensor : CBPeripheralDelegate {
func peripheral(_ peripheral: CBPeripheral, didUpdateNotificationStateFor characteristic: CBCharacteristic, error: Error?) {
guard error == nil else {
sensorDelegate?.errorDiscoveringSensorInformation(NSError(domain: CBErrorDomain, code: 0, userInfo: [NSLocalizedDescriptionKey:NSLocalizedString("Error receiving measurements updates", comment:"")]))
return
}
print("notification status changed for [\(characteristic.uuid)]...")
}
func peripheral(_ peripheral: CBPeripheral, didUpdateValueFor characteristic: CBCharacteristic, error: Error?) {
print("Updated [\(characteristic.uuid)]...")
guard error == nil , let data = characteristic.value else {
return
}
handleValueData(data)
}
func peripheral(_ peripheral: CBPeripheral, didDiscoverServices error: Error?) {
guard error == nil else {
sensorDelegate?.errorDiscoveringSensorInformation(error! as NSError)
return
}
// Find the cadence service
guard let cadenceService = peripheral.services?.filter({ (service) -> Bool in
return service.uuid == CBUUID(string: BTConstants.CadenceService)
}).first else {
sensorDelegate?.errorDiscoveringSensorInformation(NSError(domain: CBErrorDomain, code: NSNotFound, userInfo: [NSLocalizedDescriptionKey:NSLocalizedString("Cadence service not found for this peripheral", comment:"")]))
return
}
// Discover the cadence service characteristics
peripheral.discoverCharacteristics(nil, for:cadenceService )
print("Cadence service discovered")
}
func peripheral(_ peripheral: CBPeripheral, didDiscoverCharacteristicsFor service: CBService, error: Error?) {
guard let characteristics = service.characteristics else {
sensorDelegate?.errorDiscoveringSensorInformation(NSError(domain: CBErrorDomain, code: NSNotFound, userInfo: [NSLocalizedDescriptionKey:NSLocalizedString("No characteristics found for the cadence service", comment:"")]))
return
}
print("Received characteristics");
// Enable notifications for the measurement characteristic
for characteristic in characteristics {
print("Service \(service.uuid). Characteristic [\(characteristic.uuid)]")
if characteristic.uuid == CBUUID(string: BTConstants.CSCMeasurementUUID) {
print("Found measurement characteristic. Subscribing...")
peripheral.setNotifyValue(true, for: characteristic)
measurementCharasteristic = characteristic
}
}
sensorDelegate?.sensorReady()
}
}
and this other code
//
// MainViewController.swift
// BicycleSpeed
import UIKit
import CoreBluetooth
class MainViewController: UIViewController {
struct Constants {
static let ScanSegue = "ScanSegue"
static let SensorUserDefaultsKey = "lastsensorused"
}
var bluetoothManager:BluetoothManager!
var sensor:CadenceSensor?
weak var scanViewController:ScanViewController?
var infoViewController:InfoTableViewController?
var accumulatedDistance:Double?
lazy var distanceFormatter:LengthFormatter = {
let formatter = LengthFormatter()
formatter.numberFormatter.maximumFractionDigits = 1
return formatter
}()
//#IBOutlet var labelBTStatus:UILabel!
#IBOutlet var scanItem:UIBarButtonItem!
#IBOutlet weak var idLabel: UILabel!
override func viewDidLoad() {
bluetoothManager = BluetoothManager()
bluetoothManager.bluetoothDelegate = self
scanItem.isEnabled = false
}
deinit {
disconnectSensor()
}
#IBAction func unwindSegue( _ segue:UIStoryboardSegue ) {
bluetoothManager.stopScan()
guard let sensor = (segue as? ScanUnwindSegue)?.sensor else {
return
}
print("Need to connect to sensor \(sensor.peripheral.identifier)")
connectToSensor(sensor)
}
func disconnectSensor( ) {
if sensor != nil {
bluetoothManager.disconnectSensor(sensor!)
sensor = nil
}
accumulatedDistance = nil
}
func connectToSensor(_ sensor:CadenceSensor) {
self.sensor = sensor
bluetoothManager.connectToSensor(sensor)
// Save the sensor ID
UserDefaults.standard.set(sensor.peripheral.identifier.uuidString, forKey: Constants.SensorUserDefaultsKey)
UserDefaults.standard.synchronize()
}
// TODO: REconnect. Try this every X seconds
func checkPreviousSensor() {
guard let sensorID = UserDefaults.standard.object(forKey: Constants.SensorUserDefaultsKey) as? String else {
return
}
guard let sensor = bluetoothManager.retrieveSensorWithIdentifier(sensorID) else {
return
}
self.sensor = sensor
connectToSensor(sensor)
}
override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
if let infoVC = segue.destination as? InfoTableViewController {
infoViewController = infoVC
}
if segue.identifier == Constants.ScanSegue {
// Scan segue
bluetoothManager.startScan()
scanViewController = (segue.destination as? UINavigationController)?.viewControllers.first as? ScanViewController
}
}
}
extension MainViewController : CadenceSensorDelegate {
func errorDiscoveringSensorInformation(_ error: NSError) {
print("An error ocurred disconvering the sensor services/characteristics: \(error)")
}
func sensorReady() {
print("Sensor ready to go...")
accumulatedDistance = 0.0
}
func updateSensorInfo() {
let name = sensor?.peripheral.name ?? ""
let uuid = sensor?.peripheral.identifier.uuidString ?? ""
OperationQueue.main.addOperation { () -> Void in
self.infoViewController?.showDeviceName(name , uuid:uuid )
}
}
func sensorUpdatedValues( speedInMetersPerSecond speed:Double?, cadenceInRpm cadence:Double?, distanceInMeters distance:Double? ) {
accumulatedDistance? += distance ?? 0
let distanceText = (accumulatedDistance != nil && accumulatedDistance! >= 1.0) ? distanceFormatter.string(fromMeters: accumulatedDistance!) : "N/A"
let speedText = (speed != nil) ? distanceFormatter.string(fromValue: speed!*3.6, unit: .kilometer) + NSLocalizedString("/h", comment:"(km) Per hour") : "N/A"
let cadenceText = (cadence != nil) ? String(format: "%.2f %#", cadence!, NSLocalizedString("RPM", comment:"Revs per minute") ) : "N/A"
OperationQueue.main.addOperation { () -> Void in
self.infoViewController?.showMeasurementWithSpeed(speedText , cadence: cadenceText, distance: distanceText )
}
}
}
extension MainViewController : BluetoothManagerDelegate {
func stateChanged(_ state: CBCentralManagerState) {
print("State Changed: \(state)")
var enabled = false
var title = ""
switch state {
case .poweredOn:
title = "Bluetooth ON"
enabled = true
// When the bluetooth changes to ON, try to reconnect to the previous sensor
checkPreviousSensor()
case .resetting:
title = "Reseeting"
case .poweredOff:
title = "Bluetooth Off"
case .unauthorized:
title = "Bluetooth not authorized"
case .unknown:
title = "Unknown"
case .unsupported:
title = "Bluetooth not supported"
}
infoViewController?.showBluetoothStatusText( title )
scanItem.isEnabled = enabled
}
func sensorConnection( _ sensor:CadenceSensor, error:NSError?) {
print("")
guard error == nil else {
self.sensor = nil
print("Error connecting to sensor: \(sensor.peripheral.identifier)")
updateSensorInfo()
accumulatedDistance = nil
return
}
self.sensor = sensor
self.sensor?.sensorDelegate = self
print("Sensor connected. \(String(describing: sensor.peripheral.name)). [\(sensor.peripheral.identifier)]")
updateSensorInfo()
sensor.start()
}
func sensorDisconnected( _ sensor:CadenceSensor, error:NSError?) {
print("Sensor disconnected")
self.sensor = nil
}
func sensorDiscovered( _ sensor:CadenceSensor ) {
scanViewController?.addSensor(sensor)
}
}
in MainViewController.swift, there is this func sensorUpdatedValues that converts the three values that I want, to strings and initialize the the func showMeasurementWithSpeed ins the InfoTableViewController.swift .
Could I just return the three values inside function sensorUpdateValues instead to be able to store them into new variables?
The better way to solve this is by passing on the sensor object to the InfoTableViewController in the prepare(forSegue:) and then inside InfoTableViewControlleryou can call sensor.lastMeasurement.speedInMetersPerSecond or any other var that is inside there. Since the class is passed on by reference it will retain the data even when you transition to a new ViewController.
override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
if let infoVC = segue.destination as? InfoTableViewController {
infoVC.sensor = self.sensor
}
if segue.identifier == Constants.ScanSegue {
// Scan segue
bluetoothManager.startScan()
scanViewController = (segue.destination as? UINavigationController)?.viewControllers.first as? ScanViewController
}
}
Then of course you can do whatever you want with this data in the new VC( assign the values to labels or whatnot)
I believe it's because you have that func declared as an instance level function rather than a class/struct level function. You should be able to simply add the "static" keyword to make it accessible for the way you are using it in your sample code. i.e. "static func valuesForPreviousMeasurement ..."
** UPDATE - Added a simple example to show the difference between class and instance functions.
// This is an instance function being used. It's called such because
// you need an actual object instance in order to call the func.
var myCar: Car = Car()
myCar.startEngine()
// This is a class level function being used. It's called such because
// you don't actually need an object instance: It's simply part of the class.
Car.PriceForModel("HondaCivic")
Finally Solved .. it was easier than thought.. as I was understanding the three values I'm interested in were passed some how to the infoTableViewController I wanted to get rid of. And they were passed, already converted to String with the function sensorUpdatedValues inside MainView controller
func sensorUpdatedValues( speedInMetersPerSecond speed:Double?, cadenceInRpm cadence:Double?, distanceInMeters distance:Double? ) {
accumulatedDistance? += distance ?? 0
let distanceText = (accumulatedDistance != nil && accumulatedDistance! >= 1.0) ? distanceFormatter.string(fromMeters: accumulatedDistance!) : "N/A"
let speedText = (speed != nil) ? distanceFormatter.string(fromValue: speed!*3.6, unit: .kilometer) + NSLocalizedString("/h", comment:"(km) Per hour") : "N/A"
let cadenceText = (cadence != nil) ? String(format: "%.2f %#", cadence!, NSLocalizedString("RPM", comment:"Revs per minute") ) : "N/A"
OperationQueue.main.addOperation { () -> Void in
self.infoViewController?.showMeasurementWithSpeed(speedText , cadence: cadenceText, distance: distanceText )
}
}
so I traced down the function inside my InfospeedoViewController ( as xCode was asking for it because it was present in InfoTableViewController and rerouting to InfoSpeedoViewController made it necessary) and inside that function's body I made the connection of the values to the labels.
func showMeasurementWithSpeed( _ speed:String, cadence:String, distance:String ) {
speedDisplayLabel.text = speed
cadenceDisplayLabel.text = cadence
distanceDisplayLabel.text = distance
// showDetailText(speed, atSection: Constants.MeasurementsSection, row:Constants.SpeedRow)
// showDetailText(cadence, atSection: Constants.MeasurementsSection, row:Constants.CadenceRow)
// showDetailText(distance, atSection: Constants.MeasurementsSection, row:Constants.DistanceRow)
}
the commented out parts were the old InfoTableViewController indication to fill the cells..
Many thanks for your help. I learned a few things and consolidated others. I guess we went the hard way trying to catch this values in the wrong place, but I guess that just because I had all the project's file printed that I could trace the data flow easier. Not fully understanding the code because of my low knowledge of swift made it a bit more difficoult, but I had this feeling the this solution was logic as often the simple way is the best way, an often one just can't see it.
thanks again
I have two more values I want to get. A new adventure begins ..should I post another question or continue this one?

Allowing background audio with Swift not working

I want to allow background audio while the app is not in focus. I currently have this code, which should allow that:
do {
try AKSettings.setSession(category: .playback, with: .mixWithOthers)
} catch {
print("error")
}
AKSettings.playbackWhileMuted = true
I also have the setting 'Audio, Airplay and Picture in Picture' enabled in capabilities settings. However, when I press the home button on my device the audio doesn't keep playing. What am I doing wrong? I am using AudioKit to produce sounds if that matters.
I am using a singleton to house all of the AudioKit components which I named AudioPlayer.swift. Here is what I have in my AudioPlayer.swift singleton file:
class AudioPlayer: NSObject {
var currentFrequency = String()
var soundIsPlaying = false
var leftOscillator = AKOscillator()
var rightOscillator = AKOscillator()
var rain = try! AKAudioFile()
var rainPlayer: AKAudioPlayer!
var envelope = AKAmplitudeEnvelope()
override init() {
super.init()
do {
try AKSettings.setSession(category: .playback, with: .mixWithOthers)
} catch {
print("error")
}
AKSettings.playbackWhileMuted = true
AudioKit.output = envelope
AudioKit.start()
}
func setupFrequency(left: AKOscillator, right: AKOscillator, frequency: String) {
currentFrequency = frequency
leftOscillator = left
rightOscillator = right
let leftPanner = AKPanner(leftOscillator)
leftPanner.pan = -1
let rightPanner = AKPanner(rightOscillator)
rightPanner.pan = 1
//Set up rain and rainPlayer
do {
rain = try AKAudioFile(readFileName: "rain.wav")
rainPlayer = try AKAudioPlayer(file: rain, looping: true, deferBuffering: false, completionHandler: nil)
} catch { print(error) }
let mixer = AKMixer(leftPanner, rightPanner, rainPlayer)
//Put mixer in sound envelope
envelope = AKAmplitudeEnvelope(mixer)
envelope.attackDuration = 2.0
envelope.decayDuration = 0
envelope.sustainLevel = 1
envelope.releaseDuration = 0.2
//Start AudioKit stuff
AudioKit.output = envelope
AudioKit.start()
leftOscillator.start()
rightOscillator.start()
rainPlayer.start()
envelope.start()
soundIsPlaying = true
}
}
And here is an example of one of my sound effect view controllers, which reference the AudioKit singleton to send it a certain frequency (I have about a dozen of these view controllers, each with its own frequency settings):
class CalmView: UIViewController {
let leftOscillator = AKOscillator()
let rightOscillator = AKOscillator()
override func viewDidLoad() {
super.viewDidLoad()
leftOscillator.amplitude = 0.3
leftOscillator.frequency = 220
rightOscillator.amplitude = 0.3
rightOscillator.frequency = 230
}
#IBAction func playSound(_ sender: Any) {
if shared.soundIsPlaying == false {
AudioKit.stop()
shared.setupFrequency(left: leftOscillator, right: rightOscillator, frequency: "Calm")
} else if shared.soundIsPlaying == true && shared.currentFrequency != "Calm" {
AudioKit.stop()
shared.leftOscillator.stop()
shared.rightOscillator.stop()
shared.rainPlayer.stop()
shared.envelope.stop()
shared.setupFrequency(left: leftOscillator, right: rightOscillator, frequency: "Calm")
} else {
shared.soundIsPlaying = false
shared.envelope.stop()
}
}
}
I instantiated the AudioPlayer singleton in my ViewController.swift file.
It depends on when you are doing your configuration in relation to when AudioKit is started. If you're using AudioKit you should be using its AKSettings to manage your session category. Basically not only the playback category but also mixWithOthers. By default, does this:
/// Set the audio session type
#objc open static func setSession(category: SessionCategory,
with options: AVAudioSessionCategoryOptions = [.mixWithOthers]) throws {
So you'd do something like this in your ViewController:
do {
if #available(iOS 10.0, *) {
try AKSettings.setSession(category: .playAndRecord, with: [.defaultToSpeaker, .allowBluetooth, .allowBluetoothA2DP])
} else {
// Fallback on earlier versions
}
} catch {
print("Errored setting category.")
}
So I think its a matter of getting that straight. It might also help to have inter-app audio set up. If you still have trouble and provide more information, I can help more, but this is as good an answer as I can muster based on the info you've given so far.

Getting " modifying the autolayout engine from a background thread after the engine was accessed from the main thread" even while using DispatchQueue

Hello I'm trying to create a simple app that gets annotations from my web server and then propagates the map with them. The only problem is when I call the function bob 30 seconds later to get a new annotation from another location it gives me the error above, I tried to fix it using DispatchQueue.main.async to no avail. Any help is appreciated.
Here is the function in question
// this is the test to see if it can add a new annotation after 30 seconds
if bob == 30{
let user_lat_temp = 26.7709
let user_lng_temp = -80.1067
DispatchQueue.main.async() {
// Do stuff to UI
self.GetAnnotations(lat: user_lat_temp, lng: user_lng_temp)
}
// reset it to see if it breaks
bob = 0
}
bob = bob + 1
print("bob: ", bob)
}
Here is the full code
import UIKit
import MapKit
import CoreLocation
class ViewController: UIViewController, MKMapViewDelegate, CLLocationManagerDelegate {
#IBOutlet weak var mapView: MKMapView!
let access_token = ""
let manager = CLLocationManager()
var firstTime = 1
var user_lat = 0.0
var user_lng = 0.0
var bob = 0
override func viewDidLoad() {
super.viewDidLoad()
manager.delegate = self
manager.desiredAccuracy = kCLLocationAccuracyBest
manager.requestWhenInUseAuthorization()
manager.startUpdatingLocation()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
func locationManager(_ manager: CLLocationManager, didUpdateLocations locations: [CLLocation]) {
let userLocation = locations[0]
user_lat = userLocation.coordinate.latitude
user_lng = userLocation.coordinate.longitude
self.mapView.showsUserLocation = true
if firstTime == 1{
GetAnnotations(lat: user_lat, lng: user_lng)
firstTime = 0
}
// this is the test to see if it can add a new annotation after 30 seconds
if bob == 30{
let user_lat_temp = 26.7709
let user_lng_temp = -80.1067
DispatchQueue.main.async() {
// Do stuff to UI
self.GetAnnotations(lat: user_lat_temp, lng: user_lng_temp)
}
// reset it to see if it breaks
bob = 0
}
bob = bob + 1
print("bob: ", bob)
}
func GetAnnotations(lat: Double, lng: Double){
guard let url = URL(string: "http://192.168.1.10:7888/api/?controller=location&action=get_locations") else {return}
var request = URLRequest(url: url)
request.httpMethod = "POST"
let postString = "access_token=\(access_token)&lat=\(lat)&lng=\(lng)";
request.httpBody = postString.data(using: String.Encoding.utf8)
URLSession.shared.dataTask(with: request) { (data, response, err) in
if let error = err {
print("the server is not responding \(error)")
}
if let response = response {
// if the user has a bad access token or is logged out
if let httpResponse = response as? HTTPURLResponse {
if httpResponse.statusCode == 401{
print("bad access token")
return
}
}else{
print("the server is not responding")
}
print(response)
guard let data = data else { return }
// parse the json for the locations
do {
let mapJSON = try JSONDecoder().decode(parseJsonLocations.self, from: data)
let user_id = mapJSON.user_info.user_id
print(user_id)
print(mapJSON.locations.count)
// do map
let distanceSpan:CLLocationDegrees = 10000
let userLocation:CLLocationCoordinate2D = CLLocationCoordinate2DMake(lat, lng)
self.mapView.setRegion(MKCoordinateRegionMakeWithDistance(userLocation, distanceSpan, distanceSpan), animated: true)
self.mapView.delegate = self
var i = 0
while i < mapJSON.locations.count {
let location_id = mapJSON.locations[i].location_id
let location_name = mapJSON.locations[i].location_name
let location_lat = mapJSON.locations[i].lat
let location_lng = mapJSON.locations[i].lng
let locationsLocation:CLLocationCoordinate2D = CLLocationCoordinate2DMake(location_lat, location_lng)
let subtitle = "location_id: \(location_id)"
let userAnnotation = Annotation(title: location_name, subtitle: subtitle, coordinate: locationsLocation)
self.mapView.addAnnotation( userAnnotation )
i = i + 1
}
} catch {
print("error trying to convert data to JSON")
print(error)
}
}
}.resume()
}
}
There are a lot of other places where you are not paying attention to the question of what thread you might be on.
You are saying
if firstTime == 1 {
GetAnnotations( // ...
without making sure you're on the main thread.
And then, inside GetAnnotations, you are saying
self.mapView.setRegion
// ... and all that follows ...
without making sure you're on the main thread.
I'm not saying you're not on the main thread at those moments, but there is no reason to think that you are, either. You should check and get it sorted out.
You have the right idea; explicitly dispatch the UI updates on the main queue, unfortunately you have dispatched the wrong function.
GetAnnotations (Which, by convention should be getAnnotations) makes an asynchronous network call via the URLSession DataTask, with the results returned in the completion handler. With network operations such as these there is a good chance that the completion handler will not be called on the main queue and that is the case here.
Since the completion handler is not executing on the main queue and you update the UI, you get the error message.
You need to dispatch those UI operations on the main queue
For example:
guard let data = data else { return }
DispatchQueue.main.async {
// parse the json for the locations
do {
let mapJSON = try JSONDecoder().decode(parseJsonLocations.self, from: data)
let user_id = mapJSON.user_info.user_id
print(user_id)
print(mapJSON.locations.count)
// do map
let distanceSpan:CLLocationDegrees = 10000
let userLocation:CLLocationCoordinate2D = CLLocationCoordinate2DMake(lat, lng)
self.mapView.setRegion(MKCoordinateRegionMakeWithDistance(userLocation, distanceSpan, distanceSpan), animated: true)
self.mapView.delegate = self
var i = 0
while i < mapJSON.locations.count {
let location_id = mapJSON.locations[i].location_id
let location_name = mapJSON.locations[i].location_name
let location_lat = mapJSON.locations[i].lat
let location_lng = mapJSON.locations[i].lng
let locationsLocation:CLLocationCoordinate2D = CLLocationCoordinate2DMake(location_lat, location_lng)
let subtitle = "location_id: \(location_id)"
let userAnnotation = Annotation(title: location_name, subtitle: subtitle, coordinate: locationsLocation)
self.mapView.addAnnotation( userAnnotation )
i = i + 1
}
} catch {
print("error trying to convert data to JSON")
print(error)
}
}
In the case of the location manager delegate call, the documentation states:
The methods of your delegate object are called from the thread in which you started the corresponding location services. That thread must itself have an active run loop, like the one found in your application’s main thread.
So, as long as you called startUpdatingLocation from the main queue, your delegate callbacks will be on the main queue

Resources