I am using Core Motion Framework for detecting the device activity.
i.e. Walking, Running, Automotive, Stationary
The main issue is i am able to detect walking and running with great accuracy but my device is not able to detect the Automotive mode.
Here is my code
var motionActivityManager: CMMotionActivityManager?
if CMMotionActivityManager.isActivityAvailable() {
motionActivityManager?.startActivityUpdatesToQueue(NSOperationQueue.currentQueue()!, withHandler: {
activityData
in
if activityData!.walking == true {
self.lblActivityStatus?.text = "Walking"
} else if activityData!.running == true {
self.lblActivityStatus?.text = "Running"
} else if activityData!.automotive == true {
self.lblActivityStatus?.text = "Automotive"
} else if activityData!.stationary == true {
self.lblActivityStatus?.text = "Stationary"
}
print("Activity Data: ", activityData)
})
}
I finally got the answer by testing the same app on multiple devices.
the m7 chip on first generation device is not working properly for particularly "Automotive" mode.
When i was testing on 5s and ipad air, it was not detecting Automotive mode as it should detect. But testing the same app with iphone 6 Plus, it worked fine.
so the problem was with device not with framework.
Related
I'm working on an app that connects to a device on the same local WiFi network. The device takes measurements and streams the data to the iPhone. The iPhone plots it in real time.
The problem I've run into is the iPhone delays the packets by up to 200ms every half second or so. This causes noticeable stutter in the UI that I'd like to eliminate.
I started debugging the issue by placing signposts in the code when packet is received.
Looking at the profiler, you can easily see the gaps in data.
Zooming in on the space after a gap, you can see a burst of packets received.
I've checked and it isn't dropping any packets. It is simply delaying them from my app.
The weird thing is this isn't an issue on the simulator or with the Android version of the app so I know it isn't an issue with the device or the WiFi network.
Here is the same code running on the simulator showing a much more even distribution of packets.
Has anyone experienced anything like this? Is this just some kind of battery saving limitation of the iPhone hardware? Is there anyway to ensure a more timely delivery of the data to my application?
I tried rewriting the connection using SwiftNIO and ended up with the same results. I've also tried changing the serviceClass parameter of the connection to all the possibilities with no change.
Here is the relevant connection code.
private func udpReceive() {
if udpConnection?.state == .ready {
udpConnection?.receive(minimumIncompleteLength: 1, maximumLength: Int(Hangboard.shared.BufferSize), completion: { content, contentContext, isComplete, error in
os_signpost(
.begin,
log: log,
name: "udpReceive",
signpostID: signpostID
)
Task {
if let content = content {
let _ = await asyncResult(for: Hangboard.shared.udpDataReceivedNative(data: content.toKotlinByteArray(), offset: 0, length: Int32(content.count)))
}
os_signpost(
.end,
log: log,
name: "udpReceive",
signpostID: signpostID
)
self.udpReceive()
}
})
} else {
disconnect(hadError: true)
}
}
private func startUdpListener(port: NWEndpoint.Port) {
let params = NWParameters.udp
params.allowFastOpen = true
params.serviceClass = .responsiveData
let listener = try? NWListener(using: params, on: port)
self.udpListener = listener
listener?.newConnectionLimit = 1
listener?.newConnectionHandler = { connection in
connection.parameters.serviceClass = .responsiveData
self.startUdpConnection(connection: connection)
}
listener?.start(queue: .global(qos: .userInteractive))
}
private func startUdpConnection(connection: NWConnection) {
self.udpConnection = connection
connection.stateUpdateHandler = { state in
switch state {
case .ready:
self.udpReceive()
case .failed(let error):
print("Connection error! \(error.localizedDescription)")
self.disconnect(hadError: true)
default:
break
}
}
connection.start(queue: .global(qos: .userInteractive))
}
Turns out the reason for this was because I was still running a Bonjour search in the background.
Disabling the search when connecting and restarting it on disconnect removed the latency issues.
Apple's tech support mentioned this can be a problem when includePeerToPeer is enabled on a connection.
I really want this to work:
if UIDevice.current.userInterfaceIdiom == .pad {
print("iPad")
} else {
print("not iPad")
}
However, my app only prints "not iPad" even though I am using an iPad. I have Devices (under Deployment Info) set to iPhone. If I change this to Universal it works, but I don't want a universal app, I just want to be able to detect if an iPhone or iPad is being used (even though the app is for iPhones, due to compatibility mode it still can be run on iPads).
So how can I detect if the device is an iPad or iPhone without changing my app to Universal? Thanks
You can check the model:
if UIDevice.current.model.hasPrefix("iPad") {
print("it is an iPad")
} else {
print("it is not an iPad")
}
One thing you can do is get the inner-screen width of the page. Phones are generally below 786 px and you can call everything else an iPad. Use can do something like this
var width = window.innerWidth;
If (width > 786px) {
print(‘ipad’);
} else {
print(‘not ipad’)
}
iPhone Only app can be downloaded to iPad. But in current scenario, we doesn't have device that's much smaller resolution(deployment target: 9).
OBJ-C
if (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPhone && [[[UIDevice currentDevice] model] hasPrefix:#"iPad"]) {
// This app is an iPhone app running on an iPad
NSLog(#"This app is an iPhone app running on an iPad");
}
Swift
if UIDevice.current.userInterfaceIdiom == .phone, UIDevice.current.model.hasPrefix("iPad") {
print("iPad")
}
Try this,
print("Model - \(UIDevice.current.model)")
I know how to turn on/off the flash light from the back camera. I know how to switch front/back camera. But I don't know how to turn on/off the flash light independently from the active camera.
What I mean: if the active camera is the front one, when I turn on the flash light, it freezes. And when I turn it off, it unfreezes.
My code so far:
var back_cam:AVCaptureDevice?
for device in devices{
if (device as AnyObject).hasMediaType(AVMediaTypeVideo) && (device as AnyObject).position == .back{
back_cam=device as? AVCaptureDevice
}
}
guard let cam=back_cam else {
print("no back cam?")
return
}
if cam.hasTorch{
do{
try cam.lockForConfiguration()
if cam.torchMode == .on{
cam.torchMode = .off
}else{
do{
try cam.setTorchModeOnWithLevel(1)
}catch{
print(error)
}
}
cam.unlockForConfiguration()
}catch{
print(error)
}
}
}
EDIT
In case I'm not clear, I'd like to keep the light on, regardless of wether the active camera is the back one or the front one.
You need to check torch mode is supported for the device or not before you turn on/off.
As per Apple documentation for
var torchMode: AVCaptureTorchMode { get set }
Before setting the value of this property, call the
isTorchModeSupported(_:) method to make sure the device supports the
desired mode. Setting the device to an unsupported torch mode results
in the raising of an exception.
Also when you switch to front camera, the default behaviour is torch mode off. Compare it with default camera app from iPhone/iPad.
I'm trying to replicate the "Pitch Lock" on / off feature of the DJI Go app. How can I do this?
I'm using XCode 8.2.1, building for iOS 10.1, connecting to an Osmo Mobile with an iPhone 6s attached. The Osmo Mobile has the latest firmware (version 01.30.01.52).
Everything works so far (registerApp, connecting via bluetooth, getting handheld button presses, getting gimbal battery updates, getting gimbal updates).
Setting setGimbalWorkMode to either .freeMode or .yawFollowMode doesn't seem to have any effect. No error is returned in the completion block, but there's no effect on Gimbal operation.
The gimbal behaves as if it is in .freeMode (always moves to the exact direction handheld stick is pointing), but DJIGimbalDelegate only receives .yawFollowMode updates (which is what the pitchLock mode should do).
Setting setGimbalWorkMode to other modes results in an error (as expected with Osmo Mobile device).
Here's how I'm trying to toggle pitchLock on/off.
#IBAction func pitchLockPressed(_ sender: UIButton) {
pitchLock = !pitchLock
if let gimbal = fetchGimbal() {
var workMode : DJIGimbalWorkMode = .freeMode // .freeMode .fpvMode and .unknown return error using Osmo Mobile
if pitchLock {
workMode = .yawFollowMode
}
gimbal.setGimbalWorkMode(workMode, withCompletion: { (error) in
if (error != nil) {
print("error workMode: \(error?.localizedDescription)")
self.pitchLock = !(self.pitchLock) // back to previous
}
})
}
}
Here's the delegate, which only reports .yawFollowMode no matter what I do:
func gimbal(_ gimbal: DJIGimbal, didUpdate gimbalState: DJIGimbalState) {
// var needUpdate = false
if lastReportedWorkMode != gimbalState.workMode {
lastReportedWorkMode = gimbalState.workMode
switch lastReportedWorkMode {
case DJIGimbalWorkMode.fpvMode:
print("FPV\n")
case DJIGimbalWorkMode.freeMode:
print("Free\n")
case DJIGimbalWorkMode.yawFollowMode:
print("Yaw-follow\n")
case DJIGimbalWorkMode.unknown:
print("Unknown\n")
}
}
Anyone getting setGimbalWorkMode to actually change gimbal modes?
for my iOS app i want to implement a feature where the screen should turns off (like when you answer a phone call) when the device is faced down.
so I've started by detecting the device orientation:
//in my ViewDidLoad
NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(self.rotated(_:)), name: UIDeviceOrientationDidChangeNotification, object: nil)
//called when the device changes orientation
func rotated(notification: NSNotification){
if UIDevice.currentDevice().orientation == UIDeviceOrientation.FaceDown{
print("device = faced down")
}else{
print("device != faced down")
}
}
when device is down i've called
UIDevice.currentDevice().proximityMonitoringEnabled = true
else
UIDevice.currentDevice().proximityMonitoringEnabled = false
the problem is UIDeviceOrientationDidChangeNotification seems to act a little bit late so when the rotated() function is called, the device is already faced down and it turns out that in order for proximityMonitoringEnabled = true to turn off the screen the proximity sensor should not be already covered !
I'm pretty sure that this is an Apple limitation but maybe someone out there did found a solution or came across a workaround!
thanks in advance.
Approach:
Since iOS doesn't provide before changing Orientation we couldn't rely on 'UIDeviceOrientationDidChangeNotification'. Instead we can use CoreMotion Framework and access hardware's gyroscope for detecting possible FaceDown orientations and set proximityMonitoringEnabled appropriately.
Gyroscope Data:
Using Gyroscope observations below values can possibly detect FaceDown Orientation.
let gyroData = (minX:-3.78, minY:-3.38, minZ:-5.33, maxX:3.29, maxY:4.94, maxZ:3.36)
Solution in Swift:
class ProximityViewController: UIViewController {
let cmManager = CMMotionManager(), gyroData = (minX:-3.78, minY:-3.38, minZ:-5.33, maxX:3.29, maxY:4.94, maxZ:3.36)
override func viewDidLoad() {
super.viewDidLoad()
//Using GyroMotion
experimentCoreMotion()
}
//MARK: - Using Core Motion
func experimentCoreMotion() {
if cmManager.gyroAvailable {
//Enable device orientation notification
UIDevice.currentDevice().beginGeneratingDeviceOrientationNotifications()
cmManager.gyroUpdateInterval = 0.1
handleFaceDownOrientation()
}
}
func handleFaceDownOrientation() {
cmManager.startGyroUpdatesToQueue(NSOperationQueue.currentQueue()!, withHandler: { (data:CMGyroData?, error: NSError?) in
if self.isGyroDataInRange(data!) {
UIDevice.currentDevice().proximityMonitoringEnabled = (UIDevice.currentDevice().orientation == .FaceDown)
if UIDevice.currentDevice().orientation == .FaceDown { print("FaceDown detected") }
else { print("Orientation is not facedown") }
}
})
}
func isGyroDataInRange(val: CMGyroData) -> Bool {
return ((val.rotationRate.x > gyroData.minX && val.rotationRate.x < gyroData.maxX) &&
(val.rotationRate.y > gyroData.minY && val.rotationRate.y < gyroData.maxY) &&
(val.rotationRate.z > gyroData.minZ && val.rotationRate.z < gyroData.maxZ))
}
}
Hope my solution solves your query. Let me know the solution is working fine for your requirement.
Gyroscope & Accelerometer Observations:
I've experimented the possible values of FaceDown orientation using Gyroscope & Accelerometer. IMO, Gyroscope data seems fine but it's open to explore other hardware's sensors to detect FaceDown Orientation.