I can't get my head around how to get out simple status data, like the current gimbal pitch for example.
I have not found a solid connection between the DJI SDK and what actually works in xcode. The SDK gives me hints and together with xcode autocompletion a go forwards, slowly..
Class GimbalState has member getAttitudeInDegrees() with description:
"The current gimbal attitude in degrees. Roll, pitch and yaw are 0 if the gimbal is level with the aircraft and points in the forward direction of North Pole." - Great!
However, it does not autocomplete in xcode nor does it compile.
Other approaches tested:
var gimbalStateInformation = DJIGimbalState()
print(gimbalStateInformatoin.attitudeInDegrees.pitch.description)
--> All pitch roll and yaw values come out as 0.0
var gimbalStateInformation = DJUGimbalAttitude()
print(gimbalStateInformatoin.pitch.description)
--> All pitch roll and yaw values come out as 0.0
I've tried to reach the information via keys, but my app crashes when I run the code.
func getGimbalAttitude(){
// Get the key
guard let gimbalAttitudeKey = DJIGimbalKey(param: DJIGimbalParamAttitudeInDegrees) else {
print("Could not create DJIGimbalParamAttitudeInDegrees key")
return
}
// Get the keyManager
guard let keyManager = DJISDKManager.keyManager() else {
print("Could not get the key manager, manke sure you are registered")
return
}
// Test if key is available
let testing = keyManager.isKeySupported(gimbalAttitudeKey)
self.statusLabel.text = String(testing) // This comes out true
// Use key to retreive info
let gimbalAttitudeValue = keyManager.getValueFor(gimbalAttitudeKey)
let gimbalAttitude = gimbalAttitudeValue?.value as! DJIGimbalState
_ = gimbalAttitude.attitudeInDegrees.pitch
// --> Application crashes on the line above
}
I'm working towards a Mavic Mini.
Please advise in general terms how to connect the DJI Mobile SDK to Swift and specifically how I can read out the current gimbal pitch value.
You can access the current Gimbal pitch through its didUpdate state delegate function
import DJISDK
class GimbalController: NSObject, DJIGimbalDelegate {
let gimbal: DJIGimbal
init(gimbal: DJIGimbal) {
self.gimbal = gimbal
super.init()
gimbal.delegate = self
}
func gimbal(_ gimbal: DJIGimbal, didUpdate state: DJIGimbalState) {
print(state.attitudeInDegrees.pitch)
}
}
// create an instance of the custom gimbal controller in some other class
// and pass it the gimbal instance
if let aircraft = DJISDKManager.product() as? DJIAircraft {
if let gimbal = aircraft.gimbal {
let gimbalController = GimbalController(gimbal: gimbal)
}
}
Related
There is a new heading property of CMDeviceMotion in iOS 11.
I'm trying to use it but find that it's always -1.0. It's supposed to hold degrees as a Double from 0.0 to 360.0.
My app targets iOS 11+, and I'm testing on a physical device (iPhone) running iOS 11.
let mmgr = CMMotionManager()
mmgr.showsDeviceMovementDisplay = true // for calibrating magnetometer, maybe not needed?
mmgr.deviceMotionUpdateInterval = 0.1
mmgr.startDeviceMotionUpdates(to: .main, withHandler: { (motionData: CMDeviceMotion?, error: Error?) in
if let motion = motionData {
print("heading:", motion.heading) // always -1.0
}
})
I can get other properties just fine, such as motion.attitude.roll. Is there something I'm missing?
The issue is that I needed to start motion updates with a different method signature which includes the CMAttitudeReferenceFrame option:
let mmgr = CMMotionManager()
mmgr.deviceMotionUpdateInterval = 0.1
mmgr.startDeviceMotionUpdates(using: .xMagneticNorthZVertical, to: .main, withHandler: { (motionData: CMDeviceMotion?, error: Error?) in
if let motion = motionData {
print("heading:", motion.heading) // works
}
})
The What's New in iOS 11 guide states that if you use xArbitraryZVertical (default) or xArbitraryCorrectedZVertical for the CMAttitudeReferenceFrame option, the heading will always be -1.
This helpful tidbit is not stated in the heading property reference.
Ok, I am new to URL querying and this whole aspect of Swift and need help. As is, I have an iMessage app that contains and SKScene. For the users to take turns playing the game, I need to send the game back and forth in messages within 1 session as I learned here : https://medium.com/lost-bananas/building-an-interactive-imessage-application-for-ios-10-in-swift-7da4a18bdeed.
So far I have my scene all working however Ive poured over Apple's ice cream demo where they send the continuously-built ice cream back and forth, and I cant understand how to "query" everything in my SKScene so I can send the scene.
I'm unclear as to how URLQueryItems work as the documentation does not relate to sprite kit scenes.
Apple queries their "ice cream" in its current state like this:
init?(queryItems: [URLQueryItem]) {
var base: Base?
var scoops: Scoops?
var topping: Topping?
for queryItem in queryItems {
guard let value = queryItem.value else { continue }
if let decodedPart = Base(rawValue: value), queryItem.name == Base.queryItemKey {
base = decodedPart
}
if let decodedPart = Scoops(rawValue: value), queryItem.name == Scoops.queryItemKey {
scoops = decodedPart
}
if let decodedPart = Topping(rawValue: value), queryItem.name == Topping.queryItemKey {
topping = decodedPart
}
}
guard let decodedBase = base else { return nil }
self.base = decodedBase
self.scoops = scoops
self.topping = topping
}
}
fileprivate func composeMessage(with iceCream: IceCream, caption: String, session: MSSession? = nil) -> MSMessage {
var components = URLComponents()
components.queryItems = iceCream.queryItems
let layout = MSMessageTemplateLayout()
layout.image = iceCream.renderSticker(opaque: true)
layout.caption = caption
let message = MSMessage(session: session ?? MSSession())
message.url = components.url!
message.layout = layout
return message
}
}
But I cant find out how to "query" an SKScene. How can I "send" an SKScene back and forth? Is this possible?
You do not need to send an SKScene back and forth :) What you need to do is send the information relating to your game set up - such as number of turns, or whose turn it is, or whatever, as information that can be accessed by your app at the other end to build the scene.
Without knowing more about how your scene is set up and how it interacts with the information received for the other player's session, I can't tell you a lot in terms of specifics. But, what you need to do, if you are using URLQueryItems to pass the information, simply retrieve the list of query items in your scene and set up the scene based on the received values.
If you have specific questions about how this could be done, if you either share the full project, or post the relevant bits of code as to where you send out a message from one player and how the other player receives the information and sets up the scene, I (or somebody else) should be able to help.
Also, if you look at composeMessage in the code you posted above, you will see how in that particular code example the scene/game information was being sent to the other user. At the other end of the process, the received message's URL parameter would be decomposed to get the values for the various query items and then the scene would be set up based on those values. Look at how that is done in order to figure out how your scene should be set up.
currently I'm working on an update for an already existing App (migration to Swift 3). I'm having targets for Today-, Search-, Message- and Watch Extensions. Every target needs to access the Core Data Model of my App, so I created an AppGroup and enabled the Capability for every target. Although I've subclassed the NSPersistentStoreCoordinator, so everything is stored in a shared folder:
import CoreData
class PFPersistentContainer: NSPersistentContainer {
override open class func defaultDirectoryURL() -> URL {
if let url = FileManager.default.containerURL(forSecurityApplicationGroupIdentifier: "add-groupname-here") {
return url
}
// Fallback
return super.defaultDirectoryURL()
}
}
This class-file, as well as my Core-Data-Model and the following class are all in target membership of all of the mentioned targets. The Codegen of the Entities is set to Class Definition. So far I'm using the default implementation for the Core Data Stack:
class DataManager {
/**
* Singleton Implementation
*/
internal static let sharedInstance:DataManager = {
let instance = DataManager()
return instance
}()
// MARK: - Core Data stack
lazy var persistentContainer: PFPersistentContainer = {
let container = PFPersistentContainer(name: "Data")
container.loadPersistentStores(completionHandler: { (storeDescription, error) in
if let error = error as NSError? {
fatalError("Unresolved error \(error), \(error.userInfo)")
}
})
return container
}()
// MARK: - Core Data Saving support
func saveContext () {
let context = persistentContainer.viewContext
if context.hasChanges {
do {
try context.save()
} catch {
let nserror = error as NSError
fatalError("Unresolved error \(nserror), \(nserror.userInfo)")
}
}
}
}
Now the weird part: Accessing this data from the Today- and Message-Extension seems to work gracefully (I'm assuming the Search-Extension is working too, but so far as a developer I'm not able to test this). Trying to fetch it with the same Request from the Watch App's Extension results in an empty Array. Here is my fetch code:
internal func getLimitedPOIsWithCalculatedDistance(andCurrentLocation currentLocation:CLLocation) -> Array<POI> {
var poiArray:Array< POI > = []
guard let context = self.backgroundContext else { return [] }
do {
// Initialize Fetch Request
let request:NSFetchRequest<NSFetchRequestResult> = NSFetchRequest(entityName: "POI")
// Create Entity Description
let entityDescription = NSEntityDescription.entity(forEntityName: "POI", in: context)
// Configure Fetch Request
request.entity = entityDescription
// Configure Predicate
let predicate = NSPredicate(format: "(disabled == NO) AND (locationDistanceCalculated <= \(SETTINGS_MAXIMUM_FETCH_DISTANCE))")
request.predicate = predicate
if let result = try context.fetch(request) as? Array<POI> {
for poi in result {
let poiLocation = CLLocation(latitude: poi.locationLatitude, longitude: poi.locationLongitude)
let distance = currentLocation.distance(from: poiLocation)
poi.locationDistanceTransient = Double(distance)
}
poiArray = result.filter({ (poi) -> Bool in
return poi.locationDistanceTransient > 0.0
})
poiArray.sort { (first, second) -> Bool in
return first.locationDistanceTransient < second.locationDistanceTransient
}
}
} catch {
print("Error in WatchDataInterface.getLimitedPOIsWithCalculatedDistance(): \(error.localizedDescription)")
}
return poiArray
}
A bit more context for better understanding: the POI Entitie contains latitude and longitude of a location. When starting the app, I'm pre-calculating the distance to each point in the database from the current user's position. When the Today-Extension tries to get the nearest x (in my case 8) POIs to the current users position, fetching all 15k POIs and calculating their distance is to much memory wise. So I had to pre-calculate the distance (stored in locationDistanceCalculated), then fetch in a given radius (the static SETTINGS_MAXIMUM_FETCH_DISTANCE) and calculate a precise distance during the fetch process (stored into a transient property locationDistanceTransient).
In the Watch App's ExtensionDelegate Class a CLLocationManagerDelegate is implemented and the code is called, when the User's Location is updated:
extension ExtensionDelegate: CLLocationManagerDelegate {
func locationManager(_ manager: CLLocationManager, didUpdateLocations locations: [CLLocation]) {
OperationQueue.main.addOperation{
if let watchDataManager = self.watchDataManager {
// getNearestPOIs() calls the getLimitedPOIsWithCalculatedDistance() function and returns only a numberOfPOIs
self.nearbyPOIs = watchDataManager.getNearestPOIs(numberOfPOIs: 4)
}
}
}
}
It was tested in Simulator and on a Device. The context.fetch always returns an empty array (Yes, core data contains values and yes I've tested it without the predicate). Am I missing anything new in Core Data, which I haven't considered yet or are their any limitations in WatchOS3 why this isn't working out? Does anyone has a clue? Thanks for your help.
Update: when using a Watch Framework target to access Core Data, like it's described in this project, the fetch keeps being empty. Maybe this could be the right path, but a Watch Framework is the wrong selection. Will keep you up to date.
Update 2: I've already checked the App Programming Guide for WatchOS and the transferFile:metadata: function in the API References, but it doesn't seem to be a suitable way to send these large amounts of data to the AppleWatch. I just can't rely on the circumstance, that the user is checking the app more often than he is traveling out of the SETTINGS_MAXIMUM_FETCH_DISTANCE radius and for locations with a high density of POIs this data is even more extensive.
Update 3: I've reimplemented the nearby Feature for POIs to fetch only in a given radius. If this solves a problem for you, check out this public Gist.
Since Watch OS3, you cannot access a CoreData base using App Group trick ( because the watch is supposed to work without a phone.
So you have to use WatchConnectivity to fetch your data
Idea
My idea is to write a little application for my iOS-Device to record a motion after start recording by touching an UIButton and save the data from the accelerometer for this motion. After the recording, I can save this data to use it in the finished Application to detect this motion again.
So, what I'm looking for is a way to compare two NSMutableArrays, each as a set of motion data, and want to check if the current NSMutableArray is the same motion as the recorded one.
Problems
To realize this idea, I found three problems, which I can't solve myself:
How to compare two NSMutableArrays and get e.g. an index-level of the similarity or a percentage e.g. 97.32% equivalent.
What if the user makes the motion in different speeds? E.g. slowly or fast.
The last problem: How to handle different device-orientations? Is there a way to calculate the data from the accelerometer to a neutral level in any orientation? I think I have a solution approach for this last problem, but don't know how to bring this to code: We have the attitude data for pitch, roll and yaw. Maybe we can calculate with these values the neutral data for the accelerometer?
Code
At this moment, I only have the code to get the data from the accelerometer and the attitude data, both from the MotionManager. I played multiple hours with my code and searched a lot on Google, but can't find the right way...
import UIKit
import CoreMotion
var motionManager = CMMotionManager()
var x:Float = 0.0
var y:Float = 0.0
var z:Float = 0.0
var roll:Float = 0.0
var pitch:Float = 0.0
var yaw:Float = 0.0
class FirstViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
motionManager.accelerometerUpdateInterval = 1
motionManager.deviceMotionUpdateInterval = 1
}
#IBAction func startRecording() {
accelerometerData = NSMutableArray()
motionManager.startDeviceMotionUpdates(using: CMAttitudeReferenceFrame.xMagneticNorthZVertical, to:
OperationQueue.current!, withHandler: {
(deviceMotion, error) -> Void in
let attitude = deviceMotion?.attitude
let roll = self.degrees(radians: attitude!.roll)
let pitch = self.degrees(radians: attitude!.pitch)
let yaw = self.degrees(radians: attitude!.yaw)
self.roll = Float(roll)
self.pitch = Float(pitch)
self.yaw = Float(yaw)
let accX = deviceMotion?.userAcceleration.x
let accY = deviceMotion?.userAcceleration.y
let accZ = deviceMotion?.userAcceleration.z
self.x = Float(accX!)
self.y = Float(accY!)
self.z = Float(accZ!)
})
}
}
Update
I will update this question, when I know more.
Problem 2: Solution:
With the Attribute CMAttitudeReferenceFrame.xMagneticNorthZVertical for the motionManager.startDeviceMotionUpdates-Method you can neutralize all sensors data to a defined orientation. You can access the accelerometer-data here too. I've updated the code above, too.
I'm trying to get a label to rotate based on the tit of the device in an app I'm making in swift. Given that I should be fine with rotating the label, does anybody know how to find the current accelerometer values of an iPhone? For example, getting it to print its x, y and z values once every second or so.
Thanks in advance.
Sample code:
Add the framework
Add the following code in the appropriate places
import CoreMotion
var motionManager = CMMotionManager()
var AccelX: CGFloat = 0
// Setup accelemeter detection
if motionManager.accelerometerAvailable == true {
motionManager.startAccelerometerUpdatesToQueue(NSOperationQueue(), withHandler: { (data, error) -> Void in
self.outputAccelertionData(data.acceleration)
})
}
func outputAccelertionData (acceleration: CMAcceleration) {
AccelX = 0
if fabs(CGFloat(acceleration.x)) > fabs(AccelX) {
AccelX = CGFloat(acceleration.x)
}
}