How does an external bluetooth gps like the dual 150 integrate with the CLLLocationManager? Does it automatically integrate or is there a special API that is used to work with the external GPS rather than the internal GPS?
So it turns out that it does "automatically" integrate with Location Services using the Dual App from the App Store.
Ensure that the device is in "Apple" mode (switch near power socket)
Ensure that device is paired with YOUR IOS device (IOS Bluetooth Settings)
Ensure that the Dual blue light is solid (Bluetooth Paired)
Ensure that the Dual green light is flashing (GPS Synching) or solid (GPS Synched)
if off rerun the Dual App
If still not working, power cycle Dual, goto 1.
If still not working, power cycle IOS device, goto 1.
Related
The ARKit API supports simultaneous world and face tracking via the back and front cameras, but unfortunately due to hardware limitations, the new iPad Pro 2020 is unable to use this feature (probably because the LIDAR camera takes a lot more power). This is a bit of a step back.
Here is an updated reference in the example project:
guard ARWorldTrackingConfiguration.supportsUserFaceTracking else {
fatalError("This sample code requires
iOS 13 / iPad OS 13, and an iOS device with
a front TrueDepth camera. Note: 2020 iPads
do not support user face-tracking while world tracking.")
}
There is also a forum conversation proving that this is an unintentional hardware flaw.
It looks like the mobile technology is not "there yet" for both. However, for my use case I just wanted to be able to switch between front and back tracking modes seamlessly, without needing to reconfigure the tracking space. For example, I would like a button to toggle between "now you track and see my face" mode and "world tracking" mode.
There are 2 cases: it's possible or it's impossible, but maybe there are some alternative approaches depending on that.
Is it possible, or would switching AR tracking modes necessitate setting-up the tracking space again? If so, how would it be achieved?
If it's impossible:
Even if I don't get face-tracking during world-tracking, is there a way to get a front-facing camera feed that I can use with the Vision framework, for example?
Specifically: how do I enable back-facing tracking and get front and back facing camera feeds simultaneously, and disable one or the other selectively? If it's possible even without front-facing tracking and only the basic feed, this will work.
I did a search on ARCore-enabled devices.
https://developers.google.com/ar/discover/supported-devices
It doesn't say whether it will support the Galaxy Tab S5E.
Will ARCore work on the Galaxy Tab S5E?
At the moment there's no Samsung Galaxy Tab S5E in a list of ARCore-supported devices because it's new – Google did not have a time to update a list.
But I'm 99% sure it'll be on a list because Samsung Galaxy Tab S5E has all the required sensors for AR experience: Gyroscope, Accelerometer, back RGB camera and Magnetometer.
Why not 100%? Because business policy and hardware issues are unpredictable things.
Is it possible to always use the GPS chip in the Apple Watch for a WatchKit app, even when a iPhone is connected (for Apple Watch series 2 and above because these models have a GPS chip on board).
For my app on watchOS I need the most accurate GPS data. Unfortunately, when an iPhone is connected location request (using the standard code) are delegated to the iPhone and the GPS chip of the iPhone is used. This makes perfectly sense from a power conservation perspective, but not from an accuracy perspective. The iPhone could be tucked away in a coat, bag etc. In that case the phone has no clear view to the sky, while the watch has. The GPS data from the watch are much, much more accurate in that case (and the iPhone GPS data extremely inaccurate)
Is there a way to configure the CLLocationManager to block delegation of GPS location requests to the iPhone and always use the GPS chip in the watch?
I have requirement to support iPod Touch as a part of cross platform cordova application. Application should use Indoor Positioning System using wifi triangulation or any other approach to locate and map objects.
Does cordova support iPod Touch, if So what are the Pros and Cons, Challenges while implementing solution?
Does iPod touch can be used for Indoor Positioning System and How to achieve it using cordova like using wifi triangulation or any other third party sdks(please suggest)?
Can ipod Touch camera can be used to capture and process images like to read QR code etc. Please suggest any third party api's
Thanks,
Phani
I'm not an expert with iPod touches, but in theory there should be no problem with using them for indoor positioning.
Wifi triangulation is not possible to do with iOS devices (you are only able to obtain the info about the Wifi base station you are attached to), and thus not the optimal solution. I would do this with Bluetooth beacons, which the iPod touch should be able to support well. With beacons you can do triangulation and get to approximately 2m accuracy. The easiest way to get that functioning is through a Cordova plugin. I work for a company called Proximi.io that is a unified positioning platform. All you have to do is place your beacons in their correct positions on our portal, and our libraries will handle the rest. We also have a Cordova plugin.
That being said, we haven't tested our SDKs with iPod touches, so I would recommend testing it first. I also think that the camera usage should be possible, but unfortunately cannot help you more with that.
First of all: This question is not directly programming related. However, the problem only exists for developers, so I'm trying to find an answer here anyways since there are maybe other people on this community who already solved the problem.
I want to record the screen of the iPad 2 to be able to create demo videos of an app.
Since I'm using motion data, I cannot use the simulator to create the video and have to use the actual iPad itself.
I've seen various websites where different methods were discussed.
iPad 2 <==> Apple Digital AV Adapter <==> Blackmagic Design Intensity Pro <==> Playback software <==> TechSmith Camtasia screen recorder on the playback software to circumvent the HDCP flag
iPad 2 <==> Apple VGA Adapter <==> VGA2USB <==> Recording software
...
Everyone seems to have his own hacky solution to this problem.
My setup is the following:
iPad 2 (without Jailbreak)
Apple Mac mini with Lion Server
PC with non-HDCP compliant main board
Non-HDCP compliant displays
It doesn't matter whether the recording has to be on the mac or on the PC.
My questions:
Is it possible to disable the HDCP flag programmatically from within the application itself?
HDMI offers a better quality than VGA. Will the first method I've listed work with my setup although I don't have a full HDCP chain?
What about the Intensity Extreme box? Can I use it and then connect to the Thunderbolt port of the mac mini and record from there?
Is the Thunderbolt port of the mac mini bidirectional and is also suited for capturing? Is the mac mini HDCP compliant? If it does not work due to my screens not being HDCP compliant, will it work if I start the recording software, then disconnect the screens? Will it work if I use an iPad 2 over VNC as a screen since it has to be HDCP compliant if it sends HDCP streams?
If I have to fall back to the VGA solution: Will the VGA adapter mirror everything what's showing on the iPad 2 screen or do I have to program a special presentation mode which sends everything to the VGA cable instead of the iPad screen? Will the proposed setup using VGA2USB be qualitatively high or would you recommend other tools?
What about the Apple Composite AV Cable? Maybe another approach?
I decided to use the Blackmagic Design Intensity Pro together with the Apple Digital AV adapter on a machine with a working HDCP chain.
This approach worked well, capturing is possible in 720p with the iPad's screen centered within the captured video. Therefore, capturing happens not at the full native resolution of the iPad's screen, and black borders are introduced to fill the 720p video frames.
I posted info about displaying HDMI HDTV output to a non-HDCP monitor. Look for my posts. Perhaps it will be of use. Or, how about just using a cell phone to record a video of your tablet's screen, with proper lighting, low reflectance, etc. Won't be 100% but might be sufficient.